Disable GC entirely
Manu
turkeyman at gmail.com
Tue Apr 9 23:02:54 PDT 2013
On 10 April 2013 15:08, Rob T <alanb at ucora.com> wrote:
> On Wednesday, 10 April 2013 at 04:32:52 UTC, Manu wrote:
>
>> final on the class means you can't derive it anymore (what's the point of
>> a
>> class?),
>>
>
> I think that you'd place final on a derived class, not a base class. So it
> can make perfect sense, although final on the methods of a final class
> makes little sense so it should probably be a compiler error.
>
>
> and the manual final blocks are totally prone to error.
>> In my experience, 90% of functions are not (or rather, should not be)
>> virtual. The common (and well performing) case should be default.
>>
>
> Believe it or not, but I actually have been in the reverse position more
> than once, so which way the default should go is debatable. In your case I
> can certainly see why you'd prefer the opposite as I've done RT programming
> before and will be doing it again. What could perhaps work is a module
> level specifier that indicates what the defaults should be as a matter of
> use case preference, but I can see that going horribly wrong.
>
> The question I have, is why use a class if you do not need to use virtual
> functions? I think the problem goes a bit deeper than the defaults being
> opposite of what you want. I expect that you'd probably want struct
> inheritance or something like that but cannot get it from D?
I do use virtual functions, that's the point of classes. But most functions
are not virtual. More-so, most functions are trivial accessors, which
really shouldn't be virtual.
OOP by design recommends liberal use of accessors, ie, properties, that
usually just set or return a variable. Wen would you ever want @property
size_t size() { return size; } to be a virtual call?
A base class typically offers a sort of template of something, implementing
as much shared/common functionality as possible, but which you might
extend, or make more specific in some very controlled way.
Typically the base functionality and associated accessors deal with
variable data contained in the base-class.
The only situations I can imagine where most functions would be virtual are
either a) ridiculously small classes with only 2 functions (at least you'll
only type 'virtual' once or twice in this case), b) some sort of OOP-tastic
widget that 'can do anything!' or tries to be a generalisation of an
'anything', which is, frankly, horrible and immature software design, and
basically the entire reason OOP is developing a name as being a terrible
design pattern in the first place...
I wouldn't make the latter case an implicit recommendation through core
language design... but apparently that's just me ;)
No I don't want struct inheritance (although that would be nice! but I'm
okay with aggregating and 'alias this'), that would insist on using 'ref'
everywhere, and you can't create ref locals, so you can't really use
structs conveniently this way.
Classes are reference types, that's the point. I know what a class is, and
I'm happy with all aspects of the existing design, except this one thing.
Can you demonstrate a high level class, ie, not a primitive tool, but the
sort of thing a programmer would write in their daily work where all/most
functions would be virtual?
I can paste almost any class I've ever written, there is usually 2-4
virtuals, among 20-30 functions.
Any language with properties can't have virtual-by-default.
>> Seriously, .length or any other trivial property that can no longer be
>> inlined, or even just called.
>> And human error aside, do you really want to have to type final on every
>> function? Or indent every line an extra tab level?
>>
>
> Mark your properties as final?
>
That's 90% of the class! You are familiar with OOP right? :)
Almost everything is an accessor...
I usually have 2 virtual functions, update() and draw(), or perhaps there's
a purpose specific doWork() type of function to perform the derived
object's designated function, but basically everything else is trivial
accessors, or base class concepts that make absolutely no sense to override.
I also work with lots of C++ middleware, and the virtuals are usually
tightly controlled and deliberately minimised, and there's a good reason
for this too; the fewer virtuals that the user is expected to override, the
simpler it is to understand and work with your class!
Additionally, it's a nice self-documenting feature; at a glance you can see
the virtuals, ie, what you need to do to make use of a 3rd party OOP API.
[...]
>
>> I don't want to ban the use of virtual functions. I want to ban the use of
>>
>> virtual functions that aren't marked virtual explicitly! ;)
>>
>> Likewise, I like the GC, I just want to be able to control it.
>> Disable auto-collect, explicitly issue collect calls myself at controlled
>> moments, and give the collect function a maximum timeout where it will
>> yield, and then resume where it left off next time I call it.
>>
>
> I agree 100% and have that need too. I'd go further and also prefer the
> ability to optionally ban certain language features from use from within
> selective parts of my code base. As you say, I do not actually want to
> outright ban the GC or any other language feature (I really do use
> them!!!), it's only the desire to be able to have much better control over
> it for the situations that demand precision and certainty.
>
Precisely.
Having control over D and the GC like what we're taking about in here can
> turn D into a seriously awesome systems language unlike any other.
Correct, it's not quite a systems language while the GC does whatever it
wants. But D needs the GC to be considered a 'modern', and generally
productive language.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20130410/2b9815c7/attachment.html>
More information about the Digitalmars-d
mailing list