auto classes and finalizers

kris foo at bar.com
Wed Apr 5 20:02:32 PDT 2006


Dave wrote:
> In article <e11ki9$rtq$1 at digitaldaemon.com>, kris says...
> 
>>Jarrett Billingsley wrote:
>>
>>>"kris" <foo at bar.com> wrote in message news:e11fds$m0m$1 at digitaldaemon.com...
>>>
>>>
>>>>Yes, it is. The "death tractors" (dtors in D) are notably less than useful 
>>>>right now. Any dependencies are likely in an unknown state (as you note), 
>>>>and then, dtors are not invoked when the program exits. From
>>>>what I recall, dtors are not even invoked when you "delete" an object? 
>>>>It's actually quite hard to nail down when they /are/ invoked :)
>>>
>>>
>>>They are invoked when you call delete.  This is how you do the deterministic 
>>>"list of special stuff" that you mention - you just 'delete' them all, 
>>>perhaps in a certain order.
>>
> 
> Ok, so for non-auto death tractors (that name is great):
> 
> a) non-auto D class dtors are actually what are called finalizers everywhere
> else, except when delete is explicitly called.
> b) although dtors are eventually all called, it is non-deterministic unless the
> class is auto, or delete is used explicitly.
> c) unless dtors are called deterministically, they could often be considered
> worthless since, w/ a GC handling memory, the primary reason for dtor's is to
> release other expensive external resources.
> d) there is (alot of) overhead involved with 'dtors for every class'.
> e) All this has been a major sticking-point of other languages and runtimes
> (like VB & C#.NET). Because of c) and d), in those languages, the workaround
> they use is finalizers instead of dtors (they also have Dispose, but that needs
> to be called explicitly), and using(...) takes the place of auto/delete. IIRC,
> exactly when these finalizers are called is always non-deterministic and not
> even guaranteed unless an explicit "full collect" is done, and a big part of
> this is precisely because it's so expensive. Although I program in those
> languages day to day, because of this, I don't rely on anything that is going on
> behind the scenes as I've always ended-up explicitly "finalizing" things myself
> rather than relying on the GC or the using(...) statement. If you've done alot
> of DB work in .NET (for example), then you'll know that doing this is sometimes
> as bothersome as malloc/free or new/delete (and Thank God for .NET's
> try/finally). That is a major reason I think finalizers are useless unless
> they're always deterministic.
> 
> From some tests I've done in the past and recently duplicated in
> http://www.digitalmars.com/drn-bin/wwwnews?digitalmars.D/36258, just attempting
> to set a finalizer is damned expensive, and a lot of that expense is because
> setFinalizer needs to be synchronized. IIRC, in the tests I've run in the past,
> if the finalizer overhead is removed, the current GC can actually run as fast
> for smallish class objects over several collections, as new/delete for C++
> classes or malloc/free for C structs.
> 
> There is not only an expense involved in setting the finalizer, but the way it
> works in the current D GC is that there is overhead involved in every collection
> checking for finalizers, even for non-class objects. It looks to me like if all
> the non-deterministic finalization cruft could be removed from the GC, the
> *current* GC may actually be a little faster than malloc/free for class objects
> (at least moderately sized ones).
> 
> Long and short of it is I like Mike's ideas regarding allowing dtors for only
> auto classes. In that way, the GC wouldn't have to deal with finalizers at all,
> or at least during non-deterministic collections. It would also still allow D to
> claim RAII because 'auto' classes are something new for D compared to most other
> languages.

I could buy that too, if the darned "auto" keyword weren't so overloaded :-P

[snip]



More information about the Digitalmars-d mailing list