OSNews article about C++09 degenerates into C++ vs. D discussion

Don Clugston dac at nospam.com.au
Wed Nov 22 05:32:46 PST 2006


Kyle Furlong wrote:
> John Reimer wrote:
>> On Tue, 21 Nov 2006 21:51:35 -0800, Kyle Furlong 
>> <kylefurlong at gmail.com> wrote:
>>
>>> Steve Horne wrote:
>>>> On Sun, 19 Nov 2006 15:28:33 -0800, "John Reimer"
>>>> <terminal.node at gmail.com> wrote:
>>>>
>>>>> On Sun, 19 Nov 2006 14:59:19 -0800, BCS <BCS at pathilink.com> wrote:
>>>>>
>>>>>> Mars wrote:
>>>>>>> http://www.osnews.com/comment.php?news_id=16526
>>>>>>
>>>>>> One issue brought up is that of D "requiring" the use of a GC.
>>>>>> What would it take to prove that wrong by making a full blown 
>>>>>> standard  lib that doesn't use a GC, and in fact doesn't have a GC?
>>>>  I don't know. Personally, I am all in favour of having the choice -
>>>> but remember, it's not just a matter of creating that library.
>>>> Maintaining two standard libraries would mean a lot of ongoing
>>>> headaches.
>>>>
>>>>> Note, however, that C++ users, many who have grown dependent on 
>>>>> manual  memory management, are looking for a reason to fault D.  
>>>>> I've actually  heard cases where C++ users lambast GC based 
>>>>> languages: use of a GC  apparently creates "bad programming 
>>>>> practices" -- imagine the laziness of  not cleaning up after yourself!
>>>>  I agree - but I also strongly disagree.
>>>>  The problem is that memory management isn't just about allocating and
>>>> freeing memory. It is closely coupled with newing and deleting, with
>>>> constructors and destructors, and therefore with wider resource
>>>> management issues.
>>>>  Two problems can arise...
>>>>  1.  Garbage collection isn't immediate. Resources can stay locked long
>>>>     after they should have been freed, because the garbage collector
>>>>     hasn't got around to destroying those objects yet. This can be a
>>>>     problem if you are trying to acquire further locks or whatever.
>>>>  2.  Reference cycles. Take Java. It can garbage collect when there are
>>>>     reference cycles, sure, but it cannot know what order to destroy
>>>>     those objects in. Calling destructors in the wrong order could
>>>>     cause big problems.
>>>>      Solution - don't call the destructors (sorry, finalisers) at all.
>>>>     Just free the memory, since it doesn't matter what order you do
>>>>     that in.
>>>>      So that's it - Java doesn't guarantee to call finalisers. I don't
>>>>     know for sure that this is why, but it is the only good reason I
>>>>     can think of.
>>>>      If you think reference cycles are a theoretical rather than real
>>>>     problem, well, I'm afraid many practical data structures have
>>>>     them - even the humble doubly-linked list.
>>>>  Either of these problems is sufficient on its own to mean that the
>>>> garbage collector cannot be relied upon. As the programmer, you have
>>>> to take responsibility for ensuring that the cleaning up is done. And
>>>> that, according to black-and-white reasoning, defeats the whole point
>>>> of garbage collection.
>>>>  But then these problems, even counted together, only create issues for
>>>> a minority of objects in most code.
>>>>  Awkward persons might observe that the rate of problems tends to
>>>> increase in lower level code, and that this is why the
>>>> applications-oriented language Java has more problems than the
>>>> very-high-level languages that also do GC such as Python. And those
>>>> same awkward persons might then point out that D explicitly targets
>>>> systems level code, aiming its sights at a somewhat lower level than
>>>> Java.
>>>>  But lets put that point to one side for a bit.
>>>>  Someone intelligent enough to consider shades of grey might still
>>>> argue that it is a good idea to develop good habits early, and to
>>>> apply them consistently. It saves on having these problems arise as
>>>> surprise bugs, and perhaps as a result of third party libraries that
>>>> you don't have source for and cannot fix.
>>>>  I have a lot of sympathy with this point of view, and don't think it
>>>> can be lightly dismissed. It isn't just a matter of taking sides and
>>>> rejecting the other side no matter what. It is a valid view of the
>>>> issue.
>>>>  The trouble is that the non-GC way is also prone to surprise bugs.
>>>>  So, as far as I can see, neither approach is a clear and absolute
>>>> winner. I know it can seem as if GC is the 'modern' way and that
>>>> non-GC is a dinosaur, but good and bad isn't decided by fashions or
>>>> bandwagons. Both GC and non-GC have problems.
>>>>  Now to consider that point I put to one side. D is explicitly aimed at
>>>> systems level code. Well, that's true, but in the context of GC we
>>>> have a rather skewed sense of high-level vs low-level - low level
>>>> would tend to mean data structures and resource management rather than
>>>> bit twiddling and hardware access. D systems level programming is
>>>> probably roughly equally prone to GC problems as Java applications
>>>> level programming.
>>>>  In any case, D is a systems level language in the sense of
>>>> down-to-and-including systems level. Most real world code has a mix of
>>>> high-level and low-level. So in a single app, there can be a whole
>>>> bunch of high-level code where GC is a near perfect approach, and a
>>>> whole bunch of low-level code in which GC cannot be relied upon and is
>>>> probably just an unwanted complication.
>>>>  And when there are two equally valid approaches, each of which has its
>>>> own advantages and disadvantages, and both of which could be valuable
>>>> in the same application, which should the serious programmer demand?
>>>> Particularly the systems-level programmer?
>>>>  Right - Both!
>>>>  But does it make sense to demand a separate non-GC standard library?
>>>> That seems to suggest a world where an application is either all GC or
>>>> all non-GC.
>>>>  GC seems pointless if it doesn't happen by default, so the approach of
>>>> opting out for specific classes when necessary seems, to me, to be as
>>>> close to ideal as you can get. And even then, there's the proviso that
>>>> you should stick to the default approach as much as possible and make
>>>> damned sure that when you opt out, it's clear what you are doing and
>>>> why. It's not a GC-is-superior thing, just a consistency thing -
>>>> minimising confusion and complexity.
>>>>  In that case, with GC as the default and with opting out being
>>>> reserved for special cases, you're probably going to carry on using
>>>> the GC standard library anyway.
>>>>  As for embedded platforms, if malloc and free would work, so would
>>>> garbage collection. If not, you probably can't use any conventional
>>>> standard library (and certainly not data structure library code), and
>>>> should be using a specialised embedded development library (probably
>>>> tailored for the specific platform).
>>>>  In other words, the only benefit I can see to having a separate non-GC
>>>> library is marketing. And it seems that a better approach is to
>>>> educate people about the benefits of dropping the old either-or
>>>> thinking and choosing both.
>>>>  AFAIK, there are two competitors in this having-both approach, and
>>>> they are both C++. Managed C++, and C++ with a GC library. And they
>>>> all get it wrong IMO - you have to opt in to GC, not opt out. If GC
>>>> isn't the default, you get new classes of bugs - the 'oh - I thought
>>>> that was GC, but apparently not' and 'damn - I forgot to specify GC
>>>> for this' bugs.
>>>>  So there we are, D is not only already perfect, it is the only
>>>> language available that has achieved this amazing feat ;-)
>>>>
>>>
>>> Wow that was long, but good, make it an article, Walter?
>>
>>
>> It was too long, but with good points.  If it were pared down, it 
>> would read easier and the points might hit home even harder.
>>
>> Concerning D and GC:
>>
>> The problem is that most D apologists /DO/ advertise D as having the 
>> best of both worlds when it comes to memory management, but C++ fans 
>> are bound and determined to see D as practically a GC-only language: 
>> the GC is one of the first points they always bring up.  They keep 
>> seeing it in the same light as Java and other such languages.  It's 
>> unfair and short-sited, but a typical response.
>>
>> If you really take an honest look at OSNEWS posts and others, you will 
>> realize that some of these people are literally annoyed at D and D 
>> promoters for a reason deeper and unrelated to the language.  You 
>> can't argue with that.  Some good considerations, like Steve's, just 
>> doesn't hit home with those boys.
>>
>> -JJR
> 
> I seriously think there is a sizable group of people who use C++ at 
> their workplace, and for their hobbies, and maybe have written a 
> convoluted something or other for Boost. These people have invested a 
> huge ammount of time and effort to carve out something usable from the 
> jungles that are the C++ lands.
> 
> These people fight D because they see how it will simply negate that 
> time investment by making it irrelevant. When it comes down to it, 
> someone who actually understands C++ in depth and can be productive in 
> it is a very valuable person. If D becomes defacto, that skill set 
> becomes much less valuable.
> 
> Thats not to say that someone who is at that level of understanding in 
> C++ can easily adapt to D, but the psychology of it is that they have 
> spent so much time into actually getting C++ to work for them that its 
> an abhorrent idea for them to leave that behind.
> 
> Any reason they can grasp on to, they will. Any defect they can find, 
> they'll point it out. Hopefully, over time, the smart ones will realize 
> the dead end and move on to D.

Actually, I think that anyone who's put a lot of effort into Boost-style 
template code will have a huge list of C++ quirks that they wish would 
be fixed. My first impression of D was "there's loads of cool stuff in 
here that I wish was in C++, but the templates aren't good enough, 
because there's no IFTI". My second impression was there was enough 
interesting features (like mixins and static if) to give D a go despite 
the absence of IFTI.
Well, now that we have IFTI and tuples(!) I seriously don't think any 
template affectionado is likely to evaluate D negatively in that regard.
Once the word gets around, I think there'll be a lot of defections.



More information about the Digitalmars-d mailing list