Why is D unpopular?

H. S. Teoh hsteoh at qfbox.info
Tue May 31 02:47:21 UTC 2022


On Tue, May 31, 2022 at 02:07:53AM +0000, forkit via Digitalmars-d wrote:
> On Monday, 30 May 2022 at 18:58:38 UTC, H. S. Teoh wrote:
> > ...
> > And D is the only language where having multiple compilers is said
> > to be a bad thing.  C, for example, probably has hundreds of
> > different compilers, yet we never hear anyone complain about why C
> > is bad because it has so many compilers.  Or C++, for that matter.
> > Yet for D this is somehow one of the biggest nails in its supposed
> > coffin.  Tellingly enough, back in the days when dmd was the only
> > compiler, people were singing lamentations on why having only one
> > compiler was bad.  And now this.  Let the reader draw his own
> > conclusions. ;-)
> > ...
> 
> The problem with 'multiple compilers' is the concept of
> 'implementation defined' behaviours.

Portable coding practices dictate that code should not rely on
implementation-defined behaviours.  If you write code that does, be
prepared for it to be non-portable.


> With a single compiler, implementation defined behaviours, which would
> of course exist, are contained to the 'one' compiler, which has
> obvious benefits for developers, and their customers (and C# is a
> prime example), as well as the compiler developers and the language
> designers.

The disadvantage is that the code will rely on said
implementation-defined behaviours, resulting in vendor lock-in.


> Of course people like Stroustrup strongly support and argue for the
> idea of multiple compilers, but his views/arguments really reflect the
> legacy of C and C++. I don't know that they are relevant to the future
> ;-)
> 
> I don't argue against multiple compilers per se. I argue against
> compilers having 'different' definitions of behaviours of the same
> language.

The language spec (ideally) dictates mandatory behaviours, and indicates
which construct(s) may have implementation-defined behaviours. If you're
writing code intended to target multiple runtime environments, you avoid
using these constructs.


> I would like to understand whether this is also a problem with the D
> programming language (I don't know that it is, but I'd like to know).
> If it is a problem, then (to keep in context with the subject of this
> thread), perhaps it is a reason why D is unpopular, given the problems
> it has created in the C/C++ world of programming.

Your second sentence doesn't from the first, logically.  What has
implementation-defined behaviours got to do with popularity?  When one
talks about popularity, the first thing that comes to mind is the
question "is language X used by my best friend and my friend's best
friend? If not, it sux. If yes, it must be cool, I'll use it."  It has
little, if anything, to do with technical issues like
implementation-defined behaviours.  Nobody is gonna go "OMG Java has no
implementation-defined behaviours, it must be the best language in the
world!!!!111ROTFLMAOBBQ".  It's not even a consideration as far as
popularity is concerned.

But that aside, D has a lot of facilities built in to help you make your
code maximally portable.

- Fixed sizes for integral types, for example, are a prime example of
  where D shines over C/C++. In C, `int` can mean anything from a 16-bit
  value to a 64-bit value (according to the spec anyway), and char is
  ("at least 8 bits wide", but could be wider). This leads to all sorts
  of silliness like stdint.h that has to spell out
  implementation-defined typedefs just so people can write int16_t to be
  100% sure they are the expected size. In D, `short` is ALWAYS 16 bits,
  so you write `short` once, and you get a 16-bit value, and never have
  to worry about "what if on some weird platform `short` is actually 8
  bits?".  And don't get me started on printf %d specs, which may have
  to be written %ld or %lld depending on whether your implementation
  makes `long` 32-bit or 64-bit long. In D, `long` is ALWAYS 64-bits.
  End of story. No need for stdint.h, no need for insanity like
  `printf("%"PRIu64"\n", var);` just to get the right print format.
  Heck, std.format.format lets you write "%d" for ALL integral types.
  Nonsense like %ld and %lld don't even need to exist in D.

- Then you have std.bitmanip.nativeToBigEndian, et al, to help you deal
  with byte order issues. It deliberately returns char[n] instead of
  native int types, so that you never accidentally swap byte order more
  than once in your conversions.

- Then there are modules like std.process that abstract away much of the
  OS-dependent details of spawning a subprocess, etc.. Details that
  you'd have to grapple with manually in C/C++.

These are just random items in no particular order that came to mind. D
is quite far ahead of C/C++ in terms of reducing implementation-defined
behaviour.  I think it's unfair to just lump it with C/C++ in terms of
"problems caused by implementation-defined behaviour".  To be frank, NO
language is 100% free of implementation-defined behaviours.  (Yes, even
Java.)  The only ones that are, are academic toys that have no real
world applications.  As long as you're interacting with the real world
in some way, there WILL be implementation-defined behaviours.

But all that is besides the point.  I've had my projects compiled with
all 3 D compilers without any problems or weirdnesses caused by
differences in implementation-defined behaviours.  In fact, I freely
switch between DMD and LDC when writing D code in the same project --
DMD for the compilation speed, LDC for the runtime speed.  I've never
run into a case where testing something with DMD exhibited different
behaviour from the final executable produced by LDC.  I'm pretty sure
there *are* some cases where you might run into some differences -- but
so far, I haven't encountered any yet.  And I do write a fair amount of
D code.  So this whole hangup about implementation-defined behaviours is
IMO unduly exaggerated.

And more to the original point: back in the day when D had only one
compiler, the naysayers said that it was the sign that the D ecosystem
is small and immature, that's why it only has 1 compiler.  Now that we
have 3+ compilers (soon to be 4 once SDC gets more release-ready), the
naysayers say that it's the reason D is small and unpopular.

I have just one word to describe this: strawman.


T

-- 
Creativity is not an excuse for sloppiness.


More information about the Digitalmars-d mailing list