It is the year 2020: why should I use / learn D?

Jonathan M Davis newsgroup.d at jmdavisprog.com
Wed Nov 14 23:47:16 UTC 2018


On Wednesday, November 14, 2018 4:25:07 PM MST Walter Bright via 
Digitalmars-d wrote:
> On 11/14/2018 10:47 AM, Dukc wrote:
> > I doubt the shortening distance. While C++ does advance and D isn't
> > moving as fast as it was at 2010 (I think), I still believe C++ isn't
> > the faster evolver of the two. When the next C++ standard comes out, D
> > has improved too. Examples
> > of what might be there by then:
> C++ is adding lots of new features. But the trouble is, the old features
> remain, and people will still use them, and suffer.
>
> Examples:
>
> 1. The preprocessor remains. There has never been a concerted effort to
> find replacements for it, then deprecate it. It's like allowing
> horse-drawn carts on the road.
>
> 2. Strings are still 0-terminated. This is a performance problem, memory
> consumption problem, and is fundamentally memory unsafe.
>
> 3. Arrays still decay to pointers, losing all bounds information.
>
> 4. `char` is still optionally signed. What a lurking disaster that is.

All of those are definitely problems, and they're not going away - though
occasionally, the preprocessor does come in handy (as much as I agree that
on the whole it's better that it not be there).

> 5. What size is an `int`?

While I agree with the point that you're trying to make, that particular
type actually isn't really a problem on modern systems in my experience,
since it's always 32 bits. Maybe with ARM it's a problem (thus far I've only
seriously programmed on x86 and x86-64 machines), and certainly, if you have
to deal with 16-bit machines, it's a problem, but for most applications on
modern systems at this point, int is always 32 bits. It's long that shoots
you in the foot, because that still varies from system to system, and as
such, I've always considered long to be bad practice in any C++ code base
I've worked on. On the better teams that I've worked on, int has been fine
when the size of a type really didn't matter, but otherwise, an integer type
has been one of the int*_t types. But even then, when you're dealing with
something like printf, you're screwed, because it doesn't understand the
types with fixed sizes. So, you're constantly fighting the language and
libraries. D's approach of fixing the size of most integer and floating
point types is vastly superior, and the problems that we do have there are
from the few places where we _didn't_ make them fixed, but since that mostly
relates to the size of the memory space, I'm not sure that we really had
much choice there. The main outlier is real, though most of the controversy
there seems to have do with arguments about the efficiency of the x87 stuff
rather than having to do with differences across systems like you get with
the integers.

- Jonathan M Davis





More information about the Digitalmars-d mailing list