Quora: Why hasn't D started to replace C++?

Laeeth Isharc laeeth at nospamlaeeth.com
Thu Feb 8 03:54:22 UTC 2018


On Thursday, 8 February 2018 at 00:09:47 UTC, Ali wrote:
> On Tuesday, 30 January 2018 at 20:45:44 UTC, Andrei 
> Alexandrescu wrote:
>> https://www.quora.com/Why-hasnt-D-started-to-replace-C++
>>
>> Andrei
>
> my modest opinion about this
>
> D currently is a small player, that have an attractive 
> proposition for some
>
> * it is like C++ (that is close to the power of C++) but simpler
> * it have some unique advanced feature related to meta 
> programming
>
> D's best hope is to be a bigger player, it is unrealistic to 
> replace c++
> Any improvement made to D will help make it a bigger player
>
> And while some D features can be better advertised
> (like Design by Contract, DbC is a big deal, and few other 
> languages are know to have it)
>
> I think D need to constantly add features, the idea that D is 
> big enough, or that it needs more idioms rather than features, 
> is in my opinion wrong
>
> D need to constantly add features, because all of it 
> competitions are constantly adding features
>
> As D add features, it may have a breakthrough, at some point

Maybe features help, but there's also just a natural process of 
maturation that we don't notice because it happens slowly.

In 2014 when I started using D it wasn't unusual for the 
compilers to segfault.  And since I didn't know D, or even modern 
compilers, their flags etc (beyond a bit of work in visual studio 
in the late 90s, I mostly learnt to program C on 8 bit CP/M, 
which was a bit different then),it was quite a confusing 
experience.  I couldn't have recommended D to somebody else then.

The documentation also was not very accessible to people who 
didn't think formally and were used to Python material.  I tried 
working with one chap, a trader who knew a bit of python and he 
was absolutely terrified of the D documentation.

The situation there is also rather better today.

Then again, how can I trust the compiler.  It means something 
that Liran at Weka said they haven't had any data corruption 
bugs, because data on they scale tends to find problems and bring 
them to the fore.

 From what I have seen, big changes, much more than is generally 
appreciated are often not a consequence only of one big causal 
factor, but lots of little things aligned and coming together.

However if you want a big thing just consider growth in data set 
sizes and storage capacity and related that to trends in 
processor power and memory speed.

Guido says python is fast enough because you are bottlenecked on 
I/O and network.  But in my office I have a test infiniband 
network where the 40 Gbps switch cost about 600 quid (that's old 
technology now).  NVMe drives do pretty decent throughput.  Json 
parsing is not the cleverest thing one can do with data but how 
does that compare with the throughput from just a couple of nvme 
drives?

And according to the ACM a fundamental assumption that held true 
since the dawn of computing is in the process of being 
overturned.   With storage class memory and newer network 
technology (there's a road map to get to 1Tbps) the bottleneck 
from here isnt storage or network - it's CPU.

I guess that won't hurt the adoption of D.  Not tomorrow, but 
slowly over time.




More information about the Digitalmars-d mailing list