Blocking points for further D adoption

Artem Tarasov via Digitalmars-d digitalmars-d at puremagic.com
Sat Jun 4 06:18:02 PDT 2016


The largest blocking point to me is the community attitude. D constantly
wants to 'rule them all' instead of integrating with other language
ecosystems. This only recently started to change, but only towards C/C++
and not in the other direction, which is dynamic languages. PyD is only
barely alive, and nobody seems to be interested to take it to the next
level—of making it easy to distribute the created packages.

I'm speaking here from a researcher's perspective. One must realize that in
our universe, there is often no time to learn yet another language, so
people consolidate around Python so that everyone stays productive, and
this situation will not change until someone rolls out a complete
replacement for numpy, scipy, pandas, and scikit-learn at the very least.
(and that won't happen any time soon) A fancy custom Jupyter kernel is nice
but often half-baked and not really necessary. But solving distribution of
shared libraries is a must if you (still) want to become a C++ replacement.

To me it seems that D currently has a unique advantage of being able to
easily generate in compile time all the boilerplate binding code that
everybody hates to write in C++ (or if one uses boost::python, hates to
wait to compile). Combine that with the fact that many are terrified of
C/C++ insomuch that Cython was invented, and D offers a much nicer language
with GC for those who don't want to even know about memory management.
Research people would love this, but only if it's a production-ready
solution that needs no extra time investment.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20160604/f9a92c94/attachment-0001.html>


More information about the Digitalmars-d mailing list