What is the right level of abstractions for D?

Laeeth Isharc via Digitalmars-d digitalmars-d at puremagic.com
Sun Oct 30 16:25:03 PDT 2016


Those categories - I am not sure how well they fit.

When I learnt to program, C was considered a high level language, 
  and now Swift is considered low level.  The world has changed a 
little, but that isn't my main point.

To grow in a healthy way, D doesn't need to think in terms of 
dominating the world in its totality in a single bound.  All that 
is necessary is to appeal to people around the world in different 
lines of work who are unhappy with what they have now and are 
searching for something better,  or who know about D but where 
there is some kind of barrier to adopting it (barriers are not 
impossible to overcome).

It's a big world,  and the work of Polanyi and Hayek should 
remind us that its very difficult to know where users will come 
from,  because that requires a knowledge of time and place that 
we don't have.   But at this size in relation to the total pool 
of programmers to grow,  and Walter's point about listening to 
your current customers who wish to become bigger ones is a good 
one.

Implicitly in what you wrote is the idea that low level 
programmers are the ones with real ability,  and people who write 
in Python might be productive,  but are of a different level of 
overall ability.

Once that kind of high level / low level mapping to ability might 
have made sense,  but if it's still useful,  I think it's much 
less applicable now.   There are plenty of mediocre embedded 
device programmers,  and plenty of people who used to write in C 
and now write in Python.   (I am sure ESR still writes in C, but 
he write about his love for python some time back).

And to call python a scripting language is misleading terminology 
- conventionally so,  but still  true - for example,  AHL,  the 
large quant money management firm,  wrote their risk management 
systems entirely in Python. You may be an enthusiast of Fisher 
Black's Noise paper and think that people in money management are 
fooling themselves,  and I am sympathetic to some of that,  but 
my impression is that technically this firm is decent.

And everything gets more mixed up when you can compile a Ruby 
dialect and have it appear at the top of the performance tables.  
  It was a scripting language before,  and now it's not?  (it's 
control and level of abstraction rather than performance that 
distinguishes level of a language,  but in the past these things 
went together).

It seems to me you are reifying your class structure of languages 
when I am not sure it is a good representation of how things are.

The reason scripting applications don't use D if it's not already 
used for other things is libraries and polish.   D has great 
python interoperability and also a nice package manager.   Well,  
try compiling a program as a python library depending on vibed 
using dub.   It won't go very well because the fPIC isn't 
propagated and you might need to compile in a little C code got 
the constructors.   And what needs to be done isn't completely 
documented anywhere. So at this point,  it's a less appealing 
proposition because for the people that currently use D,  these 
things don't bother them as much and because it's still a small 
community with a lot of work to do. John Colvin is working on it, 
  and maybe it will be fixed soon - because it did bother me.

But this isn't a consequence of the Platonic essence of D as a 
language.   It's merely contingent on the particular stage of 
development and it couldn't have been otherwise because you have 
to get the foundation right before putting sugar on top.

The experience of social learning is that every person that 
follows a course somewhat mysteriously makes it easier for those 
that follow.   It's not only mysterious,  because the signposts 
and guides get better too.   D is not an easy language to learn,  
but even today it's far from exceptionally difficult,  and it 
will get easier with time.

If you want a C and C++ ABI,  want to have control over memory 
and to go down to a low level if you need it,  but are someone in 
a position where you need to get stuff done,  and don't think 
modern C++ is the answer,  what choices do you have?  Not all 
that many.

And I have thought for a while that people were recklessly 
squandering performance and memory usage.   That's not good 
craftsmanship - it might be necessary to make compromises in a 
practical situation, but it's hardly something to be proud of,  
as people almost seem to be.   What Knuth said in his paper and 
speech is not what people take his words out of context to mean,  
and on the other hand look at the everyday experience of using 
applications written by people who follow this philosophy to see 
that there must be something wrong with it.

A language really takes off not because of something it started 
doing that it didn't before,  and all of a sudden everything is a 
success.   It takes off when you have the seeds of something,  
and external conditions change to the point that suddenly it 
starts to become a much more favourable environment for language 
adoption.

Herb Sutter wrote a long time back about the end of the free 
lunch from Moore's Law.   And its already visible today - somehow 
one ends up being CPU and memory bound more than Guido would have 
us believe (I am not sure quite often that Python is fast 
enough).   It's why I adopted D, and another chap from hedge fund 
land who has spoken at dconf.   He said to me this year that 
things had a bit the feel of python scene in the 90s before it 
really took off.

Dataset sizes keep growing.  And I don't know,  but it seems to 
be that the ACM paper I posted earlier about the consequences of 
new fast non-volatile storage technology is right - storage won't 
be the bottleneck for much longer, and increasingly CPU will.   
Intel described the technology as the most significant 
development since the Internet.   I am not sure that is merely 
hyperbole.

When you see a change in relative price of the sort they are 
talking,  and it's something so basic,  then everything changes.  
  Not overnight,  but over the course of the next five to ten 
years.

I can't imagine this shift will be bad for D,  and if that's the 
case then the pattern of people drawn to explore D at the margin 
won't necessarily fit your categories.   I should think the 
determinants will be more things like how much work and data do 
you have in relation to easily available processing power (yes,  
the cloud is cheap,  but there are institutional and contractual 
barriers to moving data outside the firm,  and these change 
slowly), how open are you to new technologies,  do you have the 
authority to try it yourself without persuading a committee,  can 
you and your people learn something new without extensive 
training courses,  are you addicted to depending on support,  can 
you already program in C?  And so on.

So look at recent adopters - that probably gives you a little 
idea about what's coming.

It seems to me the recent low level improvements might be seen as 
just a general process of maturation that is reflected now in 
lower level stuff,  previously in documentation improvements,  
and tomorrow in something else.   It's not like there is a 
central direction of any effectiveness on what people work on.

A push for C++ interoperability is a good thing,  but 
strategically what has that to do with being low level?   It 
makes it easier to use C++ libraries,  which is also good for 
scripting...


Friday, 28 October 2016 at 08:46:05 UTC, Joakim wrote:
> On Thursday, 27 October 2016 at 17:03:09 UTC, Nick Sabalausky 
> wrote:
>> On 10/27/2016 02:22 AM, Joakim wrote:
>>>
>>> 1. low-level compiled languages like C++, D, Rust, and Swift, 
>>> meant for
>>> performance and usually experts who want to squeeze it out
>>>
>>> 2. mid-level bytecode languages like Java and C#, meant for 
>>> the vast
>>> middle of day-to-day programmers to crank out libraries and 
>>> apps that
>>> perform reasonably well
>>>
>>> 3. high-level "scripting" languages like Ruby and Python, 
>>> meant for
>>> those who don't care too much for performance but just want 
>>> to get
>>> working code
>>>
>>> I think D is positioned somewhere between 1 and 2, though 
>>> closer to 1.
>>> However, there is sometimes talk of using D for all three, 
>>> though
>>> perhaps that is only meant as an added benefit for people 
>>> already using
>>> it for 1 or 2, ie those who already know the language better.
>>>
>>
>> You're falling into the common fallacy that those groups are 
>> mutually exclusive and that a single language can't be 
>> appropriate for more than one. D is all about proving that 
>> wrong, and is meant for, and good at, all three.
>
> There are good reasons for this split, and yes, it is probably 
> impossible for one language to attract all three groups. You 
> could use languages from 1 and 2 for all three, with a bit more 
> work, but I don't see many scripts written in C++. :)
>
> The reason for the split is that there are different levels of 
> software expertise and performance needs, and each of those 
> groups is geared for a different level.  Show templates to a 
> scripting user and they probably run away screaming.  D can 
> probably do well with groups 1 and 2, but the level of power 
> and expertise that is needed for those lower levels will scare 
> away people from 3.  Those already using it for 1 and 2 may 
> also be comfortable with reusing D for scripting, but that's 
> not attracting people from group 3, ie those who only want 
> something easy to use and don't want to know the difference 
> between a static and dynamic array.
>
>> I've noticed that, for many of the people who don't "get" D, 
>> the problem they're hitting is that they're minds are so 
>> twisted around by the "polyglot" culture, that they're looking 
>> for "the one" tiny little niche that D is for, not seeing 
>> that, and thus missing the whole entire point.
>
> Yes, this has definitely hurt D, being stuck between 1 and 2.  
> People from 2 probably look at D and think it's too low-level.  
> People from 1 are always looking to squeeze out _more_ 
> performance, so the GC is a way for them to just write off D.  
> You and I think they're making a mistake, but maybe they're not 
> wrong for their own uses.
>
> As I said, the recent push for @nogc and C++ compatibility 
> suggests that a renewed effort is being made to focus on group 
> 1, particularly when combined with the D benchmarks for regex 
> and recently math.  I'm happy in that space between 1 and 2, 
> and the recent push to move languages from 2 to AoT compilation 
> suggests that is a good place to be.  So maybe group 2 will 
> also come to us. :)




More information about the Digitalmars-d mailing list