It is the year 2020: why should I use / learn D?
H. S. Teoh
hsteoh at quickfur.ath.cx
Fri Nov 16 22:41:10 UTC 2018
On Thu, Nov 15, 2018 at 07:49:32PM -0700, Jonathan M Davis via Digitalmars-d wrote:
[...]
> Honestly, over time, I've become increasingly convinced that the more
> radical ideas would be incredibly undesirable (e.g. making const or
> immutable the default).
Actually, immutable by default would encourage better coding style, as
well as potentially provide a small performance benefit by allowing the
optimizer to take advantage of more things that don't need to be mutable
(can elide certain loads even in the face of aliasing, can infer more
loop invariants, etc.).
> shared has some rough edges that need to be sorted out, but I don't
> think that it's fundamentally broken as things stand. I think that the
> issue is more that it's misunderstood, and its proper use has not
> really been messaged well - with the related problem being that the
> core synchronization components in druntime have not been entirely
> properly updated to take shared into account like they should have
> been, mostly because no one wanted to mess with them, because they had
> the idea that shared was largely unfinished and might change
> drastically later. So, while shared's implementation needs some tweaks
> to be sure, I'm not the least bit convinced that it needs a serious
> overhaul on the language front so much as some work on the library
> front and an overhaul on the PR front.
Whether or not shared is fundamentally broken or (mostly) working as
designed, I can't say, but the documentation problem is very real,
because what little of it I have read has scared me off from touching
shared with a 10-foot pole thus far. The fact that you have to cast may
have been by design, but it's a scary thing to the under-informed.
> In any case, I think that _most_ of the things that should go in
> something like D3 can be done in D2 so long as Walter and Andrei can
> be convinced. For instance, we totally _could_ fix the nonsense about
> treating bool as an integer type in D2. There's nothing about that
> that requires D3.
Yes, but cutting the legacy tie with C integer promotion rules would
require D3.
> Unfortunately, of course, Walter and Andrei weren't convinced by the
> DIP that would effectively have fixed that by removing the implicit
> conversions from integer literals to bool, so that's not happening in
> D2 unless something drastic changes, and as such, I see no reason to
> expect that it would happen in D3.
I was very disappointed at the rejection, in fact. I suppose you have a
point that if we were to start over from a blank slate like D3, it will
probably still stay that way. But if I had any say in how D3 would be
done, I would definitely say treat bool as a non-numerical type, and
require a cast if for whatever reason you decide to treat it as such.
It makes intent so much clearer, and IMO leads to better code (just like
when pointers stop implicitly converting to bool -- I was initially
annoyed, but in retrospect appreciated the much improved readability of:
if ((p = func()) !is null) ...
over the terse but cryptic:
if ((p = func())) ...
[...]
> In a large program, it can very much be worth going to the extra
> effort of making your program work with a lot of extra attributes, but
> they often just get in the way, and forcing them on all programs by
> default would easily risk making D miserable to work in by default.
Attribute inference is the way to go. *Nobody* wants to waste time
manually annotating everything. Boilerplate is evil. In D2 Walter
couldn't pull off across-the-board attribute inference, mainly because
of backward compatibility, among other issues. In D3 we could
potentially build this in by default from the beginning, and it would
save so many headaches.
> LOL. Walter's comment at dconf this year that he wished D had const as
> the default definitely makes me that much more leery of D3 ever
> arriving, since I increasingly avoid const in D.
So you're essentially going back to D1? ;-)
> Honestly, the only thing I can think of where I'd love the opportunity
> to be able to sit down and start from scratch would be ranges - and
> not just for auto-decoding. I'd want to rework them so that save
> wasn't a thing, and I'd want to figure out how to rework them so that
> the reference and value semantics were cleaner.
Yeah, I think Andrei himself also expressed the wish that ranges could
be defined such that input ranges by definition have by-reference
semantics, whereas forward ranges have by-value semantics, and would
save their current position just by being copied into another variable.
IIRC, the reason .save was introduced in the first place was because at
the time, the language was still lacking some features that made it easy
to determine whether something had by-value or by-reference semantics.
[...]
> In general, the kind of changes that I can think of that I'd like to
> see are things that can definitely be done in D2 - assuming that
> Walter and Andrei can be convinced, which is rarely easy, and the
> question of D2 vs D3 probably wouldn't change that much (some, since
> backwards compatibility would be less of an issue, but that doesn't
> mean that they'd then be easy to convince of major changes in
> general).
[...]
Seeing as the default answer to large-scale language changes these days
seem to be "no, because it would break too much code", I think D3 would
allow an opportunity to make large-scale changes that we otherwise
wouldn't dare to in D2.
One of the warts I'd like to see go the way of the dodo is the whole
fiasco with Object.toString, Object.opEquals, etc., specifically how
they interact with attributes in class methods. And also the implicit
Monitor object (when it's not needed). Andrei had proposed ProtoObject
as a D2 solution to this impasse, but I've yet to see any further
progress since the initial discussion.
T
--
Knowledge is that area of ignorance that we arrange and classify. -- Ambrose Bierce
More information about the Digitalmars-d
mailing list