Is it time for D 3.0?

NaN divide at by.zero
Sun Mar 29 09:24:50 UTC 2020


On Sunday, 29 March 2020 at 02:09:37 UTC, krzaq wrote:
> On Sunday, 29 March 2020 at 01:21:25 UTC, NaN wrote:
>> Firstly either way you have to remember something, u16 or 
>> short. So there's memorization whatever way you slice it.
>
> But you don't have to remember anything other than what you 
> want to use. When you want a 16 bit unsigned integer you don't 
> have to mentally lookup the type you want, because you already 
> spelled it. And if you see a function accepting a long you 
> don't have to think "Is this C? If so, is this Windows or 
> not(==is this LP64)? Or maybe it's D? But what was the size of 
> a long in D? oh, 64"

If you're sitting there thinking "wait what language am I using" 
you have bigger problems. I've used maybe 10 different languages 
over 30 years, it's not ever been a problem for me to remember 
what language I'm using or what the basic types were.


>> Secondly those processors from the last millennium are still 
>> the dominant processors of this millennium.
>
> Are they really? I have more ARMs around me than I do x86's. 
> Anyway, they're compatible, but not the same. "double 
> precision" doesn't really mean much outside of hardcore number 
> crunching, and short is (almost?) never used as an optimization 
> on integer, but a limitation of its domain. And, at least for C 
> and C++, any style guide will tell you to use a type with a 
> meaningful name instead.

ARMs were outselling x86 by the end of the 90s, just nobody took 
any notice till the smartphone boom. (In units shipped at least)


>>> Programming languages should aim to lower the cognitive load 
>>> of their programmers, not the opposite.
>>
>> I agree, but this is so irrelevant it's laughable.
>>
>
> It is very relevant. Expecting the programmer to remember that 
> some words mean completely different things than anywhere else 
> is not good, and the more of those differences you have, the 
> more difficult it is to use the language. And it's just not the 
> type names, learning that you have to use enum instead of 
> immutable or const for true constants was just as mind-boggling 
> to me as learning that inline means anything but inline in C++.

I'm 100% with you on the enum thing, I don't struggle to remember 
it but its awful. Its the language equivalent of a "leaky 
abstraction", from an implementation point of view enum members 
and manifest constants are pretty much the same thing, so why not 
use the same keyword? Its like saying well a single int is 
actually just an array with one member so from now on you have to 
declare ints as arrays

int[1] oh_really;

My other pet hate is nothrow. That actually means no exceptions, 
not not that it wont throw, it can still throw errors.

Oh yeah and

assert(0)

i hate that too


>>> To paraphrase your agument:
>>> A mile is 1760 yards
>>> A yard is 3 feet
>>> A foot is 12 inches
>>> What's so hard to understand? If that is causing you problems 
>>> then you probably need to reconsider your career path.
>>
>> If your job requires you to work in inches, feet and yards 
>> every single day then yes you should know that off the top of 
>> your head and you shouldn't even have to try.
>>
>> And if you find it difficult then yes you should reconsider 
>> your career path. If you struggle with basic arithmetic then 
>> you shouldn't really be looking at a career in engineering.
>
> That's circular reasoning. The whole argument is that your day 
> job shouldn't require rote memorization of silly incantations. 
> As for "basic arithmetic" - there is a reason why the whole 
> world, bar one country, moved to a sane unit system.

The reason was because the actual math was easier, not because it 
was hard to remember what a foot was. Which doesnt apply here, 
we're just talking about names, not about whether the system 
makes actually working with the units easier.



More information about the Digitalmars-d mailing list