Just sayin', fwiw...

Nick Sabalausky (Abscissa) SeeWebsiteToContactMe at semitwist.com
Thu May 16 03:11:15 UTC 2019


Not to beat anyone over the head or push for any change (I know not to 
do that 'round these parts these days ;) )...but I've noticed a very 
frequent pattern in my own usage of D:

1. I write a bunch of code.
2. I take advantage of ranges and std.algorithm and think I'm being all 
hip and cool and awesome.
3. I hit compile.
4. Something in std.algorithm fails to find a matching overload.
5. I double check everything. Everything seems a-ok kosher.
6. Maybe I figure, "well, I haven't bothered to update my default D 
compiler in a while, maybe it's a bug or a hole in phobos's function 
signature that's been fixed", so to be safe I do a "dvm list | sort" and 
a "dvm use [newest]", maybe download & install a new version too.
7. After dub makes me wait a minute or two to recompile all of vibe.d, I 
get the same error.
8. I scratch my head wondering WTF I did wrong.
9. [If I'm lucky] I remember, "Oh, wait, I'm doing string processing...I 
need a bySomethingOrOther to kill autodecode"
10. I toss in ".byCodeUnit", it works, I go about the rest of my day.

And I'm a D veteran. I've been using D since before v1.0 (yes, pre-D1). 
I can't even imagine how painful this would be for a D newcomer.

To be clear...not to badger, not to request, not to hope for D to change 
for the better on any matter that isn't earth-shatteringly large (I know 
better by now), I'm just sayin' for the sake o' sayin' : It really 
*wound* be nice to be able to (even optionally) get SOME kind of clear 
notice anytime I've failed to implicitly CHOOSE between decode or no-decode.

I'm tempted to just always "alias String = immutable(byte)[]" and 
pedantically convert to that whenever and wherever possible. (Welcome to 
C++-land's 'my language sucks so individual projects have to re-invent 
their own string types. Yeee-hah!)

That is all.


More information about the Digitalmars-d mailing list