Regarding the proposed Binray Literals Deprecation

Don Allen donaldcallen at gmail.com
Sat Sep 10 19:05:38 UTC 2022


On Saturday, 10 September 2022 at 02:17:30 UTC, Walter Bright 
wrote:
> On 9/9/2022 4:43 PM, Adam D Ruppe wrote:
>>> If you're using a lot of octal literals such that this is an 
>>> issue, one wonders, what for? The only use I know of is for 
>>> Unix file permissions.
>> 
>> I keep hitting them in random C code I'm translating. Various 
>> unix things beyond file permissions and a hardware manual for 
>> a think i had to drive (an rfid chip) used them for various 
>> bit triplets too.
>
> octal!433 is really not much different from 0433. It could even 
> be shortened to o!433, exactly the same number of characters as 
> 0o433.
>
> The reasons for adding language syntactic sugar:
>
> 1. its very commonplace
>
> 2. the workarounds are gross
>
> Of course it's a judgement call, and I understand you see them 
> randomly in C code, but does it really pay off? The downside is 
> the language gets bigger and more complex, the spec gets 
> longer, and people who don't come from a C background wonder 
> why their 093 integer isn't 93.
>
> > the newer imported!"std.conv".octal!433 pattern
>
> Nobody would ever write that unless they used octal exactly 
> once, which suggests that octal literals aren't common enough 
> to justify special syntax.
>
>
>> I often prefer using binary literals anyway, but changing 
>> something like 0o50000 to binary is a little obnoxious.
>
> I first implemented binary literals in the 1980s, thinking they 
> were cool and useful. They were not and not. I haven't found a 
> reasonable use for them, or ever wanted them. (I prefer writing 
> them in hex notation, as binary literals take up way too much 
> horizontal space. After all, C3 is a lot easier than 11000011. 
> The latter makes my eyes bleed a bit, too.)
>
> Let's simplify D.

I couldn't agree more with this. I've made it clear that I've 
done some very successful work with D and have been very pleased 
with the outcome. But this work involved porting C code I wrote 
10 years ago that had become ugly (or maybe it always was) and 
difficult to maintain. The D version is a big improvement.

But if I were starting with an empty editor buffer, would I 
choose D? Especially to write a garden-variety application rather 
than bashing hardware registers? Perhaps not. Some of that would 
be simply that a higher-level language would be more suitable, 
e.g., Haskell or Scheme, both personal favorites. But some of 
that would be due to the hangover from C and C++ that D stills 
exhibits. My opinion: C was a bad language in 1970 and it is 
horrifying today. C++? Words fail me, unless they are 
scatological. I think the more D can detach itself from its C 
heritage and emphasize modern programming language practice (in 
other words, take advantage of what we have learned in the last 
52 years), the better.




More information about the Digitalmars-d mailing list