Annoyance with new integer promotion deprecations
H. S. Teoh
hsteoh at quickfur.ath.cx
Wed Feb 7 00:24:26 UTC 2018
On Tue, Feb 06, 2018 at 10:38:36PM +0000, Luís Marques via Digitalmars-d wrote:
[...]
> Yeah, it's annoying. For my MSP430 work (16-bit, lots of shorts and
> bytes) I created a generic type which works around this, so you would
> do:
>
> byte c = a.nx + b;
>
> where .nx means "non-extending" and converts/wraps the
> (u)byte/(u)short in my special type. The arithmetic operations are
> infectious, so you only need to apply it to one of the operands (and
> you can preserve it across statements by using "auto" instead of
> "byte"). Because D doesn't have automatic type conversions for the
> function calls, the function signatures still take the standard types,
> and you add .nx where needed in the body. Because the higher-order
> bits are discarded after each operation, the optimizer easily
> optimizes away the multi-word operations (as least in all of the cases
> I bothered checking...).
[...]
> Unfortunately, I'm not holding my breath for something like this. At
> least D is flexible enough that my .nx solution works! That's saying a
> lot about the language.
I really like your .nx idea! It neatly sidesteps the nonsensical
mandatory casts and on top of that documents intent (the .nx being a
telltale sign of truncation -- much better than arbitrary implicit
rules). I think I'll adopt it in some form in my code, to make dealing
with narrow ints saner. I don't know what your .nx type does, but for
my purposes I'll probably just have a thin wrapper around
byte/ubyte/etc. with overloaded arithmetic operators that perform the
requisite casts.
And yeah, one thing I really like about D is that it empowers you with
the tools you need to implement types that are (almost) as powerful as
built-in types. While for the most part the built-in stuff is pretty
cool, when it's not up to snuff, in 99% of the cases you can just
replace it with your own solution, and it Just Works(tm).
T
--
Some ideas are so stupid that only intellectuals could believe them. -- George Orwell
More information about the Digitalmars-d
mailing list