Deprecate implicit conversion between signed and unsigned integers

Jonathan M Davis newsgroup.d at jmdavisprog.com
Thu Feb 20 03:37:41 UTC 2025


On Monday, February 17, 2025 3:24:37 PM MST Walter Bright via dip.ideas wrote:
> On 2/17/2025 1:06 AM, Atila Neves wrote:
> >> (Did I mention that explicit casts also hide errors introduced by refactoring?)
> >
> > `cast(typeof(foo)) bar`?
>
> That can work, but when best practices mean adding more code, the result is
> usually failure.
>
> Also, what if `foo` changes to something not anticipated by that cast?

That's part of why if I were creating a new language, I'd want a level of
conversion in between implicit and explicit, though I don't have a good name
for the idea, since explicit implicit casts isn't exactly good. But
essentially, it would be nice to have a defined set of conversions like we
get with implicit casts, but they don't actually happen implicitly. Rather,
you use some sort of explicit cast to tell the compiler that you want it to
occur, but it only allows that subset of "implicit" casts rather than being
the blunt instrument that you typically get with casts which will then do
things like reinterpret the memory.

But of course, we don't have anything like that in D, and it probably
wouldn't make sense to retrofit it in at this point, though we could
certainly define more restrictive casts via templated functions (e.g. like
C++ does with stuff like dynamic_cast and const_cast) in order to allow a
particular piece of code to be more selective about the casting that it
allows so that it can have a cast but not risk it turning into a reintepret
cast or whatnot.

As for converting between signed and unsigned... I'm definitely mixed on
this one. I follow essentially the rules that you mentioned for using signed
vs unsigned, but I _have_ been bitten by this (quite recently in fact), and
it was hard to catch. On the other hand, I don't know how many casts would
be required in general if we treated conversions between signed and unsigned
as narrowing conversions and thus required a cast. Since I mostly just use
unsigned via size_t (there are exceptions, but they're rare), I suspect that
I wouldn't need many casts in my code, but I don't know. And the code that I
got bitten with recently was templated, which could make handling it
trickier (though in this case, I could have just cast to long, and that's
what I needed to do anyway).

My guess is that we'd be better off with requiring the casts, but I don't
know. It _is_ arguably trading off one set of bugs for another, but it would
also force you to think about what you want with any particular conversion
rather than silently doing something that you don't necessarily want. Casts
do become more problematic with refactoring, but the lack of casts is
similarly problematic, since those also can change behavior silently. It's
just for a different set of types. Realistically, I would expect that some
code would have fewer bugs with the cast requirement, and other code would
have more, but I would _guess_ (based on how I code at least) that the net
result would be fewer.

- Jonathan M Davis





More information about the dip.ideas mailing list