Deprecate implicit conversion between signed and unsigned integers

Jonathan M Davis newsgroup.d at jmdavisprog.com
Thu Feb 20 03:14:08 UTC 2025


On Wednesday, February 5, 2025 4:43:37 AM MST Quirin Schroll via dip.ideas wrote:
> Those are annoying, yes. Especially unary operators. If you asked
> me right now what `~x` returns on a small integer type, I
> honestly don’t know.

IIRC, _all_ operations on integer types smaller than int get converted to
int, and then if the compiler can determine for certain that the result
would fit in a smaller type, then it can be implicitly converted to the
smaller type, but in most cases, it can't know that. ~x would probably
implicitly convert, but like you, I'd have to test it.

> D has C’s rules because of one design decision early on: If it
> looks like C, it acts like C or it’s an error.

Yes, but the issue with cases like this is more that they could be errors
when they're not rather than us looking to change the behavior to something
else.

- Jonathan M Davis






More information about the dip.ideas mailing list