Deprecate implicit conversion between signed and unsigned integers

Atila Neves atila.neves at gmail.com
Fri Feb 7 12:50:55 UTC 2025


On Thursday, 6 February 2025 at 09:10:41 UTC, Walter Bright wrote:
> [I'm not sure why a new thread was created?]
>
> This comes up now and then. It's an attractive idea, and seems 
> obvious. But I've always been against it for multiple reasons.
>
> 1. Pascal solved this issue by not allowing any implicit 
> conversions. The result was casts everywhere, which made the 
> code ugly. I hate ugly code.

I hate ugly code too, but I'd rather have explicit casts.

> 3. Is `1` a signed int or an unsigned int?

In Haskell, it could be either and the type would either be 
inferred. Or the programmer chooses:

1 :: Int

> 4. What happens with `p[i]`? If p is the beginning of a memory 
> object, we want i to be unsigned. If p points to the middle, we 
> want i to be signed. What should be the type of `p - q`? signed 
> or unsigned?

Good questions.


More information about the dip.ideas mailing list