Deprecate implicit conversion between signed and unsigned integers

Kagamin spam at here.lot
Sat Feb 15 18:33:52 UTC 2025


On Friday, 14 February 2025 at 00:09:14 UTC, Quirin Schroll wrote:
> What would be a “proper number”? At best, signed and unsigned 
> types represent various slices of the infinite integers.

The problem is they are incompatible slices that you have to mix 
due to abuse of unsigned integers everywhere. At best unsigned 
integer gives you an extra bit, but in practice it doesn't cut: 
when you want a bigger integer, you use a much wider integer, not 
one bit bigger integer.

> C# uses signed integers because not all CLR languages support 
> unsigned types.

It demonstrates that the problem is due to abuse of unsigned 
integers.


More information about the dip.ideas mailing list