Deprecate implicit conversion between signed and unsigned integers

Quirin Schroll qs.il.paperinik at gmail.com
Fri Feb 14 00:09:14 UTC 2025


On Thursday, 6 February 2025 at 16:39:26 UTC, Kagamin wrote:
> On Monday, 3 February 2025 at 18:40:20 UTC, Atila Neves wrote:
>> https://forum.dlang.org/post/pbhjffbxdqpdwtmcbikh@forum.dlang.org
>
> I agree with Bjarne, the problem is entirely caused by abuse of 
> unsigned integers as positive numbers. And deprecation of 
> implicit conversion is impossible due to this abuse: signed and 
> unsigned integers will be mixed everywhere because signed 
> integers are proper numbers and unsigned integers are 
> everywhere due to abuse.

What would be a “proper number”? At best, signed and unsigned 
types represent various slices of the infinite integers.

> Counterexample is C# that uses signed integers in almost all 
> interfaces and it just works.

C# uses signed integers because not all CLR languages support 
unsigned types. There’s a 
[`CLSCompliantAttribute`](https://learn.microsoft.com/de-de/dotnet/api/system.clscompliantattribute) that warns you if you expose unsigned integers to your public API. That said, the case for 8-bit types is reversed: C#’s `byte` type is unsigned and `sbyte` is the signed, non-CLS-compliant variant.


More information about the dip.ideas mailing list