automatic int to short conversion - the HELL?

Sean Kelly sean at invisibleduck.org
Fri Sep 19 17:11:42 PDT 2008


bearophile wrote:
> Don:
> 
>> But the solution is NOT to leave the language as-is, only disallowing 
>> signed-unsigned comparison. That's a cure that's as bad as the disease.
> 
> May I ask you why?
> 
> 
>> One of the biggest stupidities from C is that 0 is an int. Once you've 
>> done that, you HAVE to have implicit conversions. And once you have 
>> implicit conversions, you have to allow signed-unsigned comparision.
> 
> I don't understand (I know no languages where 0 isn't an int), can you explain a bit better?

I think this actually applies to any integer literal.  For example:

     short i = 0;
     unsigned j = 1;

In C, the above code implicitly converts int(0) to short and int(1) to 
unsigned.  If literals had a type and implicit conversions were illegal, 
this code would have to be:

     short i = (short)0;
     unsigned j = (unsigned)1;

which obviously stinks.  However, typed literals plus allowed conversion 
also makes this legal:

     unsigned k = -2;

which makes no sense, given the types involved.


Sean


More information about the Digitalmars-d-learn mailing list