Integer conversions too pedantic in 64-bit
Iain Buclaw
ibuclaw at ubuntu.com
Tue Feb 15 07:17:18 PST 2011
== Quote from dsimcha (dsimcha at yahoo.com)'s article
> Now that DMD has a 64-bit beta available, I'm working on getting a whole bunch
> of code to compile in 64 mode. Frankly, the compiler is way too freakin'
> pedantic when it comes to implicit conversions (or lack thereof) of
> array.length. 99.999% of the time it's safe to assume an array is not going
> to be over 4 billion elements long. I'd rather have a bug the 0.001% of the
> time than deal with the pedantic errors the rest of the time, because I think
> it would be less total time and effort invested. To force me to either put
> casts in my code everywhere or change my entire codebase to use wider integers
> (with ripple effects just about everywhere) strikes me as purity winning out
> over practicality.
I have a similar grudge about short's being implicitly converted to int's,
resulting in hundreds of unwanted casts thrown in everywhere cluttering up code.
ie:
short a,b,c,;
a = b + c;
Hidden implicit casts should die.
More information about the Digitalmars-d
mailing list