Integer conversions too pedantic in 64-bit

dsimcha dsimcha at yahoo.com
Mon Feb 14 20:54:09 PST 2011


Actually, the more I think about it the more I realize it was kind of a 
corner case.  Basically, in the two programs that were a royal PITA to 
convert to 64-bit, I was storing arrays of indices into other arrays. 
This needed to be reasonably memory-efficient.  There is no plausible 
way that any array that I was storing an index into could be >4 billion 
elements, so I used uints instead of size_t's.

On 2/14/2011 11:47 PM, Jason House wrote:
> The very fact that you didn't have issues with size_t before compiling in 64 bit mode seems like a short-coming of D. It should be hard to write code that isn't platform independent. One would kind of hope that size_t was a distinct type that could have uints assigned to them without casts. It might even be good to allow size_t to implicitly convert to ulong.
>
> dsimcha Wrote:
>
>> Now that DMD has a 64-bit beta available, I'm working on getting a whole bunch
>> of code to compile in 64 mode.  Frankly, the compiler is way too freakin'
>> pedantic when it comes to implicit conversions (or lack thereof) of
>> array.length.  99.999% of the time it's safe to assume an array is not going
>> to be over 4 billion elements long.  I'd rather have a bug the 0.001% of the
>> time than deal with the pedantic errors the rest of the time, because I think
>> it would be less total time and effort invested.  To force me to either put
>> casts in my code everywhere or change my entire codebase to use wider integers
>> (with ripple effects just about everywhere) strikes me as purity winning out
>> over practicality.
>



More information about the Digitalmars-d mailing list