size_t + ptrdiff_t

Regan Heath regan at netmail.co.nz
Mon Feb 20 05:19:27 PST 2012


On Mon, 20 Feb 2012 11:28:44 -0000, Manu <turkeyman at gmail.com> wrote:

> On 20 February 2012 13:16, Walter Bright <newshound2 at digitalmars.com>  
> wrote:
>
>> On 2/20/2012 3:02 AM, Manu wrote:
>>
>>> ? I must have misunderstood something... I've never seen a 64bit C
>>> compiler
>>> where 'int' is 64bits.
>>>
>>
>> What are you using in C code for a most efficient integer type?
>>
>
> #ifdef. No 2 C compilers ever seem to agree.
> It's a major problem in C, hence bringing it up here. Even size_t is  
> often
> broken in C. I have worked on 64bit systems with 32bit pointers where
> size_t was still 64bit, but ptrdiff_t was 32bit (I think PS3 is like  
> this,
> but maybe my memory fails me)
>
> I want to be confident when I declare a numeric type that can interact  
> with
> pointers, and also when I want the native type.

I can imagine situations where you want to explicitly have a numeric type  
that can hold/interact with pointers, or you need /more/ width than the  
native/efficient int type.

But, in /all/ other cases surely we want the **compiler** to pick/use the  
native/most efficient int type/size.  Further, why should we state this  
explicitly, why shouldn't "int" just /be/ the native/most efficient type  
(as determined by the compiler during compilation of each/every block of  
code)... I know, I know, this goes in the face of one of D's initial  
design decisions - being sure of the width of your types without having to  
guess or dig in headers for defines etc.. but, remind me why this is a bad  
idea?

Because, it just seems to me that we want "int" to be the native/most  
efficient type and we want fixed sized types for special/specific cases  
(like in struct definitions where alignment/size matters, etc), i.e.

int a;   // native/efficient type
int16 b; // 16 bit int
int32 c; // 32 bit int
int64 d; // 64 bit int
..and so on..

But.. assuming that's not going to change any time soon, we might be able  
to go the other way.  What if we had a built-in "nint" type, which we  
could use everywhere we didn't care about integer type width, which  
resulted in the compiler picking the most efficient/native int width on a  
case by case basis (code inspection, etc.. not sure of the limits of this).

Regan

-- 
Using Opera's revolutionary email client: http://www.opera.com/mail/


More information about the Digitalmars-d mailing list