Bartosz about Chapel

foobar foo at bar.com
Thu Nov 10 12:33:58 PST 2011


Walter Bright Wrote:

> On 11/9/2011 10:13 PM, Caligo wrote:
> > Something like this would have been better, and if I recall this is how Chapel
> > is doing it:
> >
> > int(32)  a;  // int
> > int(64)  b;  // long
> 
> I did consider something like that early on, but couldn't get past
> 
> 1. how ugly it looks after a while
> 
> 2. the implication that int(17) should work, or int(1), etc.
> 
> 3. the reality that CPUs tend to have 32 as the optimal size, and 'int' being a 
> natural thing to type, it tends to be the default one types in. Otherwise, how 
> would you explain that one should use int(32) rather than int(16) for routine 
> integer use? It just seems (to me) that it would encourage the use of 
> inefficient integer types.
> 
> 4. I bet the first thing programmers would do with such types is:
> 
>     alias int(32) myint;
> 
> and then we're in a mismash of non-standard types (this happens too often in C 
> where the sizeof(int) is unreliable).
> 
> 5. I don't need constant reminding that ints are 32 bits.

1. This makes sense on 32bit platforms. What about 64bit platforms? 
2. What about int(128), int(16), int(8)?  for the non-default (!= 32) you do want to know the size. in what way short and byte better? 
3. The default optimal size should indeed have an alias such as 'int'.


A much better scheme IMO is to define as general type and predefine an easy to remember alias for the default (32). Wait a sec, that's what chapel did.. 
The problem isn't with "int" but rather with the proliferation of the other types for all the combinations of sign-ness and size. 



More information about the Digitalmars-d mailing list