Bartosz about Chapel

Walter Bright newshound2 at digitalmars.com
Thu Nov 10 01:39:30 PST 2011


On 11/9/2011 10:13 PM, Caligo wrote:
> Something like this would have been better, and if I recall this is how Chapel
> is doing it:
>
> int(32)  a;  // int
> int(64)  b;  // long

I did consider something like that early on, but couldn't get past

1. how ugly it looks after a while

2. the implication that int(17) should work, or int(1), etc.

3. the reality that CPUs tend to have 32 as the optimal size, and 'int' being a 
natural thing to type, it tends to be the default one types in. Otherwise, how 
would you explain that one should use int(32) rather than int(16) for routine 
integer use? It just seems (to me) that it would encourage the use of 
inefficient integer types.

4. I bet the first thing programmers would do with such types is:

    alias int(32) myint;

and then we're in a mismash of non-standard types (this happens too often in C 
where the sizeof(int) is unreliable).

5. I don't need constant reminding that ints are 32 bits.


More information about the Digitalmars-d mailing list