typedefs are useless

BLS nanali at nospam-wanadoo.fr
Tue Dec 4 01:12:10 PST 2007


IMO, in this case ADA is slicker than D.
I remember that f.i.
type Natural is range 0..255
is quit often in use. (long time ago that I learned a bit ADA)

Having quality assurance and *Reliable Software in mind, then the ADA 
way much is smarter  than using D's assert() or DBC instead.

Hope we'll see this feature in D, maybe it is worth a Feature Request.
Bjoern

Peter C. Chapin schrieb:
> Steven Schveighoffer wrote:
> 
>> Let's say I want a way to create a type that's like a long, but is not 
>> implicitly convertable from a long.
>>
>> I can do:
>>
>> typedef long mytype;
>>
>> However, I can't create literals of this type.  so if I want to initialize a 
>> mytype value to 6, I have to do:
>>
>> mytype x = cast(mytype)6L;
> 
> FWIW, Ada solves this problem by considering literals in a special type
> called "universal integer." It's special because you can't actually
> declare any variables of that type. However, universal integers can be
> implicitly converted to other types derived from Integer. So, in Ada it
> looks like this
> 
> type My_Type is range 0..10    -- Or whatever range you need.
> 
> X : My_Type;
> Y : Integer;
> 
> ...
> 
> X := Y;    -- Type mismatch. Compile error.
> X := 1;    -- Fine. Universal integer converts to My_Type.
> 
> This sounds like what you want for D. Note, by the way, that the range
> constraint on a type definition in Ada must be static. Thus the compiler
> can always tell if the value of the universal integer (which can only be
> a literal) is in the right range.
> 
> Ada also has a concept of universal float to deal with float point
> literals in a similar way.
> 
> Peter



More information about the Digitalmars-d mailing list