Implicit enum conversions are a stupid PITA

Walter Bright newshound1 at digitalmars.com
Thu Mar 25 10:38:41 PDT 2010


Yigal Chripun wrote:
> Walter Bright Wrote:
>> Pascal has explicit casts. The integer to character one is CHR(i), the
>> character to integer is ORD(c).
> I meant implicit, sorry about that. The pascal way is definitely the correct
> way. what's the semantics in your opinion of ('f' + 3) ? what about ('?' +
> 4)? making such arithmetic valid is wrong.

Yes, that is exactly the opinion of Pascal. As I said, I've programmed in 
Pascal, suffered as it blasted my kingdom, and I don't wish to do that again. I 
see no use in pretending '?' does not have a numerical value that is very useful 
to manipulate.

> I'm sure that the first Pascal
> versions had problems which caused you to ditch that language (they where
> fixed later).

They weren't compiler bugs I was wrestling with. They were fundamental design 
decisions of the language.

> I doubt it though that this had a large impact on Pascal's
> problems.

I don't agree. Pascal was a useless language as designed. This meant that every 
vendor added many incompatible extensions. Anyone who used Pascal got locked 
into a particular vendor. That killed it.


>>> The fact that D has 12 integral types is a bad design, why do we need so
>>> many built in types? to me this clearly shows a need to refactor this
>>> aspect of D.
>> Which would you get rid of? (13, I forgot bool!)
>> 
>> bool byte ubyte short ushort int uint long ulong char wchar dchar enum
> 
> you forgot the cent and ucent types and what about 256bit types?

They are reserved, not implemented, so I left them out. In or out, they don't 
change the point.


> Here's How I'd want it designed: First of, a Boolean type should not belong
> to this list at all and shouldn't be treated as a numeric type. Second, there
> really only few use-cases that are relevant
> 
> signed types for representing numbers: 1) unlimited integral type - int 2)
> limited integral type  - int!(bits), e.g. int!16, int!8, etc.. 3) user
> defined range: e.g. [0, infinity) for positive numbers, etc..
> 
> unsigned bit-packs: 4) bits!(size), e.g. bits!8, bits!32, etc..
> 
> of course you can define useful aliases, e.g. alias bits!8 Byte; alias
> bits!16 Word; .. or you can define the aliases per the architecture, so that
> Word above will be defined for the current arch (I don't know what's the
> native word size on say ARM and other platforms)

People are going to quickly tire of writing:

bits!8 b;
bits!16 s;

and are going to use aliases:

    alias bits!8 ubyte;
    alias bits!16 ushort;

Naturally, either everyone invents their own aliases (like they do in C with its 
indeterminate int sizes), or they are standardized, in which case we're back to 
pretty much exactly the same point we are at now. I don't see where anything was 
accomplished.


> char and relatives should be for text only per Unicode, (perhaps a better
> name is code-point).

There have been many proposals to try and hide the fact that UTF-8 is really a 
multibyte encoding, but that makes for some pretty inefficient code in too many 
cases.

> for other encodings use the above bit packs, e.g. alias
> bits!7 Ascii; alias bits!8 ExtendedAscii; etc..
> 
> enum should be an enumeration type. You can find an excellent strongly-typed
> design in Java 5.0

Those enums are far more heavyweight - they are a syntactic sugar around a class 
type complete with methods, interfaces, constructors, etc. They aren't even 
compile time constants! If you need those in D, it wouldn't be hard at all to 
make a library class template that does the same thing.




More information about the Digitalmars-d mailing list