Inferring an integer literal as ubyte

Shriramana Sharma via Digitalmars-d-learn digitalmars-d-learn at
Mon Dec 14 05:33:41 PST 2015

Hello. I was trying to do something like this:

ubyte code = to!ubyte(spec, 6) + 16;

and got an error saying:

cannot implicitly convert expression (cast(int)to(spec, 6) + 16) of type int 
to ubyte

Looking at, sure enough 16 is 
specified to be inferred as an `int`.

I thought that integer literals used to be inferred as the smallest integral 
type that can fit them – am I mistaken?

Shriramana Sharma, Penguin #395953

More information about the Digitalmars-d-learn mailing list