convert char[4] to uint at compile time

Moritz Warning moritzwarning at web.de
Tue Dec 23 11:06:10 PST 2008


On Tue, 23 Dec 2008 13:16:28 +0300, Denis Koroskin wrote:

> On Tue, 23 Dec 2008 11:07:08 +0300, Janderson <ask at me.com> wrote:
> 
>> Moritz Warning wrote:
>>> On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
>>>
>>>> Reply to Moritz,
>>>>
>>>>> Hi,
>>>>>
>>>>> I have problems to convert a char[4] to an uint at compile time. All
>>>>> variations (I've tried) of using an enum crashes dmd:
>>>>>
>>>>> union pp { char[4] str; uint num; }
>>>>> const uint x = pp("abcd").num
>>>>> This does also doesn't work:
>>>>>
>>>>> const uint x = cast(uint) x"aa aa aa aa";
>>>>>
>>>>> Any ideas?
>>>>>
>>>>>
>>>> template Go (char[4] arg)
>>>> {
>>>>     const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) |
>>>>     arg[3];
>>>> }
>>>>
>>>> import std.stdio;
>>>> void main()
>>>> {
>>>>    writef("%x\n", Go!("Good"));
>>>> }
>>>  Thanks!
>>> That workaround should do it.
>>>  Maybe it will be possible to just do cast(uint) "abcd" in the future.
>>> :>
>>
>> That would only cast the pointer.  It should be something like :
>> cast(uint)(*"abcs") or *cast(uint*) "abcs".
>>
>> -Joel
> 
> And what about endianness? You can't have a feature in a language that
> gives different results in different environment.

The use of uint in my example might be confusing.
I only needed an environment independent bit pattern of 4 bytes.
An integer is used because it's faster than comparing a char[4]
with DMD. :/

(GDC doesn't show such behavior)


More information about the Digitalmars-d-learn mailing list