core.simd ubyte16 initialization weirdness.
    realhet 
    real_het at hotmail.com
       
    Sun May  7 21:07:38 UTC 2023
    
    
  
Hello,
```
import std, core.simd;
void main()
{
     enum ubyte16
         good1 = mixin([1, 2, 3, 4]),
     	bad = [1, 2, 3, 4];
     static immutable ubyte16
         good2 = mixin([1, 2, 3, 4]),
     	crash = [1, 2, 3, 4];
     pragma(msg, good1);
     pragma(msg, bad);
     pragma(msg, good2);
     pragma(msg, crash);
}
```
In the above example I tried 4 ways to initialize ubyte16 
constants.
I only specify the first 4 values, the remaining is automatically 
zero.
2 of them are good 2 of them are bad.
Egy enum version compiles, but after trying SSE pshufb 
instruction with them, it seems like the [1, 2, 3, 4, 12 times 0] 
is distorted to this: [1,1,1,1,2,2,2,2,3,3,3,3,4,4,4,4]
I discovered the mixin() trick when I wanted to give them some 
calculated series.
Is it a bad way to initialize these constants? Is there a better 
way?
cast(immutable(__vector(ubyte[16])))[cast(ubyte)1u, 
cast(ubyte)2u, cast(ubyte)3u, cast(ubyte)4u, cast(ubyte)0u, 
cast(ubyte)0u, cast(ubyte)0u, cast(ubyte)0u, cast(ubyte)0u, 
cast(ubyte)0u, cast(ubyte)0u, cast(ubyte)0u, cast(ubyte)0u, 
cast(ubyte)0u, cast(ubyte)0u, cast(ubyte)0u]
This is how the compiler dumps it. But it's not so compact, so I 
rather use the mixin() version, I just don't understand why the 
int array fails -> [1, 2, 3....] It would look so nice.
    
    
More information about the Digitalmars-d-learn
mailing list