[Issue 23573] New: std.bitmanip.bitfields doesn't respect native endianness
d-bugmail at puremagic.com
d-bugmail at puremagic.com
Wed Dec 21 10:16:56 UTC 2022
https://issues.dlang.org/show_bug.cgi?id=23573
Issue ID: 23573
Summary: std.bitmanip.bitfields doesn't respect native
endianness
Product: D
Version: D2
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: P1
Component: phobos
Assignee: nobody at puremagic.com
Reporter: ibuclaw at gdcproject.org
Given the following definition (-preview=bitfields)
```
union realbits
{
real value;
struct
{
version (LittleEndian)
{
private ubyte[6] padding;
ulong significand;
uint exp : 15;
bool sign : 1;
}
else
{
bool sign : 1;
uint exp : 15;
ulong significand;
private ubyte[6] padding;
}
}
}
```
You'd have thought that the equivalent std.bitmanip.bitfields would be
identical to the above, but no! You declare the bitfields in the same order,
regardless of the native endian in effect on the target.
```
import std.bitmanip : bitfields;
version (LittleEndian)
{
// ...
mixin(bitfields!(
uint, "exponent", 15,
bool, "sign" , 1));
}
else
{
version (THIS_GENERATES_WRONG_LAYOUT)
{
mixin(bitfields!(
bool, "sign" , 1,
uint, "exponent", 15));
}
else
{
mixin(bitfields!(
uint, "exponent", 15,
bool, "sign" , 1));
}
// ...
}
```
There could be a valid reason for this behaviour, and if so it should be
explicitly documented.
Otherwise, if this is indeed a bug, then it looks like the fix would be
something along the lines of foreach vs. foreach_reverse all bitfields
depending on little vs big endian respectively.
--
More information about the Digitalmars-d-bugs
mailing list