Why must bitfields sum to a multiple of a byte?

monarch_dodra monarchdodra at gmail.com
Thu Aug 2 02:03:53 PDT 2012


On Wednesday, 1 August 2012 at 07:24:09 UTC, Era Scarecrow wrote:
> On Tuesday, 31 July 2012 at 20:41:55 UTC, Dmitry Olshansky 
> wrote:
>
>> Great to see things moving. Could you please do a separate 
>> pull for bitfields it should get merged easier and it seems 
>> like a small but important bugfix.
>
> https://github.com/rtcvb32/phobos/commit/620ba57cc0a860245a2bf03f7b7f5d6a1bb58312
>
>  I've updated the next update in my bitfields branch. All 
> unittests pass for me.

I had an (implementation) question for you:
Does the implementation actually require knowing what the size of 
the padding is?

eg:
struct A
{
     int a;
     mixin(bitfields!(
         uint,  "x",    2,
         int,   "y",    3,
         ulong,  "",    3 // <- This line right there
     ));
}

It that highlighted line really mandatory?
I'm fine with having it optional, in case I'd want to have, say, 
a 59 bit padding, but can't the implementation figure it out on 
it's own?


More information about the Digitalmars-d-learn mailing list