The bitfields setters give incorrect results if the bitfield is 64 bits long and the first field is <= 32. The following program returns the correct result: import std.bitmanip; import std.string; public static void main() { union S { // entire 64-bit unsigned integer ulong bits = ulong.max; // split in two mixin (bitfields!( ulong, "back", 31, ulong, "front", 33) ); const string toHex() { return format("0x%016X", bits); } } S num; num.bits = ulong.max; writeln("num1 = ", num.toHex); num.back = 1; writeln("num2 = ", num.toHex); } Output: num1 = 0xFFFFFFFFFFFFFFFF num2 = 0xFFFFFFFE00000001 But if you change the bitfields to this: mixin (bitfields!( ulong, "back", 32, ulong, "front", 32) ); Output: num1 = 0xFFFFFFFFFFFFFFFF num2 = 0x0000000000000001 The front half, which shouldn't be changed, is converted to zeros. Or: mixin (bitfields!( ulong, "back", 31, ulong, "front", 33) ); Output: num1 = 0xFFFFFFFFFFFFFFFF num2 = 0x0000000080000001 I experimented a little bit. 1) The sizes of the fields (ubyte, ushort, etc.) don't seem to matter as long as they are long enough to hold the value. 2) It's only the first field that is broken, as far as I can tell. 3) Additional fields don't solve the problem. I looked at the code for std.bitmanip but it's beyond my ability to modify. So I'll have to rely on the kindness of strangers for a fix.
My current branch resolves this issue. Once the pull is accepted this will be resolved. https://github.com/rtcvb32/phobos/commit/620ba57cc0a860245a2bf03f7b7f5d6a1bb58312 The problem in the code happens to be how it handles the enum mask, which is exactly 32bits; When it's inverted it inverts to 0, masking nothing. guilty code: extendSign = ~((cast(MasksType)1u << len) - 1); resolution: extendSign = cast(MasksType) ~((1uL << len) - 1);
https://github.com/D-Programming-Language/phobos/pull/1613