Why are unsigned to signed conversions implicit and don't emit a warning?
Andrej Mitrovic
none at none.none
Sun Apr 10 16:47:54 PDT 2011
I just had a little bug in my code. In the WindowsAPI, there's this alias:
alias ubyte BYTE;
Unfortunately I didn't check for this, and I erroneously assumed BYTE was a signed value (blame it on my lack of coffee). So when I used code like this:
alias Tuple!(byte, "red", byte, "green", byte, "blue") RGBTuple;
RGBTuple GetRGB(COLORREF cref)
{
RGBTuple rgb;
rgb.red = GetRValue(cref);
rgb.green = GetGValue(cref);
rgb.blue = GetBValue(cref);
return rgb;
}
The rgb fields would often end up being -1 (Yes, I know all about how signed vs unsigned representation works). My fault, yes.
But what really surprises me is that these unsigned to signed conversions happen implicitly. I didn't even get a warning, even though I have all warning switches turned on.
I'm pretty sure GCC would complain about this in C code. Visual Studio certainly complains if I set the appropriate warnings, examples given:
warning C4365: '=' : conversion from 'unsigned int' to 'int', signed/unsigned mismatch
warning C4365: '=' : conversion from 'unsigned short' to 'short', signed/unsigned mismatch
More information about the Digitalmars-d-learn
mailing list