Why is char initialized to 0xFF ?

Andrej Mitrovic andrej.mitrovich at gmail.com
Sun Jun 9 04:01:28 UTC 2019


On Saturday, 8 June 2019 at 18:04:46 UTC, Adam D. Ruppe wrote:
> On Saturday, 8 June 2019 at 17:55:07 UTC, James Blachly wrote:
>> char is a UTF8 character, but 0xFF is specifically 
>> forbidden[3] by the UTF8 specification.
>
> And that is exactly why it is the default: the idea here is to 
> make uninitialized variables obvious, because they will be a 
> predictable, but invalid value when they appear.

To me they are not really obvious or useful, especially when I 
interface with C/C++. I pass some default-initialized char or 
float to a C/C++ library (by mistake), and I get some weird 
output written in some distant data field. The end result is 
either broken data somewhere down the line, or garbled output in 
the UI.

I much prefer default values which are correct for 99% of the 
intended use-cases. I make full use of the fact integers 
default-initialize to zero, I think it's a great "feature". If 
there was a NaN for integers, I'd probably hate it..

I would prefer it if the compiler (or a tool!) had a switch 
--check-use-before-initialize or something of the sort, with 
code-flow analysis and all that good stuff.


More information about the Digitalmars-d mailing list