Null references redux

Walter Bright newshound1 at digitalmars.com
Sat Sep 26 17:55:44 PDT 2009


Andrei Alexandrescu wrote:
> Walter Bright wrote:
>> Even forcing an explicit initializer doesn't actually solve the 
>> problem - my experience with such features is programmers simply 
>> insert any old value to get the code to pass the compiler, even 
>> programmers who know it's a bad idea do it anyway.
> 
> I think you're starting to be wrong at the point where you don't realize 
> that many bugs come from references that people have forgotten to 
> initialize. Once you acknowledge those, you will start to realize that a 
> reference that must compulsively be initialized is valuable.

The problem is it's worse to force people to provide an initializer. 
Most of the time that will work out ok, but the one silent bad value 
producing silent bad output overbalances all of it. Null pointer 
dereferences do not produce bad output that can be overlooked.

It isn't a theoretical problem with providing bad initializers just to 
shut the compiler up. I have seen it in the wild every time some manager 
required that code compile without warnings and the compiler warned 
about no initializer.

I'm very much a fan of increasing D's ability to detect and head off 
common mistakes, but it's really easy to tip into seducing programmers 
into writing bad code in order to avoid an overly nagging compiler.

There's the other problem of how to represent an "empty" value. You have 
to create a special object that then you have to either test for 
explicitly, or that has member functions that throw. You're no better 
off with that, and arguably worse off.


> You think from another perspective: you strongly believe that *most* of 
> the time you can't or shouldn't initialize a reference. Your code in 
> Phobos reflects that perspective. In the RegExp class, for example, you 
> very often define a variable at the top of a long function and 
> initialize it halfway through it. I trivially replaced such code with 
> the correct code that defines symbols just where they're needed.

That style doesn't reflect anything more than my old C habits which 
require declarations before any statements. I know it's bad style and do 
it less and less over time.



More information about the Digitalmars-d mailing list