Null references redux
Andrei Alexandrescu
SeeWebsiteForEmail at erdani.org
Sat Sep 26 14:35:12 PDT 2009
Jeremie Pelletier wrote:
> Andrei Alexandrescu wrote:
>> Walter Bright wrote:
>>> Denis Koroskin wrote:
>>> > On Sat, 26 Sep 2009 22:30:58 +0400, Walter Bright
>>> > <newshound1 at digitalmars.com> wrote:
>>> >> D has borrowed ideas from many different languages. The trick is to
>>> >> take the good stuff and avoid their mistakes <g>.
>>> >
>>> > How about this one:
>>> >
>>> http://sadekdrobi.com/2008/12/22/null-references-the-billion-dollar-mistake/
>>>
>>> >
>>> >
>>> > :)
>>>
>>> I think he's wrong.
>>>
>>> Getting rid of null references is like solving the problem of dead
>>> canaries in the coal mines by replacing them with stuffed toys.
>>>
>>> It all depends on what you prefer a program to do when it encounters
>>> a program bug:
>>>
>>> 1. Immediately stop and produce an indication that the program failed
>>>
>>> 2. Soldier on and silently produce garbage output
>>>
>>> I prefer (1).
>>>
>>> Consider the humble int. There is no invalid value such that
>>> referencing the invalid value will cause a seg fault. One case is an
>>> uninitialized int is set to garbage, and erratic results follow.
>>> Another is that (in D) ints are default initialized to 0. 0 may or
>>> may not be what the logic of the program requires, and if it isn't,
>>> again, silently bad results follow.
>>>
>>> Consider also the NaN value that floats are default initialized to.
>>> This has the nice characteristic of you know your results are bad if
>>> they are NaN. But it has the bad characteristic that you don't know
>>> where the NaN came from. Don corrected this by submitting a patch
>>> that enables the program to throw an exception upon trying to use a
>>> NaN. Then, you know exactly where your program went wrong.
>>>
>>> It is exactly analogous to a null pointer exception. And it's darned
>>> useful.
>>
>> My assessment: the chances of convincing Walter he's wrong are quite
>> slim... Having a rationale for being wrong is very hard to overcome.
>>
>> Andrei
>
> I actually side with Walter here. I much prefer my programs to crash on
> using a null reference and fix the issue than add runtime overhead that
> does the same thing. In most cases a simple backtrace is enough to
> pinpoint the location of the bug.
But that's a false choice. You don't choose between a crashing program
and an out-of-control program. This is the fallacy. The problem is the
way Walter puts it it's darn appealing. Who would want a subtly
incorrect program?
> Null references are useful to implement optional arguments without any
> overhead by an Optional!T wrapper. If you disallow null references what
> would "Object foo;" initialize to then?
The default should be non-nullable references. You can define nullable
references if you so wish. The problem is, Walter doesn't realize that
the default initialization scheme and the optional lack thereof by using
"= void" goes straight against his reasoning about null objects.
Andrei
More information about the Digitalmars-d
mailing list