Null References and related matters
Nick Sabalausky
a at a.a
Tue Dec 23 18:17:44 PST 2008
"bearophile" <bearophileHUGS at lycos.com> wrote in message
news:giqphn$1aog$1 at digitalmars.com...
> This was already discussed in the past, but I think it doesn't hurt
> rehashing it a little, when there's an opinion of a famous computer
> scientist and programmer like
> Tony Hoare: "Null References: The Billion Dollar Mistake" presentation:
>
>>I call it my billion-dollar mistake. It was the invention of the null
>>reference in 1965. At that time, I was designing the first comprehensive
>>type system for references in an object oriented language (ALGOL W). My
>>goal was to ensure that all use of references should be absolutely safe,
>>with checking performed automatically by the compiler. But I couldn't
>>resist the temptation to put in a null reference, simply because it was so
>>easy to implement. This has led to innumerable errors, vulnerabilities,
>>and system crashes, which have probably caused a billion dollars of pain
>>and damage in the last forty years. In recent years, a number of program
>>analysers like PREfix and PREfast in Microsoft have been used to check
>>references, and give warnings if there is a risk they may be non-null.
>>More recent programming languages like Spec# have introduced declarations
>>for non-null references. This is the solution, which I rejected in 1965.<
>
> That's why day ago I have said that sooner or later D will have something
> like the ? syntax of Delight (and C#):
> http://delight.sourceforge.net/null.html
>
Interesting. And now that I think about it, null references do seem to be
little more than the reference equivilent of sentinel values, which I've
never been a big fan of (ie, reserving special values to indicate something
other than what the variable normally represents, for instance embedding
error codes in the return value of a function that normally returns a
meaningful value).
More information about the Digitalmars-d
mailing list