Null references (oh no, not again!)

Alex Burton alexibu at mac.com
Thu Mar 5 04:19:56 PST 2009


bearophile Wrote:

> Andrei Alexandrescu:
> > I did some more research and found a study:
> > http://users.encs.concordia.ca/~chalin/papers/TR-2006-003.v3s-pub.pdf
> > ...
> > Turns out in 2/3 of cases, references are really meant to be non-null... 
> > not really a landslide but a comfortable majority.
> 
> Thank you for bringing real data to this debate.
> Note that 2/3 is relative to nonlocal variables only:
> 
> >In Java programs, at least 2/3 of declarations (other than local variables) that are of 
> reference types are meant to be non-null, based on design intent.
> We exclude local variables because their non-nullity can be inferred by intra-procedural 
> analysis<
> 
> So the total percentage may be different (higher?).
> Anyway, nonnullable by default seems the way to go if such feature is added.
> 
I think there is some faulty logic here.

People are writing code that has a design intention of nullable (as shown in the study) precisly because that is the default reference type in the language. 

Inferring that the default for a new language should be nullable based on these statistics would be a logical error.

Implementing default non nullable would have the effect of reducing the amount of design intentional nullable used. It would also greatly increase the quality and maintainability of the code, as all references not specifically marked as nullable could be safely dereferenced.

Alex



More information about the Digitalmars-d mailing list