null [re: spec#]

Jonathan M Davis jmdavisProg at gmx.com
Sat Nov 6 21:20:50 PDT 2010


On Saturday 06 November 2010 19:05:32 Nick Sabalausky wrote:
> "foobar" <foo at bar.com> wrote in messagend in a pointlessly roundabout way.
> 
> > 2. "null" is an a type-system attribute, hence should be checked at
> > compile time and would have ZERO affect on run-time performance.
> > Same as assigning a string value to an int variable.
> 
> I strongly agree with this.
> 
> On a related note, I *hate* that D silently sticks in a default value
> whenever anything isn't properly inited. This is one thing where I really
> think C# got it right, and D got it wrong. And waving the "It's not leaving
> it with an undefined value like C does!" banner is an irritating strawman:
> Yea, it's better than C, but it still sucks.

Well, it _is_ better than C. Going C# or Java's route forces the programmer to 
initialize variables even in cases where they know that it's not necessary 
(which is annoying but may or may not be worth it), but more importantly (from 
Walter's perspective at least), it would require flow analysis, which he actively 
avoids. Using default values avoids memory bugs like you get in C and results in 
a simpler compiler implementation (and therefore a less bug-prone one) and makes 
it simpler for other tools to be written for the language. Now, it may be that 
Java and C#'s way is ultimately better, but unless you make a practice of 
declaring variables without initializing them (which generally should be avoided 
regardless), it generally doesn't matter.

Also, it's essentially D's stance that not initializing a variable is a bug, so 
every variable is default initialized to the closest to an error value that 
exists for that type. null is the obvious choice for pointers and references.

I'm moderately divided on the issue, but ultimately, I think that D's decision 
was a good one. Java and C#'s may or may not be better, but I still think that 
what D does works quite well.

> I also dislike that D's reference types being nullable by default is
> inconsistent with its value types. (Yea, naturally reference and value
> types are going to have inherent differences, but nullability shouldn't be
> one of them.)

It's not at all inconsistent if you look at from the perspective that types are 
default initialized to the closest thing to an error value that they have. Many 
of the value types (such as the integral types), don't really have a value 
that's an obvious error, so they don't fit in with that quite so well, but it's 
unavoidable given that they just don't have an obvious error value.

And I don't understand why you think that nullability shouldn't be a difference 
between value types and reference types. That's one of the _key_ differences 
between them. Value types _cannot_ be null, while references can. And I'd sure 
hate to _not_ be able to have a null reference. It's irritating enough that 
arrays and associative arrays are almost treated the same when they're null as 
when they're empty.

I can totally understand wanting non-nullable reference types. There are plenty 
of times where it just doesn't make sense to a have a variable which can be null 
- even if it's a reference - but there are plenty of cases where it _does_ make 
sense, and I do find the fact that D default initializes to error values to be 
quite useful, since I do consider it bad practice in general to not initialize a 
variable when it's declared. Sometimes you have to for scoping reasons or 
whatnot, but generally, variables _should_ be initialized when declared.

- Jonathan M Davis


More information about the Digitalmars-d mailing list