null [re: spec#]
Nick Sabalausky
a at a.a
Sun Nov 7 00:31:20 PDT 2010
"Jonathan M Davis" <jmdavisProg at gmx.com> wrote in message
news:mailman.144.1289103661.21107.digitalmars-d at puremagic.com...
> On Saturday 06 November 2010 19:05:32 Nick Sabalausky wrote:
>> "foobar" <foo at bar.com> wrote in messagend in a pointlessly roundabout
>> way.
>>
>> > 2. "null" is an a type-system attribute, hence should be checked at
>> > compile time and would have ZERO affect on run-time performance.
>> > Same as assigning a string value to an int variable.
>>
>> I strongly agree with this.
>>
>> On a related note, I *hate* that D silently sticks in a default value
>> whenever anything isn't properly inited. This is one thing where I really
>> think C# got it right, and D got it wrong. And waving the "It's not
>> leaving
>> it with an undefined value like C does!" banner is an irritating
>> strawman:
>> Yea, it's better than C, but it still sucks.
>
> Now, it may be that
> Java and C#'s way is ultimately better, but unless you make a practice of
> declaring variables without initializing them (which generally should be
> avoided
> regardless), it generally doesn't matter.
The problem is values accidentally not being inited. When that happens, D
jumps in and just assumes it should be xxxx, which is not always correct.
"null" is not always the intended starting value for a reference type. 0 is
not always the intended starting value for an integer type.
Invalid-code-unit is not always the intended starting value for a character.
NaN is not always the intended starting value for a floating point type, and
while it *is* better than, for example, 0 for ints, it's still tends to
leave the error undetected until further down the code-path (same for null).
>
> Also, it's essentially D's stance that not initializing a variable is a
> bug, so
> every variable is default initialized to the closest to an error value
> that
> exists for that type.
So why respond to a bug by ignoring it and occasionally turning it into
another bug? If it considers something a bug, then it should *say* so.
> It's not at all inconsistent if you look at from the perspective that
> types are
> default initialized to the closest thing to an error value that they have.
> Many
> of the value types (such as the integral types), don't really have a value
> that's an obvious error, so they don't fit in with that quite so well, but
> it's
> unavoidable given that they just don't have an obvious error value.
>
See, that goes back to what the OP was saying, and I agree with: "Error"
should *not* be a valid value for a type (unless explicitly decreed by the
programmer for a specific variable). It should either be a value that the
programmer explicitly *gives* it, or a compile-time error.
> And I don't understand why you think that nullability shouldn't be a
> difference
> between value types and reference types. That's one of the _key_
> differences
> between them. Value types _cannot_ be null, while references can. And I'd
> sure
> hate to _not_ be able to have a null reference.
It's only a difference in D because D makes it so. It's *not* a fundamental
difference of the concept of a "reference type" and "value type".
Non-nullable reference types can and do exist. And Haxe has value types that
are nullable. I thnk JS does too. And I never suggested we not be able to
have null references at all. In fact, I already made a big point that they
*should* be allowed:
"there are
plenty of cases where run-time nullability is useful and where lack of it is
problematic at best: A tree or linked list, for example. The "null object"
idiom doesn't count, because all it does is just reinvent null, and in a
pointlessly roundabout way."
It just shouldn't be default, and it should only be used when it's actually
needed.
> I can totally understand wanting non-nullable reference types. There are
> plenty
> of times where it just doesn't make sense to a have a variable which can
> be null
> - even if it's a reference - but there are plenty of cases where it _does_
> make
> sense,
Agreed. And I've also come across plenty of cases where nullable value types
are useful (but obviously they shouldn't be the default).
Conceptually, nullability is orthogonal from reference-vs-value, but many
languages conflate the two (presumably because it just happens to be the
easiest due to the way the hardware typically works).
More information about the Digitalmars-d
mailing list