References in D
Regan Heath
regan at netmail.co.nz
Mon Sep 24 05:23:02 PDT 2012
On Sun, 16 Sep 2012 23:46:34 +0100, deadalnix <deadalnix at gmail.com> wrote:
> Le 15/09/2012 19:13, Jonathan M Davis a écrit :
>> On Saturday, September 15, 2012 15:24:27 Henning Pohl wrote:
>>> On Saturday, 15 September 2012 at 12:49:23 UTC, Russel Winder
>>>
>>> wrote:
>>>> On Sat, 2012-09-15 at 14:44 +0200, Alex Rønne Petersen wrote:
>>>> […]
>>>>
>>>>> Anyway, it's too late to change it now.
>>>>
>>>> I disagree. There are always opportunities to make changes to
>>>> things,
>>>> you just have manage things carefully.
>>>
>>> I don't know if people really use the ability of references being
>>> null. If so, large amounts of code will be broken.
>>
>> Of course people use it. Having nullable types is _highly_ useful. It
>> would
>> suck if references were non-nullable. That would be _horrible_ IMHO.
>> Having a
>> means to have non-nullable references for cases where that makes sense
>> isn't
>> necessarily a bad thing, but null is a very useful construct, and I'd
>> _hate_
>> to see normal class references be non-nullable.
>>
>> - Jonathan M Davis
>
> Years of java have proven me the exact opposite. Nullable is a usefull
> construct, but nullable by default is on the wrong side of the force.
I think it depends on your background. Most of my experience has been
with C and C++ and I agree with Jonathan that null is incredibly useful,
and something I use a lot. In fact, I am often annoyed that 'int' doesn't
have an equivalent value, and instead I have to invent a magic number and
ensure it's never a possible valid value.
What I've noticed looking at Java code written by others is that null as a
possible state is ignored by the vast bulk of the code, which is
completely the opposite when reviewing C/C++ code where null is checked
for and handled where applicable. I think it's a mindset thing brought on
by the language, or how it's taught, or something. It seems to me that
Java would have benefited from non-null references :p
My uses of null all seem to boil down to being able to represent something
as being "not there yet" or "not specified" etc without having to resort
to a magic value, or 2nd flag/boolean/parameter. I find null is a nice
clean way to do this.
As to whether it should be the default or not.. well, you might have a
point there. I think I probably want to use null less than otherwise.
But, all we need is a compiler which says "Oi, you haven't initialised
this" forcing me to explicitly set it to null where desired, or a valid
value and problem solved, right? That combined with a construct like
NotNull!(T) which would assert in debug that the reference is not null and
you can basically stop doing null checks in release code. Win win, right?
R
--
Using Opera's revolutionary email client: http://www.opera.com/mail/
More information about the Digitalmars-d
mailing list