C# interview
Don
nospam at nospam.com.au
Tue Oct 7 05:21:23 PDT 2008
Denis Koroskin wrote:
> On Tue, 07 Oct 2008 13:33:16 +0400, Don <nospam at nospam.com.au> wrote:
>
>> Denis Koroskin wrote:
>>> On Mon, 06 Oct 2008 13:58:39 +0400, Don <nospam at nospam.com.au> wrote:
>>>
>>>> Denis Koroskin wrote:
>>>>> The two things that needs to be changed to support this feature are:
>>>>> 1) make typeof(null) == void*
>>>>> 2) remove default initializers (for reference types, at least)
>>>>> The latter rule can be relaxed (as done in C#): you can have a
>>>>> variable uninitialized. However, you can't read from it until you
>>>>> initialize it explicitly. This is enforced statically:
>>>>> // The following is ok:
>>>>> Object o;
>>>>> o = new Object();
>>>>> // This one is ok, too:
>>>>> Object o;
>>>>> if (condition) {
>>>>> o = new Foo();
>>>>> } else {
>>>>> o = new Bar();
>>>>> }
>>>>> // But this is rejected:
>>>>> Object o;
>>>>> if (condition) {
>>>>> o = new Foo();
>>>>> }
>>>>> Object o2 = o; // use of (possibly) uninitialized variable
>>>>
>>>> Why not just disallow uninitialized references?
>>>> So none of the examples would compile, unless you wrote:
>>>>
>>>> Object o = new Object();
>>>>
>>>> or
>>>>
>>>> Object o = null;
>>>>
>>>> The statement that "50% of all bugs are of this type" is consistent
>>>> with my experience. This wouldn't get all of them, but I reckon it'd
>>>> catch most of them.
>>>> I can't see any advantage from omitting the "=null;"
>>> That's the whole point - you won't need to check for (o is null)
>>> ever. All the references are always valid. This might be a good
>>> contract to enforce.
>>
>> I think you misunderstood me. I wrote "=null" not "==null".
>> It's my experience that most null pointer bugs in D are caused by
>> simply forgetting to initialize the reference. Anyone coming from C++
>> is extremely likely to do this. This bug could be completely
>> eliminated by requiring objects to be initialized. If you want them to
>> be be uninitialised, fine, set them equal to null. Make your
>> intentions clear.
>>
>
> Well, there is no difference between Object o; and Object o = null; -
> they both will be initialized to null. But I agree that explicit
> initialization makes an intention more clear.
>
>> Then, an object reference could only ever be null if it was explicitly
>> set to null.
>>
>
> Not, actually:
>
> void foo(Bar b) {
> Bar bb = b; // bb is null == b is null
> }
But b must have been explicitly set to null. Someone has made a
conscious decision that some reference is permitted to be null. Right
now, it can happen by accident.
>> There are some null pointer exceptions which are caused by returning a
>> null by accident, but they're much rarer, and they're not much
>> different to any other logic error.
> I think that explicitly stating that function *may* return null or
> function always returns valid reference is a good contract. User will be
> aware of the potential null returned.
>
>> Yes, they would be eliminated in your proposal. But the cost-benefit
>> is not nearly as good as eliminating the default null initialisation;
>> the benefit is higher, but the cost is hundreds of times higher.
>>
> Really? How is it so?
I wasn't talking about runtime cost, but cost to the language (increase
in complexity of the spec, Walter's time). It requires _major_ language
changes. New lexing, new types, new name mangling, new code generation.
Actually, I thought that my proposal will lead to
> better optimizations (while ensuring high robustness) to what we have
> now. For example, now every function that accepts a class reference
> ought to check the pointer against null. With my proposal you should
> check just once! upon casting from T? to T. Once you got a reference, it
> is not null so no checking against null is necessary. And since you
> operate on T instead of T? all the time (and assigning from T? to T?
> doesn't need any checking either) it is rather enough to ignore the cost.
True. I don't disagree that non-nullable references would be nice to
have. I just think that it's unlikely that Walter would do it.
My point is simply that:
Object o;
is almost always an error, and I think that making it illegal would
catch the #1 trivial bug in D code.
More information about the Digitalmars-d
mailing list