null [re: spec#]

Nick Sabalausky a at a.a
Tue Nov 9 11:03:52 PST 2010


"Simen kjaeraas" <simen.kjaras at gmail.com> wrote in message 
news:op.vlwul8tuvxi10f at biotronic-pc.lan...
> Daniel Gibson <metalcaedes at gmail.com> wrote:
>
>> Nick Sabalausky schrieb:
>>> "so" <so at so.do> wrote in message news:op.vlv3iukp7dtt59 at so-pc...
>>>>> There's no usage of an undeclared variable, but the right-hand-side 
>>>>> of the
>>>>> second line uses 'i' before *the programmer* initializes it. Yes, the 
>>>>> D
>>>>> compiler chooses to automatically initialize it, but by doing so it 
>>>>> silently
>>>>> creates a bug every time the programmer intends 'i' to start out as 
>>>>> anything
>>>>> other than 0. And it's not easily noticed since 0 is a commonly-used 
>>>>> value.
>>>>> (Something like 0xDEADBEEF would at least be an improvement (albeit a 
>>>>> small
>>>>> one) since at least that would stand out more and likely fail more
>>>>> spectacularly.)
>>>> So you want language force you to type either "int x=0;" or "int 
>>>> x=void;".
>>>> Fair enough and i agree it "might" be a bit better. But you are making 
>>>> it as it is something so much important.
>>>  I tend to get a bit fired up by it because Walter's reasoning on it 
>>> being *better* to automatically assume some init value baffles me.
>>>
>>
>> It gives deterministic results/errors.
>
> Yup. Also, as opposed to certain other solutions, it does not require
> advanced flow control, that is likely to be incomplete. Incomplete flow
> control here will make people write code 'to shut the compiler up'. And
> that is worse than uninitialized variables.
>

First of all, the risks from "shut the compiler up" initalizations are 
highly exagerated (I've dealt with enough C# to know that Walter's full of 
crap on the "dangers" of that). Secondly, there is absolutely no way it's 
worse than auto-initing. Let's look at an example:

int i;
// Dumb-ass convoluted code that should never pass code review anyway:
if(m)
    i = something;
// Stuff here
if(m)
    // Do something with 'i'

Suppose the compiler complains 'i' might be used before being written to. 
The programmer has two choices:

1. Blindly toss in "=0" with no regard as to whether or not it's correct.
2. Fix the damn code properly (And believe it or not, this *may* actually 
amount to "int i=0;", but just not always).

Now, let's go back to D's current behavior. Only one thing ever happens now. 
The compiler will:

1. Blindly toss in "=0" with no regard as to whether or not it's correct.

Note that's word-for-word identical to before, except now option 2, the 
*right* option, doesn't even fucking exist. That does NOT make it better, 
that makes it worse.




More information about the Digitalmars-d mailing list