null [re: spec#]

Nick Sabalausky a at a.a
Tue Nov 9 10:39:08 PST 2010

"Daniel Gibson" <metalcaedes at> wrote in message 
news:ibbp25$ls8$1 at
> Nick Sabalausky schrieb:
>> "so" <so at> wrote in message news:op.vlv3iukp7dtt59 at so-pc...
>>>> There's no usage of an undeclared variable, but the right-hand-side of 
>>>> the
>>>> second line uses 'i' before *the programmer* initializes it. Yes, the D
>>>> compiler chooses to automatically initialize it, but by doing so it 
>>>> silently
>>>> creates a bug every time the programmer intends 'i' to start out as 
>>>> anything
>>>> other than 0. And it's not easily noticed since 0 is a commonly-used 
>>>> value.
>>>> (Something like 0xDEADBEEF would at least be an improvement (albeit a 
>>>> small
>>>> one) since at least that would stand out more and likely fail more
>>>> spectacularly.)
>>> So you want language force you to type either "int x=0;" or "int 
>>> x=void;".
>>> Fair enough and i agree it "might" be a bit better. But you are making 
>>> it as it is something so much important.
>> I tend to get a bit fired up by it because Walter's reasoning on it being 
>> *better* to automatically assume some init value baffles me.
> It gives deterministic results/errors.
> For example, when your code works when an int is initialized with 0 (but 
> you didn't initialize it), it may work most of the time in C and fail 
> randomly. In D it will always work. Same thing the other way round.
> Or if you do some calculation with an uninitialized int value.. I guess 0 
> is one of the easiest values to spot: on multiplication it creates 0 and 
> on addition it doesn't change to value so by looking at the unwanted 
> result of a calculation you probably can see the error more easily than on 
> some other value (or even a random value, that may create results that 
> look about right).

Where are people getting the idea that I've said C's behavior is better than 
D's? Once again, I'm not talking about D vs C (ie "int i;" leaves 'i' in an 
undefined state), I'm talking about D vs C# (ie, "int i;" causes 
compile-time errors when 'i' is read before being written to).

More information about the Digitalmars-d mailing list