Null references redux
Denis Koroskin
2korden at gmail.com
Sat Sep 26 15:27:16 PDT 2009
On Sun, 27 Sep 2009 02:03:40 +0400, Walter Bright
<newshound1 at digitalmars.com> wrote:
> Denis Koroskin wrote:
>> I don't understand you. You say you prefer 1, but describe the path D
>> currently takes, which is 2!
>> dchar d; // not initialized
>> writeln(d); // Soldier on and silently produce garbage output
>
> d is initialized to the "invalid" unicode bit pattern of 0xFFFF. You'll
> see this if you put a printf in. The bug here is in writeln failing to
> recognize the invalid value.
>
> http://d.puremagic.com/issues/show_bug.cgi?id=3347
>
Change dchar to float or an int. It's still not initialized (well,
default-initialized to some garbage, which may or may not be okay for a
programmer).
>> I don't see at all how is it related to a non-null default.
>
> Both are attempts to use invalid values.
>
No.
>> Non-null default is all about avoiding erroneous situations, enforcing
>> program correctness and stability. You solve an entire class of
>> problem: NullPointerException.
>
> No, it just papers over the problem. The actual problem is the user
> failed to initialize it to a value that makes sense for his program.
> Setting it to a default value does not solve the problem.
>
> Let's say the language is changed so that:
>
> int i;
>
> is now illegal, and generates a compile time error message. What do you
> suggest the user do?
>
> int i = 0;
>
1) We are talking about non-null *references* here.
2) I'd suggest user to initialize it to a proper value.
"int i;" is not the whole function, is it? All I say is "i" should be
initialized before accessed, and that fact should be statically enforced
by a compiler.
> The compiler now accepts the code. But is 0 the correct value for the
> program? I guarantee you that programmers will simply insert "= 0" to
> get it to pass compilation, even if 0 is an invalid value for i for the
> logic of the program. (I guarantee it because I've seen it over and
> over, and the bugs that result.)
>
This is absolutely irrelevant to non-null reference types. Programmer
can't write
"Object o = null;" to cheat on the type system.
More information about the Digitalmars-d
mailing list