Null references (oh no, not again!)

Nick Sabalausky a at a.a
Wed Mar 4 22:30:48 PST 2009


"Walter Bright" <newshound1 at digitalmars.com> wrote in message 
news:gonnf4$2mnj$1 at digitalmars.com...
>> The code is bug-ridden. It's exactly the kind of maintenance nightmare 
>> where you change one line and 1000 lines below something crashes.
>
> It does occur in various forms. I know this from experimenting with flow 
> analysis. The problem with saying it's "buggy" is then the programmer 
> throws in a dead assignment to "fix" it rather than refactor it.
>
>

If someone has code like that than the main issue is that it needs to be 
refactored. Neither the current status of D, nor "perfect" flow-analysis 
would do anything to force, or even nudge, the programmer into doing that 
refactoring. So if C#-style analysis doesn't force the proper refactoring 
either, so what? At least attention will get called to it and the programmer 
will at least have the *opportunity* to choose to fix it. And if they choose 
to do the dead-assignment hack, well, the code's already crap anyway, it's 
not like they're really all that worse off.

>> Listen to the man. The point is to use non-null as default in function 
>> signatures, but relax that rule non-locally so programmers don't feel 
>> constrained. It's the best of all worlds.
>
> But much of this thread is about tracking where a null assignment to a 
> field is coming from. By doing only local analysis, that case is not dealt 
> with at all.
>

Before I address that, let me address one other thing first:

I disagree with the notion that C#-style flow-analysis-related constraints 
are a problem as long as the flow-analysis rules are, as in C#, well-defined 
and easy to understand. I've written plenty of C# and never had a problem 
with it. There are limitations in C# for which I've needed non-trivial 
workarounds, but not once has the conservatively-biased flow-analysis ever 
been one of the things that caused me non-trivial work. In fact, I would 
argue that a "perfect" flow-analysis would be much worse because it would 
cause code to constantly flip-flop between valid and invalid at the tiniest 
seemingly-insignificant change. (In other words, if you think the "symbol 
unresoved" emails you get are bad, it's nothing compared to what would 
happen with a perfect "zero reject-valid" flow-analysis.)

With that out of the way: Yes, for "perfect"-style flow-analysis, both local 
and non-local analysis would be needed in order to track the origin of a bad 
null. But with the C#-style, local-only is perfectly sufficient to 
accomplish that.

So basically, with C#-style, ie, disallowing the more esoteric patterns of 
theoretically-valid code, you get two benefits over the "perfect"-style 
analysis:

1. Far easier for the programmer to *know* when something will or won't be 
accepted by initialization-checks, and minor changes don't cause surprises.

2. The initialization-checks can catch all uses of uninited vars with simple 
local-only analysis.





More information about the Digitalmars-d mailing list