Wish: Variable Not Used Warning
Bruce Adams
tortoise_74 at yeah.who.co.uk
Thu Jul 10 12:44:27 PDT 2008
On Thu, 10 Jul 2008 11:20:21 +0100, Don <nospam at nospam.com.au> wrote:
> Bruce Adams wrote:
>> I would contend this is a problem with the quality of headers provided
>> by M$.
>> Library code has a greater need to be high quality than regular code.
>> Operating system APIs even more so.
>> Removing warnings from C/C++ headers requires you to write them
>> carefully to
>> remove the ambiguity that leads to the warning. That is, this
>> definition of
>> quality is a measure that increases with decreasing semantic ambiguity.
>
> I think it's a complete fallacy to think that lower-number-of-warnings
> is proportional to better-code-quality.
> Once a warning is so spurious (eg, so that it has a <1% chance of being
> an error), it's more likely that you'll introduce an error in getting
> rid of the warning.
> In C++, error-free code is clearly defined in the spec. But warning-free
> code is not in the spec. You're at the mercy of any compiler writer who
> decides to put in some poorly thought out, idiotic warning.
>
I didn't say that *overall* quality is related to lower warnings but it is
a factor.
There are other factors that are typically more significant. Still given
the choice between
code with some warnings and warning free code all other things being equal
I would
pick the warning free code. You obviously shift your quality measure
towards that
aspect of readability. Personally I think the impact on readability is
minimal.
> If you insist on avoiding all warnings, you're effectively using the
> programming language spec which one individual has carelessly made on a
> whim.
>
While some warnings are less useful than others I don't think its fair in
general to say they're introduced carelessly on a whim.
> For example, VC6 generates some utterly ridiculous warnings. In some
> cases, the chance of it being a bug is not small, it is ZERO.
>
Before they got Herb Sutter on board VC++ was notoriously bad.
If that's true then it would be a compiler bug. If you know it to be true
you can disable the warning with a pragma. Similarly in gcc all warnings
are
supposed to have an on/off switch. So you get to choose which warnings
you think are important. I am well aware that some people choose to ignore
all warnings in order to code faster. In general its a false economy like
not writing unit-tests.
> In DMD, the signed/unsigned mismatch warning is almost always spurious.
> Getting rid of it reduces code quality.
I have encountered quite a few bugs (in C++) relating to unsigned/signed
mismatches. Its a very subtle and hard to spot problem when a simple
addition
suddenly changes the sign of your result. It costs a ugly cast to remove
the
warning but that is a trade I'm prepared to make to never have to worry
about
such bugs.
Regards,
Bruce.
More information about the Digitalmars-d
mailing list