Regarding compiler switches

Nick Sabalausky a at a.a
Sat Nov 7 00:56:21 PST 2009


"div0" <div0 at users.sourceforge.net> wrote in message 
news:hd2n6e$135m$1 at digitalmars.com...
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> Nick Sabalausky wrote:
>
> Ok fair dos, I agree with you that warnings should [not exist at all] be
> warnings not errors.
>


Ok, I'm glad that we at least agree on that.


> But if you like warnings so much please provide a concrete example of a
> compilation run that's given you a useful warning that you've then used
> to correct a problem with your code.
>


First of all, not all of it is necessarily about identifying a problem that 
already exists. Some of it, like getting a "switch statement has no default" 
when switching on an enum has more to do with defensive coding and making 
sure the code properly handles a non-local change sometime in the future 
(ie, adding an new enum value). Secondly, see below...


> For every situation I believe that there is a simple and unambiguous
> right/wrong choice and D's design pretty much does away with all those
> pointless warnings and it makes stupid things errors.
>


I think that's overly idealistic. I absolutely agree that warnings should be 
minimized in favor of a clearly defined "ok" vs "error" whenever reasonably 
possible, and that D has done great in this regard, but I definitely don't 
think it's always reasonably possible. Sometimes deciding "that is allowed" 
will cause potential problems to go unnoticed and deciding that the same 
thing should be an error instead just simply causes more trouble than it's 
worth. There isn't always a good decision, or, if there is, it might just 
not be practical for other reasons.

For instance: DMD's "no return at end of function" warning. That definitely 
has saved me from runtime bugs. There have been times when I got that 
warning, and saw that I had built up a return value but forgot to return it, 
and definitely didn't want to be returning T.init.

"Ok, so the language definition should require a return at the end of a 
non-void function, and to omit it would be an error." Nice try, but then the 
language becomes a pain in the ass whenever you have something that's 
guaranteed to either throw or return a value before the final closing curly 
brace.

"The obvious correct solution is flow analysis." Great, except 1. Perfect 
flow analysis can be impractically slow, and 2. It a big feature that'll 
take time to implement, plus there are more pressing matters on the table, 
so what are you supposed to do in the meantime?

"Then for the meantime, just pick 'ok' or 'error'." First of all, at that 
point, the whole idea that "for every situation [...] there is a simple and 
unambiguous right/wrong choice" has pretty much already come crashing down. 
It was a great idea while it lasted, just like "everyone should always get 
along", but the real world just doesn't work that way, even as much as I 
would love for it to. Secondly, nobody's going to agree which way to go 
anyway, so...you call the damn thing a warning and move on. And that's 
pretty much just what happened. An unenviable, but nevertheless good, move 
on Walter's part.

I can make a similar case for "statement is not reachable": There *have 
been* times when I've had code that needed to get executed, but was within a 
section that, due to an oversight, was dead code. The language could have 
just outlawed such dead code, but that's just a serious pain when debugging. 
There *have been* times when I deliberately stuck a premature, unconditional 
and temporary return statement (or throw) in the middle of a function for 
the sake of debugging. I knew it caused dead code, and I didn't care, and a 
non-fatal warning wouldn't have bothered anyone because I knew damn well I 
was just going to remove it in a minute or two anyway (I have ways to make 
sure such temporary things actually do get removed). So if dead code had 
been declared an error, I would have had to temporarily comment it all out 
just to shut the stupid compiler up.

>
> I've spent 100s of hours over the past 8 years where I work fixing
> shitty code written by dribbling morons who couldn't be bothered to fix
> the warnings that their code generated and who quite frankly should
> never have been employed in the first place.
>
> Given the fact that you are a NG posting, D loving weirdo your code is
> probably very good and well documented and probably it compiles with out
> generating any warnings at all, but you and the code you write is rare &
> odd.
>
> Most of the code that's done out there by 'normal programmers' whose
> only concern is getting paid at the end of the month basically sucks, so
> making things errors, which they have to fix rather than warnings is the
> way to go, from my pragmatic working coders point of view.
>
> Though I suppose you can throw the lint tool argument right back at me
> and say that people that employee morons should run a lint tool.
>


I've worked at such places as well. It really is very depressing :(.

Although I don't think "lint tool" is the right solution in that case 
(although it might help somewhat). The real solution is take the idiot who 
keeps hiring idiots in the first place, and kick his worthless ass out to 
the curb (it was probably just one of those "recruiters [or managers] who 
use grep" anyway, if you read Joel On Software).

Unfortunately, in my experience, the stupidity in such places usually goes 
all the way to the top (or at least somewhere above the glass ceiling beyond 
which us mere worker peasants hath no right to look upon or speaketh to 
hallowed upper management), so in those cases it doesn't matter what the 
language designers do, those people are just going to find new ways to 
thoroughly fuck everything up anyway (like mandate VBScript, or name a 
loading function "save", or have the asshole salesmen promise new features 
to new customers before the development team even hears about it, let alone 
actually have it in existence, and then blame the devs when things 
inevitably don't work out - and these were all just one specific company - 
and doesn't even amount to half of the crap they pulled in the mere one year 
I was there).

And I do really mean all that, I'm not using any hyperbole (well, okay, 
maybe the "peasant" and "hallowed" stuff was embellished a little bit - but 
only in word choice, not actual overall meaning).


> lol. I know what you mean, my todo list is growing at an exponential
> rate. By the time I die, I'm going to need at least 4 more lifetimes to
> catch up on it.


Very well-put. One of the reasons I sometimes wish I were Vulcan, or better 
yet, a Trill symbiote ;)


>
>> But one problem is, Walter has already indicated that he's not 
>> particularly
>> interested in adding new warning-related compiler switches, so unless 
>> that
>> changes, there are a lot of other things I can work on that don't have a
>> high likelyhood of being ignored just like my #2567 "-ww" "warnings as
>> warnings" patch.
>
> Always a problem. Maybe a patch which does your new idea would have a
> better chance though, as that way people would have the best of both 
> worlds.
>


Yea, that is a good point. (I hope ;) )


>>
> > First of all, just to be clear, I should point out that I really meant
>> "parse+semantics and mixin expansion, etc", not just "parse" because all 
>> of
>> that stuff, everything before optimization and codegen, will need to be 
>> done
>> for a good lint tool.
>
> True, but that is all front end stuff and the front end is open source
> so doing a lint tool would be easy for anybody who feels that strongly
> about it.
>


Oh right, certainly. But it still simply just isn't there right now, so in 
the meantime all we have is DMD (or LDC) itself...


>> Ok, so with that out of the way, to answer your question of "what's wrong
>> with rerunning the whole parse process?":
>>
>> Well, first of all, why? It's the same damn thing that needs to get done, 
>> so
>> what the hell's the point?
>
> Well one point is, CPU time is very cheap, human time is very expensive.
>


I've heard the CPU/programmer time-economics argument many times, and the 
more I hear it the more convinced I get it's a load of crap. Or at least 
overstated/oversimplified. For one thing, unless all the code you're writing 
is entirely specific to a custom program that's only going to get used by a 
handful of people a handful of times, then you have to multiply the CPU cost 
by the number of users and frequency of usage to get a meaningfully 
comparable figure.

At the very least, blindly ranking one's own (or one's company's own) 
programmer time higher than CPU time of the product is extremely 
disrespectful to the users. And anyone who disregards the consequences of 
disrespect to users/clients/customers is a terrible manager in both ethics 
*and* bottom-line results. Sure, some companies have been able to compensate 
in other ways for the financial effects of their blatant disrespect to 
society, but it's still a factor that's working against them

But even besides that, the whole CPU-time versus programmer-time argument 
isn't even applicable in this case: For something like a build process, CPU 
time *IS* programmer time. After all, what the heck is the programmer doing 
while the CPU is using it's CPU time? Being productive? Hell no, he's just 
sitting there loosing his state of flow. In fact, there isn't anything else 
he *can* do without putting his flow in even further jeopardy. And that flow 
is one of the most important factors in programmer productivity, if not the 
single most.


>> Secondly, templates and CTFE slow down the whole process. And being that
>> those are two of D's killer features, their effect on compilation time is
>> just going to get felt more and more.
>
> really? what's the maximum time you've waited for dmd to finish
> compiling? I've never seen it take more than a 1 second; however my D
> apps aren't big or complicated so maybe I've not run into your problems.
>


The app I just happen to be working on at the moment takes about 12 seconds 
to compile with DMD (4 seconds if nothing has changed). And it's only a 
medium-sized program at most (at least if you count the libraries I've built 
that it uses: exclude them and it's less than 500 lines, but that's probably 
not a fair way to count it).

A full build of all executables in the project that it's part of takes, umm, 
let's see...1 min 44 sec, and actually that's just the windows build, I also 
do linux builds (although usually only when putting out a new release). 
Granted I could probably cut that "build everything" figure down 
significantly if incremental building worked reliably...and switching to 
xfbuild might help a little (currently using rebuild 0.76 with oneatatime 
turned off), although xfbuild gets it's speed by avoiding duplicated parsing 
which is exactly what I'm defending here. But anyway, those are the build 
times I'm getting right now.


> Sorry but I think that's rubbish. In my experience 99% of developer time
> is spent either debugging problems, sat at a desk scribbling diagrams
> and solving problems or being on the phone to the client.
>
> If saving a few second during a compilation really improves your
> productivity you must be a really crappy programmer, which can't be
> true; you are obviously not that stupid.


Like I said above, it's all about breaking the flow.


>> And finally, if a lint tool is built into the compiler, and you really 
>> want
>> compile and lint treated separately, and you don't care about duplicated
>> processing, then all you have to do is change "dmd -w" to 
>> "dmd -w -c"/"dmd".
>> So there you go: Eating cake and still having it.
>
> Good point, but then you need to persuade Walter to support all that
> lint functionality which would be redundant if the language was
> sufficiently well specified. I'd rather see the effort put into closing
> off the corner cases, maybes and undefined functionally.
>


I'd be happy with just my "-ww" "warnings as warnings" patch getting 
accepted.


>> I'll grant that rerunning the whole parse would be perfectly fine for an
>> initial version of a lint tool, but as a permanent design it's just 
>> simply
>> idiotic wasteful programming.
>
> ? I really don't understand your reluctance to burn cpu cycles.
> To me you seem to be suffering from a bad case of premature optimisation.
>

I'd normally be one of the first to advocate the avoidance of premature 
optimization, but something like duplicating all the work of a whole 
compiler front-end is strikes me as going a bit off the deep end. I'd 
compare it more to sorting large amounts of highly random data with 
something like a quicksort rather than a bubblesort.

Besides, I don't really mean to say a D lint tool would have to avoid that 
duplicated parsing right from the start. It's just considering that 
duplication to be acceptable in the long-term that I have a problem with.





More information about the Digitalmars-d mailing list