assert semantic change proposal

Atila Neves via Digitalmars-d digitalmars-d at puremagic.com
Mon Aug 4 02:38:25 PDT 2014


This. 1000x this.

Atila

On Monday, 4 August 2014 at 01:17:23 UTC, John Carter wrote:
> On Sunday, 3 August 2014 at 19:47:27 UTC, David Bregman wrote:
>
>> 2. Semantic change.
>> The proposal changes the meaning of assert(), which will 
>> result in breaking existing code. Regardless of philosophizing 
>> about whether or not the code was "already broken" according 
>> to some definition of assert, the fact is that shipping 
>> programs that worked perfectly well before may no longer work 
>> after this change.
>
> Subject to the caveat suggesting having two assert's with 
> different names and different meanings, I am in the position to 
> comment on this one from experience.
>
> So assuming we do have a "hard assert" that is used within the 
> standard libraries and a "soft assert" in user code (unless 
> they explicitly choose to use the "hard assert"....)
>
> What happens?
>
> Well, I'm the dogsbody who has the job of upgrading the 
> toolchain and handling the fallout of doing so.
>
> So I have been walking multimegaline code bases through every 
> gcc version in the last 15 years.
>
> This is relevant because on every new version they have added 
> stricter warnings, and more importantly, deeper optimizations.
>
> It's especially the deeper optimizations that are interesting 
> here.
>
> They are often better data flow analysis which result in more 
> "insightful" warnings.
>
> So given I'm taking megalines of C/C++ code from a warnings 
> free state on gcc version N to warnings free on version N+1, 
> I'll make some empirical observations.
>
> * They have _always_ highlighted dodgy / non-portable / 
> non-standard compliant code.
> * They have quite often highlighted existing defects in the 
> code.
> * They have quite often highlighted error handling code as 
> "unreachable", because it is... and the only sane thing to do 
> is delete it.
> * They have often highlighted the error handling code of 
> "defensive programmers" as opposed to DbC programmers.
>
> Why? Because around 30% of the code of a defensive programmer 
> is error handling crud that has never been executed, not even 
> in development and hence is untested and unmaintained.
>
> The clean up effort was often fairly largish, maybe a week or 
> two, but always resulted in better code.
>
> Customer impacting defects introduced by the new optimizations 
> have been....
>
> a) Very very rare.
> b) Invariably from really bad code that was blatantly 
> defective, non-standard compliant and non-portable.
>
> So what do I expect, from experience from Walter's proposed 
> change?
>
>
> Another guy in this thread complained about the compiler 
> suddenly relying on thousands of global axioms from the core 
> and standard libraries.
>
> Yup.
>
> Exactly what is going to happen.
>
> As you get...
>
> * more and more optimization passes that rely on asserts,
> * in particular pre and post condition asserts within the 
> standard libraries,
> * you are going to have flocks of user code that used to 
> compile without warning
> * and ran without any known defect...
>
> ...suddenly spewing error messages and warnings.
>
> But that's OK.
>
> Because I bet 99.999% of those warnings will be pointing 
> straight at bone fide defects.
>
> And yes, this will be a regular feature of life.
>
> New version of compiler, new optimization passes, new 
> warnings... That's OK, clean 'em up, and a bunch of latent 
> defects won't come back as customer complaints.



More information about the Digitalmars-d mailing list