Final by default?

Manu turkeyman at gmail.com
Fri Mar 14 21:02:52 PDT 2014


On 15 March 2014 10:49, Walter Bright <newshound2 at digitalmars.com> wrote:

> On 3/14/2014 5:06 AM, Manu wrote:
>
>> In my experience, API layout is the sort of performance detail that
>> library
>> authors are much more likely to carefully consider and get right. It's
>> higher
>> level, easier to understand, and affects all architectures equally.
>> It's also something that they teach in uni. People write books about that
>> sort
>> of thing.
>> Not to say there aren't terrible API designs out there, but D doesn't make
>> terrible-api-design-by-default a feature.
>> Stuff like virtual is the sort of thing that only gets addressed when it
>> is
>> reported by a user that cares, and library authors are terribly reluctant
>> to
>> implement a breaking change because some user reported it. I know this
>> from
>> experience.
>> I can say with confidence, poor API design has caused me less problems
>> than
>> virtual in my career.
>>
>> Can you honestly tell me that you truly believe that library authors will
>> consider, as a matter of common sense, the implications of virtual (the
>> silent
>> default state) in their api?
>> Do you truly believe that I'm making a big deal out of nothing; that I
>> will
>> never actually, in practise, encounter trivial accessors and properties
>> that
>> can't inline appearing in my hot loops, or other related issues.
>>
>> Inline-ability is a very strong API level performance influence,
>> especially in a
>> language with properties.
>>
>> Most programmers are not low-level experts, they don't know how to protect
>> themselves from this sort of thing. Honestly, almost everyone will just
>> stick
>> with the default.
>>
>
>
> I find it incongruous to take the position that programmers know all about
> layout for performance and nothing about function indirection? It leads me
> to believe that these programmers never once tested their code for
> performance.
>

They probably didn't. Library authors often don't if it's not a library
specifically intended for aggressive realtime use. Like most programmers,
especially PC programmers, their opinion is often "that's the optimisers
job".

That said, function inlining is perhaps the single most important API level
performance detail, and especially true in OO code (which advocates
accessors/properties).
Function calls scattered throughout your function serialise your code; they
inhibit the optimiser from pipelining properly in many cases, ie,
rescheduling across a function call is often dangerous, and compilers will
always take a conservative approach. Locals may need to be saved to the
stack across trivial function calls. I'm certain it will make a big
difference in many instances.

Compile some release code without -inline and see what the performance
difference is, that is probably a fairly realistic measure of the penalty
to expect in OO-heavy code.


I know what I'm doing, and even I, when I don't test things, always make
> some innocuous mistake that eviscerates performance. I find it very hard to
> believe that final-by-default will fix untested code.
>

I don't find it hard to believe at all, infact, I find it very likely that
there will be a significant benefit to client code that the library author
will probably have never given a moments thought to. It's usually
considered fairly reasonable for programmers to trust the optimiser to at
least do a decent job. virtual-by-default inhibits many of the most
important optimisations; inlining, rescheduling, pipelining, and also
increases pressure on the stack and caches.

And that's the whole thing here... I just don't see this as obscure or
unlikely at all. If I did, I wouldn't care anywhere near as much as I do.
All code has loops somewhere.


And the library APIs still are fixable. Consider:
>
>     class C {
>         void foo() { ... }
>     }
>
> and foo() needs to be final for performance, but we don't want to break
> existing users:
>
>     class C {
>         void foo() { foo2(); }
>         final void foo2() { ... }
>     }
>

The length you're willing to go to to resist a relatively minor breaking
change, with an unusually smooth migration path, that virtually everyone
agrees with is surprising to me.
Daniel Murphy revealed that it only affects 13% of classes in DMD's OO
heavy code. That is in line with my past predictions; most classes aren't
base classes, so most classes aren't actually affected.

I understand that you clearly don't believe in this change, and I grant
that is your prerogative, but I really don't get why... I just can't see it
when considering the balance.
Obviously I care about the compiler's codegen more than the average guy;
but as I see it, that's a compiler's primary purpose, and programmers are
supposed to be able to trust that it can and will do it well.


My questions above, they were serious questions. Please, I would really
like to hear you answer those questions, or rephrase them if you like...

Can you honestly tell me that you truly believe that library authors will
consider, as a matter of common sense, the implications of virtual (the
silent default state) in their api?
Or you don't consider that to be something worth worrying about, ie, you
truly believe that I'm making a big deal out of nothing; that I will never
actually, in practise, encounter trivial accessors and properties that
can't inline appearing in my hot loops, or other related issues?


I won't post any more on the topic.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20140315/edceac48/attachment-0001.html>


More information about the Digitalmars-d mailing list