ThinLTO is there. I think that should settle the final/virtual debate

Jonathan M Davis via Digitalmars-d digitalmars-d at puremagic.com
Sun Dec 4 12:31:46 PST 2016


On Sunday, December 04, 2016 19:49:15 NVolcz via Digitalmars-d wrote:
> On Sunday, 4 December 2016 at 01:36:50 UTC, deadalnix wrote:
> > First, presentation:
> > https://www.youtube.com/watch?v=9OIEZAj243g
> >
> > Some of this is available in LLVM today, and everything
> > presented here will be in 4.0 . The long story short: ThinLTO
> > can do most of what LTO does but with a price that is much
> > closer to the one of a regular build than the one of a classic
> > LTO build.
> >
> > LTO optimization can devirtualize all function that do not need
> > to be virtual, and even use profile infos to speculatively
> > devirtualize - aka JVM grade devirtualization.
> >
> > I would love to see this leveraged to finally put to rest the
> > final vs virtual debate. If we use this tech properly,
> > everything that do not need to be virtual can be finalized -
> > except across shared object, which shouldn't be too much of an
> > issue in practice.
>
> My understanding was that the main argument for final by default
> was that it is easy to make the wrong decision for a method to be
> virtual and then going from virtual to final would break the
> compatibility.

That and that some of the performance-centric guys like Manu work in
environments where non-virtual functions should be the norm, and you only
want a small percentage of functons to be virtual.

Really, whether a function is virtual or not should be a design decision.
For inheritance and polymorphism to work well, it really needs to be part of
the design of a how a class works. The idea that it's reasonable to just
inherit from a class and override random functions and expect everything to
work well is a faulty one. Good design requires that the programmer actually
make a design decision about whether overridability makes sense for that
function and that class and make that function virtual or not, and having it
be virtual when the programmer didn't explicitly planned for it to be
overridden is just a recipe for disaster. So, the fact that virtual is then
the default rather than requiring a decision by the programmer is just
inviting problems. Those are problems that are dealt with in other languages
(e.g. Java and C#), and they're survivable (especially if you don't work in
an environment where the virtualness of a function is a performance
problem), but since it's a problem for random functions to be overridden
without the designer of the base class planning for it, and it _is_ a
performance problem for some folks, it arguably really doesn't make sense
for virtual to be the default. But we're stuck with it at this point.

Improvements to devirtualization may help with the performance problems
associated with virtual functions, but they don't fix the design problems
that go with virtual by default.

Fortunately, D makes it far less of an issue than it would be otherwise,
because we have fully-featured structs which are non-polymorphic. So, this
really only becomes a problem when you're using classes and don't need
polymorphism for the whole class (and I think that part of Andrei's argument
against final by default was the fact that he didn't think that it made any
sense to even use classes if you weren't looking to have the functions be
virtual).

- Jonathan M Davis



More information about the Digitalmars-d mailing list