Slow performance compared to C++, ideas?
Kapps
opantm2+spam at gmail.com
Mon Jun 3 00:30:55 PDT 2013
On Monday, 3 June 2013 at 07:06:05 UTC, Manu wrote:
> There are functions that
> the author intended to be overridden, and functions that have
> no business
> being overridden, that the author probably never imagined
> anyone would
> override.
> What if someone does come along and override one of these, and
> it was never
> designed to work under that circumstance in the first place?
> At very least, it will have never been tested. That's not a
> very robust API
> offering if you ask me.
This is something just as important as the performance issues.
Most of the time people will leave functions to simply use
whatever the default is for virtual/final. If it's final, this
works fairly well. The author hasn't put in the effort to decide
how to handle people overriding their function. But with virtual
by default, you don't know if the author actually considered that
people will be overriding the function or if it's simply that
they didn't bother specifying. I know the vast majority of my
code is virtual, simply because I didn't specify the 'final'
keyword 500 times, and didn't think about that I'd need to do it.
The resulting code is unsafe because I didn't consider those
functions actually being overridden.
A standard example is the addRange vs add functions. If these are
left as default, you have no clue if the author considered that
someone would be overriding it. Will addRange call add, or do you
need to override both? Is this an implementation detail that may
change at any moment? Or when a different class overrides
addRange and makes it call add? By forcing the author to write
'virtual', you know that they at least *considered* that someone
may override it, and hopefully thought about the consequences or
documented whether addRange calls add. Or they may think about it
later and realize that it's a mistake to have left it default and
make addRange final. Going from final to virtual is fine; going
from virtual to final breaks code.
And of course, the vast majority of functions should *not* be
virtual. So why is it the default to have to specify 'final' for
all of these functions?
I think that there are substantial benefits to final being
default, but breaking every single program that uses inheritance
is a pretty big issue. Still, fixing the code would be quite
trivial if you can see a list of the functions that need to be
made virtual (either by specifying --transition=virtual or just
looking at the compiler errors that pop up when you build). It
would even be possible to write a tool to automatically update
code using the results of --transition if it was implemented in a
way that gave enough details. Unfortunately this would only solve
the issue of making your code work again, you would still have to
go through everything and decide if it should be virtual.
More information about the Digitalmars-d
mailing list