Basic benchmark

Walter Bright newshound1 at digitalmars.com
Sun Dec 14 18:37:22 PST 2008


Bill Baxter wrote:
> Of course that back end was also designed for C/C++ originally, right?

Pretty much all of them are.

> But anyway, I agree with bearophile, that requiring too many special
> features out of a back end will make it hard for any alternative D
> compilers to keep up.

I'm aware of that, but I'm also aware of the crippling workarounds 
cfront had to use to avoid changing *anything* in the back end, because 
cfront had no control over them. There are still some suboptimal things 
in C++ due to trying to avoid changing the rest of the tool chain.


>> While of course all this can be added to any back end, I understand how to
>> do it to mine, and it would take me a lot of time to understand another well
>> enough to be able to know just where to put the fix in.
> 
> That's understandable, but at some point it becomes worth the effort
> to learn something new.  Many people get by just fine using C++.  They
> may be interested in D, but it just takes too much effort.  However, a
> little effort invested in learning D pays off (at least we all believe
> so or we wouldn't be here).  Likewise, if there were a really solid
> well-maintained back end with a liberal open source license that
> generates great code, it would very likely be worth your time to learn
> it, even though it might be rough going in the short term.

Such doesn't exist, however. I remember efforts back in the early 80's 
to build one (the PCC, for example).

>> Another thing I'd be very unwilling to give up on with the dmd back end is
>> how fast it is. DMC is *still* by far the fastest compiler out there.
> 
> I'd gladly trade fast compilation for "has a future" or "supports
> 64-bit architectures" or "generates faster code" or "doesn't crash
> when there are too many fixups in main()".   Have you seen the
> messages about how long it can take to compile DWT applications?  DWT
> progs are already desperately in need of some smarter dependency
> tracking and ability to do minimal recompilations.  I think
> implementing that (in a build system or whatever) would more than make
> up for the loss in raw compilation speed.  Besides, I think a chunk of
> the the compilation speed is thanks to the module system, and avoiding
> the endless reparsing required for C++ #includes.  So any D compiler
> should benefit.

DMC is the fastest C/C++ compiler. DMD benefits from much of the work 
that went in to make it fast. I did design the semantics of D to favor 
fast parsing speeds, but there's still the back end speed which has 
nothing to do with parsing semantics.

I found out yesterday that gcc still generates *text* assembler files 
which are then fed to the assembler for all compiles. That just cannot 
be made to be speed competitive.

> Anyone have the data for the time required to compile tango with DMD
> vs LDC?  It would be interesting to see how bad the difference is.
> 
> Anyway, all that said,  it's not clear that we really do have that
> mythical "uber backend" available right now.
> 
> According to my conversations on the clang mailing list, the current
> target is for LLVM to be able to fully support a C++ compiler by 2010.
>  I'm not quite sure what all that involves, but apparently it includes
> things like making exceptions work on Windows.  So it certainly does
> look a bit premature to move over to LLVM as the primary platform for
> D at this point.

Abandoning dmd's back end now then would entail a 2 year delay with no 
updates, and I guarantee that there'll be years of wringing bugs out of 
LLVM. Writing a cg for a complex instruction set like the x86 is, well, 
pretty complicated <g> with thousands of special cases.

One thing that made D possible was I was able to use a mature, 
professional quality, debugged optimizer and back end. The lack of that 
has killed many otherwise promising languages in the past.



More information about the Digitalmars-d mailing list