D const design rationale

Walter Bright newshound1 at digitalmars.com
Sat Jun 23 01:04:11 PDT 2007


Bill Baxter wrote:
> Walter Bright wrote:
>> Optimization often makes the difference between a successful project 
>> and a failure. C++ has failed to supplant FORTRAN, because although in 
>> every respect but one C++ is better, that one - optimization of arrays 
>> - matters a whole lot. It drives people using C++ to use inline 
>> assembler. They spend a lot of time on the issue. Various proposals to 
>> fix it, like 'noalias' and 'restrict', consume vast amounts of 
>> programmer time. And time is money.
> 
> FORTRAN is also helped by having a fairly standardized ABI that can be 
> called easily from lots of languages, which C++ lacks.  But C has that, 
> and it has also failed to supplant FORTRAN for numeric code.  But I 
> think Sean's right.  A lot of that is just that the language supports 
> things like actual multi-dimensional arrays (by 'actual' I mean 
> contiguous memory rather than pointers to pointers),

C has them too:
	int array[3][5];
is not an array of pointers to arrays.

> and mathematical 
> operations on them right out of the box.

No, FORTRAN does not have array operations out of the box. It has no 
more mathematical operations than C does (in fact, it has fewer).

> Telling a numerics person that 
> C/C++ will give them much better IO and GUI support, but take them a 
> step back in terms of core numerics is like trying to sell a hunter a 
> fancy new gun with a fantastic scope that will let you pinpoint a mouse 
> at 500 yards but -- oh, I should mention it only shoots bb's.

I've read the papers on it. It's very clear that the only technical 
reason FORTRAN is better than C at numerics is it doesn't have array 
aliasing.

That's it.

> I guess what I'd like to say in summary is that I'm skeptical about the 
> claim that optimization "often" makes the difference between success and 
> failure.  "occasionally" I could believe.  Ill-advised premature 
> optimization has probably led to the demise of many more a project than 
> actual optimization problems in the end product.  We'll all gladly take 
> a free 20% speed improvement if the compiler can give it to us, but I 
> don't believe there are that many projects that will fail simply for 
> lack of that 20%.

When you're paying by the minute for supercomputer time, 20% is a big deal.

When you're predicting the weather, a 20% slowdown means you're 
producing history rather than predictions.

If Google could get 20% more speed out of their servers, they could cut 
the size of their server farm by 20%. That's hundreds of millions of 
dollars.

When you're writing a game, numerics performance is what makes your 
game's graphics better than the competition.

When you're writing code for embedded systems, faster code means you 
might be able to use a slower, cheaper processor, which can translate 
into millions of dollars in cost savings when you're shipping millions 
of units.

I don't want D to be fundamentally locked out of these potential 
markets. If D compilers can produce fundamentally better code than C++, 
that's a big selling point for D into companies like Google. And when 
they use it for their critical server farm apps, they'll naturally tend 
to use it for much more.



More information about the Digitalmars-d mailing list