std.math performance (SSE vs. real)

via Digitalmars-d digitalmars-d at puremagic.com
Sat Jul 5 09:45:42 PDT 2014


On Saturday, 5 July 2014 at 16:24:28 UTC, Iain Buclaw via 
Digitalmars-d wrote:
> Right, it's a quirk of the CPU.

It's a precision quirk of floating point that has to be defined, 
and different CPUs follow different definitions. Within IEEE754 
it can of course also differ, since it does not prevent higher 
precision than specified.

http://en.wikipedia.org/wiki/Denormal_number


More information about the Digitalmars-d mailing list