<div class="gmail_quote">On Tue, Nov 30, 2010 at 1:43 PM, Kagamin <span dir="ltr"><spam@here.lot></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="im">Walter Bright Wrote:<br>
<br>
> How do you decide how many bits should be enough for any algorithm?<br>
><br>
> The thing is, the FPU has 53 bits of precision and so ought to be correct to the<br>
> last bit.<br>
<br>
</div>It's not me, it's the programmer. He was disgusted that his algorithm produced garbage, which means, the error was unacceptable. Mat be it was 1%, may be 80%, I don't, that was his decision, that the result was unacceptable. The bug description assumes the problem was in the last bit, which means, he wanted precision higher than the machine precision.<br>
</blockquote><div><br></div><div>What programmer? What algorithm? As far as I can tell, this was found when testing a library explicitly for accuracy, not in an application, so your argument doesn't apply. </div></div>
<br>