A strange div bug on Linux x86_64, (both dmd & ldc2): long -5000 / size_t 2 = 9223372036854773308
H. S. Teoh
hsteoh at quickfur.ath.cx
Thu Aug 13 19:59:56 UTC 2020
On Thu, Aug 13, 2020 at 07:40:28PM +0000, mw via Digitalmars-d wrote:
> On Thursday, 13 August 2020 at 19:24:11 UTC, Tove wrote:
> > One should always use unsigned whenever possible as it generates
> > better code, many believe factor 2 is simply a shift, but not so on
> > signed.
> I'm fine with that. In many area of the language design, we need to
> make a choice between: correctness v.s. raw performance.
> But at least we also need *explicit* visible warning message after
> we've made that choice:
I agree that the compiler should at least warn or prohibit implicit
conversions between signed/unsigned. It has been the source of quite a
number of frustrating bugs over the years -- frustrating mostly because
implicit conversion yields unexpected results yet due to code breakage
it's unlikely to ever change.
Unfortunately I don't see the situation changing anytime soon, unless
somebody comes up with a *really* convincing argument that can win
Walter over. After the flop with the recent bool != int DIP, I've kinda
given up hope that this area of D (int promotion rules, including
implicit conversion) will ever improve.
I don't agree with making array length signed, though. The language
should not whitewash the harsh reality of the underlying hardware, even
if we make concessions in the way of warning the user of potentially
unexpected/unwanted semantics, such as when there's implicit conversion
between signed/unsigned values.
The irony is that Bill Gates claims to be making a stable operating system and Linus Torvalds claims to be trying to take over the world. -- Anonymous
More information about the Digitalmars-d