[DMD] Loop index incorrectly optimised out for -release -O
Teodor Dutu
teodor.dutu at gmail.com
Sat Oct 9 21:09:19 UTC 2021
Hi,
I discovered a bug in DMD whereby, during optimisation, the loop
index variable is removed, despite being used afterwards. I
reproduced the bug in [this
issue](https://issues.dlang.org/show_bug.cgi?id=22372).
Using run.dlang.io, I tried to find a regression, but all
supported DMD versions (2.060 and newer) are printing the same
incorrect output (`Exception: i = 0; n = 1`). My experiment can
be found [here](https://run.dlang.io/is/uYfTzl). This seems to be
a backend issue, as both LDC and LDC-beta are working fine.
However, when analysing the CI outputs from [the
PR](https://github.com/dlang/dmd/pull/13116) which made me come
across this bug, I noticed that all 3 of DMD, LDC and GDC are
failing the same test, which I've also been able to reproduce
myself:
- LDC:
https://cirrus-ci.com/task/6291197929979904?logs=test_druntime#L1449
- DMD:
https://cirrus-ci.com/task/5728247976558592?logs=test_druntime#L1390
- GDC:
https://cirrus-ci.com/task/4883823046426624?logs=test_druntime#L1411
The failure of this test is caused by the issue I mentioned above
and the code with which I reproduced the bug is based on it.
I am unsure what to make of this, since the logs seem to
contradict my experiment with LDC. Has anyone else encountered
this? How did you proceed?
Thanks,
Teodor
More information about the Digitalmars-d
mailing list