[Issue 11435] -O optimization flag causes invalid 32 bit codegen
via Digitalmars-d-bugs
digitalmars-d-bugs at puremagic.com
Thu Jul 24 01:13:34 PDT 2014
https://issues.dlang.org/show_bug.cgi?id=11435
--- Comment #10 from yebblies <yebblies at gmail.com> ---
And the same thing for short (I think)
import core.sys.windows.windows;
import core.stdc.string;
extern(C) int printf(in char*, ...);
alias T = short;
void fun(T c, T b, int v)
{
printf("%d %d\n", b);
}
void abc(T[] b, size_t index)
{
fun(b[0], b[1], 0);
}
void main()
{
auto p = VirtualAlloc(null, 4096, MEM_COMMIT, PAGE_EXECUTE_READWRITE);
assert(p);
memset(p, 0, 4096);
auto px = (cast(T*)(p + 4096 - 2 * T.sizeof));
printf("%p\n", px+1);
abc(px[0..2], 0);
}
--
More information about the Digitalmars-d-bugs
mailing list