[Issue 11435] -O optimization flag causes invalid 32 bit codegen

via Digitalmars-d-bugs digitalmars-d-bugs at puremagic.com
Thu Jul 24 00:56:47 PDT 2014


https://issues.dlang.org/show_bug.cgi?id=11435

--- Comment #9 from yebblies <yebblies at gmail.com> ---
Windows test case:


import core.sys.windows.windows;
import core.stdc.string;

extern(C) int printf(in char*, ...);

alias T = byte;

void fun(T c, T b, T a)
{
    printf("%d %d %d\n", a, b, c);
}

void abc(T[] b, size_t index)
{
    fun(b[index+1], b[index+2], b[index+3]);
}

void main()
{
    auto p = VirtualAlloc(null, 4096, MEM_COMMIT, PAGE_EXECUTE_READWRITE);
    assert(p);
    memset(p, 0, 4096);
    abc((cast(T*)(p + 4090))[0..4], 0);
}

--


More information about the Digitalmars-d-bugs mailing list