Building for ARM with LDC

Joakim via digitalmars-d-ldc digitalmars-d-ldc at puremagic.com
Mon Sep 29 01:17:05 PDT 2014


On Monday, 15 September 2014 at 07:33:26 UTC, Joakim wrote:
> Got the segfault in the sieve program to go away by changing 
> the default calling convention for all linkage types in 
> gen/abi.cpp to C, just as Kai did on PPC64.  I don't get any 
> "Aborted" tests now either.
>
> One issue cropping up in std.format appears to be with structs 
> that use alias this for a member, specifically these two:
>
> https://github.com/D-Programming-Language/phobos/blob/master/std/format.d#L1307
> https://github.com/D-Programming-Language/phobos/blob/master/std/format.d#L1748
>
> The first one returns false, the second one returns a junk 
> number for the imaginary component of the complex number.  Not 
> sure why it fails for some structs that use alias this and not 
> others, going to look into it.

Hey Kai, I see that you just mentioned that you're still trying 
ldc on linux/ARM.  I never got an answer to my question before: 
how many tests pass for you when running on linux/ARM?  I've 
detailed my results here: most of druntime passes while most of 
phobos doesn't.

> Another issue is that the dwarf output for debugging might be 
> corrupted somehow.  If I compile sieve.d and have it run fine 
> against non-debug druntime/phobos, it then segfaults in 
> different places if I link it against debug druntime or phobos, 
> which are simply compiled with the -g flag.

I figured out what the problem was with debug: it's that 
codeGenOptLevel() in gen/optimizer.cpp turns off all 
optimizations when generating debug symbols is turned on.  As we 
both noted previously that ldc is generating buggy code for -O0 
with ARM, the problem is with the generated code, not debug 
output.  If I patch that ldc function so that debug output can be 
used with different optimization levels, I can debug again on 
linux/ARM if I use an optimization level of -O1 or higher, though 
I haven't tested it extensively.

I had tried using different optimization levels with debug 
before, but didn't know ldc was silently turning it off till now. 
  I notice that clang doesn't seem to turn optimization off for 
debug and that the ldc patch that turned optimization off is more 
than five years old: perhaps it's not necessary?


More information about the digitalmars-d-ldc mailing list