Calling conventions (was: make install; where do .di files go?)

David Nadlinger see at klickverbot.at
Thu Oct 18 12:56:23 PDT 2012


On Thursday, 18 October 2012 at 18:45:29 UTC, Jacob Carlborg 
wrote:
> On 2012-10-18 14:24, Manu wrote:
>
>> What's distinct about the D calling convention?
>
> GDC uses the C calling convention (or whatever the calling 
> convention used by the system)

There is no single »C calling convention«, it varies between 
different OSes and architectures. But yes, as far as I know the 
calling convention used by GDC for extern(D) is the same as GCC 
(the C compiler) defaults to.

> where DMD uses a slightly modified version, if I recall 
> correctly. Note that DMD only defines an ABI for x86 and 
> possible x86-64.
>
> http://dlang.org/abi.html

The situation on x86_64 is actually quite different from the one 
on x86: On x86, DMD uses a pretty unique calling convention, 
which for example passes the first/last integer parameter in EAX. 
This calling convention is usually very cubmersome to support in 
alternative compilers – for example, while calling conventions 
are easily extensible in LLVM, it still isn't a small task, and 
requires directly patching LLVM. It is unlikely that this calling 
convention will ever be supported by an alternative compiler, 
simply because there is little motivation for somebody to step up 
and implement it.

On x86_64, however, the »The extern (C) and extern (D) calling 
convention matches the C calling convention used by the supported 
C compiler on the host system« clause from the ABI documentation 
applies and DMD also tries to follow it. However, what DMD 
produces unfortunately doesn't exactly match the System V x86_64 
ABI – for example, it passes the parameters in reverse order, 
for reasons unknown (if it's just a hack to make it seem as if 
LTR parameter evaluation was implemented, then its … well a 
huge hack with considerable negative consequences).

Actually, Walter, could you please clarify the reasons for this 
deviation from the spec? While LDC currently follows DMD [1] GDC 
doesn't; and when I'm going to overhaul the LDC x86_64 ABI 
implementation soon, it would be nice to just remove the hack and 
make LDC actually conform to the D (!) spec.

Which brings me to my next point: Currently, the extern(D) 
calling convention used by LDC on x86_64 is neither the DMD one 
nor the C one, but a custom design. It is back from when LDC was 
the first "stable" compiler for x86_64, and I think the intention 
of Frits and the others back then was to make it similar to the 
DMD x86 ABI implementation (the D). I do intend to make extern(D) 
behave the same as extern(C), as mandated by the spec, but I'd 
like to know whether the DMD argument order is going to stay 
reversed to avoid breaking the ABI twice.

So much for the »low-level« parameter-passing part of the 
ABI/calling convention. There are several more issues regarding 
cross-compiler compatibility which should also not be overlooked. 
For example, the layout of nested contexts (for closures, nested 
structs, …) is currently completely unspecified, and as a 
result differs between DMD and LDC (and GDC probably as well). 
Also, the runtime interface has slight differences, although 
these should be fairly easy to reconcile. Exception handling is 
another difference…

All those differences should definitely be fixed, at least for 
x86_64 and future platforms like ARM, because an unified ABI is 
definitely a very good thing to have – C++ continues to suffer 
dearly for not specifying one. But this is only going to happen 
if all three compiler implementations are actively working 
together.

David


[1] By simply reversing all the parameters in both LLVM function 
declarations and calls – duh!


More information about the Digitalmars-d mailing list