Silicon Valley D Meetup - April 15, 2021 - "Compile Time Function Execution (CTFE)"

12345swordy alexanderheistermann at gmail.com
Mon May 3 01:51:28 UTC 2021


On Tuesday, 27 April 2021 at 08:12:57 UTC, FeepingCreature wrote:
> On Monday, 26 April 2021 at 14:01:37 UTC, sighoya wrote:
>> On Monday, 26 April 2021 at 13:17:49 UTC, FeepingCreature 
>> wrote:
>>> On Sunday, 25 April 2021 at 21:27:55 UTC, sighoya wrote:
>>>> On Monday, 19 April 2021 at 06:37:03 UTC, FeepingCreature 
>>>> wrote:
>>>>> Native CTFE and macros are a beautiful thing though.
>>>>
>>>> What did you mean with native?
>>>
>>> When cx needs to execute a function at compiletime, it links 
>>> it into a shared object and loads it back with dlsym/dlopen. 
>>> So while you get a slower startup speed (until the cache is 
>>> filled), any further calls to a ctfe function run at native 
>>> performance.
>>
>> Ah okay, but can't Dlang runtime functions not anyway called 
>> at compile time with native performance too?
>>
>> So generally, cx first parses the program, then filters out 
>> what is a macro, then compiles all macro/ctfe functions into 
>> shared lib and execute these macros from that lib?
>>
>
> Sorta: when we hit a macro declaration, "the module at this 
> point" (plus transitive imports) is compiled as a complete 
> unit. This is necessary cause parser macros can change the 
> interpretation of later code. Then the generated macro object 
> is added to the module state going forward, and that way it can 
> be imported by other modules.
>
>> Isn't it better to use the cx compiler as a service at compile 
>> time and compile code in-memory in the executable segment 
>> (some kind of jiting I think) in order to execute it then.
>> I think the cling repl does it like that.
>
> That would also work, I just went the path of least resistance. 
> I already had an llvm backend, so I just reused it. Adding a 
> JIT backend would be fairly easy, except for the part of 
> writing and debugging a JIT. :P
>
>>
>> And how does cx pass type objects?
>
> By reference. :) Since the compiler is in the search path, you 
> can just import cx.base and get access to the same Type class 
> that the compiler uses internally. In that sense, macros have 
> complete parity with the compiler itself. There's no attempt to 
> provide any sort of special interface for the macro that 
> wouldn't also be used by compiler internal functions. (There's 
> some class gymnastics to prevent module loops, ie. cx.base 
> defines an interface for the compiler as a whole, that is 
> implemented in main, but that is indeed also used by the 
> compiler's internal modules themselves.)
>
> The downside of all this is that you need to parse and process 
> the entire compiler to handle a macro import. But DMD gives me 
> hope that this too can be made fast. (RN compiling anything 
> that pulls in a macro takes about a second even with warm 
> object cache.)

It is better to use an existing jit framework then to build one 
on your own with regards to the JIT backend.

-Alex


More information about the Digitalmars-d-announce mailing list