[std.database] at compile time

Ary Manzana ary at esperanto.org.ar
Sun Oct 16 08:38:19 PDT 2011


On 10/16/11 2:35 AM, Don wrote:
> On 16.10.2011 04:16, Ary Manzana wrote:
>> On 10/15/11 5:00 PM, Marco Leise wrote:
>>> Am 15.10.2011, 18:24 Uhr, schrieb Ary Manzana <ary at esperanto.org.ar>:
>>>
>>>> On 10/14/11 5:16 PM, Graham Fawcett wrote:
>>>>> On Fri, 14 Oct 2011 21:10:29 +0200, Jacob Carlborg wrote:
>>>>>
>>>>>> On 2011-10-14 15:26, Andrei Alexandrescu wrote:
>>>>>>> On 10/14/11 6:08 AM, Jacob Carlborg wrote:
>>>>>>>> On 2011-10-14 12:19, foobar wrote:
>>>>>>>>> Has anyone looked at Nemerle's design for this? They have an SQL
>>>>>>>>> macro which allows to write SQL such as:
>>>>>>>>>
>>>>>>>>> var employName = "FooBar"
>>>>>>>>> SQL (DBconn, "select * from employees where name = $employName");
>>>>>>>>>
>>>>>>>>> what that supposed to do is bind the variable(s) and it also
>>>>>>>>> validates the sql query with the database. This is all done at
>>>>>>>>> compile-time.
>>>>>>>>>
>>>>>>>>> My understanding is that D's compile-time features are powerful
>>>>>>>>> enough to implement this.
>>>>>>>>
>>>>>>>> You cannot connect to a database in D at compile time. You could
>>>>>>>> some
>>>>>>>> form of validation and escape the query without connecting to the
>>>>>>>> database.
>>>>>>>
>>>>>>> A little SQL interpreter can be written that figures out e.g. the
>>>>>>> names
>>>>>>> of the columns involved.
>>>>>>>
>>>>>>> Andrei
>>>>>>
>>>>>> But you still won't be able to verify the columns to the actual
>>>>>> database
>>>>>> scheme?
>>>>>
>>>>> One approach would be to write a separate tool that connects to the
>>>>> database and writes out a representation of the schema to a source
>>>>> file. At compile time, the representation is statically imported, and
>>>>> used to verify the data model.
>>>>>
>>>>> If we had preprocessor support, the tool could be run as such,
>>>>> checking the model just before passing the source to the compiler.
>>>>
>>>> Yeah, but you need a separate tool.
>>>>
>>>> In Nemerle it seems you can do everything just in Nemerle...
>>>>
>>>> It would be awesome if CTFE would be implemented by JITting functions,
>>>> not by reinventing the wheel and implementing a handcrafted
>>>> interpreter...
>>>
>>> I wonder if that would work well with cross-compiling. If you blindly
>>> JIT functions, they may end up using structs of the wrong size, or
>>> integers with different endianness. Compile for 64-bit on a 32-bit
>>> machine. What size is size_t during CTFE?
>>
>> I don't understand this quite well. I want JITted functions to just
>> generate code that ultimately will be compiled. It's like what CTFE is
>> doing now, except that instead of doing it by interpreting every bit and
>> spec of the language you would compile the function, run it to generate
>> code, and then compile the code for the target machine.
> [snip]
>> Maybe I'm not taking something into account... what is it?
>
> You're assuming that the compiler can run the code it's generating. This
> isn't true in general. Suppose you're on x86, compiling for ARM. You
> can't run the ARM code from the compiler.
>


Compile the function to be compile-time evaluated in X86. Compile the 
function that will go to the obj files/executables in ARM. (you'd 
eventually compile the first ones in ARM if they are used in run-time).

What's bad about that?


More information about the Digitalmars-d mailing list