[std.database] at compile time

Ary Manzana ary at esperanto.org.ar
Sat Oct 15 19:16:11 PDT 2011


On 10/15/11 5:00 PM, Marco Leise wrote:
> Am 15.10.2011, 18:24 Uhr, schrieb Ary Manzana <ary at esperanto.org.ar>:
>
>> On 10/14/11 5:16 PM, Graham Fawcett wrote:
>>> On Fri, 14 Oct 2011 21:10:29 +0200, Jacob Carlborg wrote:
>>>
>>>> On 2011-10-14 15:26, Andrei Alexandrescu wrote:
>>>>> On 10/14/11 6:08 AM, Jacob Carlborg wrote:
>>>>>> On 2011-10-14 12:19, foobar wrote:
>>>>>>> Has anyone looked at Nemerle's design for this? They have an SQL
>>>>>>> macro which allows to write SQL such as:
>>>>>>>
>>>>>>> var employName = "FooBar"
>>>>>>> SQL (DBconn, "select * from employees where name = $employName");
>>>>>>>
>>>>>>> what that supposed to do is bind the variable(s) and it also
>>>>>>> validates the sql query with the database. This is all done at
>>>>>>> compile-time.
>>>>>>>
>>>>>>> My understanding is that D's compile-time features are powerful
>>>>>>> enough to implement this.
>>>>>>
>>>>>> You cannot connect to a database in D at compile time. You could some
>>>>>> form of validation and escape the query without connecting to the
>>>>>> database.
>>>>>
>>>>> A little SQL interpreter can be written that figures out e.g. the
>>>>> names
>>>>> of the columns involved.
>>>>>
>>>>> Andrei
>>>>
>>>> But you still won't be able to verify the columns to the actual
>>>> database
>>>> scheme?
>>>
>>> One approach would be to write a separate tool that connects to the
>>> database and writes out a representation of the schema to a source
>>> file. At compile time, the representation is statically imported, and
>>> used to verify the data model.
>>>
>>> If we had preprocessor support, the tool could be run as such,
>>> checking the model just before passing the source to the compiler.
>>
>> Yeah, but you need a separate tool.
>>
>> In Nemerle it seems you can do everything just in Nemerle...
>>
>> It would be awesome if CTFE would be implemented by JITting functions,
>> not by reinventing the wheel and implementing a handcrafted
>> interpreter...
>
> I wonder if that would work well with cross-compiling. If you blindly
> JIT functions, they may end up using structs of the wrong size, or
> integers with different endianness. Compile for 64-bit on a 32-bit
> machine. What size is size_t during CTFE?

I don't understand this quite well. I want JITted functions to just 
generate code that ultimately will be compiled. It's like what CTFE is 
doing now, except that instead of doing it by interpreting every bit and 
spec of the language you would compile the function, run it to generate 
code, and then compile the code for the target machine.

An example:

enum host = "localhost";
enum port = 3306;
enum database = "foo";
enum password = "whatever";

string code_db(host, port, database, password) {
   auto db = new Database(host, port, database, password);
   auto str = "";
   foreach(auto table; db) {
     str ~= "class " ~ table.name ~ " : Table {";
     foreach(auto column; table.columns) {
       // Well, you get the idea...
     }
     str ~= "}";
   }
}

// Now code_db must be executed at compile-time because it's assigned to 
an enum. Oh, but currently CTFE wouldn't be able to open a connection to 
the database. Well, it could, if you'd JIT it and then execute it. 
code_db just generates a string containing the table classes... so why, 
oh why, does the endianess of size_t matters?
enum db = code_db(host, port, database, password);

mixin(db); // I want to paste the string into the code, not sure this is 
the syntax

---

Maybe I'm not taking something into account... what is it?

Thanks,
Ary


More information about the Digitalmars-d mailing list