[std.database] at compile time

foobar foo at bar.com
Sun Oct 16 16:29:34 PDT 2011


Don Wrote:

> On 16.10.2011 17:39, Ary Manzana wrote:
> > On 10/16/11 4:56 AM, foobar wrote:
> >> Don Wrote:
> >>
> >>> On 16.10.2011 04:16, Ary Manzana wrote:
> >>>> On 10/15/11 5:00 PM, Marco Leise wrote:
> >>>>> Am 15.10.2011, 18:24 Uhr, schrieb Ary Manzana<ary at esperanto.org.ar>:
> >>>>>
> >>>>>> On 10/14/11 5:16 PM, Graham Fawcett wrote:
> >>>>>>> On Fri, 14 Oct 2011 21:10:29 +0200, Jacob Carlborg wrote:
> >>>>>>>
> >>>>>>>> On 2011-10-14 15:26, Andrei Alexandrescu wrote:
> >>>>>>>>> On 10/14/11 6:08 AM, Jacob Carlborg wrote:
> >>>>>>>>>> On 2011-10-14 12:19, foobar wrote:
> >>>>>>>>>>> Has anyone looked at Nemerle's design for this? They have an SQL
> >>>>>>>>>>> macro which allows to write SQL such as:
> >>>>>>>>>>>
> >>>>>>>>>>> var employName = "FooBar"
> >>>>>>>>>>> SQL (DBconn, "select * from employees where name =
> >>>>>>>>>>> $employName");
> >>>>>>>>>>>
> >>>>>>>>>>> what that supposed to do is bind the variable(s) and it also
> >>>>>>>>>>> validates the sql query with the database. This is all done at
> >>>>>>>>>>> compile-time.
> >>>>>>>>>>>
> >>>>>>>>>>> My understanding is that D's compile-time features are powerful
> >>>>>>>>>>> enough to implement this.
> >>>>>>>>>>
> >>>>>>>>>> You cannot connect to a database in D at compile time. You could
> >>>>>>>>>> some
> >>>>>>>>>> form of validation and escape the query without connecting to the
> >>>>>>>>>> database.
> >>>>>>>>>
> >>>>>>>>> A little SQL interpreter can be written that figures out e.g. the
> >>>>>>>>> names
> >>>>>>>>> of the columns involved.
> >>>>>>>>>
> >>>>>>>>> Andrei
> >>>>>>>>
> >>>>>>>> But you still won't be able to verify the columns to the actual
> >>>>>>>> database
> >>>>>>>> scheme?
> >>>>>>>
> >>>>>>> One approach would be to write a separate tool that connects to the
> >>>>>>> database and writes out a representation of the schema to a source
> >>>>>>> file. At compile time, the representation is statically imported,
> >>>>>>> and
> >>>>>>> used to verify the data model.
> >>>>>>>
> >>>>>>> If we had preprocessor support, the tool could be run as such,
> >>>>>>> checking the model just before passing the source to the compiler.
> >>>>>>
> >>>>>> Yeah, but you need a separate tool.
> >>>>>>
> >>>>>> In Nemerle it seems you can do everything just in Nemerle...
> >>>>>>
> >>>>>> It would be awesome if CTFE would be implemented by JITting
> >>>>>> functions,
> >>>>>> not by reinventing the wheel and implementing a handcrafted
> >>>>>> interpreter...
> >>>>>
> >>>>> I wonder if that would work well with cross-compiling. If you blindly
> >>>>> JIT functions, they may end up using structs of the wrong size, or
> >>>>> integers with different endianness. Compile for 64-bit on a 32-bit
> >>>>> machine. What size is size_t during CTFE?
> >>>>
> >>>> I don't understand this quite well. I want JITted functions to just
> >>>> generate code that ultimately will be compiled. It's like what CTFE is
> >>>> doing now, except that instead of doing it by interpreting every bit
> >>>> and
> >>>> spec of the language you would compile the function, run it to generate
> >>>> code, and then compile the code for the target machine.
> >>> [snip]
> >>>> Maybe I'm not taking something into account... what is it?
> >>>
> >>> You're assuming that the compiler can run the code it's generating. This
> >>> isn't true in general. Suppose you're on x86, compiling for ARM. You
> >>> can't run the ARM code from the compiler.
> >>>
> >>
> >> This is quite possible in Nemerle's model of compilation.
> >> This is the same concept as XLST - a macro is a high level transform
> >> from D code to D code.
> >>
> >> 1. compile the macro ahead of time into a loadable compiler
> >> module/plugin.
> >> the plugin is compiled for the HOST machine (x86) either by a separate
> >> compiler
> >> or by a cross-compiler that can also compile to its HOST target.
> 
> YES!!! This is the whole point. That model requires TWO backends. One 
> for the host, one for the target.
> That is, it requires an entire backend PURELY FOR CTFE.
> 
> Yes, of course it is POSSIBLE, but it is an incredible burden to place 
> on a compiler vendor.

How does that differ from the current situation? We already have a separate implementation of a D interpreter for CTFE. 
I disagree with the second point as well - Nothing forces the SAME compiler to contain two separate implementations as is now the case. 
In Fact you could compile the macros with a compiler of a different vendor. After all that's the purpose of an ABI, isn't it? 
In fact it makes the burden on the vendor much smaller since you remove the need for a separate interpreter.


More information about the Digitalmars-d mailing list