Motivation for compile time function execution

Pragma ericanderton at yahoo.removeme.com
Fri Feb 16 09:34:04 PST 2007


Walter Bright wrote:
> The sqrt() example I gave is cute, but not a very interesting reason for 
> doing compile time function execution. So what's the real reason?
> 
> A while back, Eric Anderton and Don Clugston showed how one could use 
> template metaprogramming techniques to do, for example compile time 
> regular expressions. It was a marvelous technical demonstration, and 
> showed that essentially any computation could be done, using templates, 
> at compile time.
> 
> There were some serious problems, though:
> 
> 1) They are hard and unintuitive to write (for those not comfortable 
> with functional programming, which is most of us).
> 
> 2) The result looks awful - templates are just syntactically unsuited to 
> this, even though they're a lot easier on the eye than C++ template 
> metaprograms.
> 
> 3) This is off-putting enough that some question even having such 
> facilities in the language, as it results in impenetrable code 
> unmaintainable by joe coders.
> 
> 4) While theoretically capable of any computation, such template 
> metaprogramming had severe practical limitations. Every step of every 
> computation required generating a unique template. This naturally is 
> incredibly slow and memory consumptive (C++ metaprogramming is notorious 
> for taking all night to do a build). Even worse, if you're going to use 
> string templates to parse, say, a 1000 character DSL, the template name 
> generated must include its arguments, so that template identifier will 
> be 1000 characters long. Then, it slices off the first character, and 
> generates another template for the rest (999), then 998, then 997, etc., 
> until 1000 templates are generated averaging 500 characters long. It 
> doesn't take much of that before the whole thing collapses.
> 
> 5) In casting about for a solution, the compile time regex came up 
> again. This was still strongly disliked.
> 
> 6) I promised to try and make template metaprogramming less obtuse, and 
> what better way to do that than to obsolete a whole class of templates, 
> replacing them with ordinary functions?

I mentioned this over in announce, but I'm working on taking this to the next level.

I've found that treating meta programming like lisp (thanks to everyone who drew those parallels this week) is the first 
step.  Adopting this mindset makes #1,#2 and #3 somewhat moot.  You only wind up in trouble if you try to approach it 
like normal D code (DMD 1.006 is going to change that quite a bit).

Then, I found that aggressive use of tuples makes a lot of things much easier to handle and grok; thanks BCS!  Again, 
this kind of list passing is very lisp-ish and makes for some very straightforward templates.

To that end, I went about writing a parsing lib in the same vein as Enki.  I currently have a tokenizer (attached) that 
will turn an input string into a tuple of token 'structs', complete with line/col information.  By this, I've found that 
using tuples has addressed point #4 only slightly by avoiding the kludge I added to the regex lib.  Passing such large 
tuples around still bloats the template "manglespace" horribly, and generating the list is about as bad.

That said, Walter, I think you're on the right track by making functions compile-time capable.  I might just refactor my 
code to take advantage of this.

This leads me to two questions for DMD 1.006:

Say I convert my tuple-processing code into compile-time functions that use const lists instead. What is the primary 
tradeoff, increased compile-time for unlimited call-depth due to non-bloated namespaces?  Is there another tradeoff that 
is more dominant here, and not as obvious?


-- 
- EricAnderton at yahoo
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: tokenizer.d
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20070216/030faa58/attachment.ksh>


More information about the Digitalmars-d mailing list