Is metaprogramming useful?

Steve Horne stephenwantshornenospam100 at aol.com
Wed Nov 29 10:11:01 PST 2006


On Wed, 29 Nov 2006 11:25:43 -0500, Brad Anderson <brad at dsource.org>
wrote:

>But the existing prefix notation is exactly why it can be extended so many
>ways with macros.  Change that and you lose most of, or at least a lot of, the
>metaprogramming facilities (see Dylan).

So don't change it. Just add a standard syntax-sugar library on top
for expressions with precedence and associativity (which sadly Lisp -
or at least Scheme - macros can't handle, but which can be handled by
using a more Von Neumann approach).

Nemerle has been mentioned recently, and I've been reading up a bit
today, and my impression is very positive. You start out, from the
beginning, using real world high level practical tools. There's a
heavy functional flavour, so it helps to have played with something
like Haskell in the past, but get past the "this is different" and
there is real workhorse stuff.

And of course that goes beyond the fact that this is a usable
high-level language out of the box. Making it a .NET language is both
an obvious plus point and my main reservation. It means there is a
solid set of libraries to use, without the need for a whole bunch of
Nemerle-specific porting. The downside obviously being that it is
limited to the .NET platform - no systems level coding etc.

Anyway, it's not until you've got the tools to do 99% of your work
that the 'by the way, the if/else, for loop etc etc are just standard
library macros - you can do it different if you really need to'
becomes an issue.


Some people have mentioned a key problem with metaprogramming/code
generation in terms of tools (e.g. the debugging issue). Well, I'm
glad I've picked up the 'concept oriented' terminology from that XLR
link because it helps me say this more easily...

It doesn't matter whether a concept is implemented directly in the
compiler or in a library. What matters is whether the tools understand
the concept. If you have a standard set of concepts in a library that
handle 99% of all requirements, tools like debuggers can be written to
be aware of them, and so the problem only relates to the 1% of code.
The principle is not so different from having source-level debugging
instead of assembler-level debugging. And even for that 1%, the
alternatives are all IMO just as bad as generated code anyway. Code
that has been force-fitted to badly matched language concepts is hard
to understand and maintain, just like the generated code.

Of course if the library that describes a new concept could also give
special instructions to the debugger on how to present it, along with
perhaps documentation handling instructions etc etc, then that would
be a very good thing. It would mean that you could treat a mature
metaprogramming library much as you would a built-in compiler feature
- so long as the library itself is working, you only worry about what
you are doing with it, not the internals of how it works.

-- 
Remove 'wants' and 'nospam' from e-mail.



More information about the Digitalmars-d mailing list