Reimplementing the bulk of std.meta iteratively

Bruce Carneal bcarneal at gmail.com
Tue Sep 29 12:44:00 UTC 2020


On Tuesday, 29 September 2020 at 03:14:34 UTC, Andrei 
Alexandrescu wrote:
> On 9/28/20 6:46 PM, Bruce Carneal wrote:
>> [...]
>
> Years ago, I was in a panel with Simon Peyton-Jones, the 
> creator of Haskell. Nicest guy around, not to mention an 
> amazing researcher. This question was asked: "What is the most 
> important principle of programming language design?"
>
> His answer was so powerful, it made me literally forget 
> everybody else's answer, including my own. (And there were no 
> slouches in that panel, save for me: Martin Odersky, Guido van 
> Rossum, Bertrand Meyer.) He said the following:
>
> "The most important principle in a language design is to define 
> a small core of essential primitives. All necessary syntactic 
> sugar lowers to core constructs. Do everything else in 
> libraries."
>
> (He didn't use the actual term "lowering", which is familiar to 
> our community, but rather something equivalent such as 
> "reduces".)
>
> That kind of killed the panel, in a good way. Because most 
> other questions on programming language design and 
> implementation simply made his point shine quietly. Oh yes, if 
> you had a small core and build on it you could do this easily. 
> And that. And the other. In a wonderfully meta way, all 
> questions got instantly lowered to simpler versions of 
> themselves. I will never forget that experience.
>
> D breaks that principle in several places. It has had a 
> cavalier attitude to using magic tricks in the compiler to get 
> things going, at the expense of fuzzy corners and odd limit 
> cases. Look at hashtables. Nobody can create an equivalent 
> user-defined type, and worse, nobody knows exactly why. (I 
> recall vaguely it has something to do, among many other things, 
> with qualified keys that are statically-sized arrays. Hacks in 
> the compiler make those work, but D's own type system rejects 
> the equivalent code. So quite literally D's type system cannot 
> verify its own capabilities.)
>
> Or take implicit conversions. They aren't fully documented, and 
> the only way to figure things out is to read most of the 7193 
> lines of 
> https://github.com/dlang/dmd/blob/master/src/dmd/mtype.d. 
> That's not a small core with a little sugar on top.
>
> Or take the foreach statement. Through painful trial and error, 
> somebody figured out all possible shapes of foreach, and 
> defined `each` to support most:
>
> https://github.com/dlang/phobos/blob/master/std/algorithm/iteration.d#L904
>
> What should have been a simple forwarding problem took 190 
> lines that could be characterized as very involved. And mind 
> you, it doesn't capture all cases because per 
> https://github.com/dlang/phobos/blob/master/std/algorithm/iteration.d#L1073:
>
> // opApply with >2 parameters. count the delegate args.
> // only works if it is not templated (otherwise we cannot count 
> the args)
>
> I know that stuff (which will probably end on my forehead) 
> because I went and attempted to improve things a bit in 
> https://github.com/dlang/phobos/pull/7638/files#diff-0d6463fc6f41c5fb774300832e3135f5R805, which attempts to simplify matters by reducing the foreach cases to seven shapes.
>
> To paraphrase Alan Perlis: "If your foreach has seven possible 
> shapes, you probably missed some."
>
> Over time, things did get better. We became adept of things 
> such as lowering, and we require precision in DIPs.
>
> I very strongly believe that this complexity, if unchecked, 
> will kill the D language. It will sink under its own weight. We 
> need a precise, clear definition of the language, we need a 
> principled approach to defining and extending features. The 
> only right way to extend D at this point is to remove the odd 
> failure modes created by said cavalier approach to doing 
> things. Why can't I access typeid during compilation, when I 
> can access other pointers? Turns out typeid is a palimpsest 
> that has been written over so many times, even Walter cannot 
> understand it. He told me plainly to forget about trying to use 
> typeid and write equivalent code from scratch instead. That 
> code has achieved lifetime employment.
>
> And no compiler tricks. I am very, very opposed to Walter's 
> penchant to reach into his bag of tricks whenever a difficult 
> problem comes about. Stop messing with the language! My 
> discussions with him on such matters are like trying to talk a 
> gambler out of the casino.
>
> That is a long way to say I am overwhelmingly in favor of 
> in-language solutions as opposed to yet another addition to the 
> language. To me, if I have an in-language solution, it's game, 
> set, and match. No need for language changes, thank you very 
> much. An in-language solution doesn't only mean no addition is 
> needed, but more importantly it means that the language has 
> sufficient power to offer D coders means to solve that and many 
> other problems.
>
> I almost skipped a heartbeat when Adam mentioned - 
> mid-sentence! - just yet another little language addition to go 
> with that DIP for interpolated strings. It would make things 
> nicer. You know what? I think I'd rather live without that DIP.
>
> So I'm interested in exploring reification mainly - 
> overwhelmingly - because it already works in the existing 
> language. To the extent it has difficulties, it's because of 
> undue limitations in CTFE and introspection. And these are 
> problems we must fix anyway. So it's all a virtuous circle. 
> Bringing myself to write `reify` and `dereify` around my types 
> is a very small price to pay for all that.

My thanks for the effort apparent in your composition.  The main 
lesson I take from it is that language primitives should be 
chosen using as much light as we can bring to bear.

I believe that language additions can be used to rebase the 
language on a better set of primitives.  I believe that there is 
more to learn about meta programming and the language features 
that support it and that the D platform may be the best place to 
learn.  Finally, I believe that a postulated ideal of "a small 
set of primitives" will always be better than any actual set 
derived from experience on the frontier.

Perhaps because of your strong stand against additions generally, 
you've not taken a serious look at type functions in particular.  
I say that you've not taken a serious look because I don't have 
another way to understand your writings on the reify/dereify 
design.

If you elect to investigate type functions, I would very much 
like to hear your opinion.  I see type functions as an "it just 
works", "rebasing" addition that lets us displace inappropriate 
use of templates now, and that may illuminate additional 
simplification opportunities with use.

If you do not elect to investigate type functions, I expect we'll 
next interact on a different topic.

Again, my thanks for the thoughtful composition.



More information about the Digitalmars-d mailing list