[just talk] what if implicitly typed literals were disallowed
H. S. Teoh
hsteoh at quickfur.ath.cx
Wed Oct 24 16:17:40 PDT 2012
On Thu, Oct 25, 2012 at 12:55:36AM +0200, Adam D. Ruppe wrote:
> On Wednesday, 24 October 2012 at 19:00:03 UTC, Timon Gehr wrote:
[...]
> >If your goal is to be able to customize the type of literals from
> >druntime, it is also possible to make the compiler invoke some
> >predefined templates/functions in order to transform literals to an
> >arbitrary library-defined type.
>
> OOooh, I like it. Then kill off the keywords while you're at it and
> boom.
But there might be instances where you want to refer to the built-in
types. Having the keywords around (perhaps suitably renamed) seems to be
needed still.
> alias _int int;
> /* repeat for the rest */
>
> template __d_literal(T, T value) {
> alias value __d_literal;
> }
>
> and you have the current behavior, but complete customization
> potential. That's potentially awesome. You could even wrap literals
> just to identify them and overload functions on them.
>
> whoa i kinda want.
I like this idea too. Coupled with CTFE, it can potentially let you
rewire the language in new and novel ways. I mean, think about this one:
initialize a complicated multi-dimensional array with a custom string
notations, expressed as a token string, and have your custom
class/struct transform that into compile-time initialization of said
array.
And what about _transparent substitution_ of AA literals for a custom
hash implementation? You could even make std.container take AA literals
as initializers for hashes, and have CTFE transform that into the
appropriate ctor/init calls. You could use AA literals for all kinds of
different user-defined types. Use AA literals to implement a *library*
solution for Python's named-parameter function calling, by turning them
into the appropriate tuples.
You might even be able to make BigNum literals without using strings (or
am I dreaming too much about this one?).
Library implementation of quad-precision floats.
Or library implementation of int.isPrime, which computes an int
literal's primality at compile-time?
(P.S. My perl script doth mock me with the random signature it selected
below. :-P)
T
--
Just because you can, doesn't mean you should.
More information about the Digitalmars-d
mailing list