Getting the overload set of a template

Alex sascha.orlov at gmail.com
Mon Apr 23 16:52:11 UTC 2018


On Monday, 23 April 2018 at 16:16:09 UTC, Arafel wrote:
> ```
> import std.meta;
>
> void main()
> {
>     pragma(msg, __traits(getMember, A, "Foo1").stringof); // 
> Foo1(int N) if (N & 1)
>     pragma(msg, __traits(getAttributes, __traits(getMember, A, 
> "Foo1"))[0]); // tuple("int", "odd")
>     alias f1a = Instantiate!(__traits(getMember, A, "Foo1"), 
> 1); // This is expected
>     pragma(msg, f1a); // A
>     alias f1b = Instantiate!(__traits(getMember, A, "Foo1"), 
> "+"); // Why would I know that I can even instantiate?? Also, 
> can I haz UDA plz?
>     pragma(msg, f1b); // B
> }
>
> class A {
>     @("int", "odd")
> 	template Foo1(int N) if (N & 1)    {
>         enum Foo1 = "A";
>     }
>     @("string", "+")
> 	template Foo1(string op) if (op == "+") {
>         enum Foo1 = "B";
>     }
> }
> ```

I'm not arguing about the case of different interfaces. It is 
more or less ok, as from different argument types it will be 
unambiguous which template will be instantiated. It is the case 
of differentiating templates by their structure and/or 
constraints.

In this case, it is almost sure, that more then one form of 
implementation exists. However, the forms will yield the same 
semantic result. And I'm wondering why the implementation form 
alone leads to differentiation.


More information about the Digitalmars-d-learn mailing list