Reducing template constraint verbosity? [was Re: Slides from my ACCU Silicon Valley talk]
Max Samukha
spambox at d-coding.com
Tue Dec 14 09:26:34 PST 2010
On 12/14/2010 06:09 PM, Andrei Alexandrescu wrote:
> On 12/14/10 10:08 AM, Steven Schveighoffer wrote:
>>
>> On Tue, 14 Dec 2010 10:47:05 -0500, Andrei Alexandrescu
>> <SeeWebsiteForEmail at erdani.org> wrote:
>>
>>> On 12/14/10 7:33 AM, biozic wrote:
>>
>>>> I have a question about this and some pieces of code in the standard
>>>> library, notably std.algorithm: some of the templated functions use
>>>> template constraints even when no template overloading is taking place.
>>>> Wouldn't some static asserts help print more accurate messages when
>>>> these functions are misused?
>>>>
>>>> Nicolas
>>>
>>> The intent is to allow other code to define functions such as e.g.
>>> "map" or "sort". Generally any generic function should exclude via a
>>> constraint the inputs it can't work on. That way no generic function
>>> chews off more than it can bite.
>>
>> Redirected from another thread.
>>
>> Having written a few of these functions with template constraints, I
>> wondered if there was ever any discussion/agreement on reducing
>> verbosity when specializing template constraints?
>>
>> For instance, if you want two overloads of a template, one which accepts
>> types A and one which accepts types B where B implicitly converts to A
>> (i.e. a specialization), you need to explicitly reject B's when defining
>> the overload for A's. For example:
>>
>>
>> void foo(R)(R r) if(isRandomAccessRange!R) {...}
>>
>> void foo(R)(R r) if(isInputRange!R && !isRandomAccessRange!R) {...}
>>
>>
>> It seems redundant to specify !isRandomAccessRange!R in the second
>> overload, but the compiler will complain otherwise. What sucks about
>> this is the 'definition' of the first overload is partially in the
>> second. That is, you don't really need that clause in the second
>> overload unless you define the first one. Not only that, but it makes
>> the template constraints grow in complexity quite quickly. Just look at
>> a sample function in std.array that handles 'the default' case:
>>
>>
>> void popFront(A)(ref A a) if(!isNarrowString!A && isDynamicArray!A &&
>> isMutable!A && !is(A == void[]))
>>
>> Any idea how this can be 'solved' or do we need to continue doing things
>> like this? My naive instinct is to use the declaration order to
>> determine a match (first one to match wins), but that kind of goes
>> against other overloads in D.
>
> I thought of a number of possibilities, neither was good enough. I
> decided this is a small annoyance I'll need to live with.
>
> Andrei
Should we think about making constrained templates more specialized than
unconstrained? For example, the use case when I need one or more
constrainted templates and an unconstrained catch-the-rest template pops
up quite often. Consider the current behavior:
void foo(T : int)(T x)
{
}
void foo(T)(T x) // catches types not implicitly castable to int
{
}
foo(1); // ok
But:
void foo(T)(T x) if (is(T : int))
{
}
void foo(T)(T x)
{
}
foo(1);
Error: template test.foo(T) if (is(T == int)) foo(T) if (is(T == int))
matches more than one template declaration, test.d(7):foo(T) if (is(T ==
int)) and test.d(11):foo(T)
Also note that the compiler doesn't detect the ambiguity if there are
constraints in the template parameter list:
void foo(T : int)(T x)
{
}
void foo(T)(T x) if (is(T == int))
{
}
foo(1); // first template is instantiated
Essentially, interactions between template parameter constraints and
if-constraints are not specified. What do you think?
More information about the Digitalmars-d
mailing list