Frist Draft (in this forum): Enum Parameters

Steven Schveighoffer schveiguy at gmail.com
Wed May 8 01:37:16 UTC 2024


On Tuesday, 7 May 2024 at 17:44:25 UTC, Paul Backus wrote:
> On Tuesday, 7 May 2024 at 16:34:52 UTC, Steven Schveighoffer 
> wrote:
>> I don't think this is going to be the case. Consider that 
>> without explicitly forwarding, the difficulty is not in 
>> preserving refness, but non-refness. If care is not taken 
>> (e.g. via using forward), then a non-ref parameter suddenly 
>> turns into a ref parameter (implicitly). This is not the case 
>> for enum parameters -- enumness is preserved whether it's not 
>> an enum or is an enum.
>
> If a single function in a call stack forgets to pass a given 
> parameter by `ref`, it will be copied. If your code relies on 
> the parameter not being copied (e.g., if it has a disabled copy 
> constructor), this failure to preserve `ref` results in a bug.
>
> If a single function in a call stack forgets to pass a given 
> parameter by `enum`, it will be evaluated at runtime. If your 
> code relies on the parameter being evaluable at compile time 
> (e.g., if it uses it as a template argument or in a `static if` 
> condition), this failure to preserve `enum` results in a bug.
>
> The two are exactly analogous.

If an entire ref chain is broken by one copy, then the 
*semantics* are different. That is, all of a sudden, you aren't 
affecting the original parameter.

If an entire enum chain is broken by one runtime evaluation, the 
*code does not compile*. This is either a loud error, or a silent 
acceptance *that does not affect correctness*. Of course, this 
assumes that the auto-enum in question does not do something 
drastically different based on enumness.

By example:

```d
void foo(auto ref int x) {
    bar(x);
}

void bar(int x) {
    baz(x); // oops, this does not affect the caller's x
}

void baz(auto ref int x) {
    ++x;
}

// and with enums:
void foo(auto enum int x) {
    bar(x);
}

void bar(int x) {
    baz(x);
}

void baz(auto enum int x) {
    writeln(x); // same whether bar is enum or not, no semantic 
difference
}
```

Yes, if `baz` accepts by enum only, you will get a *compiler 
error*. Contrast that with `ref`, where you get no indication 
that something is wrong, and it is semantically wrong.

>
>> Yes, generic code that wishes to forward enum-ness to another 
>> thing must use auto enum to accomplish this. However, 
>> comparing to the the current mechanism, things are pretty 
>> inferior. Switching from an enum parameter to a runtime 
>> parameter involves *changing the place where parameters are 
>> put*. You currently have to remember to create a nested 
>> template with an outer template that takes a tuple for this to 
>> work correctly (something which I doubt many generic functions 
>> do).
>>
>> Put another way, if the cost of having the convenience of enum 
>> parameters is that we must also allow auto enum, I don't think 
>> this is a large enough drawback.
>
> The argument here is essentially that it is worth it to make 
> the library *author's* job more difficult in order to give the 
> library *users* a better experience.

No, I think it makes both easier.

Consider the case of `writef`. We have two versions of `writef`, 
one which takes a string format via a runtime parameter, and one 
which takes a string format via compile time parameter.

They have to be called in different ways (burden on the user). 
You have to write two different functions, and overload them 
based on template constraints (burden on the developer).

```d
     void writef(alias fmt, A...)(A args)
     if (isSomeString!(typeof(fmt)))
     { ... }

     void writef(Char, A...)(in Char[] fmt, A args)
     { ... }

// vs

     void writef(Char, A...)(auto enum Char[] fmt, A args)
     // note the lack of template constraints needed
     { ... }
```

It also has the benefit of being easier to explain and easier to 
document.

> In general, I agree. But there's a point at which this argument 
> falls apart. If you make the library author's job so difficult 
> that they can no longer write correct code, it is library users 
> who will ultimately suffer.

I don't think this case has been made, even with `auto enum`.

> Personally, I think the `ref` storage class *already* crosses 
> this line. The vast majority of generic D code does not handle 
> non-copyable types correctly, and the design of `ref` is 100% 
> to blame for this. Since `enum` uses the exact same design as 
> `ref`, I think it's pretty reasonable to assume that it will 
> cause the same kinds of problems.

I can't see how `ref` is responsible for this. Non-copyable types 
are difficult to work with in general. Can you explain further?

>
> So, when I evaluate this DIP, here's the tradeoff I'm looking 
> at:
>
> * **Pro:** more convenient library APIs (e.g., `format`).
> * **Con:** more bugs in generic library code.

I have different opinions of these categories.

-Steve


More information about the dip.development mailing list