Yearly "can we please have void foo() auto" post

Paul Backus snarwin at gmail.com
Fri Jul 28 00:14:15 UTC 2023


On Thursday, 27 July 2023 at 22:08:27 UTC, Steven Schveighoffer 
wrote:
> I think it would be nicer to specify exact attributes than to 
> "opt out" of inference.
>
> For e.g. `@safe`, we are covered, because there is `@system`. 
> For `nothrow`, there is `throw` (I think?). But for `@nogc` and 
> `pure`, there is no opposite.
>
> I would like to see some mechanism or additional attributes 
> that make it so you can specify what should not be inferred 
> before turning on full inference.

I'm 100% in favor of adding `@gc` and `@impure`. Unfortunately, 
every time this topic comes up, the discussion degenerates into 
endless bikeshedding and no progress is made.

IMO this would be a perfect opportunity for Walter to exercise 
his BDFL power and push through a decision unilaterally.

> One large problem with all this is people who "don't care" 
> about attributes. Of course, now their code will infer 
> attributes! But that has become part of the public API. If they 
> change something to change the attribute inference, all of a 
> sudden code that is marked explicitly will stop compiling, and 
> the author will say "too bad, not my fault".

Yes, this is the main pain point of universal attribute 
inference. OTOH, with the current system, the pain point is that 
you often can't even use a library in the first place, because 
the author forgot to add an explicit attribute that the compiler 
*could* have inferred--and the author will say, "not my problem, 
I don't care about attributes."

Ultimately, I think universal inference comes out on top, because 
"I can't upgrade `<package>`" is a less severe failure mode than 
"I can't use `<package>` at all." But I accept that reasonable 
people can disagree on this.

> I think the only way to fix this is to only infer when 
> explicitly stated.
>
> And to that effect... we could just make the storage class 
> `auto` mean that, like the OP says.
>
> I mean, we already allow `static void foo()`, why not `auto 
> void foo()`?

Requiring the programmer to take *any action at all* to enable 
inference is too much of an obstacle. People naturally take the 
path of least resistance; as language designers, it's our job to 
make that path the correct one.

Not to mention, there is a non-trivial amount of existing D code 
that would likely never be updated to use `auto` for inference, 
but *would* benefit from having inference enabled by default.


More information about the Digitalmars-d mailing list