So why was typedef bad?
Walter Bright via Digitalmars-d
digitalmars-d at puremagic.com
Sun Sep 4 01:23:33 PDT 2016
On 8/31/2016 7:31 AM, Ethan Watson wrote:
> On Wednesday, 31 August 2016 at 14:05:16 UTC, Chris Wright wrote:
>> Specifying the default value for the type.
>
> Alias has the same problem in this case.
Alias is not a different type from the underlying type. So having a different
default value makes no sense.
> I'm making a distinction here between a typedef and a type mimic here because
> C++ interop is a big factor in our usage, so mixing up concepts between a
> language that's meant to make that easy is not ideal. Although looking at
> std.typecons.Typedef, I'd wonder if a typemimic language feature would have been
> a better way to go...
I don't understand. C++ typedef and D alias are semantically equivalent.
D's typedef was dropped because nobody could come up with a coherent explanation
of how implicit conversions should work to/from the base type. This is not the
simple problem it appears to be. There are all kinds of issues, like how does
type deduction work, type inference, when are the types the same, when are they
different, promotion rules, etc. Furthermore, they add a lot of complexity to
generic templates.
We finally just abandoned it.
More information about the Digitalmars-d
mailing list