So, User-Defined Attributes

Timon Gehr timon.gehr at gmx.ch
Sun Jan 6 19:27:04 PST 2013


On 01/07/2013 01:38 AM, Walter Bright wrote:
> On 1/6/2013 2:24 PM, deadalnix wrote:
>> On Saturday, 5 January 2013 at 22:14:47 UTC, Walter Bright wrote:
>>> On 1/5/2013 2:06 PM, Philippe Sigaud wrote:
>>>> But why is @(MyType) accepted, whereas @(int) is not?
>>>
>>> Because it's looking for an expression inside the parents, and int is
>>> not an
>>> expression.
>>
>> And mytype is an expression ?????
>
> Parsing happens before semantic analysis. Hence, MyType looks like an
> expression.

Sure, that is how the compiler currently works. The compiler is an 
inadequate reference at this point. (Eg. I am still reducing the massive 
breakage introduced by 2.061 regressions. Mostly 'forward reference' 
errors -- mentioned nowhere in the spec, and seemingly introduced in 
order to 'fix' ICEs.)

Why does it make sense from the standpoint of language design?

The compiler should obviously use the part of the parser that parses 
template arguments to parse UDA's. I am surprised this is not what is done.

Also, this particular problem would not exist in the first place if 
basic types just were treated like expressions in the parser. I am not 
familiar with DMD, but I bet the special casing of built-ins leaves an 
ugly trail in the code base.


More information about the Digitalmars-d mailing list