Local functions infer attributes?

Manu via Digitalmars-d digitalmars-d at puremagic.com
Mon Sep 29 17:32:59 PDT 2014


On 30 September 2014 09:25, deadalnix via Digitalmars-d
<digitalmars-d at puremagic.com> wrote:
> On Monday, 29 September 2014 at 08:02:45 UTC, Andrei Alexandrescu
> wrote:
>>
>> I understand. The short answer to this is D cannot do that and we cannot
>> afford to change the language to make it do that.
>>
>> There are two longer answers.
>>
>
> I think this is because ref have several conflated meaning:
>   - "I want to mess up with the argument" (à la swap). This is the
> meaning it has right now.
>   - "Burrowing". Which is the same as previous behavior, except
> for classes and delegates, so the whole scope story, as they are
> natural "reference types".
>   - "Do not copy" aka const ref. This one is for performance
> reason. It doesn't really matter if a copy is made or not as the
> damn thing is const, but one want to avoid expensive copy when
> the thing passed down is fat.

I don't see ref that way at all. I see it so much simpler than that:
ref is a type of pointer. It's effectively T(immutable(*)).
It's uses are emergent from what it is; a good way to pass big things
around in argument lists, or share references to a single instance of
something.
It also offers an advantage due to the nature that the pointer is
immutable; you don't need the pointer syntax baggage (*, &) when
dealing with ref's, which is very convenient in user code.

I think all this 'meaning' is a mistake; all that does is confuse the
matter. You don't attribute such 'meaning' to a normal pointer, it's a
primitive type.
If it's defined by some conceptual rubbish, then when it's attempted
to be used in some situation that doesn't perfectly fit the conceptual
attribution, you find yourself in awkward logical situations with
weird edge cases.
If you just say "it's an immutable pointer, and follows value syntax
semantics for practical reasons", then it's very clear what it does.
It's also very clear what that means to the ABI, and relationships
with other languages.

A similar mistake from C is where 'int' doesn't mean '32bits', it
means some conceptual nonsense that seemed like a good idea at the
time, but the result is, you don't really know what 'int' is, and
everyone reinvents it so it works reliably (typedef int32 something).


> Each of them have their own set of cases where you want and do
> not want ref.

I agree, cases where ref should be used are non-trivial. It may be a
result of external factors, some user logic, or explicit request from
user code. The fact that it lives outside the type system makes that
whole reality very awkward indeed.



More information about the Digitalmars-d mailing list