Ruling out arbitrary cost copy construction?

Andrei Alexandrescu SeeWebsiteForEmail at erdani.org
Wed Oct 6 16:03:45 PDT 2010


On 10/6/10 16:38 CDT, Michel Fortin wrote:
> On 2010-10-06 16:14:58 -0400, Andrei Alexandrescu
> <SeeWebsiteForEmail at erdani.org> said:
>
>> On 10/6/10 13:59 CDT, Michel Fortin wrote:
>>> Could algorithms use "move" when possible (when "front" returns a ref)
>>> and do a copy when move isn't available? That'd be a good compromise I
>>> think.
>>>
>>> T tmp = moveOrCopy(r1.front);
>>> r1.front = moveOrCopy(r2.front);
>>> r2.front = moveOrCopy(tmp);
>>
>> Then problem is, people sort ranges of their own objects and are
>> amazed at the unbearable performance.
>
> I'll repeat myself in clearer terms:
>
> I agree with you that copy-constructor should be of linear complexity.

But I'm asking for constant complexity. We can only simplify if copy 
construction has no impact on overall complexity.

> I agree that you should use a copy when you can't do a move (when front
> doesn't return a ref).

But that makes it impossible for an algorithm to guarantee good 
complexities.

> I disagree that you should use a copy even when you can do a move (when
> front *does* return a ref).

Alright.

> Therefore, I suggest that you use "move" when you can (when front
> returns a ref) and fallback to a copy otherwise (when front doesn't
> return a ref). Hence "moveOrCopy", which moves when it can move and
> copies when it cannot move.
>
> How can this result in poorer performances than to always copy? It can
> only increase performance because you're copying less. (I understand
> that it was probably just a misunderstanding of my suggestion.)

It doesn't lead to poorer performance than copying. The only issue is 
that now you need to carry cost of copy everywhere as a possible 
term/factor. Also, many algorithms would approach things differently if 
the copy were O(n). This complicates, not simplifies, things.


Andrei



More information about the Digitalmars-d mailing list