Can assumeSafeAppend() grab more and more capacity?
Ali Çehreli via Digitalmars-d-learn
digitalmars-d-learn at puremagic.com
Mon Jun 5 16:17:46 PDT 2017
On 06/05/2017 03:16 PM, ag0aep6g wrote:
> The spec says [1]: "one may use the .capacity property to determine how
> many elements can be appended to the array without reallocating." So the
> space indicated by `.capacity` is reserved for the array.
Cool. Thanks!
>> 3) Bonus: Shouldn't the array specialization of std.algorithm.remove
>> call assumeSafeAppend if the array has capacity to begin with? (The
>> equivalent of following code?)
>>
>> const oldCap = arr.capacity;
>> // ... do std.algorithm.remove magic on arr ...
>> if (oldCap) {
>> arr.assumeSafeAppend();
>> }
>>
>> I'm aware that there can be multiple slices with non-zero capacity
>> until one of them grabs the capacity for itself but it's ok for
>> remove() to give the capacity to just one of them.
>
> Seems safe, but you'll have to justify claiming the capacity like that.
My justification was that it feels to be a bug anyway to have multiple
slices to data where one is about to remove() elements from (hence
jumbling the others' elements). My thinking was, if capacity were not
guaranteed for any slice to begin with, then why not pull it under some
slices arbitrarily. But I agree with you that remove() should still not
decide on its own.
However, I've noticed an inconsistency when writing the previous
paragraph: If capacity is guaranteed reserved space, multiple slices
start their lives with a lie! :) From my earlier program:
auto a = [ 1, 2, 3, 4 ];
auto b = a;
Both of those slices have non-zero capacity yet one of them will be the
lucky one to grab it. Such semantic issues make me unhappy. :-/
Ali
More information about the Digitalmars-d-learn
mailing list