assumeSafeAppend and purity

Jonathan M Davis jmdavisProg at
Mon Feb 6 21:35:14 PST 2012

On Monday, February 06, 2012 21:40:55 Steven Schveighoffer wrote:
> I thought of a better solution:
> pure T[] pureSafeShrink(T)(ref T[] arr, size_t maxLength)
> {
>     if(maxLength < arr.length)
>     {
>         bool safeToShrink = (arr.capacity == arr.length);
>         arr = arr[0..maxLength];
>         if(safeToShrink) arr.assumeSafeAppend(); // must workaround purity
> here
>     }
>     return arr;
> }
> This guarantees that you only affect data you were passed.

Does it really? What if I did this:

auto arr = new int[](63);
auto saved = arr;
assert(arr.capacity == 63);
assert(saved.capacity == 63);
pureSafeToShrink(arr, 0);

This happens to pass on my computer, though the exact value required for the 
length will probably vary. So, a slice of the data which is now supposed to be 
no longer part of any array still exists.

Also, given that allocating a new array and then immediately trying to shrink 
it with pureSafeShrink will only use assumeSafeAppend if you just so happen to 
have picked a length that lines up with the block size allocated makes it 
pretty much useless IMHO. I'm only going to use assumeSafe append if I _know_ 
that it's safe. pureSafeShrink is therefore trying to protect me when I don't 
need it and is ruining the guarantees that assumeSafeAppend gives me, since 
it's only better than arr = arr[0 .. maxLength]; if the array just so happens 
to have the same length as its capacity.

So, I don't think that this function really buys us anything. I'm inclined to 
just make assumeSafeAppend pure.

- Jonathan M Davis

More information about the Digitalmars-d mailing list