D array expansion and non-deterministic re-allocation

Bartosz Milewski bartosz-nospam at relisoft.com
Wed Nov 25 20:30:41 PST 2009


Andrei Alexandrescu Wrote:

> How about creating a struct Value!T that transforms T (be it an array or 
> a class) into a value type? Then if you use Value!(int[]), you're 
> effectively dealing with values throughout (even though internally they 
> might be COWed). Sometimes I also see a need for DeepValue!T which would 
> e.g. duplicate transitively arrays of arrays and full object trees. For 
> the latter we need some more introspection though. But we have 
> everything in the laguage to make Value and DeepValue work with arrays.
> 
> What do you think?

I'm afraid this would further muddle the message: "If you want safe arrays, use the Value device; if you want to live dangerously, use the built in type." I'd rather see the reverse: D arrays are safe to use. They have the usual reference semantics of static arrays. But if you expand them, the sharing goes away and you get a unique reference to a copy. This is the "no gotcha" semantics, totally predictable, easy to reason about. 

How the compiler supports that semantics while performing clever optimizations is another story. It's fine if this part is hard. The language can even impose complexity requirements, if you are sure that they are possible to implement (it's enough to have one compliant implementation to prove this point). 

By the way, what are the library algorithms that rely on O(1) behavior of array append?



More information about the Digitalmars-d mailing list