const?? When and why? This is ugly!

Jason House jason.james.house at gmail.com
Sun Mar 8 08:08:28 PDT 2009


Christopher Wright Wrote:

> Burton Radons wrote:
> >    int [] a = new int [1];
> > 
> >    a [0] = 1;
> > 
> >    auto b = cast (invariant (int) []) a;
> > 
> >    a [0] += b [0];
> >    a [0] += b [0];
> >    writef ("%s\n", a [0]);
> >    // Normal result: 4.
> >    // Optimiser which assumes invariant data can't change: 3
> > 
> > Yes, the code is an abuse of the const system. THAT'S EXACTLY MY POINT. Casting mutable data to invariant leads to situations like these. Only data which will never change can be made invariant. Putting "alias invariant (char) [] string" in object.d induces these situations and makes it seem like it's a good idea.
> 
> You're going into undefined territory and complain that it doesn't work 
> as you expect. Perhaps that should issue a warning, but you're doing 
> something wrong and bad: you're saying that the same array is both 
> mutable and immutable.
> 
> Think of the other approach: once you cast an array to invariant, the 
> compiler finds all aliases of that array and turns them invariant. You'd 
> be even more upset in that case. It would have long-reaching effects 
> that are hard to track down.
> 
> Or you could forbid the cast. But it's a useful cast, and you really 
> can't get rid of it if you ever want to convert something that is 
> mutable to something that is invariant without copying.

A cast could be avoided if the compiler could track unique mutable references. Then assignment to invariant could be done implicitly and make the mutable reference no longer exist. This would require escape analysis for mutable references. I like allowing implicit invariant casts, but I seem to be in the minority. Doing that brings further language complexity.



More information about the Digitalmars-d mailing list