Cannot compare object.opEquals is not nogc

Lodovico Giaretta via Digitalmars-d-learn digitalmars-d-learn at puremagic.com
Sun Jul 24 02:03:04 PDT 2016


On Sunday, 24 July 2016 at 02:17:27 UTC, Rufus Smith wrote:
> This just isn't right. What your saying is that because someone 
> screwed up, we must live with the screw up and build everyone 
> around the screw up. This mentality is why everyone is so 
> screwed up in the first place, do you not see that?
>
> And I think you really have a misconception about the GC vs 
> nogc. One can rewrite GC code, such as an GC based opEquals, 
> without limitations. They can allocate on the stack, use malloc 
> and free when done, etc. opEquals generally doesn't have state. 
> So you it is possible to around around. It's probably always 
> possible to rewrite a GC opEquals to use nogc without too much 
> difficulty.

Now you are telling me to "program by trust", because there's 
nothing ensuring that I remember to free everything I allocated 
with malloc/free, while a GC would guarantee no memory leaks. 
Again there's nothing stopping me from returning pointers to 
things allocated on the stack. And now there are lots...
Before you told me that programming by trust is a wrong attitude, 
and now you propose me to use it, risking memory leakage in a 
function that may be executed hundreds of times per second.

> BUT, it is impossible to use a GC opEquals in nogc code! This 
> means there is no work around. What you claim is that we accept 
> the impossiblity(which makes nogc useless) just to avoid having 
> to rewrite some GC opEquals code. We can't rewrite the nogc 
> side, it's set in stone. We are screwed from the start when we 
> attempt to do nogc code because at some point we will have to 
> do comparisons. It's the same problem with purity and any other 
> transitive relationship.
>
> All "workarounds" are just as limited because basically we 
> added the relationship if A is nogc and A uses B, then B must 
> be nogc. Yet, we start with B is GC. Hence we never ever have A 
> use B, because A's can only use nogc.
>
> To see it simpler, What if everything in D was GC based. All 
> code was marked GC(even statements, types, etc). Do you agree 
> that nogc would be absolutely useless? We couldn't build up 
> anything because we couldn't include any code. This is the 
> extreme case.
>
> Conversely, if everything was built up using nogc, we could 
> write GC based code just fine, could we not?

No. If you put a big @nogc attribute on Object.opXXX, then nobody 
can write GC code in his classes. So if everything is @nogc, you 
cannont write GC code, because it woudn't interact with Phobos. 
Example: if you mark an algorithm that takes a delegate @nogc, 
then you cannot pass GC delegates to it. So you cannot use it in 
GC code.

> Therefore, claiming that we stay with GC based code just 
> prevents using more and more nogc code. The more GC based code 
> D gets, the less useful nogc gets and we are back were we 
> started.
>
> Since nogc is more critical in the foundational layers, as it 
> affects everything built on it, all core features should be 
> nogc. This way, the user isn't can decide when to break away 
> from the GC code, which will only affect everything after that 
> point.

Yes. All building blocks must be as much @nogc as possible. But 
customization points (virtual functions, delegate arguments, ...) 
must not be @nogc, otherwise it is no possible to have classes 
that use the GC or callbacks that use the GC.

> This is a one way street, pretending it is two way only 
> enriches the lawyers and eventually makes everyone unhappy. 
> Making the D dependent on the GC was a mistake that many wasted 
> man hours will go in to trying to unravel.  Trying to carry on 
> this mistake just wastes more hours.
>
> I understand that it is a mess, but it got that way from the 
> mistake in the first place, not from trying to undo the 
> mistake(which is illogical because there would be no nogc if 
> there wasn't a gc in the first place).  I also understand that 
> there is some desire to keep things "backwards compatible". 
> This is also a mistake, not only does it prolong the pain and 
> suffering but is irrational. 1. A fork can be made. Those 
> people that have based their code on the GC can continue using 
> an older version. Their code works at that point, does it not? 
> So just stop going down that dead end path. They have what they 
> need, it's not like they will lose anything(virtually nothing 
> except in most cases). Moving in the correct direction is 
> always better, regardless of who's panties get in a wad.

I still don't understand why you want Object.opXXX @nogc. As I 
already said, you can still make your functions @nogc, just 
accepting parameters of @nogc types. It's obvious. If I wrote a 
wonderful library that uses the GC, you will not use it. If I 
have a class that uses the GC in opXXX (and I am free to have it, 
because maybe I need it, and maybe it's the most efficient way 
for my use case), you will not use it. The same applies here. 
You'll have your algorithms work only on classes that declare 
opXXX as @nogc.

Not all memory allocation patterns are good for malloc/free. Not 
all of them are good for stack allocations. Some of them are not 
even good for reference counting. Every class shall use the best 
solution for its job. And everybody must still be able to extend 
the base class.
If you want to use a method specific to a subclass, you downcast. 
If you want to use the @nogc opXXX when the base does not enforce 
it, you downcast. It's the same principle: more advanced 
functionalities require more derived types (and @nogc is more 
derived, because it is covariant to not- at nogc). Basic OOP.


More information about the Digitalmars-d-learn mailing list