[xmlp] the recent garbage collector performance improvements

Michael Rynn michaelrynn at optusnet.com.au
Sat Feb 25 00:07:48 PST 2012


On Thu, 02 Feb 2012 15:44:56 +0000, Richard Webb wrote:

> 


With the xml package (xmlp) , and the linked node DOM, the GC is likely 
to fail cleanup.  I divided the generated test file, with its 2 layer 
elements, into 5, 50, 500, 5000 sized files.  I put in a mixin on the 
Node class to do static counts of Node objects created and deleted.  

I also set the document to be read multiple times, calling GC.collect 
between each one, and timed it, and the increase in read time, and the 
increase in GC.collect time, was strong evidence that the GC was unable 
to clean it up, even if, to my naive code inspection, the previous 
document instance was no longer reachable from GC "roots", ie, should 
have been directly replaced on stack.

The linkdom, rather too faithfully copies the java dom model, with double-
linked pointers.  Elements have an "owner document" (I felt like ditching 
this).  All element children derive from ChildNode (Element, Text, CDATA 
and Comment), and have a pointer to the parent_ node. (could ditch this, 
maybe for CharData).   Anyway, its a pretty linked up conventional model. 
Java GC, evidently has got around problems of GC xml documents. 

When I looked at C code for libxml2, for instance, all the Nodes have 
that sort of interlocked linkage. I tried once to see how far I could get 
(on windows) calling the libxml2 from D.  I did not get very far, and 
gave up after crashes and a headache. 

On the GC cleanup stats, with 5 nodes of the generated test was 
cleanable, but on 50,000, well, the poor GC could not delete a single 
Node object. I've only done this with dmd2_32 on windows. 64 bit may be 
different. Its also Github release of 2.058 - 162, and may not be in 
releasable state for GC.  

The original std.xml seemed to fair best, on GC cleanup, but still 
developed troubles on each larger document size.   std.xmlp.arraydom was 
similar.  Both of these have no back pointers to parents, use an array to 
hold children.

I thought of just nulling the backpointers, and wrote a version of an 
explode method. I tried this with the node version doing, or not doing a 
delete.  With a big document, it currently has to do a delete, or the GC 
will not clean up everything.

I turned the gc stats counters, into a mixin template, and put them on 
the std.xml1, and on the std.arraydom, and any intermediate objects from 
the parsing, that I wanted to confirm were not hanging around. For to 
many of them, I found it necessary to chase them down, and force delete 
(I now prefer explode, because delete implies a complete cleanup, and for 
me, explde means to split into little non-referencing pieces, that the GC 
can manage).

This means I use too many pointer linked structures for the GC. It may be 
just as difficult for RedBlack tree in std.container, I haven't checked.

For me one lesson at least, is, do not assume a set of interlocked 
classes, with circular references, can always be isolated in a single 
full GC collection, and properly deleted. For internal temporaries, that 
will never be seen by external code, it seems a good idea to explode them.

For large complex tree structures, it might be less CPU time to explode 
them, rather than this particular version of GC having to work it out 
with less information. Do explode again.  

After exploding, repeated runs and bigger loads ran optimally.


So I read with interest other posts on mechanisms of GC and how to 
integrate with the compiler to tag classes and stack layout, to know 
pointers precisely. I particularly liked the suggestion for mixin on each 
class, for GC layout. 

What could be done, even adhoc, is to do GC stats on various kinds of 
complex self referring classes, to examine the effectiveness of GC 
collection. This could even be a compiler flag, or version. It was useful 
for me to spot where I could reuse a class rather than reconstruct new. 

A lot of D applications may be run once, and GC doesn't matter so much.
But for long running games, and servers, verified GC behaviour will bring 
more confidence.

Aggressive measurement may allow better detection of where and when the 
GC fails, and allow better assessment of effects of GC code changes.


GC is a tough problem. A hard one to write a whole language based around 
it.





More information about the Digitalmars-d mailing list