Bulk allocation and partial deallocation for tree data structures.

Filip Bystricky via Digitalmars-d-learn digitalmars-d-learn at puremagic.com
Thu Jun 29 19:07:00 PDT 2017


Hello!

I'm implementing a persistent hash trie (like in clojure/scala). 
Every 'persisting' insertion involves allocating a fixed number 
(6) of nodes (each chunk is a fixed width ranging between 1 and 
~33 words).

Basically, this data structure always allocates a whole branch at 
a time, but then nodes are deallocated individually.

Is there a way to tell an allocator allocate n chunks at a time? 
Or, alternatively, is there a way to allocate all the memory 
needed at once, and then free just chunks of it at a time? It 
seems like this would provide at least some speed improvement. 
And it could also be useful for batch operations on other 
node-based data structures (such as adding ranges of nodes to a 
graph at a time).

Thanks!



More information about the Digitalmars-d-learn mailing list