Ternary Search Trees

Robert Fraser fraserofthenight at gmail.com
Fri Apr 24 07:01:43 PDT 2009


Robert Fraser wrote:
> bearophile wrote:
>> Does someone has some need for Ternary Search Trees into Phobos (for 
>> D1. And eventually later for D2 too)?
>> TSTs allow to find keys, key prefixes, or even keys with holes. Keys 
>> are arrays of T, where T is the template type.
>> They can be designed to store the keys alone, or as an associative 
>> data structure.
>>
>> With some benchmarks I have seen that a simple TST implementation is 
>> about as fast as the built-in AAs of D (but much slower than Python 
>> dicts).
>>
>> Bye,
>> bearophile
> 
> I implemented a version of the TST you posted using Tango + D1... Here 
> are my results:
> 
> Test                Part                Mean      Median    Max
> ----                ----                ----      ------    ---
> TST                 Insertion           0.0428    0.0338    0.0886
> TST                 Iteration           0.0022    0.0022    0.0024
> TST                 Lookup              0.0225    0.0223    0.0237
> HashMap             Insertion           0.0621    0.0421    0.2205
> HashMap             Iteration           0.0035    0.0034    0.0036
> HashMap             Lookup              0.0169    0.0168    0.0184
> TreeMap             Insertion           0.1045    0.1041    0.1058
> TreeMap             Iteration           0.0041    0.0041    0.0044
> TreeMap             Lookup              0.0895    0.0892    0.0917
> AssocArray          Insertion           0.0262    0.0262    0.0268
> AssocArray          Iteration           0.0015    0.0015    0.0016
> AssocArray          Lookup              0.0130    0.0129    0.0132
> 
> (TreeMap and HashMap are in tango,util.container, AssocArray is the 
> built-in D associative array; testing code is attached. Compiled with -O 
> -release -inline using DSSS 0.78 and DMD 1.043 on Windows XP x86 SP3).
> 
> Indeed, TSTs seem to beat Red-Black BSTs (the implementation Tango's 
> TreeMap uses) quite handily, and have faster insert times than hash 
> maps, though slower lookup. The big winner here, though, appears to be 
> D's built-in associative arrays. I thought they were supposed to be very 
> slow, but Tango's implementation, at least, looks pretty good (without 
> any re-hashing).
> 
> For order-sensitive collections, I'm definitely using TSTs; I'm sold on 
> the concept. Not only are they faster, allowing prefix search could be 
> very useful. However, the times here are very low (the word file is only 
> 1.1MB, and my system is fairly fast, though this is memory-dependent not 
> CPU-dependent)... I'll try with a larger word list & see what results I 
> get.
> 

Results are similar on a much larger word list:
Test                Part                Mean      Median    Max
----                ----                ----      ------    ---
TST                 Insertion           0.1906    0.1630    0.2495
TST                 Iteration           0.0032    0.0031    0.0035
TST                 Lookup              0.1650    0.1651    0.1662
HashMap             Insertion           0.1637    0.1504    0.2679
HashMap             Iteration           0.0029    0.0028    0.0030
HashMap             Lookup              0.1212    0.1210    0.1241
TreeMap             Insertion           0.6133    0.6120    0.6276
TreeMap             Iteration           0.0035    0.0035    0.0035
TreeMap             Lookup              0.6310    0.6292    0.6446
AssocArray          Insertion           0.1131    0.1129    0.1154
AssocArray          Iteration           0.0014    0.0014    0.0016
AssocArray          Lookup              0.0939    0.0928    0.1045



More information about the Digitalmars-d mailing list