XML Benchmarks in D

BCS ao at pathlink.com
Fri Mar 14 21:43:27 PDT 2008


Reply to Alexander,

> BCS wrote:
> 
>> Not as I understand it (I looked this up about a year ago so I'm a
>> bit rusty). on 32bits, you can't map in 4GB because you need space
>> for the programs code (and on windows you only get 3GB of address
>> space as the OS gets that last GB) Also what about a 10GB file? My
>> idea is to make some sort of lib that lest you handle larges data
>> sets (64bit?) You would ask for a file to be "mapped in" and then you
>> would get an object that syntactically looks like an array. Indexes
>> ops would actually map in pieces, slices would generate new objects
>> (with ref to the parent) that would, on demand, map stuff in. Some
>> sort of GCish thing would start un mapping/moving stings when space
>> gets tight. If you never have to actual convert the data to a "real"
>> array you don't ever need to copy the stuff, you can just leave it in
>> the file. I'm not sure it's even possible or how it would work, but
>> it would be cool. (and highly useful)
>> 
> I've got this strange feeling in my stomach that shouts out "WTF?!"
> when I read about >3-4GB XML files. I know, it's about the "if" and
> "whens", but still; if you find yourself needing such a beast of an
> XML file, you might possibly think of other forms of data structuring
> (a database, perhaps?).
> 

Truth be told, I'm not that far from agreeing with you (on seeing that I'd 
think: "WTF?!?!.... Um... OoooK.... well..."). I can't think of a justification 
for the lib I described if the only thing it would be used for would be a 
XML parser. It might be used for managing parts of something like... a database 
table. <G>





More information about the Digitalmars-d mailing list