Performant method for reading huge text files
Rene Zwanenburg
renezwanenburg at gmail.com
Mon Feb 3 15:44:28 PST 2014
I'm running into a problem I've come across before but never
found a satisfactory solution for.
There's a pretty large ascii file I need to process, currently
about 3GB but size will increase in the future. D's ranges in
combination with std.algorithm are simply perfect for what I'm
doing, and it's trivial to write nice code which doesn't load the
entire file into memory.
The problem is speed. I'm using LockingTextReader in std.stdio,
but it't not nearly fast enough. On my system it only reads about
3 MB/s with one core spending all it's time in IO calls.
Sadly I need to support 32 bit, so memory mapped files aren't an
option. Does someone know of a way to increase throughput while
still allowing me to use a range API?
More information about the Digitalmars-d-learn
mailing list