why does array appending add a page size instead of doubling ?

timotheecour thelastmammoth at gmail.com
Sat Feb 2 18:22:30 PST 2013


What's the rationale behind array appending adding a page size to 
capacity instead of doubling, after the size of the array reaches 
a certain size (cf what std::vector does) ? That seems like an 
example of (evil?) early optimization.

----
T[]a;
int n=some big value;
foreach(i;0..n)
     a~=T.init;
----
Depending on T.sizeof or n, the code above will be linear 
complexity in n (while capacity remains below page size) or 
quadratic (while capacity increments are one page at a time).

The C++ version doesn't have this problem, as doubling is always 
the case (in most implementations at least).
----
std::vector<T>a;
int n=some big value;
for(int i=0;i<n;i++)
     a.push_back(T());
----

Note that we can't always use a.reserve(n) as in more complex 
examples, we don't know final size. And appender is more 
restrictive. Would love some explanations!


More information about the Digitalmars-d-learn mailing list