Out of memory error (even when using destroy())
Jordan Wilson via Digitalmars-d-learn
digitalmars-d-learn at puremagic.com
Fri May 26 01:28:05 PDT 2017
On Friday, 26 May 2017 at 06:31:49 UTC, realhet wrote:
> Hi,
>
> I'm kinda new to the D language and I love it already. :D So
> far I haven't got any serious problems but this one seems like
> beyond me.
>
> import std.stdio;
> void main(){
> foreach(i; 0..2000){
> writeln(i);
> auto st = new ubyte[500_000_000];
> destroy(st); //<-this doesnt matter
> }
> }
>
> Compiled with DMD 2.074.0 Win32 it produces the following
> output:
> 0
> 1
> 2
> core.exception.OutOfMemoryError at src\core\exception.d(696):
> Memory allocation failed
>
> It doesn't matter that I call destroy() or not. This is ok
> because as I learned: destroy only calls the destructor and
> marks the memory block as unused.
>
> But I also learned that GC will start to collect when it run
> out of memory but in this time the following happens:
> 3x half GB of allocations and deallocations, and on the 4th the
> system runs out of the 2GB
> limit which is ok. At this point the GC already has 1.5GB of
> free memory but instead of using that, it returns a Memory
> Error. Why?
>
> Note: This is not a problem when I use smaller blocks (like
> 50MB).
> But I want to use large blocks, without making a slow wrapper
> that emulates a large block by using smaller GC allocated
> blocks.
>
> Is there a solution to this?
>
> Thank You!
I believe the general solution would be to limit allocation
within loops (given the issue Johnathan mentioned).
This I think achieves the spirit of your code, but without the
memory exception:
ubyte[] st;
foreach(i; 0..2000){
writeln(i);
st.length=500_000_000; // auto = new ubyte[500_000_000];
st.length=0; // destory(st)
st.assumeSafeAppend;
// prevent allocation by assuming it's ok to overrwrite what's
currently in st
}
More information about the Digitalmars-d-learn
mailing list