Out of memory error (even when using destroy())

Jonathan M Davis via Digitalmars-d-learn digitalmars-d-learn at puremagic.com
Fri May 26 00:45:13 PDT 2017


On Friday, May 26, 2017 06:31:49 realhet via Digitalmars-d-learn wrote:
> Hi,
>
> I'm kinda new to the D language and I love it already. :D So far
> I haven't got any serious problems but this one seems like beyond
> me.
>
> import std.stdio;
> void main(){
>      foreach(i; 0..2000){
>          writeln(i);
>          auto st = new ubyte[500_000_000];
>          destroy(st); //<-this doesnt matter
>      }
> }
>
> Compiled with DMD 2.074.0 Win32 it produces the following output:
> 0
> 1
> 2
> core.exception.OutOfMemoryError at src\core\exception.d(696): Memory
> allocation failed
>
> It doesn't matter that I call destroy() or not. This is ok
> because as I learned: destroy only calls the destructor and marks
> the memory block as unused.
>
> But I also learned that GC will start to collect when it run out
> of memory but in this time the following happens:
> 3x half GB of allocations and deallocations, and on the 4th the
> system runs out of the 2GB
>   limit which is ok. At this point the GC already has 1.5GB of
> free memory but instead of using that, it returns a Memory Error.
> Why?
>
> Note: This is not a problem when I use smaller blocks (like 50MB).
> But I want to use large blocks, without making a slow wrapper
> that emulates a large block by using smaller GC allocated blocks.

It's likely an issue with false pointers. The GC thinks that the memory is
referenced when it isn't, because some of the values match the pointers that
would need to be freed.

> Is there a solution to this?

Use 64-bit. False pointers don't tend to be a problem with 64-bit, whereas
they can be with 32-bit - especially when you're allocating large blocks of
memory like that.

- Jonathan M Davis



More information about the Digitalmars-d-learn mailing list