A program leaking memory.

Johan Grönqvist johan.gronqvist at gmail.com
Mon Mar 13 06:56:27 PST 2006


I have now transcribed a program from c++ to D, but when I run it, it
uses all memory and crashes.

I try to minimise a function of many variables iteratively using a
steepest descent method. The iteration steps are performed in a function
named step.

Step uses two vectors, and these are allocated anew in each step. After
the step-function has finished, I expected these vectors to be
garbage-collected and disappear.

I allocate vectors with:
  real[] grad;
  grad.lenth=1000;
  for (uint ix = 0; ix<grad.lenth; ix++) grad[ix] = gradient(ix);

I the use the vectors but do nothing to deallocate them or indicate that
I am done using them.

Memory usage grows until the program need too much memory. My
interpretation is that new vectors are allocated at each iteration, but
the old vectors are not freed.

The behaviour is the same using DMD and GDC.

Thanks in advance

/ johan



More information about the Digitalmars-d-learn mailing list