-v1 doesn't assume std.gc.setV1_0()

Chris Miller chris at dprogramming.com
Wed Feb 21 12:28:12 PST 2007


Should -v1 assume std.gc.setV1_0()? I don't believe it does.

Some objects in some of my programs are being collected too early unless I  
call std.gc.setV1_0(). I don't believe I'm doing any pointer voodoo  
either, but it's possible.

Perhaps as a compromise, -v1 could assume std.gc.setV1_0() as long as the  
new GC isn't considered stable.



More information about the Digitalmars-d mailing list