Can you please tell me what hardware the box this failed on is running? Also, if by some chance you still have GDB attached, registers and disassembly would be tremendously useful.<br><br>Also, I don't think it's a recursive call leading to a deadlock. The trace makes perfect sense to me. unittest2 calls into opApply, which calls into the foreach body code of unittest2. There's something very strange going on with the test where I append to an Appender from multiple threads, manually synchronizing. It looks like this is related. I wonder if there's some subtle reason why Appender can't be appended to from multiple threads, even if you manually synchronize, or if synchronizing on a global mutex, i.e. synchronized { /* Do stuff */ } is subtly broken.<br>
<br><div class="gmail_quote">On Sat, Jun 4, 2011 at 10:15 PM, Brad Roberts <span dir="ltr"><<a href="mailto:braddr@puremagic.com">braddr@puremagic.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Could be the same as one of the previous threads. Sorry for being a little lazy (it's that kinda weekend):<br>
<br>
I just noticed that the phobos linux 64/32 build had been running for almost 10 hours (I guess I need to add some<br>
timeout logic still). I attached gdb to get a stack trace:<br>
<br>
#0 0x55573425 in __kernel_vsyscall ()<br>
#1 0x555b0469 in __lll_lock_wait () at ../nptl/sysdeps/unix/sysv/linux/i386/i686/../i486/lowlevellock.S:142<br>
#2 0x555ab659 in _L_lock_835 () from /lib32/libpthread.so.0<br>
#3 0x555ab4eb in __pthread_mutex_lock (mutex=0x8139264) at pthread_mutex_lock.c:82<br>
#4 0x080f4065 in _d_criticalenter ()<br>
#5 0x080d4ea3 in std.parallelism.__unittest2() ()<br>
#6 0x080ed3a5 in<br>
std.parallelism.__T19ParallelForeachTaskTS3std5range13__T4iotaTiTiZ4iota6ResultTDFKiZiZ.ParallelForeachTask.impl() ()<br>
#7 0x080d3253 in std.parallelism.AbstractTask.job() ()<br>
#8 0x080d3523 in std.parallelism.TaskPool.tryDeleteExecute() ()<br>
#9 0x080ed1e1 in<br>
std.parallelism.__T19ParallelForeachImplTS3std5range13__T4iotaTiTiZ4iota6ResultTDFKiZiZ.ParallelForeachImpl.__T17ResubmittingTasksZ.submitAndExecute()<br>
()<br>
#10 0x080e44a2 in std.parallelism.__T15ParallelForeachTS3std5range13__T4iotaTiTiZ4iota6ResultZ.ParallelForeach.opApply() ()<br>
#11 0x080d49db in std.parallelism.__unittest2() ()<br>
#12 0x080edba9 in std.parallelism.__modtest() ()<br>
#13 0x080ff9dc in core.runtime.runModuleUnitTests() ()<br>
#14 0x080efca2 in object.ModuleInfo.opApply() ()<br>
#15 0x080ff8f7 in runModuleUnitTests ()<br>
#16 0x080f4980 in rt.dmain2.main() ()<br>
#17 0x080f45e8 in rt.dmain2.main() ()<br>
#18 0x080f4594 in main ()<br>
<br>
Without having looked at the code, the stack suggests a recursive call resulting in a deadlock. Notice frame 11 and 5<br>
are in the same function.<br>
<br>
Have fun with it.<br>
<br>
Later,<br>
Brad<br>
_______________________________________________<br>
phobos mailing list<br>
<a href="mailto:phobos@puremagic.com">phobos@puremagic.com</a><br>
<a href="http://lists.puremagic.com/mailman/listinfo/phobos" target="_blank">http://lists.puremagic.com/mailman/listinfo/phobos</a><br>
</blockquote></div><br>