If stdout is __gshared, why does this throw / crash?

Anon via Digitalmars-d-learn digitalmars-d-learn at puremagic.com
Sat Mar 5 17:10:58 PST 2016


On Saturday, 5 March 2016 at 14:18:31 UTC, Atila Neves wrote:
> With a small number of threads, things work as intended in the 
> code below. But with 1000, on my machine it either crashes or 
> throws an exception:
>
>
> import std.stdio;
> import std.parallelism;
> import std.range;
>
>
> void main() {
>     stdout = File("/dev/null", "w");
>     foreach(t; 1000.iota.parallel) {
>         writeln("Oops");
>     }
> }

Note that `1000.iota.parallel` does *not* run 1000 threads. 
`parallel` just splits the work of the range up between the 
worker threads (likely 2, 4, or 8, depending on your CPU). I see 
the effect you describe with any parallel workload. Smaller 
numbers in place of 1000 aren't necessarily splitting things off 
to additional threads, which is why smaller numbers avoid the 
multi-threaded problems you are encountering.

> I get, depending on the run, "Bad file descriptor", "Attempting 
> to write to a closed file", or segfaults. What am I doing wrong?
>
> Atila

`File` uses ref-counting internally to allow it to auto-close. 
`stdout` and friends are initialized in a special way such that 
they have a high initial ref-count. When you assign a new file to 
stdout, the ref count becomes one. As soon as one of your threads 
exits, this will cause stdout to close, producing the odd errors 
you are encountering on all the other threads.

I would avoid reassigning `stdout` and friends in favor of using 
a logger or manually specifying the file to write to if I were 
you.


More information about the Digitalmars-d-learn mailing list