Practical parallelization of D compilation

H. S. Teoh hsteoh at
Wed Jan 8 19:44:22 UTC 2020

On Wed, Jan 08, 2020 at 09:13:18AM +0000, Chris Katko via Digitalmars-d-learn wrote:
> On Wednesday, 8 January 2020 at 06:51:57 UTC, H. S. Teoh wrote:
> > Generally, the recommendation is to separately compile each package.
> What's the downsides / difficulties / "hoops to jump through" penalty
> for putting code into modules instead of one massive project?

Are you talking about *modules* or *packages*?  Generally, the advice is
to split your code into modules once it becomes clear that certain bits
of code ought not to know about the implementation details of other bits
of code in the same file.  Some people insist that the cut-off is
somewhere below 1000 LOC, though personally I'm not so much interested
in arbitrary limits, but rather how cohesive/self-contained the code is.

The difference between modules and packages is a bit more blurry, since
you can create a package.d to make a package essentially behave like a
module.  But it just so happens that D's requirement that package
containment structure must match directory structure does map rather
well onto separate compilation: just separately compile each directory
and link them at the end.

> Is it just a little extra handwriting/boilerplate, or is there a
> performance impact talking to other modules vs keeping it all in one?

What performance impact are we talking about here, compile-time or
runtime?  Compile-time might increase slightly because of the need for
the compiler to open files and look up directories. But it should be
minimal.  There is no runtime penalty. I see it mainly as a tool for
code organization and management; it has little bearing on the actual
machine code generated at the end.


It always amuses me that Windows has a Safe Mode during bootup. Does that mean that Windows is normally unsafe?

More information about the Digitalmars-d-learn mailing list