Practical parallelization of D compilation

H. S. Teoh hsteoh at quickfur.ath.cx
Wed Jan 8 19:31:13 UTC 2020


On Wed, Jan 08, 2020 at 06:56:20PM +0000, Guillaume Lathoud via Digitalmars-d-learn wrote:
> Thanks to all for the answers.
> 
> The package direction is precisely what I am trying to avoid. It is
> still not obvious to me how much work (how many trials) would be
> needed to decide on granularity, as well as how much work to
> automatize the decision to recompile a package or not ; and finally,
> when a given package needs to be recompiled for only one or a few
> files changed, most likely one would WAIT (much) more than with the
> current solution - and within a single process.

This is the problem that build systems set out to solve.  So existing
tools like makefiles (ugh) would work (even though I dislike make for
various reasons -- but for simple projects it may well suffice), you
just have to write a bunch of rules for compiling .d files into object
files then link them.  Personally I prefer using SCons
(https://scons.org/), but there are plenty of similar build systems out
there, like tup, Meson, CMake, etc..  There are also fancier offerings
that double as package managers like Gradle, but from the sounds of it
you're not interested to do that just yet.

As for using packages or not: I do have some projects where I compile
different subsets of .d files separately, for various reasons.
Sometimes, it's because I'm producing multiple executables that share a
subset of source files. Other times it's for performance reasons, or
more specifically, the fact that Vibe.d Diet templates are an absolute
*bear* to compile, so I write my SCons rules such that they are compiled
separately, and everything else is compiled apart from them, so if no
Diet templates change, I can cut quite a lot off my compile times.

So it *is* certainly possible; you just have to be comfortable with
getting your hands dirty and writing a few build scripts every now and
then. IMO, the time investment is more than worth the reduction in
compilation waiting times.

Furthermore, sometimes in medium- to largish projects I find myself
separately compiling a single module plus its subtree of imports (via
dmd -i), usually when I'm developing a new module and want to run
unittests, or there's a problem with a particular module and I want to
be able to run unittests or test code (in the form of temporary unittest
blocks) without waiting for the entire program to compile. In such
cases, I do:

	dmd -i -unittest -main -run mymod.d

and let dmd -i figure out which subset of source files to pull in. It's
convenient, and cuts down quite a bit on waiting times because I don't
have to recompile the entire program each iteration.


[...]
> having a one-liner solution (no install, no config file) delivered
> along with the compiler, or as a compiler option, could fill a sweet
> spot between a toy app (1 or 2 source files), and a more complex
> architecture relying on a package manager. This might remove a few
> obstacles to D usage. This is of course purely an opinion.

I find myself in the same place -- my projects are generally more than 1
or 2 files, but not so many that I need a package manager (plus I also
dislike package managers for various reasons).  I find that a modern,
non-heavy build system like SCons or tup fills that need very well. And
to be frank, a 200+ file project is *far* larger than any of mine, and
surely worth the comparatively small effort of spending a couple of
hours to write a build script (makefile, SConscript, what-have-you) for?


T

-- 
ASCII stupid question, getty stupid ANSI.


More information about the Digitalmars-d-learn mailing list