Build using D

H. S. Teoh hsteoh at quickfur.ath.cx
Wed Apr 29 17:09:54 UTC 2020


On Fri, Apr 17, 2020 at 08:24:06AM +0000, Guillaume Piolat via Digitalmars-d wrote:
[...]
> A declarative build description trades being power for being
> conventional and deriving rules that works on every project (through
> "lack of power").

IMNSHO, this is a flawed argument.  The basics of a build system is a
directed acyclic graph, with build sources, any intermediate targets,
and targets represented by the nodes, and derivation rules represented
by the edges. This model works for every project, but does not trade for
any "lack of power".

The problem with most modern build systems is that instead of tackling
this essential problem structure they try to "simplify" it by tacking on
various assumptions and conventions, which impose artificial limitations
on the system.  Just like the old version control systems like RCS/CVS,
Subversion, etc., they assume a particular workflow / revision history
structure, and therefore when you encounter a case that wasn't
considered in the original design, it's a pain to work around (CVS has
incredibly long times just for creating a branch, for example; SVN is
much better in that it's just a copy, but still requires server
roundtrips, and merges can sometimes be nightmarish).  Modern systems
like Git directly address the underlying structure of a revision graph
-- it's a directed acyclic graph, you see -- and therefore allows
extremely efficient branching/merging, and other graph manipulations
unimaginable in other revision control systems.

It's about time the same thing happened in the build system world.
Something that tackles the underlying DAG directly, and provides modern
guarantees like 100% build reproducibility, ability to handle any task
that fits into the DAG structure,[1] not just some canned operations
like "derive executable from C++ sources", and build times proportional
to the size of change rather than the size of the entire workspace.

I have a website project, for example, where sources include
mathematical model files which get transformed into POVRay input files
that are fed to povray to produce images, then post-processed by
imagemagick to produce .png's which are then installed onto a webserver
-- nowhere is a "compiler" involved but the structure of the task
remains the same: input -> transform -> output.  Another project takes
3D model data as input, and transforms them into runtime data and
accessor code in the form of D source code, that then gets
cross-compiled to ARM and packaged into an APK for Android.

In all of these cases, the details of the input -> output transformation
differs widely, but they nonetheless all fit into the framework of
dependency resolution over a DAG.  This is not even a tough problem like
satisfying versioned dependencies, an NP-complete problem.  Current
technology is well able to handle it with ease.

The only real problem that stands in the way is, IMNSHO, inertia of
developers too used to their aging tools to try something new, and the
lack of a well-known project to propel the prospective build system from
obscurity into the limelight (e.g., without Linux, I honestly doubt most
people would've even heard of git -- Torvalds probably wouldn't have
bothered to write it in the first place).


> For example, dub test, dub dustmite etc... are impossible with
> powerful immediate build (as opposed to declarative).

I don't agree with this.  A "powerful" build system can still be fully
declarative, and amenable to auto-testing, bisecting, etc.. Nothing
about the DAG model requires that your build must be expressed in an
imperative language.  You just have to think outside the box a bit to
arrive at a workable solution.


> Your "unique" build combination has to be justified.  If a standard
> build system cannot express your program, _perhaps_ it's up to you to
> simplify your build, rather than the build system to adapt to unique
> complexity.

I can't agree with this.  Why should I change the structure of my build
just because some build system stuck with a dated paradigm can't handle
its "complexity"?  Nothing can be more elementary that the concept of
transforming a bunch of inputs into some set of outputs via some
specified transformation.  I'm not even asking for something so complex
as solving an NP-complete problem in order to satisfy versioned
dependencies.  If a build system can't handle something as simple as
that, _perhaps_ that build system ought to be replaced with something
more competent! ;-)

(And I note, ironically, that dub *does* include a solver for the
NP-complete problem of satisfying versioned dependencies, *yet* it is
unable to handle something so simple as compiling a helper program from
some subset of input files, running said program to generate some more
source files, and compiling said source files into the final executable.
Note that each of these steps is again a simple input -> transform ->
output step.  Something capable of solving an NP-complete problem ought
to find such a thing beyond trivial!  Yet it cannot because of the
artificial restrictions arbitrarily placed upon it -- *needless*
arbitrary restrictions IMNSHO.)


> That said, there is always a large class of programs that cannot
> easily be built, that domain is cursed.

It's only cursed if one refuses to drop one's outmoded way of thinking
and embrace a better paradigm. ;-)


T

-- 
What's an anagram of "BANACH-TARSKI"?  BANACH-TARSKI BANACH-TARSKI.


More information about the Digitalmars-d mailing list