Make DMD emit C++ .h files same as .di files

H. S. Teoh hsteoh at quickfur.ath.cx
Mon Feb 25 22:55:18 UTC 2019


On Mon, Feb 25, 2019 at 10:14:18PM +0000, Rubn via Digitalmars-d wrote:
> On Monday, 25 February 2019 at 19:28:54 UTC, H. S. Teoh wrote:
[...]
> > <off-topic rant>
> > This is a perfect example of what has gone completely wrong in the world
> > of build systems. Too many assumptions and poor designs over an
> > extremely simple and straightforward dependency graph walk algorithm,
> > that turn something that ought to be trivial to implement into a
> > gargantuan task that requires a dedicated job title like "build
> > engineer".  It's completely insane, yet people accept it as a fact of
> > life. It boggles the mind.
> > </off-topic rant>
[...]
> I don't think it is as simple as you make it seem. Especially when you
> need to start adding components that need to be build that isn't
> source code.

It's very simple. The build description is essentially a DAG whose nodes
represent files (well, any product, really, but let's say files for a
concrete example), and whose edges represent commands that transform
input files into output files. All the build system has to do is to do a
topological walk of this DAG, and execute the commands associated with
each edge to derive the output from the input.

This is all that's needed. The rest are all fluff.

The basic problem with today's build systems is that they impose
arbitrary assumptions on top of this simple DAG. For example, all input
nodes are arbitrarily restricted to source code files, or in some bad
cases, source code of some specific language or set of languages. Then
they arbitrarily limit edges to be only compiler invocations and/or
linker invocations.  So the result is that if you have an input file
that isn't source code, or if the output file requires invoking
something other than a compiler/linker, then the build system doesn't
support it and you're left out in the cold.

Worse yet, many "modern" build systems assume a fixed depth of paths in
the graph, i.e., you can only compile source files into binaries, you
cannot compile a subset of source files into an auxiliary utility that
in turn generates new source files that are then compiled into an
executable.  So automatic code generation is ruled out, preprocessing is
ruled out, etc., unless you shoehorn all of that into the compiler
invocation, which is a ridiculous idea.

None of these restrictions are necessary, and they only needlessly limit
what you can do with your build system.

I understand that these assumptions are primarily to simplify the build
description, e.g., by inferring dependencies so that you don't have to
specify edges and nodes yourself (which is obviously impractical for
large projects).  But these additional niceties ought to be implemented
as a SEPARATE layer on top of the topological walk, and the user should
not be arbitrarily prevented from directly accessing the DAG
description.  The way so many build systems are designed is that either
you have to do everything manually, like makefiles, which everybody
hates, or the hood is welded shut and you can only do what the authors
decide that you should be able to do and nothing else.


[...]
> It's easy to say build-systems are overly complicated until you
> actually work on a big project.

You seem to think that I'm talking out of an ivory tower.  I assure you
I know what I'm talking about.  I have written actual build systems that
do things like this:

- Compile a subset of source files into a utility;

- Run said utility to transform certain input data files into source
  code;

- Compile the generated source code into executables;

- Run said executables on other data files to transform the data into
  PovRay scene files;

- Run PovRay to produce images;

- Run post-processing utilities on said images to crop / reborder them;

- Run another utility to convert these images into animations;

- Install these animations into a target directory.

- Compile another set of source files into a different utility;

- Run said utility on input files to transform them to PHP input files;

- Run php-cli to generate HTML from said input files;

- Install said HTML files into a target directory.

- Run a network utility to retrieve the history of a specific log file
  and pipe it through a filter to extract a list of dates.

- Run a utility to transform said dates into a gnuplot input file for
  generating a graph;

- Run gnuplot to create the graph;

- Run postprocessing image utilities to touch up the image;

- Install the result into the target directory.

None of the above are baked-in rules. The user is fully capable of
specifying whatever transformation he wants on whatever inputs he wants
to produce whatever output he wants.  No straitjackets, no stupid hacks
to work around stupid build system limitations. Tell it how you want
your inputs to be transformed into outputs, and it handles the rest for
you.

Furthermore, the build system is incremental: if I modify any of the
above input files, it automatically runs the necessary commands to
derive the updated output files AND NOTHING ELSE (i.e., it does not
needlessly re-derive stuff that hasn't changed).  Better yet, if any of
the intermediate output files are identical to the previous outputs, the
build stops right there and does not needlessly recreate other outputs
down the line.

The build system is also reliable: running the build in a dirty
workspace produces identical products as running the build in a fresh
checkout.  I never have to worry about doing the equivalent of 'make
clean; make', which is a stupid thing to have to do in 2019. I have a
workspace that hasn't been "cleaned" for months, and running the build
on it produces exactly the same outputs as a fresh checkout.

There's more I can say, but basically, this is the power that having
direct access to the DAG can give you.  In this day and age, it's
inexcusable not to be able to do this.

Any build system that cannot do all of the above is a crippled build
system that I will not use, because life is far too short to waste
fighting with your build system rather than getting things done.


T

-- 
English has the lovely word "defenestrate", meaning "to execute by throwing someone out a window", or more recently "to remove Windows from a computer and replace it with something useful". :-) -- John Cowan


More information about the Digitalmars-d mailing list