Scaling rdmd up: build package at a time
anonymous via Digitalmars-d
digitalmars-d at puremagic.com
Sat Jun 6 13:27:11 PDT 2015
On Saturday, 6 June 2015 at 19:44:15 UTC, Andrei Alexandrescu
wrote:
> On 6/6/15 11:47 AM, Jacob Carlborg wrote:
[...]
>> I mean that rdmd should compile all files that has changed
>> including its dependencies, no more, no less. It should
>> compile all
>> these files in one go.
>
> Yah, that's the traditional C-style module-at-a-time approach.
I think the traditional approach would be to compile modules
separately, not "all [...] in one go".
> Somewhat paradoxically, for D it's faster to compile several
> files at once, even though not all were affected by the change.
>
> So in the package-at-a-time approach if at least one module in
> a package is affected by a change, the entire package gets
> rebuilt.
This may beat separately compiling modules that needs rebuilding.
But it doesn't beat compiling all of them at once, does it?
My understanding of things:
Current behaviour: When any module changed, pass all source files
to one dmd invocation.
The makefile approach: Separately recompile every module that
needs rebuilding; then link the new object files with the cached
ones.
Your proposal, variant 1: For every module that needs rebuilding,
separately recompile its whole package; then link the new object
files with the cached ones of other packages.
Variant 2: For every module that needs rebuilding, determine what
package it belongs to; then pass the source files of those
packages and the cached object files of other packages to one dmd
invocation.
The seemingly obvious thing to do: Pass the source files that
need rebuilding and the object files of other modules to one dmd
invocation.
More information about the Digitalmars-d
mailing list