Compilation memory use
Anonymouse
zorael at gmail.com
Mon May 4 17:00:21 UTC 2020
TL;DR: Is there a way to tell what module or other section of a
codebase is eating memory when compiling?
I'm keeping track of compilation memory use using zsh `time` with
some environmental variables. It typically looks like this.
```
$ time dub build -c dev
Performing "debug" build using /usr/bin/dmd for x86_64.
[...]
Linking...
To force a rebuild of up-to-date targets, run again with --force.
dub build -c dev 9.47s user 1.53s system 105% cpu 10.438 total
avg shared (code): 0 KB
avg unshared (data/stack): 0 KB
total (sum): 0 KB
max memory: 4533 MB
page faults from disk: 1
other page faults: 1237356
```
So it tells me the maximum memory that was required to compile it
all. However, it only tells me just that; there's no way to know
what part of the code is expensive and what part isn't.
I can copy dub's dmd command and run it with `-v` and try to
infer that the modules that are slow to pass semantic3 are also
the hungry ones. But are they?
Is there a better metric?
More information about the Digitalmars-d-learn
mailing list