DIP11: Automatic downloading of libraries

Andrei Alexandrescu SeeWebsiteForEmail at erdani.org
Tue Jun 14 14:56:58 PDT 2011


On 6/14/11 4:38 PM, Nick Sabalausky wrote:
> - Putting it in the compiler forces it all to be written in C++. As an
> external tool, we could use D.

Having the compiler communicate with a download tool supplied with the 
distribution seems to be a very promising approach that would address 
this concern.

> - By default, it ends up downloading an entire library one inferred source
> file at a time. Why? Libraries are a packaged whole. Standard behavior
> should be for libraries should be treated as such.

Fair point, though in fact the effect is that one ends up downloading 
exactly the used modules from that library and potentially others.

Although it may seem that libraries are packaged as a whole, that view 
ignores the interdependencies across them. This proposal solves the 
interdependencies organically.

> - Are we abandoning zdmd now? (Or is it "dmdz"?)

It is a related topic. That project, although it has been implemented, 
hasn't unfortunately captured the interest of people.

> - Does it automatically *compile* the files it downloads or merely use them
> to satisfy imports?

We need to arrange things such that the downloaded files are also 
compiled and linked together with the project.

> - Does every project that uses libX have to download it separately? If not
> (or really even if so), how does the compiler handle different versions of
> the lib and prevent "dll hell"? Versioning seems to be an afterthought in
> this DIP - and that's a guaranteed way to eventually find yourself in dll
> hell.

Versioning is a policy matter that can, I think, be addressed within the 
URL structure. This proposal tries to support versioning without 
explicitly imposing it or standing in its way.

> - How do you tell it to "update libX"? Not by expecting the user to manually
> clear the cache, I hope.

The external tool that would work in conjunction with dmd could have 
such a flag.

> - With a *real* package management tool, you'd have a built-in (and
> configurable) list of central data sources.

I don't see why you can't have with this approach too.

> If you want to use something you
> don't have installed, and it exists in one of the stores (maybe even one of
> the built-in ones), you don't have to edit *ANYTHING AT ALL*. It'll just
> grab it, no changes to your source needed at all, and any custom steps
> needed would be automatically handled. And if it was only in a data store
> that you didn't already have in your list, all you have to do is add *one*
> line. Which is just as easy as the DIP, but that *one* step will also
> suffice for any other project that needs libX - no need to add the line for
> *each* of your libX-using projects. Heck, you wouldn't even need to edit a
> file, just do "package-tool addsource http://...". The DIP doesn't even
> remotely compare.

I think it does. Clearly a command-line equivalent for the pragma needs 
to exist, and the appropriate pragmas can be added to dmd.conf. With the 
appropriate setup, a program would just issue:

using dsource.libX;

and get everything automatically.

> - I think you're severely overestimating the amount of extra dmd-invokations
> that would be needed by using an external build tool.

I'm not estimating much. It's Adam who shared impressions from actual use.

> I beleive this is
> because your idea centers around discovering one file at a time instead of
> properly handling packages at the *package* level.

The issue with package-level is that http does not have a protocol for 
listing files in a directory. However, if we arrange to support zip 
files, the tool could detect that a zip file is at the location of the 
package and download it entirely.

> Consider this:
>
> You tell BuildToolX to build MyApp. It looks at MyApp.config to see what
> libs it needs. It discovers LibX is needed. It fetches LibX.config, and
> finds it's dependencies. Etc, building up a dependency graph. It checks for
> any problems with the dependency graph before doing any real work (something
> the DIP can't do). Then it downloads the libs, and *maybe* runs some custom
> setup on each one. If the libs don't have any custom setup, you only have
> *one* DMD invokation (two if you use RDMD). If the libs do have any custom
> setup, and it involves running dmd, then that *only* happens the first time
> you build MyApp (until you update one of the libs, causing it's one-time
> setup to run once more).
>
> I think this proposal is a hasty idea that just amounts to chasing after
> "the easy way out".

I'm just trying to define a simple backend that facilitates sharing code 
and using of shared code, without arrogating the role and merits of a 
more sophisticated package management tool and without standing in the 
way of one. Ideally, the backend should be useful to such a tool - e.g. 
I imagine a tool could take a plain file format and transform it into a 
series of pragmas directing library locations.

As always, criticism is appreciated, particularly of the kind that 
prompts pushing things forward - as was the case with the idea of a 
download tool that's a separate executable, companion to dmd.


Thanks,

Andrei


More information about the Digitalmars-d mailing list