DIP11: Automatic downloading of libraries

Nick Sabalausky a at a.a
Tue Jun 14 20:14:05 PDT 2011


"Adam D. Ruppe" <destructionator at gmail.com> wrote in message 
news:it927g$c7c$1 at digitalmars.com...
> Nick Sabalausky:
>> - By default, it ends up downloading an entire library one inferred
>> source file at a time. Why? Libraries are a packaged whole.
>> Standard behavior should be for libraries should be treated as
>> such.
>
> I don't agree. You don't import a library - you import a module.
> It's natural to just download that module and get what you need
> that way.
>

You import a module *from* a library. Even if you only import one module, 
that module is likely going to import others from the same lib, which may 
import others still, and chances are you'll end up needing most of the 
modules anyway.

Also, if a library needs any special "setup" step, then this won't even work 
anyway.

Plus I see no real benefit to being able to have a "partial" library 
installation.

>> Does every project that uses libX have to download it separately?
>
> My approach is to download the libraries to a local subdirectory.
>
> $ cd foo
> $ dir
>   app.d  # btw, app.d uses "import foo.bar;"
> $ build app
> $ dir
>   app  app.o  app.d  foo/
>
>
> If you want to share a library, you can link the local subdir
> to a central lib dir using your operating system's features.
> (symlinks, junctions, whatever)
>

I think a substantial number of people (*especially* windows users - it's 
unrealistic to expect windows users to use anything like junctions) would 
expect to be able to use an already-installed library without special setup 
for every single project that uses it.

And here's a real killer: If someone downloads your lib, or the source for 
your app, should they *really* be expected to wire up all your lib's/app's 
dependencies manually? This works against the whole point of easy package 
management.

> doing, and packing your application for distribution is as simple
> as zipping up the directory. Dependencies included automatically.
>

That can't always be done, shouldn't always be done, and not everyone wants 
to. There *are* benefits to packages being independent, but this throws them 
away. Yes, there are downsides to not having dependencies included 
automatically, but those are already solved by a good package management 
system.

>> It'll just grab it, no changes to your source needed at all, and
>> any custom steps needed would be automatically handled
>
> My approach again allowed a central repo, which may direct you
> elsewhere using standard http.
>
> It builds the default url by:
>
> http://centraldomain.com/repository/package/module.d
>
>
> I think the DIP should do this too if a liburl is not specified.

A central repo per se, isn't really a good idea. What there should be is a 
standard built-in list of official repos (even if there's initially only 
one). Then others can be added. The system shouldn't have "single 
point-of-failure" built-in.

I think things like apt-get and 0install are very good models for us to 
follow. In fact, we should probably think about whether we want to actualy 
just *use* 0install, either outright or behind-the-scenes.





More information about the Digitalmars-d mailing list