DIP11: Automatic downloading of libraries
Robert Clipsham
robert at octarineparrot.com
Wed Jun 15 07:56:54 PDT 2011
On 15/06/2011 15:33, Andrei Alexandrescu wrote:
> On 6/15/11 9:13 AM, Steven Schveighoffer wrote:
>> We have been getting along swimmingly without pragmas for adding local
>> include paths. Why do we need to add them using pragmas for network
>> include paths?
>
> That doesn't mean the situation is beyond improvement. If I had my way
> I'd add pragma(liburl) AND pragma(libpath).
pragma(lib) doesn't (and can't) work as it is, why do you want to add
more useless pragmas? Command line arguments are the correct way to go
here. Not to mention that paths won't be standardized across machines
most likely so the latter would be useless.
>> Also, I don't see the major difference in someone who's making a piece
>> of software from adding the include path to their source file vs. adding
>> it to their build script.
>
> Because in the former case the whole need for a build script may be
> obviated. That's where I'm trying to be.
This can't happen in a lot of cases, eg if you're interfacing with a
scripting language, you need certain files automatically generating
during build etc. Admittedly, for the most part, you'll just want to be
able to build libraries given a directory or an executable given a file
with _Dmain() in. There'll still be a lot of cases where you want to
specify some things to be dynamic libs, other static libs, and what if
any of it you want in a resulting binary.
>> But in any case, it doesn't matter if both options are available -- it
>> doesn't hurt to have a pragma option as long as a config option is
>> available. I just don't want to *require* the pragma solution.
>
> Sounds good. I actually had the same notion, just forgot to mention it
> in the dip (fixed).
I'd agree with Steven that we need command line arguments for it, I
completely disagree about pragmas though given that they don't work (as
mentioned above). Just because I know you're going to ask:
# a.d has a pragma(lib) in it
$ dmd a.d
$ dmd b.d
$ dmd a.o b.o
<Linker errors>
This is unavoidable unless you put metadata in the object files, and
even then you leave clutter in the resulting binary, unless you specify
that the linker should remove it (I don't know if it can).
>> dget would just add the appropriate path:
>>
>> import dcollections.TreeMap =>
>> get
>> http://www.dsource.org/projects/dcollections/import/dcollections/TreeMap.d
>>
>> hm.. doesn't work
>> get
>> http://www.dsource.org/projects/dcollections/import/dcollections/TreeMap.di
>>
>> ok, there it is!
>
> This assumes the URL contains the package prefix. That would work, but
> imposes too much on the URL structure. I find the notation -Upackage=url
> more general.
I personally think there should be a central repository listing packages
and their URLs etc, which massively simplifies what needs passing on a
command line. Eg -RmyPackage would cause myPackage to be looked up on
the central server, which will have the relevant URL etc.
Of course, there should be some sort of override method for private
remote servers.
>> As I said in another post, you could also specify a zip file or tarball
>> as a base path, and the whole package is downloaded instead. We may need
>> some sort of manifest instead in order to verify the import will be
>> found instead of downloading the entire package to find out.
>
> Sounds cool.
I don't believe this tool should exist without compression being default.
--
Robert
http://octarineparrot.com/
More information about the Digitalmars-d
mailing list