DIP11

Steven Schveighoffer schveiguy at yahoo.com
Fri Aug 12 06:49:30 PDT 2011


On Fri, 12 Aug 2011 03:23:30 -0400, Jacob Carlborg <doob at me.com> wrote:

> On 2011-08-11 20:31, Steven Schveighoffer wrote:
>> On Thu, 11 Aug 2011 14:19:35 -0400, Jacob Carlborg <doob at me.com> wrote:
>>> So how would that be different if the compiler drives everything? Say
>>> you begin with a few local files. The compiler then scans through them
>>> looking for URL imports. Then asks a tool to download the dependencies
>>> it found and starts all over again.
>>
>> Forgive my compiler ignorance (not a compiler writer), but why does the
>> compiler have to start over? It's no different than importing a file, is
>> it?
>
> Probably it's no different. Well what's different is it will first parse  
> a couple of files, then downloads a few files, then parse the downloaded  
> files and the fetch some more files and so on.
>
> To me this seems inefficient but since it's not implemented I don't  
> know. It feels more efficient if it could download all needed files in  
> one step. And then compile all files in the next step.
>
> I don't know what's possible with this DIP but to me it seems that the  
> current suggestion will download individual files. This also seems  
> inefficient, my solution deals with packages, i.e. zip files.

The extendability is in the url.  For example, yes, http://server/file.d  
downloads a single file, and would be slow if every import needed to  
download an individual file.  But some other protocol, or some other cue  
to the download tool (like if the download has a path component that ends  
in .tgz or something), then you download all the files at once, and on  
subsequent imports, the cached file is used (no download necessary).  I  
think there's even a suggestion in there for passing directives back to  
the compiler like "file already downloaded, open it here..."

I think unless a single file *is* the package, it's going to be foolish to  
download individual files.

I also think a protocol which defines a central repository would be  
beneficial.  So you only need one -I parameter to include a whole  
community of D code (like dsource).

>>> This is how my package manager will work. You have a local file
>>> containing all the direct dependencies needed to build your project.
>>> When invoked, the package manager tool fetches a file containing all
>>> packages and all their dependencies, from the repository. It then
>>> figures out all dependencies, both direct and indirect. Then it
>>> downloads all dependencies. It does all this before the compiler is
>>> even invoked once.
>>>
>>> Then, preferably, but optionally, it hands over to a build tool that
>>> builds everything. The build tool would need to invoke the compiler
>>> twice, first to get all the dependencies of all the local files in the
>>> project that is being built. Then it finally runs the compiler to
>>> build everything.
>>
>> The benefit of using source is the source code is already written with
>> an import statement, there is no need to write an external build file
>> (all you need is command line that configures the compiler).
>
> I don't see the big difference. I think the most project (not the  
> smallest ones) will end up with a special file anyway, containing the  
> pragmas declaring these imports.

Note that the pragmas are specific to that file only.  So you don't have  
an import file which defines pragmas.  This is to prevent conflicts  
between two files that declare the same package override.

> In addition to that as soon as you need to pass flags to the compiler  
> you will most likely to put that in a file of some kind. In that case  
> you can just as easily put them in a build script and use a build tool.

a batch files/shell script should suffice, no need for a "special" tool.

>> Essentially, the import statements become your "build file". I think
>> dsss worked like this, but I don't remember completely.
>
> Yes, this is similar how DSSS worked. The difference is that you didn't  
> need a pragma to link a package to an URL you just wrote the import  
> declarations as you do now.

IIRC, dsss still had a global config file that defined where to import  
things from.  The DIP defines that -I switches can also define internet  
resources along with pragmas, so sticking those in the dmd.conf would  
probably be the equivalent.

> One problem I think DSSS has, is, as far as I know, it can't handle top  
> level packages with same name. Or at least not in any good way. If you  
> go with the Java package naming scheme and name your top level package  
> after your domain, example:
>
> module com.foo.bar;
> module com.foo.foobar;
>
> And another project does the same:
>
> module com.abc.defg;
>
> Then these two projects will both end up in the "com" folder. Not very  
> good in my opinion. In my solution every package has a name independent  
> of the packages it contains an all packages are placed in a folder named  
> after the package, including the version number.

These all seem like implementation details.  I don't care how the tool  
caches the files.

>> My ideal solution, no matter how it's implemented is, I get a file
>> blah.d, and I do:
>>
>> xyz blah.d
>>
>> and xyz handles all the dirty work of figuring out what to build along
>> with blah.d as well as where to get those resources. Whether xyz == dmd,
>> I don't know. It sure sounds like it could be...
>>
>>
>> -Steve
>
> Yeah, I would like that too. But as I said above, as soon as you need  
> compiler flags you need an additional file. With a built tool it can  
> then be just:
>
> "$ build"

Or instructions on the web site "use 'dmd -O -inline -release  
-version=SpecificVersion project.d' to compile"

Or build.sh (build.bat)  Note that dcollections has no makefile,  
everything is built from shell scripts.  I almost never have to edit the  
build file, because the line's like:

dmd -lib -O -release -inline dcollections/*.d dcollections/model/*.d

Any new files get included automatically.  And it takes a second to build,  
so who cares if you rebuild every file every time?

Interestingly, libraries would still need to specify all the files since  
they may not import eachother :)  I don't know if there's a "good"  
solution that isn't too coarse for that.



All of this discussion is good to determine the viability, and clarify  
some misinterpretations of DIP11, but I think unless someone steps up and  
tries to implement it, it's a moot conversation.  I certainly don't have  
the time or knowledge to implement it.  So is there anyone who is  
interested, or has tried (to re-ask the original question)?

-Steve


More information about the Digitalmars-d mailing list