DIP11

Jacob Carlborg doob at me.com
Fri Aug 12 00:23:30 PDT 2011


On 2011-08-11 20:31, Steven Schveighoffer wrote:
> On Thu, 11 Aug 2011 14:19:35 -0400, Jacob Carlborg <doob at me.com> wrote:
>> So how would that be different if the compiler drives everything? Say
>> you begin with a few local files. The compiler then scans through them
>> looking for URL imports. Then asks a tool to download the dependencies
>> it found and starts all over again.
>
> Forgive my compiler ignorance (not a compiler writer), but why does the
> compiler have to start over? It's no different than importing a file, is
> it?

Probably it's no different. Well what's different is it will first parse 
a couple of files, then downloads a few files, then parse the downloaded 
files and the fetch some more files and so on.

To me this seems inefficient but since it's not implemented I don't 
know. It feels more efficient if it could download all needed files in 
one step. And then compile all files in the next step.

I don't know what's possible with this DIP but to me it seems that the 
current suggestion will download individual files. This also seems 
inefficient, my solution deals with packages, i.e. zip files.

>> This is how my package manager will work. You have a local file
>> containing all the direct dependencies needed to build your project.
>> When invoked, the package manager tool fetches a file containing all
>> packages and all their dependencies, from the repository. It then
>> figures out all dependencies, both direct and indirect. Then it
>> downloads all dependencies. It does all this before the compiler is
>> even invoked once.
>>
>> Then, preferably, but optionally, it hands over to a build tool that
>> builds everything. The build tool would need to invoke the compiler
>> twice, first to get all the dependencies of all the local files in the
>> project that is being built. Then it finally runs the compiler to
>> build everything.
>
> The benefit of using source is the source code is already written with
> an import statement, there is no need to write an external build file
> (all you need is command line that configures the compiler).

I don't see the big difference. I think the most project (not the 
smallest ones) will end up with a special file anyway, containing the 
pragmas declaring these imports.

In addition to that as soon as you need to pass flags to the compiler 
you will most likely to put that in a file of some kind. In that case 
you can just as easily put them in a build script and use a build tool.

> Essentially, the import statements become your "build file". I think
> dsss worked like this, but I don't remember completely.

Yes, this is similar how DSSS worked. The difference is that you didn't 
need a pragma to link a package to an URL you just wrote the import 
declarations as you do now.

One problem I think DSSS has, is, as far as I know, it can't handle top 
level packages with same name. Or at least not in any good way. If you 
go with the Java package naming scheme and name your top level package 
after your domain, example:

module com.foo.bar;
module com.foo.foobar;

And another project does the same:

module com.abc.defg;

Then these two projects will both end up in the "com" folder. Not very 
good in my opinion. In my solution every package has a name independent 
of the packages it contains an all packages are placed in a folder named 
after the package, including the version number.

> My ideal solution, no matter how it's implemented is, I get a file
> blah.d, and I do:
>
> xyz blah.d
>
> and xyz handles all the dirty work of figuring out what to build along
> with blah.d as well as where to get those resources. Whether xyz == dmd,
> I don't know. It sure sounds like it could be...
>
>
> -Steve

Yeah, I would like that too. But as I said above, as soon as you need 
compiler flags you need an additional file. With a built tool it can 
then be just:

"$ build"

-- 
/Jacob Carlborg


More information about the Digitalmars-d mailing list