DIP11

Steven Schveighoffer schveiguy at yahoo.com
Thu Aug 11 10:07:36 PDT 2011


On Thu, 11 Aug 2011 12:24:48 -0400, Andrew Wiley  
<wiley.andrew.j at gmail.com> wrote:

> On Thu, Aug 11, 2011 at 5:52 AM, Steven Schveighoffer
> <schveiguy at yahoo.com>wrote:
>> I think the benefit of this approach over a build tool which wraps the
>> compiler is, the compiler already has the information needed for
>> dependencies, etc.  To a certain extent, the wrapping build tool has to
>> re-implement some of the compiler pieces.
>>
>
> This last bit doesn't really come into play here because you can already  
> ask
> the compiler to output all that information. and easily use it in a  
> separate
> program. That much is already done.

Yes, but then you have to restart the compiler to figure out what's next.   
Let's say a source file needs a.d, and a.d needs b.d, and both a.d and b.d  
are on the network. You potentially need to run the compiler 3 times just  
to make sure you have all the files, then run it a fourth time to compile.

And there is no parsing of the output data, the problem boils down to a  
simple get tool.  Running a simple get tool over and over doesn't consume  
as much time/resources as running the compiler over and over.

There are still problems with the DIP -- there is no way yet to say "oh  
yeah, compiler, you have to build this file that I downloaded too".  But  
if nothing else, I like the approach of having the compiler drive  
everything.  It reduces the problem space to a smaller more focused task  
-- get a file based on a url.  We also already have many tools in  
existence that can parse a url and download a file/package.

-Steve


More information about the Digitalmars-d mailing list