OT: on IDEs and code writing on steroids

BCS ao at pathlink.com
Thu May 21 12:07:32 PDT 2009


Reply to Yigal,

> BCS wrote:
> 
>> Reply to Yigal,
>> 
>>> if you compile each file separately than you parse all 4 files for
>>> each object file which is completely redundant and makes little
>>> sense since you'll need to recompile all of them anyway because of
>>> their dependencies.
>>> 
>> All of the above is (as far as D goes) an implementation detail[*].
>> What I'm railing on is that in c# 1) you have no option BUT to do it
>> that way and 2) the only practical way to build is from a config file
>> 
> it's as much an implementation detail in D as it is in C#. nothing
> prevents you to create your own compiler for C# as well.
> 

I disagree, see below:

>> [*] I am working very slowly on building a compiler and am thinking
>> of building it so that along with object files, it generates "public
>> export" (.pe) files that have a binary version of the public
>> interface for the module. I'd set it up so that the compiler never
>> parses more than one file per process. If you pass it more, it forks
>> and when it runs into imports, it loads the .pe files after, if
>> needed, forking off a process to generating it.
>> 
> sounds like an interesting idea - basically your compiler will
> generate the meta data just as an IDE does for C#.
> 

Maybe that's the confusion: No it won't! 

That's not the meta data I've been talking about. The meta data that c# needs 
that I'm referring to is the list of file that the compiler needs to look 
at. In D this information can be derived from the text of the import statements 
in the .d files (well it also needs the the import directory list). In c# 
this can't be done even within a single assembly. Without me explicitly telling 
the compiler what files to look in, it cant find anything! It can't even 
just search the local dir for file that have what it's looking for because 
I could have old copies of the file laying around that shouldn't be used.

>> I didn't say that the only tool you can use is the compiler. I'm fine
>> with bud/DSSS/rebuild being used. What I don't want, is a language
>> that effectively _requiters_ that some config file be maintained
>> along with the code files. I suspect that the bulk of pure D projects
>> (including large ones) /could/ have been written so that they didn't
>> need a dsss.conf file and many of those that do have a dsss.conf, I'd
>> almost bet can be handed without it. IIRC, all that DSSS really needs
>> is what file to start with (where as c# needs to be handed the full
>> file list at some point).
>> 
> you miss a critical issue here: DSSS/rebuild/etc can mostly be used
> without a config file _because_ they embed the DMDFE which generates
> that information (dependencies) for them. There is no conceptual
> difference between that and using an IDE. you just moved some
> functionality from the IDE to the build tool.
> both need to parse the code to get the dependencies.

Again, in c# you /can't get that information/ by parsing the code. And that 
is my point exactly.

>> I think we won't converge on this.
>> 
>> I think I'm seeing a tools dependency issue that I don't like in the
>> design of C# that I _known_ I'm not seeing in D. You think that D is
>> already just as dependent on the tools and don't see that as an
>> issue.
>> 
>> One of the major attractions for me to DMD is its build model so I
>> tend to be very conservative and resistant to change on this point.
>> 
>
> you're right that we will not converge on this. you only concentrate on
> the monolithic executable case and ignore the fact that in real life
> projects the common case is to have sub-components, be it Java jars, C#
> assemblies, C/C++ dll/so/a or D DDLs.

Yes the common case, but that dosn't make it the right case. See below.

> in any of those cases you sstill need to manage the sub components and
> their dependencies.
> one of the reasons for "dll hell" is because c/c++ do not handle this
> properly and that's what Java and .net and DDL try to solve. the
> dependency is already there for external tools to manage this
> complexity.

I assert that very rare that a programs NEEDS to use a DLL/so/DDL type of 
system. The only unavoidable reasons to use them that I see are:

1) you are forced to use code that can't be had at compile time (rare outside 
of plugins and they don't count because they are not your code)
2) you have lots of code that is mostly never run and you can't load it all 
(and that sounds like you have bigger problems)
3) you are running into file size limits (outside of something like a kernel 
image, this is unlikely)
4) booting takes to long (and that says your doing something else wrong)

It is my strongly held opinion that the primary argument for dlls and friends, 
code sharing, is attempting to solve a completely intractable problem. As 
soon as you bring in versioning, installers and uninstallers, the problem 
becomes flat out impossible to solve. (the one exception is for low level 
system things like KERNEL32.DLL and stdc*.so)

In this day and age where HDD's are ready to be measured in TB and people 
ask how many Gigs of RAM you have, *who cares* about code sharing?





More information about the Digitalmars-d mailing list