OT: on IDEs and code writing on steroids

Yigal Chripun yigal100 at gmail.com
Mon May 18 22:47:47 PDT 2009


Yigal Chripun wrote:
> grauzone wrote:
>>
>> Just because it doesn't work on your shitty (SCNR) platform, it 
>> doesn't mean it's wrong. On Unix, there's a single ABI for C, and 
>> linking Just Works (TM).
> 
> do YOU want D to succeed?
> that shitty platform is 90% of the market.
>>
>> But I kind of agree. The most useful thing about compiling each module 
>> to an object file is to enable separate compilation. But this is 
>> useless: it doesn't work because of bugs, it doesn't "scale" (because 
>> a single module is likely to have way too many transitive dependencies).
>>
>>> I'm not suggesting coping Java's model letter for letter or using a 
>>> VM either, but rather using a better representation.
>>
>> Ew, that's even worse. Java's model is right out retarded.
>>
>> I'd just compile a D project to a single (classic) object file. That 
>> would preserve C compatibility. Because the compiler knows _all_ D 
>> modules at compilation, we could enable some spiffy stuff, like 
>> virtual template functions or inter-procedural optimization.
> 
> Instead of compiling per module, it should be more course grained like 
> on the package/project level. in C# you can compile a single file and 
> get a "module" file (IIRC), but that's a rare thing. usually you work 
> with assemblies.
> 
oh, I forgot my last point:
for C link-time compatibility you need to be able to _read_ C object 
files and link them to your executable. you gain little from the ability 
to _write_ object files.
if you want to do a reverse integration (use D code in your C project) 
you can and IMO should have created a library anyway instead of using 
object files and the compiler should allow this as a separate option via 
a flag, e.g. --make-so or whatever



More information about the Digitalmars-d mailing list