lint for D

Yigal Chripun yigal100 at gmail.com
Sat Jul 12 00:15:03 PDT 2008


Nick Sabalausky wrote:
> "Yigal Chripun" <yigal100 at gmail.com> wrote in message 
> news:g57vk8$tma$1 at digitalmars.com...
>> Markus Koskimies wrote:
>>> On Thu, 10 Jul 2008 15:37:16 +0900, Bill Baxter wrote:
>>>
>>>> Markus Koskimies wrote:
>>>>> On Thu, 10 Jul 2008 01:42:17 +0100, Bruce Adams wrote:
>>>>> I must disagree with this. Just like D has integrated doc generator, I
>>>>> would like to see it bundled with a tool doing all kinds of static
>>>>> checks also; it need not to be integral inside, but released with
>>>>> compiler and automatically invoked by the compiler. The reason for this
>>>>> hope is very simple - it would make things simple from developers point
>>>>> of view. Just write:
>>>>>
>>>>> dmd <all my D files>
>>>>>
>>>>> ...And it would not only compile the executable (possible recognizing
>>>>> which files need to be recompiled, normally done by make), but also run
>>>>> static checker. That's also the reason why I like the "-od" -style
>>>>> options (telling directories to put things).
>>>> If the hypothetical lint tool existed, you could just write a script
>>>> containing:
>>>>
>>>>    <split up options>
>>>>    dmd $options_meant_for_dmd
>>>>    dlint $options_meant_for_dlint
>>>>
>>>> There's just a bit of work to do there to filter out the options and
>>>> split them into separate variables.  But if someone can write a lint
>>>> tool, they should be able to figure out how to do that.  :-)
>>> In ideal world that's true. But in practice (my experience with C++ &
>>> lint) if the lint-like tool is not a part of the compiler release, you
>>> spend a lots of time to configure the lint to understand the compiler
>>> options and features.
>>>
>>> But that's just my opinion. I would prefer a compiler release, where:
>>>
>>> dmd -w <all the files & options>
>>> or:
>>> gdc -w <all the options & files>
>>>
>>> ....is alias for:
>>>
>>> dlint <all the files & options> \
>>> && \
>>> dmd/gdc <all the files & options>
>>>
>>> It really does not matter if the lint-like tool is integral part of the
>>> compiler, but it would help a lot if the compiler could be instructed
>>> first to make static analysis before starting to generate code. Sure it
>>> would be great, if the lint and compiler understand the same language
>>> (i.e. they share the same parser) and same options.
>>>
>>> It is just the way I normally work; I enable all possible warnings and I
>>> would like to get rid of them all.
>> IMO, the UNIX way is the correct way. I actually prefer to remove the
>> documentation generation from DMD.
>> the compiler should only compile code and nothing else.
>> if you want to do something more complex than just compile a file you
>> should use a build tool. a build tool is where you want to integrate all
>> the separate tools together. look at ant for example: you can define an
>> ant task for compilation, a different task for doc generation, another
>> to run lint tools, etc... ant itself uses xml files (which is bad IMO)
>> to define those and is extensible (which is good).
>> there are build tools that are written as libs in a scripting language
>> and your "makefile" than just become a regular python script (scons) or
>> ruby script (rake/rant).
>> also, build tools like that can be integrated in your IDE. (If i'm not
>> mistaken eclipse can work with ant projects, so you can use gui for
>> everything)
>>
>> just use ant/rake/scons/what ever instead of dmd in your example.
>> All of those are orders of magnitude better than make and even with that
>> you can have:
>> $> make all
>> to statically check your files, compile and link them, generate docs and
>> order from pizza hut all in one go.
>>
> 
> How can having extra *optional* functionality in a tool possibly cause any 
> actual *practical* problems (ie, other than the highly abstract notion of 
> purity)? I can't think of a single realisic scenario where the difference in 
> the following two snippets would actualy make a real difference:
> 
> makecoffee  ...args...
> jumpupanddown ...args...
> 
> vs:
> 
> multitool -makecoffee ...args...
> multitool -jumpupanddown ...args...
> 
> I can understand that keeping separate tasks in separate tools can be 
> considered nice and clean and give me a warm fuzzy feeling. But, what real 
> concrete difference could it possibly make? 
> 
> 
here's a scenario for you:
ddoc and dmd - ddoc reads the ast that the compiler front-end generates
and changes it by mistake, thus affecting code generation. bundling both
in the same executable created an integration bug where docs generation
affected the executable.
this is just a hypothetical bug, of course, but one that could
eventually happen.
if ddoc was in a separate executable (sharing a common library with a
compiler cli interface) than such bug cannot happen. since you put the
compiler front-end in the lib above, you do not need to implement it
twice. but you do get two separate executables which cannot affect each
other.

also, you can link that same lib and reuse in other tools as well -
editors, IDEs, lint tools, docs generators, build tools, etc..
no need to make one huge executable of them all combined.

IMO, the compiler should be a shared lib used by different executables
to perform different tasks.



More information about the Digitalmars-d mailing list