Article on programming language adoption (x-post from /r/programming)

H. S. Teoh hsteoh at quickfur.ath.cx
Thu Aug 1 09:53:47 PDT 2013


On Thu, Aug 01, 2013 at 05:39:23PM +0200, Dicebot wrote:
> On Thursday, 1 August 2013 at 15:01:39 UTC, Tofu Ninja wrote:
> >When every you are trying to optimize for speed you need to always
> >be aware of your bottle necks, for streaming video its internet
> >speed, for a CUDA application its main memory, for coding its they
> >keyboard.
> 
> I don't buy it. In a daily programming actually writing code takes
> no more than 10% of time for me. 30% planning what needs to be done,
> 30% figuring out what some piece of code does, 30% debugging.

Wow, you're optimistic. IME, debugging takes up 90% of development time.

OTOH, if you're talking about debugging *D* code, then you may have a
point. :) I found that after falling in love with D's unittest blocks,
the number of bugs in my code have dropped drastically (or rather, I
catch 90% of them during coding time, instead of after I start using the
program).


> Even full elimination of typing phase (literally, imagine some magic
> tool that directly translate your thoughts to code) wont be as useful
> as something that halves time for _any_ of three other parts.

I dread the day such a magic tool is made to translate thoughts into
code... most of the time what is imagined in the mind is far from the
consistent, interpretable code that a machine can understand. You'd just
end up with a glob of incoherent mess. And it changes over time, too. I
don't know about you, but I've a hard time keeping more than 3-4 lines
of code in my mind's eye simultaneously (and by that I mean visualize it
down to every last punctuation, i.e., in the form that the compiler can
actually parse), though I have no problem manipulating vague ideas that
take their final form through my fingers typing on the keyboard, with
the visual feedback helping to clarify the picture in my head bit by
bit.


> And static strong typing helps them all. As well as any
> compile-verifiable correctness.
> 
> People that have bottlenecks in actually writing code must be genius
> and never make mistakes.

My guess is that people got bitten by the verbosity of mandatory type
specification in C/C++, and got the wrong idea that *all*
statically-typed languages must look like that.


On Thu, Aug 01, 2013 at 06:15:10PM +0200, Tofu Ninja wrote:
> On Thursday, 1 August 2013 at 15:39:25 UTC, Dicebot wrote:
[...]
> >People that have bottlenecks in actually writing code must be
> >genius and never make mistakes.
> 
> That is why I chose to say that it was a bottleneck in coding, not
> in development. The amount of text required to code something has
> effects on the things you mentioned, for instance having a more
> verbose language can help prevent errors by making the writer think
> about every thing he is writing,

*ahem*Java*cough* ;-)


> but can also cause more errors as more code means more places for miss
> types.

Not to mention that verbose languages violate DRY so much that after a
while, all the code looks approximately the same, and your brain tunes
out and is unable to locate the subtle discrepancy in a single line
where a bug is hiding.


> Text amount can also effect code comprehension. For instance, having
> a very wordy language can sometimes be hard to understand as it is
> simply more for your brain to process.

*cough*Java*ahem* ;-)


> On the other hand a very compact language can be hard to understand as
> their might be to many assumptions and implicit information that the
> reader my not have.

This is one of the reasons I don't like dynamically-typed languages.
Sometimes I see code like this:

	function func(x,y,z) { ... }

which tells you NOTHING about what exactly x, y, and z are supposed to
be. And unlike a statically-typed language, where you'll get a
compile-time error if you passed in a wrong type, the dynamically-typed
language will just merrily barge forward with no indication of any
problem whatsoever, until your code happens to trigger in the customer's
environment and all of a sudden the code dies with some esoteric error
like "cannot use arithmetic operator on string" somewhere deep down the
call tree, and it's up to you to trace the code backwards before you
finally realize that you passed in the wrong type several function calls
up the call tree.

The only real solution to this problem is to insert type-checking
assertions into the body of func, by which time you might as well just
use a statically-typed language. Sigh...


> Debugging kind of falls in this as well as it requires you to fully
> comprehend what is going on to be able to find the bug.
[...]

Y'know, one feature I've always wanted is the equivalent of preprocessed
C code -- with all mixins expanded, aliases substituted with their final
target, templates fully expanded, all syntactic sugar lowered, with the
original code lines in comments, so that you can see exactly how your
code was translated, and whether it matches what you *think* it does.
This would also be invaluable for debugging, as then it will map to the
assembly code much better, which will help you trace where things went
wrong.

Another pet peeve I have with debuggers is that they are too often
line-based, rather than language-unit based. Sometimes if you have a
complicated nested function call:

	auto x = fun(gun(x,y*hun(z)),iun(w+jun(x)));

you want to be able to tell exactly which part of the expression is
being evaluated, and to be able to step through each part individually.
Ideally, the debugger should be able to tell you in what order the
function arguments are being evaluated (which may not correspond with
the code), and to allow you to step over the evaluation of individual
arguments.


T

-- 
Let X be the set not defined by this sentence...


More information about the Digitalmars-d mailing list