Why is there no or or and ?

H. S. Teoh hsteoh at quickfur.ath.cx
Fri Feb 17 10:32:50 PST 2012


On Fri, Feb 17, 2012 at 09:58:06AM -0500, Nick Sabalausky wrote:
> "F i L" <witte2008 at gmail.com> wrote in message 
[...]
> > Seeing as how we're all educated around with mathematic symbols as
> > children, a language design which reflects what is most familiar
> > will be the easiest to initially understand. Less friction means
> > more productivity.
> >
> 
> You're talking about very minor details that are trivial to learn (I
> was only about 12 or 13 when I learned C). The prodictivity drop in
> these cases is *purely* a *minor* upfront cost, and with no ongoing
> cost (but does have ongoing *benefits* because it's designed
> specifically with *it's own* domain in mind instead being hampered by
> unnecessary ties to some other domain).
[...]

That's not 100% true, though. Programming has always had close ties with
math, even though, as you said, they aren't the same thing. In fact,
computing came out of math, if you want to be historically accurate. I
mean, the symbols chosen do have *some* ties with other domains, even if
the mapping is not 1-to-1. If it were truly completely arbitrary, why
don't we use $ for assignment, ` for equality, and % for addition? After
all, we're writing a programming language, not math, so we can just
assign arbitrary symbols to mean whatever we want, right?

And as for C syntax, I have to admit that I'm most comfortable with C
(or C-like) syntax, too. Even though it has its own flaws, I have become
accustomed to its quirks, and I spontaneously type C syntax when I think
of code. But that doesn't mean we shouldn't step back and re-examine the
system to see if there are better ways of doing things. I find Pascal
syntax, for example, truly painful. But if I were to disregard my
background in C (and its derivatives), perhaps there is something to be
learned from Pascal syntax.


T

-- 
Once bitten, twice cry...


More information about the Digitalmars-d mailing list