Why is there no or or and ?
a at a.a
Fri Feb 17 22:24:19 PST 2012
"H. S. Teoh" <hsteoh at quickfur.ath.cx> wrote in message
news:mailman.492.1329503475.20196.digitalmars-d at puremagic.com...
> That's not 100% true, though. Programming has always had close ties with
> math, even though, as you said, they aren't the same thing. In fact,
> computing came out of math, if you want to be historically accurate. I
> mean, the symbols chosen do have *some* ties with other domains, even if
> the mapping is not 1-to-1. If it were truly completely arbitrary, why
> don't we use $ for assignment, ` for equality, and % for addition? After
> all, we're writing a programming language, not math, so we can just
> assign arbitrary symbols to mean whatever we want, right?
> And as for C syntax, I have to admit that I'm most comfortable with C
> (or C-like) syntax, too. Even though it has its own flaws, I have become
> accustomed to its quirks, and I spontaneously type C syntax when I think
> of code. But that doesn't mean we shouldn't step back and re-examine the
> system to see if there are better ways of doing things. I find Pascal
> syntax, for example, truly painful. But if I were to disregard my
> background in C (and its derivatives), perhaps there is something to be
> learned from Pascal syntax.
I do agree with all of this, FWIW.
More information about the Digitalmars-d