Why is there no or or and ?

Nick Sabalausky a at a.a
Thu Feb 16 23:28:40 PST 2012


"H. S. Teoh" <hsteoh at quickfur.ath.cx> wrote in message 
news:mailman.460.1329459948.20196.digitalmars-d at puremagic.com...
> On Fri, Feb 17, 2012 at 06:47:20AM +0100, F i L wrote:
>> I would use them over '||' and '&&' for the reasons bearophile gave.
>> Highlighted as keywords, they're easily set appart, easier to type,
>> and more distinguished... then again if I had my way I'd remove the
>> '('/')' brackets, ending marks, and auto keyword; switched the
>> definition name-type placement and change if/else/return/contract
>> syntax...
>
> Well, if you're going to reinvent the language syntax, I'd like to
> replace:
>
> = with :=
> == with =
>
> These two are the most annoying syntax inherited from C, IMNSHO. I mean,
> mathematically speaking, = is equality, not assignment. Traditionally :=
> has been used for assignment; so why mix them up? Besides, what on earth
> is == anyway? Equal-equal? It makes no sense. And even worse, languages
> like Javascript that copied C's lousy choice of equality operator made
> it worse by introducing ===, which is both nonsensical in appearance and
> semantically a symptom of language maldesign.
>
> Next on the list is, of course:
>
> && with and (or perhaps &)
> || with or (or perhaps even |)
>
> The symbol '&' is commonly used to mean 'and', such as "John & Wiley's".
> So why the && stutter? Bitwise operations aren't used very much anyway,
> so they shouldn't be hogging single-character operators. Let bitwise AND
> be &&, and I'd be OK with that. But C has gotten it the wrong way round.
>
> Similarly '|' *has* been used traditionally to separate alternatives,
> such as in BNF notation, so there's no reason for that silly || stutter.
> Bitwise OR isn't used very often anyway, so if anything, | should be
> logial OR, and I suppose it's OK for || to be bitwise OR. Again C has it
> the wrong way round.
>
> But more importantly:
>
> ^^ with ^
> ^ with something else altogether
>
> I mean, c'mon. Everybody knows ^ means superscript, that is,
> exponentiation. So why waste such a convenient symbol on bitwise XOR,
> which is only rarely used anyway?! It should simply be called 'xor' at
> best.  Nobody who hasn't learned C (or its derivatives) knows what '^'
> means (in C) anyway.
>
> And then:
>
> ! with not
>
> Everyone knows ! means exclamation mark, or factorial. Having it also
> mean logical NOT is just needlessly confusing. What's wrong with 'not'?
> Or, since we have Unicode, what about ¬? Much clearer.
>
> As for bitwise NOT, '~' is about the most counterintuitive symbol for
> such a thing. My presumptuous guess is that Kernighan ran out of symbols
> on the keyboard for operators, so he resorted to ~. The symbol '~'
> should've been reserved for an "approximately equal" operator, useful in
> comparing floating-point numbers (which as we know usually shouldn't be
> compared with equality due to roundoff errors), like this:
>
> if (a ~ b) { ... }
>
> rather than today's baroque dance of:
>
> if (fabs(b-a) < EPSILON) { ... }
>
> And what about:
>
> . with : or ;
>
> OK. The symbol '.' is supposed to be used for the end of a sentence. At
> least, so we were told in grade school. In the case of programming, it
> should denote the end of a statement. So why is it that ';' is used to
> end statements, and '.' to access struct/class members? It seems so
> bass-ackwards. A semicolon (or a colon) is much more suitable for what
> amounts to a name composed of parts (module:object:property), because
> they signify partial stop, implying there's more to come. The period (or
> full-stop for you brits) '.' should be used to *end* statements, not to
> *continue* a multi-part name.
>
> But who am I to speak out against more than four decades of historical
> accidents, right? I think I'll shut up now.
>

Meh.

All of the syntaxes you're advocating are every bit as arbitrary as the ones 
you're against. So what if there's some other convention used in a 
completely different discipline? Code isn't english, and code isn't math: 
The needs, use-cases, etc. are all totally different so it makes sence that 
a different set of conventions would be much more appropriate.

These arguments reminds me of a non-programmer I once talked to who was 
complaining that different languages don't all use the same syntax for 
comments. My reaction was: 1. Who the hell cares? 2. They're *different* 
languages, why can't they *be* different?




More information about the Digitalmars-d mailing list