Why is there no or or and ?

H. S. Teoh hsteoh at quickfur.ath.cx
Thu Feb 16 22:27:24 PST 2012


On Fri, Feb 17, 2012 at 06:47:20AM +0100, F i L wrote:
> I would use them over '||' and '&&' for the reasons bearophile gave.
> Highlighted as keywords, they're easily set appart, easier to type,
> and more distinguished... then again if I had my way I'd remove the
> '('/')' brackets, ending marks, and auto keyword; switched the
> definition name-type placement and change if/else/return/contract
> syntax...

Well, if you're going to reinvent the language syntax, I'd like to
replace:

	=	with	:=
	==	with	=

These two are the most annoying syntax inherited from C, IMNSHO. I mean,
mathematically speaking, = is equality, not assignment. Traditionally :=
has been used for assignment; so why mix them up? Besides, what on earth
is == anyway? Equal-equal? It makes no sense. And even worse, languages
like Javascript that copied C's lousy choice of equality operator made
it worse by introducing ===, which is both nonsensical in appearance and
semantically a symptom of language maldesign.

Next on the list is, of course:

	&&	with	and	(or perhaps &)
	||	with	or	(or perhaps even |)

The symbol '&' is commonly used to mean 'and', such as "John & Wiley's".
So why the && stutter? Bitwise operations aren't used very much anyway,
so they shouldn't be hogging single-character operators. Let bitwise AND
be &&, and I'd be OK with that. But C has gotten it the wrong way round.

Similarly '|' *has* been used traditionally to separate alternatives,
such as in BNF notation, so there's no reason for that silly || stutter.
Bitwise OR isn't used very often anyway, so if anything, | should be
logial OR, and I suppose it's OK for || to be bitwise OR. Again C has it
the wrong way round.

But more importantly:

	^^	with	^
	^	with something else altogether

I mean, c'mon. Everybody knows ^ means superscript, that is,
exponentiation. So why waste such a convenient symbol on bitwise XOR,
which is only rarely used anyway?! It should simply be called 'xor' at
best.  Nobody who hasn't learned C (or its derivatives) knows what '^'
means (in C) anyway.

And then:

	!	with	not

Everyone knows ! means exclamation mark, or factorial. Having it also
mean logical NOT is just needlessly confusing. What's wrong with 'not'?
Or, since we have Unicode, what about ¬? Much clearer.

As for bitwise NOT, '~' is about the most counterintuitive symbol for
such a thing. My presumptuous guess is that Kernighan ran out of symbols
on the keyboard for operators, so he resorted to ~. The symbol '~'
should've been reserved for an "approximately equal" operator, useful in
comparing floating-point numbers (which as we know usually shouldn't be
compared with equality due to roundoff errors), like this:

	if (a ~ b) { ... }
	
rather than today's baroque dance of:

	if (fabs(b-a) < EPSILON) { ... }

And what about:

	.	with	: or ;

OK. The symbol '.' is supposed to be used for the end of a sentence. At
least, so we were told in grade school. In the case of programming, it
should denote the end of a statement. So why is it that ';' is used to
end statements, and '.' to access struct/class members? It seems so
bass-ackwards. A semicolon (or a colon) is much more suitable for what
amounts to a name composed of parts (module:object:property), because
they signify partial stop, implying there's more to come. The period (or
full-stop for you brits) '.' should be used to *end* statements, not to
*continue* a multi-part name.

But who am I to speak out against more than four decades of historical
accidents, right? I think I'll shut up now.


T

-- 
If you want to solve a problem, you need to address its root cause, not
just its symptoms. Otherwise it's like treating cancer with Tylenol...


More information about the Digitalmars-d mailing list