food for thought - swift 5 released - bottom types, string interpolation, and stuff.
H. S. Teoh
hsteoh at quickfur.ath.cx
Sat Apr 13 05:18:22 UTC 2019
On Sat, Apr 13, 2019 at 12:15:02AM -0400, Nick Sabalausky (Abscissa) via Digitalmars-d wrote:
> On 4/12/19 7:03 AM, Walter Bright wrote:
[...]
> > Knowing what (x & 1) is is a superpower?
[...]
> Naturally I can't say this for sure, but based on your arguments, it
> sounds like you may very well be in the latter, uncommon category,
> meaning you would posses the genuine benefit of lacking a mental
> distinction between "& 1" and "evenness". In effect, it's mean you see
> straight through to the core truth (analogous to rain man's counting,
> hence the allusion to "superpowers"). If so, then that's fantastic,
> and I can certainly understand why you would balk at an isOdd/etc.
> It's just like how I would balk at anyone wrapping "x++" syntax with a
> "x.increment()" function. Like most with C-like experience, I look at
> "x++" and I instinctually "see" an increment, thus abstracting it
> would be pointless.
Personally, even though I understand perfectly well what (x & 1) means,
I much rather write it as (x % 2) instead and let the optimizer
implement it as (x & 1), because that conveys intent better. I consider
using (x & 1) for testing evenness/oddness akin to writing gotos in
place of loops. It's semantically equivalent, but IMO at the wrong
level of abstraction.
> But understand that such lack of mental distinction between "& 1" and
> "evenness" isn't common, even among good programmers with low-level
> experience. And that distinction is exactly what high-level languages
> are all about: Not mixing high-level intent from from mechanical
> implementation in order to cope with the human brain's difficulty at
> handling different levels of abstraction simultaneously.
[...]
I don't find it difficult to parse (x & 1) as testing for odd/even, but
I do find it distasteful the same way most people would find writing
gotos instead of loops or if-statements distasteful. `&` is a bitwise
operator, and as such has different connotations from (x % 2), which is
more obviously equivalent to the mathematical concept of odd/even.
The machine doesn't care about the difference -- and in the old days, &
was preferred because it produced better machine code, though nowadays
any non-joke compiler would optimize away the difference -- but writing
code isn't merely just telling the machine what to do, it's also
documenting to the next human reader what you intended the machine to
do. For the latter purpose, writing (x % 2) conveys intent much better
than (x & 1).
OTOH, writing an entire function to abstract away even/oddness is going
a bit too far. That'd be like writing a function ifThen(condition,
trueBranch, falseBranch) that abstracts away if-statements -- it's
redundant and needless complexity for something already expressible with
built-in constructs. Such would be justifiable only if you're doing
something like runtime metaprogramming or writing an interpreter where
you need to parcel away built-in constructs into runtime-manipulable
objects.
T
--
Question authority. Don't ask why, just do it.
More information about the Digitalmars-d
mailing list