food for thought - swift 5 released - bottom types, string interpolation, and stuff.
Nick Sabalausky (Abscissa)
SeeWebsiteToContactMe at semitwist.com
Fri Apr 12 06:14:05 UTC 2019
On 4/11/19 10:29 PM, Walter Bright wrote:
>
>> Creating functions like `isOdd` or `GetLeastSignificantBit` help
>> convey intent without having to supplement code with comments.
>
> No they don't. For example, if I saw "isOdd" in code, I'd wonder what
> the heck that was doing. I'd go look up its documentation,
> implementation, test cases, etc., find out "oh, it's just doing the
> obvious thing", and be annoyed at the waste of my time.
>
> It wastes the time of everyone reading the code, documenting the
> function, and the QA staff. It hides the useful stuff in the library
> when it's alongside a morass of junk like this. It wastes the time of
> the poor shlub perusing the library. It makes the library look like a
> bunch of filler rather than useful stuff.
>
> I don't believe anybody's code is so well-documented that this is all
> that's left. Meaning you're wasting your time documenting the wrong things.
>
> I call this so
Parson me here, but: Bullshit.
I'll qualify that: For code that you write for yourself, I have no doubt
that your approach here is both valid and good, and even optimal. But in
the general case...umm, no...
See, here's the thing: You, undeniably, have vastly more low-level
programming experience than 99.99% of programmers (I would know, I have
greater-than-average low-level experience myself)...but to be more
specific...to YOU, the pattern "xyz & 1" is (presumably) cognitively
synonymous with BOTH "look at the LSB" and "check evenness/oddness". And
that's reasonable, because, after all, both interpretations are
certainly technically true.
But here's the problem: That's your superpower. It's enviable, it's
impressive, but it's absolutely NOT shared by the vast majority of GOOD
programmers, let alone rank-and-file ones.
Granted, I will be the FIRST person to jump out and shout "The vast
majority of programmers are incompetent hacks, have no idea what they're
doing, and don't deserve their own jobs." And yet...Well...look at my
low-level experience: I've done videogame programming targeting 386 via
Watcom's Doom-famed DOS extender, plus PalmOS equivalent (we're talking
Dragonball processor days here, not ARM). I've studied and scrutinized
every word publicly written by such low-level greats as Michael Abrash
about all of their famed low-level tweaks. I've written homebrew for
such now-simplistic systems as the GameBoy Advance and Parallax's
"Propeller" microcontroler, including the Propeller's very first audio
driver (multichannel, written in 100% ASM, supporting frequency sweeps,
music notes A-G, ADSR envelopes, general PCM, and a realtime interactive
Piano demo), and the Propeller's first EEPROM driver, plus the
Propeller's very first psuedo-3D applications (despite the
microcontroller being between NES and SNES in capabilities). I was the
first person to get D code running on a Gameboy Advance, period. I
implemented Bresenham's line drawling algorithm inside a freaking web
browser *loooong* before HTML5. Heck, I entered a GBA coding competition
and won the highest spot awarded to an entry by a one-person team. I was
writing assembler code on the Apple II when my age was single digit. And
I wrote playable software for the Atari VCS/2600 AND designed and built
the real-world EEPROM burner and PC driver I used to run said software
on a physical Atari VCS/2600.
In short: I...know...low-level. Period.
And yet, even *I* look at "x & 1" or "x % 2", and my first instinct
is..."WTF is going on...? Some bit twiddling, but to what purpose...".
*Then* I work out what it's actually doing.
And then, if it's "x & 1", I think "This environment uses two's
complement, so what are the implications for negative values, and do I
need to worry about them?"
If YOU look at "x & 1" and *instictually* see an even/odd check that
also supports, and is not thwarted by, two's complement negatives...then
I tip my hat to you good sir for your impressive skills, but you are
clearly NOT, by any stretch of the mind, a normal, typical programmer,
You're not even a normal/typical *GOOD* programmer. You're an elite. One
of the rare, the proud, the few.
For the rest of us, this "x & 1" (heck, even my first instinct would've
been to do "x % 2") is *an implementation detail*.
More information about the Digitalmars-d
mailing list