D Programmer Jobs at Sociomantic Labs

H. S. Teoh hsteoh at quickfur.ath.cx
Wed Nov 6 10:32:35 PST 2013


On Tue, Nov 05, 2013 at 10:37:38PM -0500, Nick Sabalausky wrote:
[...]
> I've actually tried to trace back just when it was I started on
> Applesoft BASIC, and best I can figure I must have been around 7 or
> 8. I know I had already learned to read (obviously), and I also
> remember it was definitely before second grade (but not
> *immediately* before, IIRC). Not intending to compete on ages of
> course, it's just that when you're a kid and you're at the
> Grandparent's place for the evening, all the adults are watching the
> news, and the only thing remotely toy-like is an Apple II...well,
> what else you gonna do? :)

Funny, I got an Apple II when I was 8, and was mostly just playing games
on it. When I was 10 or 11, I got so sick of playing games that I
decided to learn programming instead (i.e., to write my own games :P).
Sadly, I never got very far on that front.


[...]
> Heh, again, somewhat similar: I *tried* to learn C at 13, but found
> it awkward. So I went back to QBASIC, and later on Visual BASIC 3.
> When I was around 15, maybe 16, I found this book at the local
> "Electronics Boutique" as part of a Starter Kit bundle:
> 
> http://www.amazon.com/Teach-Yourself-Game-Programming-Cd-Ro/dp/0672305623
> 
> That book made C (and pointers) finally "click" for me (plus the
> occasional 386/486/586 asm in the bottleneck sections).

Really? I understood pointers right away 'cos they were just fancy
terminology for addresses in assembly language. :)


> Then at around 16-17, I got a Breakout/Arkanoid clone in a few of
> those several-budget-games-on-one-CD packages that were in all the
> computer stores at the time. Didn't make much off it, (even McDonald's
> paid more) but I was thrilled :)

Nice! I don't think I ever got that far in my attempts to write games at
the time. I was bogged down with having too many radical ideas without
the skills to actually implement them. :P


[...]
> > But anyway, w.r.t. the OP, if I were to be in charge of designing a
> > curriculum, I'd put assembly language as the first language to be
> > learned, followed by a good high-level language like D. On this, I
> > agree with Knuth's sentiments:
> >
> >     By understanding a machine-oriented language, the programmer
> >     will tend to use a much more efficient method; it is much closer
> >     to reality. -- D. Knuth
> >
> >     People who are more than casually interested in computers should
> >     have at least some idea of what the underlying hardware is like.
> >     Otherwise the programs they write will be pretty weird. -- D.
> >     Knuth
> >
> 
> If I were designing a Programming 101 curriculum, I honestly don't
> know what language I'd pick. In many ways I don't think a lot of the
> details really matter much. But what I do think are the most
> important things in a first language are instant-gratification and a
> strong emphasis on flow-of-execution. Heck, I might even pick
> Applesoft BASIC ;)

True. For a total beginner's intro to programming, assembly is probably
a bit too scary. :P  Applesoft might have been a good choice but its age
is definitely showing. I dunno. Maybe D? :) At least, the simplest parts
of it. But for computer science majors, I'd say dump assembly on them
and if they can't handle it, let them switch majors since they won't
turn out to be good programmers anyway. :P


> Of course, a lot of the choice would depend on the audience. For
> graduate-level math students, Haskell would be a real possibility.

IMNSHO, for graduate-level math students I would *definitely* start with
assembly language. It would serve to dispel the misconception that the
machine is actually capable of representing arbitrary natural numbers or
real numbers, or computing the exact value of transcendental functions,
etc.. Mathematicians tend to think in terms of idealized entities --
infinite precision, infinite representation length, etc., all of which
are impossible on an actual computer. In order to program effectively,
the first order of business is to learn the limitations of the machine,
and then learn how to translate idealized mathematical entities into the
machine's limited representation in such a way that it would be able to
compute the desired result.

In fact, now that I think of it, I think the first lecture would be to
convince them of how much computers suck -- can't represent natural
numbers, can't store real numbers, can't compute transcendental
functions, don't have infinite speed/time so only a small subset of
mathematical functions are actually computable, etc..

Then the second lecture will explain what the computer *can* do. That
would be when they'd start learning assembly language, and see for
themselves what the machine is actually doing (and thereby firmly
dispelling any remaining illusions they may have about mathematical
entities vs. what is actually representable on the machine).

Then maybe at the end of the course, after having built up a realistic
notion of what the machine is capable (and incapable) of, explain how
things may be put together to produce the illusion of mathematical
computation, say with a functional language like Haskell.

Later on, when they learn the theory of computation, I'd emphasize the
fact that even though we computer people speak of Turing completeness
all the time, actually no such computer exists that is *actually*
Turing-complete, because all physical computers have finite storage, and
therefore are no more than glorified finite-state machines. :) Of
course, that in no way makes the theory of Turing machines useless --
it's a useful idealized abstraction -- but we shouldn't be under the
illusion that we actually have access to the capabilities of an actual
Turing machine with an infinite tape. Some things are computable in
theory, but outright infeasible in practice. Like computing the
Ackermann function of Graham's number, say. :-P


T

-- 
What do you call optometrist jokes? Vitreous humor.


More information about the Digitalmars-d mailing list