Breaking backwards compatiblity

Nick Sabalausky a at a.a
Mon Mar 12 12:15:32 PDT 2012


"H. S. Teoh" <hsteoh at quickfur.ath.cx> wrote in message 
news:mailman.531.1331533449.4860.digitalmars-d at puremagic.com...
> On Mon, Mar 12, 2012 at 01:36:06AM -0400, Nick Sabalausky wrote:
>> "H. S. Teoh" <hsteoh at quickfur.ath.cx> wrote in message
>> news:mailman.510.1331520028.4860.digitalmars-d at puremagic.com...
> [...]
>> Personally, I found discrete math to be the easiest class I took since
>> kindergarten (*Both* of the times they made me take discrete math.
>> Ugh. God that got boring.) It was almost entirely the sorts of things
>> that any average coder already understands intuitively. Like
>> DeMorgan's: I hadn't known the name "DeMorgan", but just from growing
>> up writing "if" statements I had already grokked how it worked and how
>> to use it. No doubt in my mind that *all* of us here have grokked it
>> (even any of us who might not know it by name) *and* many of the
>> coworkers I've had who I'd normally classify as "incompetent VB-loving
>> imbiciles".
>
> It's not that I didn't already know most of the stuff intuitively,

Didn't mean to imply that you didn't, of course.

>
>
>> Then there was Pidgeonhole principle, which was basically just obvious
>> corollaries to preschool-level spacial relations. Etc.  All pretty
>> much BASIC-level stuff.
>
> Oh reeeeaally?! Just wait till you learn how the pigeonhole principle
> allows you to do arithmetic with infinite quantities... ;-)
>

Well, the discrete math courses offered at the places I went to didn't take 
things that far. Just explained the principle itself.

> (And before you shoot me down with "infinite quantities are not
> practical in programming", I'd like to say that certain non-finite
> arithmetic systems actually have real-life consequences in finite
> computations. Look up "Hydra game" sometime. Or "Goldstein sequences" if
> you're into that sorta thing.)
>

Yea, I don't doubt that. While no game programmer, for example, would be 
caught dead having their code crunching calculus computations, there are 
some computations done in games that are obtained in the first place by 
doing some calculus (mostly physics, IIRC). Not exactly the same thing, but 
I get that applicablity of theory isn't limited to what the computer is 
actually calculating.

>
> [...]
>> > However, I also found that most big-name colleges are geared toward
>> > producing researchers rather than programmers in the industry.
>>
>> The colleges I've seen seemed to have an identity crisis in that
>> regard: Sometimes they acted like their role was teaching theory,
>> sometimes they acted like their role was job training/placement, and
>> all the time they were incompetent at both.
>
> In my experience, I found that the quality of a course depends a LOT on
> the attitude and teaching ability of the professor. I've had courses
> which were like mind-openers every other class, where you just go "wow,
> *that* is one heck of a cool algorithm!".
>

Yea, I *have* had some good instructors. Not many. But some.

> Unfortunately, (1) most professors can't teach; (2) they're not *paid*
> to teach (they're paid to do research), so they regard it as a tedious
> chore imposed upon them that takes away their time for research. This
> makes them hate teaching, and so most courses suck.
>

#1 I definitely agree with. #2 I don't doubt for at least some colleges, 
although I'm uncertain how applicable it is to public party schools like 
BGSU. There didn't seem to be much research going on there as far as I could 
tell, but I could be wrong though.

>
> [...]
>> I once made the mistake of signing up for a class that claimed to be
>> part of the CS department and was titled "Optimization Techniques". I
>> thought it was obvious what it was and that it would be a great class
>> for me to take.  Turned out to be a class that, realistically,
>> belonged in the Math dept and had nothing to do with efficient
>> software, even in theory. Wasn't even in the ballpark of Big-O, etc.
>> It was linear algebra with large numbers of variables.
>
> Ahhhhahahahahaha... must've been high-dimensional polytope optimization
> stuff, I'll bet.

Sounds about right.  I think the term "linear programming" was tossed around 
a bit, which I do remember from high school to be an application of linear 
algebra rather than software.

> That stuff *does* have its uses...

Yea, I never doubted that. Just not what I was expected. Really caught me 
offguard.

> but yeah, that was a
> really dumb course title.
>
> Another dumb course title that I've encountered was along the lines of
> "computational theory" where 95% of the course talks about
> *uncomputable* problems. You'd think they would've named it
> "*un*computational theory". :-P
>

Yea that is kinda funny.

>
>> I'm sure it would be great material for the right person, but it
>> wasn't remotely what I expected given the name and department of the
>> course.  (Actually, similar thing with my High School class of
>> "Business Law" - Turned out to have *nothing* to do with business
>> whatsoever. Never understood why they didn't just call the class "Law"
>> or "Civic Law".) Kinda felt "baited and switched" both times.
> [...]
>
> That's why I always took the effort read course descriptions VERY
> carefully before I sign up. It's like the fine print in contracts. You
> skip over it at your own peril.
>

Our course descriptions didn't have much fine print. Just one short 
vaguely-worded paragraph. I probably could have asked around and gotten a 
syllubus from previous semesters, but I didn't learn advanced student tricks 
like that until a few years into college. ;) Plus, that's other concerns, 
like scheduling and requirements. I found that a lot of my course selections 
had to be dictated more by scheduling and availability than much anything 
else.




More information about the Digitalmars-d mailing list