[OT] Re: Andrei's list of barriers to D adoption

Jonathan M Davis via Digitalmars-d digitalmars-d at puremagic.com
Fri Jun 10 11:59:02 PDT 2016


On Friday, June 10, 2016 17:20:29 Ola Fosheim Grøstad via Digitalmars-d wrote:
> On Friday, 10 June 2016 at 15:27:03 UTC, Jonathan M Davis wrote:
> > Most developers have titles like "Software Engineer" or "Senior
> > Softweer Engineer." They'e frequently called programmers and/or
> > software developers when not talking about titles.
>
> Neither academia or businesses use Computer Scientist as a job
> title... tough?

In academia, you'd be a professor of Computer Science or a professor in the
Computer Science department. You wouldn't normally be called a Computer
Scientist - certainly not as a job title.  And in businesses, the only
companies that even _might_ have Computer Scientist as a title would be
where it was a very research-heavy job, which would not be at all normal.
Research-heavy jobs like that do exist in some large companies, but in the
vast majority of cases, programmers are hired as Software Engineers to write
code for actual products.

> > Yeah. Most universities in the US have a Computer Science
> > degree, but some have Software Engineering as a separate
> > degree. My college had Computer Science, Software Engineer, and
> > Computer Engineering, which is atypical. All of them took
> > practical courses, but the SE guys didn't have to take some of
> > the more theoretical stuff and instead took additional classes
> > focused on working on projects in teams and whatnot.
>
> Sounds like a good setup. At my uni we could pick freely what
> courses we wanted each semester, but needed a certain combination
> of fields and topics to get a specific degree. Like for entering
> computer science you would need the most feared topic, Program
> Verification taught by Ole-Johan Dahl (co-creator of Simula) who
> was very formal on the blackboard... I felt it was useless at the
> time, but there are some insights you have to be force-fed...
> only to be appreciated later in life. It is useless, but still
> insightful.
>
> Not sure if those more narrow programs are doing their students a
> favour, as often times the hardest part is getting a good
> intuition for the basics of a topic, while getting the "expert"
> knowledge for a specific task is comparatively easier. Especially
> now we have the web. So, being "forced" to learning the basics of
> a wider field is useful.

I tend to be of the opinion that the best colloge program has all of the
more theoretical stuff, because it provides a solid base for real life
programming, but project-based, real world stuff is also very important to
help prepare students for actual jobs. Too many college programs do very
little with helping prepare students for actual programming jobs, but at the
same time, I think that skipping a lot of the theoretical stuff will harm
students in the long run. But striking a good balance isn't exactly easy,
and it's definitely the case that a lot of the more theoretical stuff isn't
as obviously useful then as it is later. In some ways, it would actually be
very beneficial to actually go back to school to study that stuff after
having programmed professionally for a while, but that's a pain to pull off
time-wise, and the classes aren't really designed with that in mind anyway.

> I'm rather sceptical of choosing C++ as a language for instance.
> Seems like you would end up wasting a lot of time on trivia and
> end up students hating programming...

Choosing the right language for teaching is an endless debate with all kinds
of pros and cons. Part of the problem is that good languages for
professional work tend to be complicated with advantages aimed at getting
work done rather than teaching, which causes problems for teaching, but
picking a language that skips a lot of the compilications means that
students aren't necessarily well-prepared to deal with the more complicated
aspects of a language.

When I started out in school, C++ was the main language, but it quickly
changed to Java, which removes all kinds of certain problems, but it still
has a lot of extra cruft (like forcing everything to be in a class and a ton
of attributes forced to be on main), and it doesn't at all prepare students
to properly deal with pointers and memory. So, students starting out with
Java have some fun problems when they then have to deal with C or C++.
Alternatively, there are folks in favor of starting with functional
languages, which has certain advantages, but it's so different from how
folks would program normally that I'm not sure that it's ultimately a good
idea. All around, it's a difficult problem, and I don't know wha the right
choice is. In general, there are serious problems with teaching with real
world languages, and teaching with a language that was designed for teaching
doesn't necessarily prepare students for the real world.  I don't envy
teachers having to figure out how to teach basic programming concepts.

Regardless, I think that students should be at least exposed to both the
imperative/OO languages and the functional languages over the course of
school if they're going to be well-rounded programmers. So, a lot of the
question is how to best teach the beginning concepts than what to use later
later in the curriculum. To some extent, once you've got the basic stuff
down, the language doesn't necessarily matter much.

I did think that it was funny though when in the networking course that I
took, the teacher said that we were doing it in C, because if we did it in
Java, then there wouldn't be much of a class. We basically implemented TCP
on top of UDP as part of the course, whereas in Java, it would be a lot more
likely to use RMI or the like and not even deal with sockets, let alone
memory.

- Jonathan M Davis




More information about the Digitalmars-d mailing list