Lawrence D'Oliveiro
2024-06-08 01:18:20 UTC
I’m just curious to know whether a phenomenon I experienced, way back when
I first entered University, when the first 8-bit micros were just starting
to scuttle through the undergrowth between the feet of the “real computer”
dinosaurs, still applies today.
From the first time I came in contact with an actual computer, I loved
programming. Friends and I would spend hours in the terminal room, doing
all kinds of stuff not strictly related to our studies. Back then the main
campus system was a DEC PDP-11/70 running RSTS/E.
By contrast, most of the Comp Sci lecturers didn’t seem to be very
enthusiastic for it. Not only that, but they seemed to have little
awareness of how the real computer system, accessible just outside the
lecture hall, worked.
For example, one second-year course was on “system software”. This was
supposed to be about operating systems, some details about how they worked
and how an application program would interface to them. But the lecturer
who gave the course only seemed to have experience of Univac 1100-series
mainframes, which were of course nowhere to be found anywhere on or near
our campus.
Meanwhile, I was reading actual manuals (at the computer centre) about how
that kind of stuff worked on the PDP-11, and I was able to write real code
to try it out.
Nowadays of course all the students can carry around their own personal
hardware more powerful than that. But maybe there’s still a related
phenomenon at play: there was a discussion in another newsgroup a few days
ago involving someone who was doing a Comp Sci course around late 200x.
The lecture material was completely Microsoft-centric, talking only about
the Windows platform. They did look a little bit at Linux, but much.
To me, this represented an opportunity missed. Linux is a system you can
completely pull apart, to learn how every part of it works. And you can
learn things from comparing it with Windows, to see the pros and cons of
different design decisions. Yet the lecturers didn’t seem to have a clue.
Does that sort of thing still apply today?
I first entered University, when the first 8-bit micros were just starting
to scuttle through the undergrowth between the feet of the “real computer”
dinosaurs, still applies today.
From the first time I came in contact with an actual computer, I loved
programming. Friends and I would spend hours in the terminal room, doing
all kinds of stuff not strictly related to our studies. Back then the main
campus system was a DEC PDP-11/70 running RSTS/E.
By contrast, most of the Comp Sci lecturers didn’t seem to be very
enthusiastic for it. Not only that, but they seemed to have little
awareness of how the real computer system, accessible just outside the
lecture hall, worked.
For example, one second-year course was on “system software”. This was
supposed to be about operating systems, some details about how they worked
and how an application program would interface to them. But the lecturer
who gave the course only seemed to have experience of Univac 1100-series
mainframes, which were of course nowhere to be found anywhere on or near
our campus.
Meanwhile, I was reading actual manuals (at the computer centre) about how
that kind of stuff worked on the PDP-11, and I was able to write real code
to try it out.
Nowadays of course all the students can carry around their own personal
hardware more powerful than that. But maybe there’s still a related
phenomenon at play: there was a discussion in another newsgroup a few days
ago involving someone who was doing a Comp Sci course around late 200x.
The lecture material was completely Microsoft-centric, talking only about
the Windows platform. They did look a little bit at Linux, but much.
To me, this represented an opportunity missed. Linux is a system you can
completely pull apart, to learn how every part of it works. And you can
learn things from comparing it with Windows, to see the pros and cons of
different design decisions. Yet the lecturers didn’t seem to have a clue.
Does that sort of thing still apply today?