Discussion:
Disconnect Between Coursework And Real-World Computers
(too old to reply)
Lawrence D'Oliveiro
2024-06-08 01:18:20 UTC
Permalink
I’m just curious to know whether a phenomenon I experienced, way back when
I first entered University, when the first 8-bit micros were just starting
to scuttle through the undergrowth between the feet of the “real computer”
dinosaurs, still applies today.

From the first time I came in contact with an actual computer, I loved
programming. Friends and I would spend hours in the terminal room, doing
all kinds of stuff not strictly related to our studies. Back then the main
campus system was a DEC PDP-11/70 running RSTS/E.

By contrast, most of the Comp Sci lecturers didn’t seem to be very
enthusiastic for it. Not only that, but they seemed to have little
awareness of how the real computer system, accessible just outside the
lecture hall, worked.

For example, one second-year course was on “system software”. This was
supposed to be about operating systems, some details about how they worked
and how an application program would interface to them. But the lecturer
who gave the course only seemed to have experience of Univac 1100-series
mainframes, which were of course nowhere to be found anywhere on or near
our campus.

Meanwhile, I was reading actual manuals (at the computer centre) about how
that kind of stuff worked on the PDP-11, and I was able to write real code
to try it out.

Nowadays of course all the students can carry around their own personal
hardware more powerful than that. But maybe there’s still a related
phenomenon at play: there was a discussion in another newsgroup a few days
ago involving someone who was doing a Comp Sci course around late 200x.
The lecture material was completely Microsoft-centric, talking only about
the Windows platform. They did look a little bit at Linux, but much.

To me, this represented an opportunity missed. Linux is a system you can
completely pull apart, to learn how every part of it works. And you can
learn things from comparing it with Windows, to see the pros and cons of
different design decisions. Yet the lecturers didn’t seem to have a clue.

Does that sort of thing still apply today?
John Dallman
2024-06-08 09:06:00 UTC
Permalink
I'm just curious to know whether a phenomenon I experienced, way
back when I first entered University, when the first 8-bit micros
were just starting to scuttle through the undergrowth between the
feet of the _real computer_ dinosaurs, still applies today.
By contrast, most of the Comp Sci lecturers didn't seem to be very
enthusiastic for it. Not only that, but they seemed to have little
awareness of how the real computer system, accessible just outside
the lecture hall, worked.
For example, one second-year course was on _system software_. This
was supposed to be about operating systems, some details about how
they worked and how an application program would interface to them.
But the lecturer who gave the course only seemed to have experience
of Univac 1100-series mainframes, which were of course nowhere to
be found anywhere on or near our campus.
Most of my lecturers in 1980-83 were genuinely interested, and had
practical research projects running using real hardware. There were two
subgroups that didn't:

There were some who were essentially mathematicians, and their research
was on theorems and proofs. That was appropriate for them.

There were also "business systems" guys who seemed to see their role as
passing on the established practices of business data processing - of ten
or so years earlier. One of them wore a suit and tie to give lectures. I
decided they were dull, and used the course options to turn my "Data
Processing" degree, ostensibly in business data processing, into a
microprocessors, operating systems and graphics course. That has served
me well.

John
Lawrence D'Oliveiro
2024-06-09 00:37:06 UTC
Permalink
Post by John Dallman
There were also "business systems" guys who seemed to see their role as
passing on the established practices of business data processing - of
ten or so years earlier. One of them wore a suit and tie to give
lectures. I decided they were dull, and used the course options to turn
my "Data Processing" degree, ostensibly in business data processing,
into a microprocessors, operating systems and graphics course. That has
served me well.
I studiously avoided COBOL during my degree. Then in my first job after
graduation, the company (a software house) wanted a “Unix expert”
(something else I had had limited exposure to at the time), and ... guess
what? The biggest chunk of my time was doing COBOL programming on CTOS
machines, creating a leasing system for a finance company. Also some work
on a Wang VS system running a “4GL” (remember those?) called “SPEED II”.
And eventually, some actual Unix work, to port an accounting system
between various different proprietary Unixes of the time (yes, their
differences were a pain).

I didn’t actually do any database courses, but I have done a fair amount
of database work since then. (There is some satisfaction out of getting
the needed info with a single query using a seven-way join...) Also the
opportunities for playing with decent graphics hardware were limited while
I was doing my degree, but what was once expensive and high-end has become
commonplace now, and it’s easy enough to get to grips with graphics APIs
(Cairo, OpenGL) and graphics tools (Blender, Inkscape, GIMP) on your own
hardware.
John Dallman
2024-06-09 13:32:00 UTC
Permalink
Post by Lawrence D'Oliveiro
I studiously avoided COBOL during my degree.
I didn't get to do that. In first year, we learned Pascal and an
artificial assembly language created for teaching. In second year,
Algol-68R, FORTRAN and COBOL. So I did one term of COBOL, learned the
basics, and wanted nothing further to do with it.
Post by Lawrence D'Oliveiro
And eventually, some actual Unix work, to port an accounting system
between various different proprietary Unixes of the time (yes,
their differences were a pain).
Yup. I've done ports to HP-UX, Solaris, IRIX, AIX, MacOS X (three
different architectures), Linux, and Android. iOS is a UNIX-based system,
but has to be treated as /sui generis/.


John
songbird
2024-06-09 16:57:34 UTC
Permalink
Post by John Dallman
Post by Lawrence D'Oliveiro
I studiously avoided COBOL during my degree.
I didn't get to do that. In first year, we learned Pascal and an
artificial assembly language created for teaching. In second year,
Algol-68R, FORTRAN and COBOL. So I did one term of COBOL, learned the
basics, and wanted nothing further to do with it.
heh, yeah, i had one short course of it and that was
enough, but funny enough some 12 years later the uni
took out their custom software and replaced it with an
Oracle DB system underneath, but all the reports were
done in COBOL. so being able to at least understand
it was a positive for me having to deal with integrating
and coming up with a job scheduling script. this was
the mid 90s. i left when that conversion project was
winding down and i sure did not want to get stuck with
a certain cow-orker (because they were notorius for
making lists of things to do but not actually getting
any of those items done).
Post by John Dallman
Post by Lawrence D'Oliveiro
And eventually, some actual Unix work, to port an accounting system
between various different proprietary Unixes of the time (yes,
their differences were a pain).
Yup. I've done ports to HP-UX, Solaris, IRIX, AIX, MacOS X (three
different architectures), Linux, and Android. iOS is a UNIX-based system,
but has to be treated as /sui generis/.
i was glad to have the experience from mainframes, to
minis and then to PCs. most of the big programs i wrote
were on the mainframe or minis, but they now would run
pretty well on this fairly old-generation PC. i love
that i can do some graphics tinkering if i want to and
then when i'm not doing anything very intensive at all
the machine here will idle at about 20watts as i'm
typing along.


songbird
Charlie Gibbs
2024-06-09 17:22:11 UTC
Permalink
Post by John Dallman
Post by Lawrence D'Oliveiro
I studiously avoided COBOL during my degree.
I didn't get to do that. In first year, we learned Pascal and an
artificial assembly language created for teaching. In second year,
Algol-68R, FORTRAN and COBOL. So I did one term of COBOL, learned the
basics, and wanted nothing further to do with it.
Our computer science curriculum studiously avoided COBOL, since it
was one of those filthy real-world languages. If you really wanted
to try it, they could bring up an emulator overnight to run DOS/360
under MTS, and use the IBM compiler. I don't think anybody ever tried.

The real-world machine I was programming part-time was too small
to run a COBOL compiler - everything was done in assembly language
(which I loved) and RPG. I tried mentioning RPG to a CS weenie
and he almost threw up.

It was about ten years later that I got involved with COBOL,
and became rather adept. It was (IMHO) more fun than all the
weird languages at the university, where one course hit you
with a new one every two weeks. LISP, UMIST, SNOBOL4, more
flavours of Algol than you could throw a stick at (including
pl360, the misbegotten bastard child of Algol and assembly
language)... Fortunately I dropped out before I had to
take that one.
--
/~\ Charlie Gibbs | The Internet is like a big city:
\ / <***@kltpzyxm.invalid> | it has plenty of bright lights and
X I'm really at ac.dekanfrus | excitement, but also dark alleys
/ \ if you read it the right way. | down which the unwary get mugged.
Lawrence D'Oliveiro
2024-06-09 23:14:51 UTC
Permalink
Our computer science curriculum studiously avoided COBOL, since it was
one of those filthy real-world languages.
I never quite understood the design goals of COBOL. Was there a class of
programmers that dumb, that they preferred all that ADD/MULTIPLY/etc stuff
as opposed to the kinds of expressions you would write in Fortran and just
about every other language?

Here’s another thing: COBOL was supposed to be specifically designed for
“business” needs. This seemed to mean, among other things, no clever
BASIC-style string handling. Yet what happens in the 1980s onwards, but
relational databases become very common for “business” needs. And what’s
the best way to interface to a relational database? Via SQL, an entirely
separate query language. This means being able to construct SQL strings in
your program suddenly becomes very handy.

So what next? The COBOL implementors add various nonstandard extensions to
their compilers to cope with this SQL interface need. All of them clunky,
imperfect solutions. This becomes particularly obvious later with the rise
of the Web, and the proliferation of business sites built on something
like a MySQL/MariaDB DBMS, using dynamic languages like Perl and Python
(and PHP, if you must) that can do construction of complex strings without
breaking a sweat.

Luckily, by that time, everybody recognizes that COBOL has very firmly
become a “legacy technology”.
The real-world machine I was programming part-time was too small to run
a COBOL compiler - everything was done in assembly language (which I
loved) and RPG. I tried mentioning RPG to a CS weenie and he almost
threw up.
I never quite understood why RPG needed to exist. Another reflection of
the deficiencies of COBOL in constructing SQL query strings?
Kerr-Mudd, John
2024-06-10 09:06:19 UTC
Permalink
On Sun, 9 Jun 2024 23:14:51 -0000 (UTC)
Lawrence D'Oliveiro <***@nz.invalid> wrote:

[]
Post by Lawrence D'Oliveiro
I never quite understood why RPG needed to exist. Another reflection of
the deficiencies of COBOL in constructing SQL query strings?
RPG existed way back; seemimgly it was a kind of computerised plugboard.
It was initailly founded as a Report Program generator, but (just like
spreadsheets later) got twisted into doing other things it wasn't good at.
Oh look:
https://en.wikipedia.org/wiki/IBM_RPG#History
--
Bah, and indeed Humbug.
Louis Krupp
2024-06-10 21:15:49 UTC
Permalink
Post by Lawrence D'Oliveiro
Our computer science curriculum studiously avoided COBOL, since it was
one of those filthy real-world languages.
I never quite understood the design goals of COBOL. Was there a class of
programmers that dumb, that they preferred all that ADD/MULTIPLY/etc stuff
as opposed to the kinds of expressions you would write in Fortran and just
about every other language?
Here’s another thing: COBOL was supposed to be specifically designed for
“business” needs. This seemed to mean, among other things, no clever
BASIC-style string handling. Yet what happens in the 1980s onwards, but
relational databases become very common for “business” needs. And what’s
the best way to interface to a relational database? Via SQL, an entirely
separate query language. This means being able to construct SQL strings in
your program suddenly becomes very handy.
So what next? The COBOL implementors add various nonstandard extensions to
their compilers to cope with this SQL interface need. All of them clunky,
imperfect solutions. This becomes particularly obvious later with the rise
of the Web, and the proliferation of business sites built on something
like a MySQL/MariaDB DBMS, using dynamic languages like Perl and Python
(and PHP, if you must) that can do construction of complex strings without
breaking a sweat.
Luckily, by that time, everybody recognizes that COBOL has very firmly
become a “legacy technology”.
The real-world machine I was programming part-time was too small to run
a COBOL compiler - everything was done in assembly language (which I
loved) and RPG. I tried mentioning RPG to a CS weenie and he almost
threw up.
I never quite understood why RPG needed to exist. Another reflection of
the deficiencies of COBOL in constructing SQL query strings?
From what I've been able to find, FORTRAN was first compiled correctly
in 1958, and ALGOL dates back to 1958, so when COBOL was developed in
1959, there weren't a lot of widely available alternatives. COBOL was
good with unit record equipment; if your data were on punched cards,
with (for example) a social security number in columns 1 though 9, a
name in columns 10 through 30, an interest rate in columns 31 through 34
with an implied decimal point between columns 32 and 33, COBOL was the
way to go. Sure, you could read those fields with FORTRAN or ALGOL
formats, but decimal to binary conversion could be slower than letting
COBOL do its decimal arithmetic, and CPUs weren't what they are today.

FORTRAN, as far as I know, has no built-in sort, but COBOL does. FORTRAN
doesn't have a search statement; COBOL does.

Once upon a time, there were programmers who wrote engineering
applications in FORTRAN and went to ACM meetings, and there were
programmers who wrote business applications in COBOL and went to DPMA
conferences. You probably wouldn't have asked a business programmer to
write something in FORTRAN, and you definitely wouldn't ask an engineer
to write COBOL.

(See
https://en.wikipedia.org/wiki/Association_of_Information_Technology_Professionals
for information on DPMA.)

The COBOL interface to SQL might be clunky and imperfect, but some
people can live with that. Others can't, and they're the ones who go on
to develop things like Perl and Python and modern Fortran and the COBOL
report writer module because yes, even COBOL has progressed, in its own way.

I've never used RPG. It was apparently introduced in 1959, so it
predates COBOL, and it was probably the only game in town for a while.
My guess is that it survived because it did some of the same things as
COBOL with unit record data, but with less overhead. Remember, computers
were slow in those days, and compilation was expensive.

String handling has been late to the party in a number of languages; it
involves dynamic memory allocation, and that's always been a challenge.
As far as I know, Fortran 90 has it, but FORTRAN 77 doesn't. Unisys
ALGOL has it, but it's limited; the 2017 manual says:

===
The number of strings that can be declared in a program is limited by
the operating system to 500. If this limit is exceeded, the message
STRING POOL EXCEEDED is given.
===

and anyway it's Unisys and it's ALGOL and nobody cares.

Louis
John Levine
2024-06-11 15:00:01 UTC
Permalink
Post by Lawrence D'Oliveiro
I never quite understood why RPG needed to exist. Another reflection of
the deficiencies of COBOL in constructing SQL query strings?
You could run it on a 4K 360/20 with only a card reader/punch and printer.
People who programmed in RPG told me that for the kinds of stuff it was
intended for, it was way easier to slap together a little RPG program
than to do it in a "real" language.

COBOL needed TOS or DOS which meant 16K, preferably 32K, and a disk or a couple of tape drives.

Unix users who are familiar with awk may recognize the feeling. Yes, you
can write big programs in awk but the sweet spot is short scripts that
use the read/process/write loop.
--
Regards,
John Levine, ***@taugh.com, Primary Perpetrator of "The Internet for Dummies",
Please consider the environment before reading this e-mail. https://jl.ly
Bob Eager
2024-06-09 21:18:39 UTC
Permalink
Post by John Dallman
Post by Lawrence D'Oliveiro
I studiously avoided COBOL during my degree.
I didn't get to do that. In first year, we learned Pascal and an
artificial assembly language created for teaching. In second year,
Algol-68R, FORTRAN and COBOL. So I did one term of COBOL, learned the
basics, and wanted nothing further to do with it.
Post by Lawrence D'Oliveiro
And eventually, some actual Unix work, to port an accounting system
between various different proprietary Unixes of the time (yes, their
differences were a pain).
I taught on a UK CS degree for 37 years. Here's an idea of what we were
doing ten years ago.

First year:
- Intensive Java, group project with a meaningful product.
- Intensive maths
- Operating system theory
- Intensive UNIX course (command line only)
- Writing proper documentation
- HCI
- Theory of computation

Second year:
- Advanced operating systems, practical project
- Machine architecture
- Software engineering (projects)
- Assembler programming
- C programming
- Functional programming (Haskell)
- Optional modules

Final year:
- Group project, or individual research project. Substantial
- Choice of optional modules

Most students took a year in industry, workinmg on real projects (e.g. at
arena side during 2012 Olympics)
--
Using UNIX since v6 (1975)...

Use the BIG mirror service in the UK:
http://www.mirrorservice.org
Lawrence D'Oliveiro
2024-06-09 23:03:55 UTC
Permalink
Post by John Dallman
In first year, we learned Pascal and an
artificial assembly language created for teaching. In second year,
Algol-68R, FORTRAN and COBOL. So I did one term of COBOL, learned the
basics, and wanted nothing further to do with it.
I did Pascal as well, and enjoyed it. A lecturer got hold of something
called “NBS Pascal”, which was a two-pass compiler written in itself. It
had to be two passes because, as small a language as Pascal was, it could
not be handled in a single program that fitted into the 64K address space
of the PDP-11.

That compiler had some omissions (e.g. no gotos) and some interesting bugs
in it. But given we had the source code, there was fun to be had in
finding and fixing those bugs.

(We didn’t call it “open source” in those days, but I guess that’s what it
was.)

For example, the “extern” feature for linking to non-Pascal code didn’t
work. (Later I did find and fix that bug.) But I discovered that it was
possible to replace dummy Pascal routines with assembly-language ones of
the same names at link time. The first routine in your program would be
named “III001” in the object code, the second one “III002” etc, so it
helped to keep those dummies near the top of the program. This allowed me
full access to OS functionality that Pascal itself did not support.

As for Algol-68 ... did you find that an interesting language? I found the
“Revised Report” in the University library in my first year, and the whole
concept just blew my mind. But I was never able to get my hands on a
working implemention, until within the last few years Algol-68 Genie came
along.
Post by John Dallman
Yup. I've done ports to HP-UX, Solaris, IRIX, AIX, MacOS X (three
different architectures), Linux, and Android. iOS is a UNIX-based
system, but has to be treated as /sui generis/.
I’m mildly surprised you didn’t have trouble with MacOS X as well.
John Dallman
2024-06-09 23:24:00 UTC
Permalink
Post by Lawrence D'Oliveiro
As for Algol-68 ... did you find that an interesting language? I
found the _Revised Report_ in the University library in my first
year, and the whole concept just blew my mind. But I was never able
to get my hands on a working implemention, until within the last
few years Algol-68 Genie came along.
It was a really interesting language. It was the first one we did in
second year. I remember, 42 years later, looking at the one-page summary
we were handed out in the first lecture, and feeling everything from
Pascal falling into place. No imperative procedural language was hard
after that. We used the original 68-R implementation on an ICL mainframe.
Post by Lawrence D'Oliveiro
Post by John Dallman
Yup. I've done ports to HP-UX, Solaris, IRIX, AIX, MacOS X (three
different architectures), Linux, and Android. iOS is a UNIX-based
system, but has to be treated as /sui generis/.
I'm mildly surprised you didn't have trouble with MacOS X as well.
I work on libraries, not applications. The test harness is a command line
program, and the debug graphics are X11. I've never had to write a single
Mac OS X dialog.

John
Lawrence D'Oliveiro
2024-06-09 23:43:30 UTC
Permalink
Post by John Dallman
It was a really interesting language. It was the first one we did in
second year. I remember, 42 years later, looking at the one-page summary
we were handed out in the first lecture, and feeling everything from
Pascal falling into place. No imperative procedural language was hard
after that.
Niklaus Wirth hated Algol-68. He felt it was way too complex. He quit the
committee and went off to create Pascal instead.

Did you look at the two-level grammar? I felt that was just amazing, that
it could specify, entirely in the production rules, that a variable being
referenced not only had to be declared in scope, but also given a
compatible type (after any permissible conversions) as well.

I do wish C had adopted Algol-style “:=” for assignment and “=” for
equality testing ...
Post by John Dallman
We used the original 68-R implementation on an ICL mainframe.
That was the first-ever substantial implementation of Algol-68, as I
recall, done at RSRE Malvern?
Post by John Dallman
Post by Lawrence D'Oliveiro
Post by John Dallman
Yup. I've done ports to HP-UX, Solaris, IRIX, AIX, MacOS X (three
different architectures), Linux, and Android. iOS is a UNIX-based
system, but has to be treated as /sui generis/.
I'm mildly surprised you didn't have trouble with MacOS X as well.
I work on libraries, not applications. The test harness is a command
line program, and the debug graphics are X11. I've never had to write a
single Mac OS X dialog.
Ah. So your test harness required installing an X server on MacOS X, even
if the deliverable did not?

But now, X11 is on its way out in the Linux world, and some in the BSD
world are following suit, too. Wayland is the new way to do cross-platform
GUIs.
John Dallman
2024-06-10 08:27:00 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by John Dallman
We used the original 68-R implementation on an ICL mainframe.
That was the first-ever substantial implementation of Algol-68, as
I recall, done at RSRE Malvern?
As best I remember, yes - it's been over 40 years. The main restriction
was that everything had to be declared before use, which was not
difficult.
Post by Lawrence D'Oliveiro
Ah. So your test harness required installing an X server on MacOS
X, even if the deliverable did not?
Yes, but that isn't much of a problem.
Post by Lawrence D'Oliveiro
But now, X11 is on its way out in the Linux world, and some in the
BSD world are following suit, too. Wayland is the new way to do
cross-platform GUIs.
We will use X11 while we can, because routing graphics across the network
is really useful.

John
Ahem A Rivet's Shot
2024-06-10 09:42:04 UTC
Permalink
On Mon, 10 Jun 2024 09:27 +0100 (BST)
Post by John Dallman
As best I remember, yes - it's been over 40 years. The main restriction
was that everything had to be declared before use, which was not
difficult.
I only ever met 68C but presumably 68R also had typography
dependent stropping ?

It occurred to me to think that if anyone were to go down that
rat hole today they'd have colour to play with, keywords in blue, variables
in red, comments in light grey ...

Oh hang on vim does that to my code anyway!
--
Steve O'Hara-Smith
Odds and Ends at http://www.sohara.org/
For forms of government let fools contest
Whate're is best administered is best - Alexander Pope
John Dallman
2024-06-10 12:17:00 UTC
Permalink
Post by Ahem A Rivet's Shot
I only ever met 68C but presumably 68R also had typography
dependent stropping?
The ICL 1900 was an uppercase-only machine, so you had to put Algol
keywords in 'SINGLE-QUOTES'. This is not too bad when you're using
punched cards, since you have to be quite careful with the typography
anyway.

The machine in question had history. The CPU was old enough that the
brand name on it was ICT (International Computers & Tabulators) rather
than ICL (International Computers Limited). It was a 1904S*. The * was
not an official designation: we'd scrounged equipment from other
universities when they replaced their ICLs, and found that we could plug
in an extra memory cabinet above the spec for this model. Worked fine.

The operating system was George 2L, the Loughborough-enhanced version of
George 2, which we ran in preference to George 3.
Post by Ahem A Rivet's Shot
It occurred to me to think that if anyone were to go down that
rat hole today they'd have colour to play with, keywords in blue,
variables in red, comments in light grey ...
No, thanks. Colour highlighting with my partial sight makes things
harder.

John
Kerr-Mudd, John
2024-06-10 16:17:27 UTC
Permalink
On Mon, 10 Jun 2024 13:17 +0100 (BST)
Post by John Dallman
Post by Ahem A Rivet's Shot
I only ever met 68C but presumably 68R also had typography
dependent stropping?
The ICL 1900 was an uppercase-only machine, so you had to put Algol
keywords in 'SINGLE-QUOTES'. This is not too bad when you're using
punched cards, since you have to be quite careful with the typography
anyway.
The machine in question had history. The CPU was old enough that the
brand name on it was ICT (International Computers & Tabulators) rather
than ICL (International Computers Limited). It was a 1904S*. The * was
not an official designation: we'd scrounged equipment from other
universities when they replaced their ICLs, and found that we could plug
in an extra memory cabinet above the spec for this model. Worked fine.
The operating system was George 2L, the Loughborough-enhanced version of
George 2, which we ran in preference to George 3.
All I know about George III (and that includes the Kings- was that
the fellah with speech trouble?) was that it had actual 'IF...ELSE...ENDIF,
as opposed to IBM's JCL with it's arcane reverse condition-code testing.
Post by John Dallman
Post by Ahem A Rivet's Shot
It occurred to me to think that if anyone were to go down that
rat hole today they'd have colour to play with, keywords in blue,
variables in red, comments in light grey ...
No, thanks. Colour highlighting with my partial sight makes things
harder.
John
--
Bah, and indeed Humbug.
Lawrence D'Oliveiro
2024-06-11 07:08:32 UTC
Permalink
All I know about George III (and that includes the Kings- was that the
fellah with speech trouble?) ...
No, that George was from the 18th century, and was the one that Uranus was
originally named for. Yes, did you know Uranus was originally called
“Georgium Sidus” (“George’s Star”)? Maybe they thought “Uranus” was more
dignified. Fnarr-fnarr.

The stutterer was a later, 20th-century George.
Ahem A Rivet's Shot
2024-06-11 07:41:14 UTC
Permalink
On Tue, 11 Jun 2024 07:08:32 -0000 (UTC)
Maybe they thought “Uranus” was more dignified. Fnarr-fnarr.
Then there was Patrick Moore who thought Urinous sounded more
dignified.
--
Steve O'Hara-Smith
Odds and Ends at http://www.sohara.org/
For forms of government let fools contest
Whate're is best administered is best - Alexander Pope
Charlie Gibbs
2024-06-10 17:40:06 UTC
Permalink
Post by John Dallman
Post by Ahem A Rivet's Shot
I only ever met 68C but presumably 68R also had typography
dependent stropping?
The ICL 1900 was an uppercase-only machine, so you had to put Algol
keywords in 'SINGLE-QUOTES'. This is not too bad when you're using
punched cards, since you have to be quite careful with the typography
anyway.
Oh, I remember that on the IBM mainframe at university. Yech.

Mind you, by then I was hooked on the joys of assembly language,
and becoming adept at pushing bits around. I hated all Wirthian
languages and still do; I didn't like having some snooty compiler
slapping my wrist and telling me I couldn't do something that a
couple of machine instructions would accomplish.

The head of our computer science department was one J.E.L. Peck,
who was on the Algol 68 design committee. I wrote a term project
in assembly language, partly because I liked it better and partly
out of sheer defiance. During the review he would occasionally
stop and say, with a pained expression on his face, "Why did you
write it in assembly language?"

By this time I was writing assembly language and RPG part-time
at a small commercial shop, so I dropped out and became a
real-world programmer.
--
/~\ Charlie Gibbs | The Internet is like a big city:
\ / <***@kltpzyxm.invalid> | it has plenty of bright lights and
X I'm really at ac.dekanfrus | excitement, but also dark alleys
/ \ if you read it the right way. | down which the unwary get mugged.
Bob Eager
2024-06-10 21:38:51 UTC
Permalink
The head of our computer science department was one J.E.L. Peck, who was
on the Algol 68 design committee. I wrote a term project in assembly
language, partly because I liked it better and partly out of sheer
defiance. During the review he would occasionally stop and say, with a
pained expression on his face, "Why did you write it in assembly
language?"
I knew John Peck; he flew me over from the UK to Vancouver to advise them
on something, and I stayed in his house. He and his wife also stayed with
us for a week.

At that time, he was a fan of BCPL, which is about as low level as you can
get for a high level language. He was responsible for a nice line editor
called CHEF, which was rather lik3e the UNIX 'ed' on steroids. Much nicer,
too.

I implemented that editor on CP/M, VAX/VMS, UNIX and one or two others.
I still have the sources.
--
Using UNIX since v6 (1975)...

Use the BIG mirror service in the UK:
http://www.mirrorservice.org
R Daneel Olivaw
2024-06-10 18:26:04 UTC
Permalink
Post by John Dallman
Post by Ahem A Rivet's Shot
I only ever met 68C but presumably 68R also had typography
dependent stropping?
The ICL 1900 was an uppercase-only machine, so you had to put Algol
keywords in 'SINGLE-QUOTES'. This is not too bad when you're using
punched cards, since you have to be quite careful with the typography
anyway.
The machine in question had history. The CPU was old enough that the
brand name on it was ICT (International Computers & Tabulators) rather
than ICL (International Computers Limited). It was a 1904S*. The * was
not an official designation: we'd scrounged equipment from other
universities when they replaced their ICLs, and found that we could plug
in an extra memory cabinet above the spec for this model. Worked fine.
The operating system was George 2L, the Loughborough-enhanced version of
George 2, which we ran in preference to George 3.
Post by Ahem A Rivet's Shot
It occurred to me to think that if anyone were to go down that
rat hole today they'd have colour to play with, keywords in blue,
variables in red, comments in light grey ...
No, thanks. Colour highlighting with my partial sight makes things
harder.
John
My experiences with Algol 68-R were on an ICL 1903 under George 3, and
they predate yours by around 6-8 years. The implementation was fairly
good, except that Heap storage was not an option, and I can't remember
being able to link/map in subroutines written in other languages.
The ICL 19xx character set was a bit strange, it seemed to be based on
Ascii. All values being Octal, it was:
ICL Ascii Text
000-017 060-077 "0" - "?" (yes, character zero was binary zero)
020-037 040-057 space - "/"
040-077 100-137 "@" - "_"
With only 6 bits to play with it's hardly surprising that they only had
upper case characters. An ICL 19xx word was 24 bits, 4 6-bit bytes.

After having worked extensively with Algol 68 - a language which tries
to make it difficult for you to screw up - running across C years later
was a big shock, a language which embraces obfustication. I worked on a
mainframe and took to writing system-interface subroutines in Assembler
rather than C because it was simpler.
Want to pass a variable length text string as an argument to a
subroutine? Write the subroutine in assembler and it could extract the
information as to the length of the string from the control information.
No "termination with a null character", no buffer overflows either.
This was on Unisys 2200s, the successors of the Univac 1100s mentioned
at the top of this thread.
Peter Flass
2024-06-10 23:53:05 UTC
Permalink
Post by R Daneel Olivaw
Post by John Dallman
Post by Ahem A Rivet's Shot
I only ever met 68C but presumably 68R also had typography
dependent stropping?
The ICL 1900 was an uppercase-only machine, so you had to put Algol
keywords in 'SINGLE-QUOTES'. This is not too bad when you're using
punched cards, since you have to be quite careful with the typography
anyway.
The machine in question had history. The CPU was old enough that the
brand name on it was ICT (International Computers & Tabulators) rather
than ICL (International Computers Limited). It was a 1904S*. The * was
not an official designation: we'd scrounged equipment from other
universities when they replaced their ICLs, and found that we could plug
in an extra memory cabinet above the spec for this model. Worked fine.
The operating system was George 2L, the Loughborough-enhanced version of
George 2, which we ran in preference to George 3.
Post by Ahem A Rivet's Shot
It occurred to me to think that if anyone were to go down that
rat hole today they'd have colour to play with, keywords in blue,
variables in red, comments in light grey ...
No, thanks. Colour highlighting with my partial sight makes things
harder.
John
My experiences with Algol 68-R were on an ICL 1903 under George 3, and
they predate yours by around 6-8 years. The implementation was fairly
good, except that Heap storage was not an option, and I can't remember
being able to link/map in subroutines written in other languages.
The ICL 19xx character set was a bit strange, it seemed to be based on
ICL Ascii Text
000-017 060-077 "0" - "?" (yes, character zero was binary zero)
020-037 040-057 space - "/"
With only 6 bits to play with it's hardly surprising that they only had
upper case characters. An ICL 19xx word was 24 bits, 4 6-bit bytes.
After having worked extensively with Algol 68 - a language which tries
to make it difficult for you to screw up - running across C years later
was a big shock, a language which embraces obfustication. I worked on a
mainframe and took to writing system-interface subroutines in Assembler
rather than C because it was simpler.
Want to pass a variable length text string as an argument to a
subroutine? Write the subroutine in assembler and it could extract the
information as to the length of the string from the control information.
No "termination with a null character", no buffer overflows either.
This was on Unisys 2200s, the successors of the Univac 1100s mentioned
at the top of this thread.
Null-terminated strings are probably the biggest error in C’s design. At
the cost of an extra byte they could have had Pascal (and PL/I) strings
with an actual length.
--
Pete
R Daneel Olivaw
2024-06-11 06:10:27 UTC
Permalink
Post by Peter Flass
Post by R Daneel Olivaw
Post by John Dallman
Post by Ahem A Rivet's Shot
I only ever met 68C but presumably 68R also had typography
dependent stropping?
The ICL 1900 was an uppercase-only machine, so you had to put Algol
keywords in 'SINGLE-QUOTES'. This is not too bad when you're using
punched cards, since you have to be quite careful with the typography
anyway.
The machine in question had history. The CPU was old enough that the
brand name on it was ICT (International Computers & Tabulators) rather
than ICL (International Computers Limited). It was a 1904S*. The * was
not an official designation: we'd scrounged equipment from other
universities when they replaced their ICLs, and found that we could plug
in an extra memory cabinet above the spec for this model. Worked fine.
The operating system was George 2L, the Loughborough-enhanced version of
George 2, which we ran in preference to George 3.
Post by Ahem A Rivet's Shot
It occurred to me to think that if anyone were to go down that
rat hole today they'd have colour to play with, keywords in blue,
variables in red, comments in light grey ...
No, thanks. Colour highlighting with my partial sight makes things
harder.
John
My experiences with Algol 68-R were on an ICL 1903 under George 3, and
they predate yours by around 6-8 years. The implementation was fairly
good, except that Heap storage was not an option, and I can't remember
being able to link/map in subroutines written in other languages.
The ICL 19xx character set was a bit strange, it seemed to be based on
ICL Ascii Text
000-017 060-077 "0" - "?" (yes, character zero was binary zero)
020-037 040-057 space - "/"
With only 6 bits to play with it's hardly surprising that they only had
upper case characters. An ICL 19xx word was 24 bits, 4 6-bit bytes.
After having worked extensively with Algol 68 - a language which tries
to make it difficult for you to screw up - running across C years later
was a big shock, a language which embraces obfustication. I worked on a
mainframe and took to writing system-interface subroutines in Assembler
rather than C because it was simpler.
Want to pass a variable length text string as an argument to a
subroutine? Write the subroutine in assembler and it could extract the
information as to the length of the string from the control information.
No "termination with a null character", no buffer overflows either.
This was on Unisys 2200s, the successors of the Univac 1100s mentioned
at the top of this thread.
Null-terminated strings are probably the biggest error in C’s design. At
the cost of an extra byte they could have had Pascal (and PL/I) strings
with an actual length.
I skimmed over an interview by one of the founders of the language and
he saw things the same way - a big mistake.
Lawrence D'Oliveiro
2024-06-11 07:10:13 UTC
Permalink
Post by Peter Flass
Null-terminated strings are probably the biggest error in C’s design. At
the cost of an extra byte they could have had Pascal (and PL/I) strings
with an actual length.
I suppose they didn’t want to have some arbitrary maximum length, at the
cost of a representation that is not completely data-independent.

For example, Pascal implementations (at least on micros) commonly provided
a “Str255” type, which could hold a maximum of 255 bytes.
John Ames
2024-06-11 15:44:02 UTC
Permalink
On Tue, 11 Jun 2024 07:10:13 -0000 (UTC)
Post by Lawrence D'Oliveiro
I suppose they didn’t want to have some arbitrary maximum length, at
the cost of a representation that is not completely data-independent.
That's ultimately the thing - every solution for the problem of "array
of arbitrary size" involves *some* trade-off. If you have a definite
length specifier, then size is no longer completely arbitrary; if you
use in-band signaling, it's vulnerable to malformed data; if you use a
linked-list-of-sized-chunks structure, traversing the array is no
longer a simple walk. You have to pick *one,* or you have no solution
at all.

One might argue that defining the length property with the same size as
the pointer type should be as near to truly-arbitrary as makes no
difference, which is true enough, I s'pose; but then you're assuming
memory is cheap enough that an extra N bytes per string/array will
never add up to a problem.

Which may well be true, on modern desktop and server hardware, in this
era where tens of gigabytes of DRAM can be had for liquor money, if not
beer money - but it *definitely* wasn't true Back In The Day, and there
are still plenty of spaces today where that's not the case. So, then,
d'you choose another solution to the original problem across-the-board,
or do you resign yourself to doing things differently in embedded
spaces than you do on Real Computers? At the end of the day, there's no
perfect answer to that question.
John Ames
2024-06-11 18:12:43 UTC
Permalink
Additionally, it's completely possible to have strings be both null-
terminated *and* size-tagged! If you care about that sort of thing,
it's as simple as defining a new type:

typedef struct {
size_t length;
char * text;
} safe_string;

...which you can handle safely in your own code, or pass to functions
expecting a C-string as &some_string.text - of course, that only solves
the issue within your own libraries/applications, but it's entirely
feasible.
Stefan Ram
2024-06-11 18:55:58 UTC
Permalink
Post by John Ames
Additionally, it's completely possible to have strings be both null-
terminated *and* size-tagged!
[Disclaimer: not directly related to "folklore", but to strings in C]

Yes, one can say that C is in some sense agnostic to that question.

String literals yield null-terminated strings indeed, so one
needs to renounce or wrap them, as one needs to renounce str
functions of the C library. But then one can write own library
functions for size-tagged strings, which is precisely what
happens, when one implements Python in C. And this work has
already been done for C programmers, who can use CPython as
a library for size-tagged strings in their C programs.
Charlie Gibbs
2024-06-11 20:40:12 UTC
Permalink
Post by Stefan Ram
Post by John Ames
Additionally, it's completely possible to have strings be both null-
terminated *and* size-tagged!
[Disclaimer: not directly related to "folklore", but to strings in C]
Yes, one can say that C is in some sense agnostic to that question.
String literals yield null-terminated strings indeed, so one
needs to renounce or wrap them, as one needs to renounce str
functions of the C library. But then one can write own library
functions for size-tagged strings, which is precisely what
happens, when one implements Python in C. And this work has
already been done for C programmers, who can use CPython as
a library for size-tagged strings in their C programs.
The new C function getline() is a step in that direction, and indeed
served as an inspiration for me to write my own library of string
functions in C.
--
/~\ Charlie Gibbs | The Internet is like a big city:
\ / <***@kltpzyxm.invalid> | it has plenty of bright lights and
X I'm really at ac.dekanfrus | excitement, but also dark alleys
/ \ if you read it the right way. | down which the unwary get mugged.
Lawrence D'Oliveiro
2024-06-12 03:14:44 UTC
Permalink
Post by Charlie Gibbs
The new C function getline() is a step in that direction, and indeed
served as an inspiration for me to write my own library of string
functions in C.
A summary of the current state of play:
<https://manpages.debian.org/7/string_copying.en.html>.
Mike Spencer
2024-06-11 22:39:52 UTC
Permalink
Post by Stefan Ram
Post by John Ames
Additionally, it's completely possible to have strings be both null-
terminated *and* size-tagged!
[Disclaimer: not directly related to "folklore", but to strings in C]
Yes, one can say that C is in some sense agnostic to that question.
May I digress further from aFc to ask about strings in modern data bases?

I learned about dBase II (old enough for folklore?) where a data base
was composed of records and each record was of specified size. That
meant that (a) Lake Chargoggagoggmanchauggagoggchaubunagungamaugg
wouldn't fit in some hypothetical db of lake names or that lots a
precious space was wasted.

How do modern DB systems handle strings of arbitrary length or even
large blocks of text?

I can think if ways but just guesswork by a purely avocational techie.
--
Mike Spencer Nova Scotia, Canada
John Ames
2024-06-11 23:11:59 UTC
Permalink
On 11 Jun 2024 19:39:52 -0300
Post by Mike Spencer
How do modern DB systems handle strings of arbitrary length or even
large blocks of text?
My knowledge on this is limited compared to real DB gurus, I'm sure,
but: fixed-length fields are still relatively common, when you can be
reasonably sure that a limit of X characters will fit the bill. But to
cite the example I'm familiar with, MS Access's - go ahead and laugh,
DB gurus - Memo type essentially stores a pointer into a string heap
that exists somewhere outside of the normal record block. The trade-off
is flexibility vs. speed - fixed-length fields are right there in the
record, while variable-length requires an extra read from elsewhere.

(The other trade-off is fragmentation, naturally, but Access is already
a garbage fire where that's concerned...)
Lawrence D'Oliveiro
2024-06-12 03:12:48 UTC
Permalink
.. Access is already a garbage fire where that's concerned...)
I don’t know why people still want to use Access. Even SQLite is better
than that. And LibreOffice Base offers you a choice of add-on DBMS
backends, including both MySQL/MariaDB (full transactions and multiuser!)
and SQLite (also does transactions!).
Peter Flass
2024-06-11 23:59:57 UTC
Permalink
Post by Mike Spencer
Post by Stefan Ram
Post by John Ames
Additionally, it's completely possible to have strings be both null-
terminated *and* size-tagged!
[Disclaimer: not directly related to "folklore", but to strings in C]
Yes, one can say that C is in some sense agnostic to that question.
May I digress further from aFc to ask about strings in modern data bases?
I learned about dBase II (old enough for folklore?) where a data base
was composed of records and each record was of specified size. That
meant that (a) Lake Chargoggagoggmanchauggagoggchaubunagungamaugg
wouldn't fit in some hypothetical db of lake names or that lots a
precious space was wasted.
How do modern DB systems handle strings of arbitrary length or even
large blocks of text?
I can think if ways but just guesswork by a purely avocational techie.
I’m not an expert, but DB2 can store arbitrary data as “blobs” (binary
large objects)
--
Pete
Lawrence D'Oliveiro
2024-06-12 03:03:35 UTC
Permalink
... DB2 can store arbitrary data as “blobs” (binary large objects)
So can all modern DBMSes. And “blob” doesn’t stand for anything, it just
means what it says (“Binary Large OBject” is what they call a
“backronym”).
Lawrence D'Oliveiro
2024-06-12 03:11:10 UTC
Permalink
Post by Mike Spencer
How do modern DB systems handle strings of arbitrary length or even
large blocks of text?
The main DBMSes I have experience with are MySQL/MariaDB and SQLite.

MariaDB has both fixed-length and variable-length types for character/text
or binary data <https://mariadb.com/kb/en/data-types/>. The variable-
length types can have maximum lengths of 2**16 - 1, 2**24 - 1 and 2**32 -
1 bytes (a.k.a. “short”, “medium” and “long”).

SQLite is rather less rigid in its interpretation of types
<https://sqlite.org/datatype3.html>. The maximum length of data types
defaults to 10**9, according to
<https://sqlite.org/limits.html#max_length>, and this can be raised to no
more than 2**31 - 1.
Ahem A Rivet's Shot
2024-06-11 20:08:32 UTC
Permalink
On Tue, 11 Jun 2024 11:12:43 -0700
Post by John Ames
Additionally, it's completely possible to have strings be both null-
terminated *and* size-tagged! If you care about that sort of thing,
typedef struct {
size_t length;
char * text;
} safe_string;
...which you can handle safely in your own code, or pass to functions
expecting a C-string as &some_string.text - of course, that only solves
the issue within your own libraries/applications, but it's entirely
feasible.
Absolutely - and had something like that been added around (say)
the time that ANSI reworked the syntax it might have caught on. Although
the overhead of a size_t for a one character string is a bit heavy (another
reason for null terminated strings, they're space efficient).
--
Steve O'Hara-Smith
Odds and Ends at http://www.sohara.org/
For forms of government let fools contest
Whate're is best administered is best - Alexander Pope
Ahem A Rivet's Shot
2024-06-11 07:20:32 UTC
Permalink
On Mon, 10 Jun 2024 16:53:05 -0700
Post by Peter Flass
Null-terminated strings are probably the biggest error in C’s design. At
the cost of an extra byte they could have had Pascal (and PL/I) strings
with an actual length.
Right but would that have been an 8 bit byte with a 256 character
limit or a 9 bit one with 512 character limit. How would you handle large
strings. One of the reasons for null terminated strings was to avoid
issues like that. Another reason was the sheer elegance of things like
strcpy and strtok.

Search the archives of this group and you will find Dennis Ritchie
explaining the reasoning behind the choice.
--
Steve O'Hara-Smith
Odds and Ends at http://www.sohara.org/
For forms of government let fools contest
Whate're is best administered is best - Alexander Pope
Lawrence D'Oliveiro
2024-06-11 08:31:19 UTC
Permalink
Another reason was the sheer elegance of things like strcpy and strtok.
Neither of which is recommended nowadays.
Ahem A Rivet's Shot
2024-06-11 09:08:11 UTC
Permalink
On Tue, 11 Jun 2024 08:31:19 -0000 (UTC)
Post by Lawrence D'Oliveiro
Another reason was the sheer elegance of things like strcpy and strtok.
Neither of which is recommended nowadays.
For sure, but that was then when memory and cycles counted for more
than safety. Also it was a systems programming language designed to make
writing an efficient OS easier, it was never designed to be an application
programming language for widespread use - so once again efficiency over
safety was the right call in context.

It is still the language of choice for kernel code.

It was, and is, a poor language choice for commercial code.
--
Steve O'Hara-Smith
Odds and Ends at http://www.sohara.org/
For forms of government let fools contest
Whate're is best administered is best - Alexander Pope
Andy Walker
2024-06-10 22:17:45 UTC
Permalink
Post by John Dallman
The ICL 1900 was an uppercase-only machine, so you had to put Algol
keywords in 'SINGLE-QUOTES'.
Before the myths get too much traction, /your/ ICL 1900 may
have been u-c only, but /ours/ definitely also had l-c, and the usual
way to write Algol 68R was using u-c stropping [my memory confirmed
by reference to the A68R User Guide]:

( REAL x; read (x); print (sqrt (ABS x) )

or whatever. Of course, that was for paper tape; cards were u-c only.
I was "brought up" on paper tape, and have [therefore] never understood
why some people preferred cards and horrid u-c programs.
--
Andy Walker, Nottingham.
Andy's music pages: www.cuboid.me.uk/andy/Music
Composer of the day: www.cuboid.me.uk/andy/Music/Composers/Handel
R Daneel Olivaw
2024-06-11 06:52:13 UTC
Permalink
Post by John Dallman
The ICL 1900 was an uppercase-only machine, so you had to put Algol
keywords in 'SINGLE-QUOTES'.
    Before the myths get too much traction, /your/ ICL 1900 may
have been u-c only, but /ours/ definitely also had l-c, and the usual
way to write Algol 68R was using u-c stropping [my memory confirmed
  ( REAL x; read (x); print (sqrt (ABS x) )
or whatever.  Of course, that was for paper tape;  cards were u-c only.
I was "brought up" on paper tape, and have [therefore] never understood
why some people preferred cards and horrid u-c programs.
As I wrote above, an ICL 19xx word was 24 bits = 4 x 6-bit bytes. ICL's
next offering was the 2900 series and that had a 32-bit word, the 2900
character set was EBCDIC although I think it could also use Ascii.

The 2900 was pretty advanced when it came on the market, but it was
utterly incompatible to the 1900 systems and one of my early jobs was a
conversion project from ICL 1900 to Univac 1100 - if they had to convert
their 1900 programs then it was going to be away from that manufacturer.

May I refer you to "Layout of 24-bit word" in
https://www.ict1900.com/1-3-1%201900%20System%20Architecture.htm (the
ICL 1900 series was originally ICT 1900).

As to the Algol 68-R User Guide, the syntax was supposed to be
applicable to other implementations (I'm not sure there were any) and
not just the ICL 1900. The Preface is somewhat misleading there,
Appendix 1 modifies it.
Take a good look at section 1.5 and at Appendix 1.
(my copy is the second edition, published by HMSO in 1974)
Bob Eager
2024-06-11 09:46:56 UTC
Permalink
Post by R Daneel Olivaw
As I wrote above, an ICL 19xx word was 24 bits = 4 x 6-bit bytes. ICL's
next offering was the 2900 series and that had a 32-bit word, the 2900
character set was EBCDIC although I think it could also use Ascii.
That's correct.ICL software didn't use ASCII, but you could flip a bit
(bit 10) in the system status register to make it ASCII or EBCDIC. All
that did, however, was make the BCD instructions act appropriately.
Post by R Daneel Olivaw
The 2900 was pretty advanced when it came on the market, but it was
utterly incompatible to the 1900 systems and one of my early jobs was a
conversion project from ICL 1900 to Univac 1100 - if they had to convert
their 1900 programs then it was going to be away from that manufacturer.
Although all the microcoded 2900s (all except the 298x) could also run
1900 code. Either as a machine that IPLd as a 1900 from the start, or
dynamically (i.e. you could have one process running in 1900 mode). That
was another set of bits in the SSR (bits 12-15). The multiple bits were
because there was privisions for other emulations (e.g. LEO and System 4).
--
Using UNIX since v6 (1975)...

Use the BIG mirror service in the UK:
http://www.mirrorservice.org
Bob Eager
2024-06-11 09:50:01 UTC
Permalink
Post by Bob Eager
Post by R Daneel Olivaw
As I wrote above, an ICL 19xx word was 24 bits = 4 x 6-bit bytes.
ICL's next offering was the 2900 series and that had a 32-bit word, the
2900 character set was EBCDIC although I think it could also use Ascii.
That's correct.ICL software didn't use ASCII, but you could flip a bit
(bit 10) in the system status register to make it ASCII or EBCDIC. All
that did, however, was make the BCD instructions act appropriately.
Post by R Daneel Olivaw
The 2900 was pretty advanced when it came on the market, but it was
utterly incompatible to the 1900 systems and one of my early jobs was a
conversion project from ICL 1900 to Univac 1100 - if they had to
convert their 1900 programs then it was going to be away from that
manufacturer.
Although all the microcoded 2900s (all except the 298x) could also run
1900 code. Either as a machine that IPLd as a 1900 from the start, or
dynamically (i.e. you could have one process running in 1900 mode). That
was another set of bits in the SSR (bits 12-15). The multiple bits were
because there was privisions for other emulations (e.g. LEO and System 4).
Correction. It was just the 2980 (and I think the 2982) that were hard
wired. The later machines (e.g. 2988) were microcoded.
--
Using UNIX since v6 (1975)...

Use the BIG mirror service in the UK:
http://www.mirrorservice.org
Andy Walker
2024-06-11 14:02:31 UTC
Permalink
[...]
     Before the myths get too much traction, /your/ ICL 1900 may
have been u-c only, but /ours/ definitely also had l-c, [...].
Of course, that was for paper tape;  cards were u-c only. [...]
As I wrote above, an ICL 19xx word was 24 bits = 4 x 6-bit bytes. [...]
Yes, but that was internal. Paper tape equipment used shifts to
extend the 64 available characters. Wiki at

https://en.wikipedia.org/wiki/ICT_1900_series#Character_sets

gives an example and more info. I assume this was inherited from Atlas,
which used a similar scheme, dictated by the available Flexowriters.
--
Andy Walker, Nottingham.
Andy's music pages: www.cuboid.me.uk/andy/Music
Composer of the day: www.cuboid.me.uk/andy/Music/Composers/Bendel
R Daneel Olivaw
2024-06-11 15:52:11 UTC
Permalink
[...]
     Before the myths get too much traction, /your/ ICL 1900 may
have been u-c only, but /ours/ definitely also had l-c, [...].
Of course, that was for paper tape;  cards were u-c only. [...]
As I wrote above, an ICL 19xx word was 24 bits = 4 x 6-bit bytes. [...]
    Yes, but that was internal.  Paper tape equipment used shifts to
extend the 64 available characters.  Wiki at
  https://en.wikipedia.org/wiki/ICT_1900_series#Character_sets
gives an example and more info.  I assume this was inherited from Atlas,
which used a similar scheme, dictated by the available Flexowriters.
Now that falls under "very strange".
#74 seems to be dual purpose - Alpha-shift, but also ECMA "$".
#75 can be Beta-shift or "]"
#76 can be Delta-shift or ECMA up-arrow
#77 can be fill-char or ECMA left-arrow.
The $, ], up-arrow, left-arrow usages (and pound for Ascii $) are
confirmed by Algol 68-R Appendix 1, but the other ECMA character (` as
_) does not make sense because "`" is outside the #0 - #77 range.

In any case, the three shift characters (+ fill) cannot have been used
when programming, which rules out having lower case chars in programs.

Maybe I knew how all this worked 45 years ago but it's long gone now.
Andy Walker
2024-06-11 19:29:26 UTC
Permalink
On 11/06/2024 16:52, R Daneel Olivaw wrote:
[...]
Post by R Daneel Olivaw
As I wrote above, an ICL 19xx word was 24 bits = 4 x 6-bit bytes. [...]
     Yes, but that was internal.  Paper tape equipment used shifts to
extend the 64 available characters.  Wiki at
   https://en.wikipedia.org/wiki/ICT_1900_series#Character_sets
gives an example and more info.  I assume this was inherited from Atlas,
which used a similar scheme, dictated by the available Flexowriters.
Now that falls under "very strange".
#74 seems to be dual purpose - Alpha-shift, but also ECMA "$".
#75 can be Beta-shift or "]"
#76 can be Delta-shift or ECMA up-arrow
#77 can be fill-char or ECMA left-arrow.
The $, ], up-arrow, left-arrow usages (and pound for Ascii $) are
confirmed by Algol 68-R Appendix 1, but the other ECMA character (`
as _) does not make sense because "`" is outside the #0 - #77 range.> In any case, the three shift characters (+ fill) cannot have been
used when programming, which rules out having lower case chars in
programs.
You're conflating the characters on the paper tape [and so
available for writing programs] with the available characters [type
"CHAR"] for output to a lineprinter. Compare a modern keyboard: eg,
"'" and "@" are on the same physical key, as are "6" and "^", etc.,
so would generate the same code on physical paper tape. Likewise,
my keyboard generates both "A" and "a" from one physical key. There
is no necessary connexion with which [if any] lineprinter character
is generated. These days, keyboards can do clever things before the
computer gets at the key presses, but Flexowriters simply punched one
row of holes for each key you pressed [inc shift keys].
Post by R Daneel Olivaw
Maybe I knew how all this worked 45 years ago but it's long gone now.
Likewise, so I don't recall what happened if you had in your
program [for example]

CHAR c = "a"; print (c)

[assuming an UC-only lineprinter]. Luckily it no longer matters.
--
Andy Walker, Nottingham.
Andy's music pages: www.cuboid.me.uk/andy/Music
Composer of the day: www.cuboid.me.uk/andy/Music/Composers/Bendel
Ahem A Rivet's Shot
2024-06-11 20:12:24 UTC
Permalink
On Tue, 11 Jun 2024 20:29:26 +0100
Post by Andy Walker
You're conflating the characters on the paper tape [and so
available for writing programs] with the available characters [type
"CHAR"] for output to a lineprinter. Compare a modern keyboard: eg,
so would generate the same code on physical paper tape.
The shift and control keys both modify what gets punched onto the
paper tape so these do not produce the same code on the paper tape.
--
Steve O'Hara-Smith
Odds and Ends at http://www.sohara.org/
For forms of government let fools contest
Whate're is best administered is best - Alexander Pope
Andy Walker
2024-06-11 22:12:38 UTC
Permalink
On 11/06/2024 21:12, Ahem A Rivet's Shot wrote:
[I wrote:]
Post by Ahem A Rivet's Shot
Post by Andy Walker
You're conflating the characters on the paper tape [and so
available for writing programs] with the available characters [type
"CHAR"] for output to a lineprinter. Compare a modern keyboard: eg,
so would generate the same code on physical paper tape.
The shift and control keys both modify what gets punched onto the
paper tape so these do not produce the same code on the paper tape.
Not so for a 1960s Flexowriter. There was no control key, and
shifts were characters in their own right. Essentially -- press a key,
any key, and a row of holes is punched. With 7-track [6-bit + parity]
tape, there was no room to encode any more information. It was up to
the computer to keep track of which case the tape was in. [There were,
of course, other more "modern" electric typewriters, but by the time we
got our hands on one, Unix was "in" and everything went 8-bit Ascii.]
--
Andy Walker, Nottingham.
Andy's music pages: www.cuboid.me.uk/andy/Music
Composer of the day: www.cuboid.me.uk/andy/Music/Composers/Bendel
Charlie Gibbs
2024-06-11 20:40:11 UTC
Permalink
Post by Andy Walker
You're conflating the characters on the paper tape [and so
available for writing programs] with the available characters [type
"CHAR"] for output to a lineprinter. Compare a modern keyboard: eg,
so would generate the same code on physical paper tape. Likewise,
my keyboard generates both "A" and "a" from one physical key. There
is no necessary connexion with which [if any] lineprinter character
is generated. These days, keyboards can do clever things before the
computer gets at the key presses, but Flexowriters simply punched one
row of holes for each key you pressed [inc shift keys].
A similar thing was true of the keypunches at the university.
They were in the process of converting from an IBM 7044 to a
360/67, and had two sets of keypunches. The older ones printed
BCD glyphs on the cards, while the new ones were EBCDIC. I
quickly noticed that the correspondence between the keys and
the holes punched was the same for either. So while other
students waited for one of the right keypunches to become
available, I would sit down at one of the "wrong" ones and
start touch typing, ignoring the fact that many of the special
characters printed on the card were the only thing that was
wrong. Between this and my ability to clear a jammed keypunch
with nothing more than a blank card torn in half lengthwise,
I never had to wait for a keypunch, even at peak times.
(Plus, I left the cleared punch for others to use afterwards.)

But getting back to paper tape, several years later I was in a
job where I had to write a program to read paper tapes punched
on an IBM Selectric with an attached tape punch. These tapes
were not punched in EBCDIC, ASCII, or any code I recognized.
I found that the "Correspondence Code" punched into these tapes
was nothing more than the tilt and rotate codes used to set the
"golf ball" in the proper position for each character.
--
/~\ Charlie Gibbs | The Internet is like a big city:
\ / <***@kltpzyxm.invalid> | it has plenty of bright lights and
X I'm really at ac.dekanfrus | excitement, but also dark alleys
/ \ if you read it the right way. | down which the unwary get mugged.
Lawrence D'Oliveiro
2024-06-12 03:01:52 UTC
Permalink
Between this and my ability to clear a jammed keypunch with nothing more
than a blank card torn in half lengthwise,
I never had to wait for a keypunch, even at peak times.
I never encountered a jammed keypunch. Of course, the only time I used one
was a period of about six weeks in a summer job at the end of my first
year. (An IBM 129, I think it was.)

The researchers normally left their coding forms for the data-entry stuff
to punch on cards. But I found they made so many mistakes confusing “I”
with “1” and “O” with “0”, that I started punching my own cards, using the
single public-access keypunch machine that was mainly intended to for
making corrections to small numbers of cards.

When they found out that yes, I really could type that fast, they agreed
to let me continue doing my own data entry, as long as I gave up my seat
every few minutes to let others in the queue have their chance.
Lawrence D'Oliveiro
2024-06-11 07:03:24 UTC
Permalink
Post by Andy Walker
Of course, that was for paper tape; cards were u-c only.
I was "brought up" on paper tape, and have [therefore] never understood
why some people preferred cards and horrid u-c programs.
I heard that ICL preferred paper tape, IBM preferred cards. ;)
John Dallman
2024-06-11 08:03:00 UTC
Permalink
Post by Andy Walker
Post by John Dallman
The ICL 1900 was an uppercase-only machine, so you had to put
Algol keywords in 'SINGLE-QUOTES'.
Before the myths get too much traction, /your/ ICL 1900 may
have been u-c only, but /ours/ definitely also had l-c . . .
Of course, that was for paper tape; cards were u-c only.
We only had punched card access: there were no paper tape readers or
punches accessible to students, and I don't know if there were any on the
machine.

John
R Daneel Olivaw
2024-06-11 08:55:47 UTC
Permalink
Post by John Dallman
Post by Andy Walker
Post by John Dallman
The ICL 1900 was an uppercase-only machine, so you had to put
Algol keywords in 'SINGLE-QUOTES'.
Before the myths get too much traction, /your/ ICL 1900 may
have been u-c only, but /ours/ definitely also had l-c . . .
Of course, that was for paper tape; cards were u-c only.
We only had punched card access: there were no paper tape readers or
punches accessible to students, and I don't know if there were any on the
machine.
John
That applied where I was as well (AFAIR), although I'm pretty sure we
had a card punch peripheral. This is approaching 50 years ago now and
some details have become a bit hazy.
Lawrence D'Oliveiro
2024-06-11 04:36:20 UTC
Permalink
Post by Ahem A Rivet's Shot
It occurred to me to think that if anyone were to go down that
rat hole today they'd have colour to play with, keywords in blue,
variables in red, comments in light grey ...
Oh hang on vim does that to my code anyway!
What you have there is some automatic system for highlighting things
without actually conveying extra information. What Algol-68 needs is a
whole separate set of characters for representing keywords, as distinct
from the one for representing regular identifiers.
Stefan Ram
2024-06-10 10:03:26 UTC
Permalink
Post by John Dallman
As best I remember, yes - it's been over 40 years. The main restriction
was that everything had to be declared before use, which was not
difficult.
I think sometimes the textual "preceding" relationship is
getting mixed up with the temporal "before" relationship
here. It's completely fine to write

BEGIN
a := 10;
END

/first/, and then /later/ go back up with the cursor and turn this
into

BEGIN
INT a;
a := 10;
END

(before submitting it to the compiler). You could even have an
assistant knock that latter part out.

Back in the day, folks would sometimes say, "The requirement
to declare variables before use is a good thing, as it forces
one to think through one's program before banging it out." -
which is wrong on multiple levels.
Ahem A Rivet's Shot
2024-06-10 10:45:07 UTC
Permalink
On 10 Jun 2024 10:03:26 GMT
Post by Stefan Ram
Back in the day, folks would sometimes say, "The requirement
to declare variables before use is a good thing, as it forces
one to think through one's program before banging it out." -
which is wrong on multiple levels.
As far as I can tell it has only one real benefit - it can catch
typos when using long meaningful variable names eg:

int thing_counter;
...
thong_counter = 0;
...
thing_conter += 1;
...
return rging_counter;
--
Steve O'Hara-Smith
Odds and Ends at http://www.sohara.org/
For forms of government let fools contest
Whate're is best administered is best - Alexander Pope
Stefan Ram
2024-06-10 11:22:52 UTC
Permalink
Post by Ahem A Rivet's Shot
Post by Stefan Ram
Back in the day, folks would sometimes say, "The requirement
to declare variables before use is a good thing, as it forces
one to think through one's program before banging it out." -
which is wrong on multiple levels.
As far as I can tell it has only one real benefit - it can catch
int thing_counter;
...
thong_counter = 0;
...
thing_conter += 1;
...
return rging_counter;
Python 3.10 in 2024:

def function():
thong_counter = 0;
thing_conter += 1;
return rging_counter;

. No error messages when running that script! Python is more of
a dynamic language. But, when the function is actually /called/:

def function():
thong_counter = 0;
thing_conter += 1;
return rging_counter;

function()

|UnboundLocalError: local variable 'thing_conter' referenced before assignment
the standard Python implementation

. Then, after changing the third line,

def function():
thong_counter = 0;
thong_counter += 1;
return rging_counter;

function()

|NameError: name 'rging_counter' is not defined. Did you mean: 'thong_counter'?
the standard Python implementation

. So, Python programmers strive to test or check their
programs, but then such errors /are/ found /even without
declarations/. (Often, they are already found the moment one
simply starts to execute one's program for the first time.)

Here's a widely used static code checker named "pyflakes" applied to
the four-line script (without a call of the function):

def function():
thong_counter = 0;
thing_conter += 1;
return rging_counter;

|2:5: local variable 'thong_counter' is assigned to but never used
|3:5: undefined name 'thing_conter'
|3:5: local variable 'thing_conter' is assigned to but never used
|4:12: undefined name 'rging_counter'
pyflakes.
Lawrence D'Oliveiro
2024-06-11 04:34:25 UTC
Permalink
. So, Python programmers strive to test or check their programs, but
then such errors /are/ found /even without declarations/.
But it does require the code in question to be actually executed. If your
test cases do not provide sufficient code coverage in this way, such
errors can escape undetected.
Peter Flass
2024-06-10 23:53:04 UTC
Permalink
Post by Ahem A Rivet's Shot
On 10 Jun 2024 10:03:26 GMT
Post by Stefan Ram
Back in the day, folks would sometimes say, "The requirement
to declare variables before use is a good thing, as it forces
one to think through one's program before banging it out." -
which is wrong on multiple levels.
As far as I can tell it has only one real benefit - it can catch
int thing_counter;
...
thong_counter = 0;
...
thing_conter += 1;
...
return rging_counter;
Compilers should issue warnings for undeclared variables. It’s not rocket
science.
--
Pete
R Daneel Olivaw
2024-06-11 06:55:19 UTC
Permalink
Post by Peter Flass
Post by Ahem A Rivet's Shot
On 10 Jun 2024 10:03:26 GMT
Post by Stefan Ram
Back in the day, folks would sometimes say, "The requirement
to declare variables before use is a good thing, as it forces
one to think through one's program before banging it out." -
which is wrong on multiple levels.
As far as I can tell it has only one real benefit - it can catch
int thing_counter;
...
thong_counter = 0;
...
thing_conter += 1;
...
return rging_counter;
Compilers should issue warnings for undeclared variables. It’s not rocket
science.
I have used a Fortran implementation which permitted IMPLICIT NONE
That meant all constants and variables had to be declared (with type)
before use.
Lawrence D'Oliveiro
2024-06-11 07:01:45 UTC
Permalink
Post by R Daneel Olivaw
I have used a Fortran implementation which permitted IMPLICIT NONE
“IMPLICIT NONE” has been *permitted* in Fortran since, I don’t know, ANSI
Fortran 66 or something. I am looking forward to the day when Fortran
assumes implicit “IMPLICIT NONE”.
Peter Flass
2024-06-10 23:53:03 UTC
Permalink
Post by Stefan Ram
Post by John Dallman
As best I remember, yes - it's been over 40 years. The main restriction
was that everything had to be declared before use, which was not
difficult.
I think sometimes the textual "preceding" relationship is
getting mixed up with the temporal "before" relationship
here. It's completely fine to write
BEGIN
a := 10;
END
/first/, and then /later/ go back up with the cursor and turn this
into
BEGIN
INT a;
a := 10;
END
(before submitting it to the compiler). You could even have an
assistant knock that latter part out.
Back in the day, folks would sometimes say, "The requirement
to declare variables before use is a good thing, as it forces
one to think through one's program before banging it out." -
which is wrong on multiple levels.
Although I mostly declare variables before use, I find it very limiting.
Often I like to stick “includes” at the end of the program, so the program
can start right out without all the boilerplate.
--
Pete
Lawrence D'Oliveiro
2024-06-11 04:32:48 UTC
Permalink
Post by Peter Flass
Although I mostly declare variables before use, I find it very limiting.
In Algol-68, the rule was, all the local identifiers had to be declared
before the first label (for a GOTO). That way, there was never any
possibility that they could be “executed” (or at least, encountered during
the flow of control) more than once within their scope.

Modula-2 was explicitly designed as a two-pass language, to allow
declaration of names before use.

In Python, a name has to have been defined at some point in the flow of
control before use. The (first) definition can occur textually after the
point of reference, just so long as this rule is followed.
Scott Lurndal
2024-06-11 15:04:09 UTC
Permalink
Post by John Dallman
Post by Lawrence D'Oliveiro
But now, X11 is on its way out in the Linux world, and some in the
BSD world are following suit, too. Wayland is the new way to do
cross-platform GUIs.
We will use X11 while we can, because routing graphics across the network
is really useful.
Yes, we as well will continue to use X11 for the same reason.
D
2024-06-11 20:15:59 UTC
Permalink
Post by Scott Lurndal
Post by John Dallman
Post by Lawrence D'Oliveiro
But now, X11 is on its way out in the Linux world, and some in the
BSD world are following suit, too. Wayland is the new way to do
cross-platform GUIs.
We will use X11 while we can, because routing graphics across the network
is really useful.
Yes, we as well will continue to use X11 for the same reason.
X11 for the win! When I was young I did use the routing across the
network, but now I no longer have a home lab, so no need really.
Andy Walker
2024-06-10 23:10:02 UTC
Permalink
On 10/06/2024 00:03, Lawrence D'Oliveiro wrote:
[Algol 68:]
[...] I was never able to get my hands on a
working implemention,
Um. 68R/68RS were available from the early 1970s for ICL 1900
and 2900 computers, 68C was available starting at about the same time
for IBM 360, and was ported to Unix/PDP11, 68S came in the late 1970s
for assorted minis and smaller [inc Unix], and FLACC was also late
1970s for IBM 370 and compatible. I don't dispute your experience,
but suspect a working implementation must have been available if you
had only known where to look. Most, if not all, of the above versions
were free, at least to academics.
until within the last few years Algol-68 Genie came
along.
68G is now over 20 years old [2002], so qualifies as folklore.
Yay!
--
Andy Walker, Nottingham.
Andy's music pages: www.cuboid.me.uk/andy/Music
Composer of the day: www.cuboid.me.uk/andy/Music/Composers/Handel
Bob Eager
2024-06-10 23:18:29 UTC
Permalink
Post by Andy Walker
68G is now over 20 years old [2002], so qualifies as folklore.
Yay!
I keep meaning to build an install that. I wrote a couple of small
programs in ALGOL-68(R) for my postgrad comparative languages module.
--
Using UNIX since v6 (1975)...

Use the BIG mirror service in the UK:
http://www.mirrorservice.org
Lynn Wheeler
2024-06-10 01:29:07 UTC
Permalink
Post by John Dallman
I didn't get to do that. In first year, we learned Pascal and an
artificial assembly language created for teaching. In second year,
Algol-68R, FORTRAN and COBOL. So I did one term of COBOL, learned the
basics, and wanted nothing further to do with it.
Los Gatos lab used Metaware's TWS for various VLSI tools, including a
(mainframe) pascal (later released as vs/pascal). Besides some number of
VLSI tools, it was also used for implementing IBM's mainframe TCP/IP. I
used it for prototype rewrite of VM/370's spool file system running in
virtual address space. I had HSDT project, T1 and faster computer links
which I could drive with TCP/IP ... but VM/370's VNET/RSCS used
(diagnose) synchronous interface which was somewhat limited to around
32kbyte-40kbyte aggregate (degrading if contention by others using spool
concurrently) and I needed more like 300kbytes per T1 link ... so needed
asynchronous interface, contiguous allocation, multiple block transfers,
read-ahead, and write-behind ... with other enhancements including fast
filesystem recovery in case of restart after ungraceful crash.

HSDT posts
http://www.garlic.com/~lynn/subnetwork.html#hsdt

Also communication group was fighting hard to releasing the TCP/IP
product, but when that got reversed, they changed their strategy ... aka
since they had corporate strategic ownership of everything that crossed
datacenter walls, it had to be released through them; what shipped got
44kbyte/sec aggregate using nearly whole 3090 processor. I then changes
for RFC1044 support and in some tuning tests at Cray Research between
Cray and IBM 4341, got sustained 4341 channel throughput using only
modest amount of 4341 processor (something like 500 times improvement in
bytes used per instruction executed).

rfc1044 posts
http://www.garlic.com/~lynn/subnetwork.html#1044

Late 80s, got HA/6000 project, originally for porting NYTimes newspaper
system (ATEX) from VAXCluster to RS/6000. I rename it HA/CMP when start
doing technical/scientific cluster scale-up with national labs and
commercial cluster scale-up with RDBMS vendors (Oracle, Sybase,
Informix, Ingres) that had VAXCluster support in same source base with
Unix. Then late Jan92, cluster scale-up was transferred for announce as
IBM supercomputer (for technical/scientific *ONLY*) and we were told we
couldn't work on anything with more than four processors ... and we
leave IBM a few months later.

HA/CMP posts
http://www.garlic.com/~lynn/subtopic.html#hacmp

IBM then has worst loss in the history of US companies and was being
reorganized into the 13 baby blues in preparation for breaking up the
company.
https://web.archive.org/web/20101120231857/http://www.time.com/time/magazine/article/0,9171,977353,00.html
https://content.time.com/time/subscriber/article/0,33009,977353-1,00.html

We had already left the company, but get a call from bowels of
(corporate hdqtrs) Armonk asking us to help with the corporate
breakup. However before we get started, the board brings in former
president of AMEX as CEO who (somewhat) reverses the breakup
... although there is still quite a bit of divesture and offloading
... which included a lot of VLSI tools to industry VLSI tools
company. Now the VLSI industry standard platform was SUN and so all the
tools had to be 1st ported to SUN. I get a contract to port a (Los
Gatos) 50,000 Pascal statement VLSI application to SUN. Eventually I got
the impression that SUN Pascal hadn't been used for little other than
educational purposes and it would have been easier to port from IBM
Pascal to SUN C. Also while SUN hdqtrs was just up the road, they had
outsourced Pascal to operation on the opposite side of the world (Space
City) ... reporting problems and help with work arounds had lots of
delays.

IBM downturn/downfall/breakup posts
https://www.garlic.com/~lynn/submisc.html#ibmdownfall
--
virtualization experience starting Jan1968, online at home since Mar1970
Scott Lurndal
2024-06-11 14:50:42 UTC
Permalink
Post by John Dallman
Most of my lecturers in 1980-83 were genuinely interested, and had
practical research projects running using real hardware.
Likewise, in the 1979-1983 timeframe. I had already been
programming for three years (self-taught in BASIC, HP3000 SPL,
and PAL-D on the PDP-8); with that head start I finished all
the undergraduate degree requirements by the end of my sophmore
year and was allowed to take graduate-level classes for the next
two years (Distributed systems, Semaphores et. alia, Security/Crypto,
advanced operating systems, complexity theory) all of which were
taught from current journal articles (e.g. Lamport, Diffie Hellman,
RSA, C.A.R Hoare, et alia).

All of which was of great value during my career developing distributed
operating systems.
Post by John Dallman
There were two
There were some who were essentially mathematicians, and their research
was on theorems and proofs. That was appropriate for them.
Yes.

We also had Professor Emeritus John Atanasoff who stopped by every
now and then (he was in his late 70's).
Post by John Dallman
There were also "business systems" guys who seemed to see their role as
passing on the established practices of business data processing - of ten
We had one of those, who taught the COBOL course. And mentioned 8-track
tapes (laugh) as a storage mechanism.
D
2024-06-08 11:35:04 UTC
Permalink
I’m just curious to know whether a phenomenon I experienced, way back when
I first entered University, when the first 8-bit micros were just starting
to scuttle through the undergrowth between the feet of the “real computer”
dinosaurs, still applies today.
From the first time I came in contact with an actual computer, I loved
programming. Friends and I would spend hours in the terminal room, doing
all kinds of stuff not strictly related to our studies. Back then the main
campus system was a DEC PDP-11/70 running RSTS/E.
By contrast, most of the Comp Sci lecturers didn’t seem to be very
enthusiastic for it. Not only that, but they seemed to have little
awareness of how the real computer system, accessible just outside the
lecture hall, worked.
For example, one second-year course was on “system software”. This was
supposed to be about operating systems, some details about how they worked
and how an application program would interface to them. But the lecturer
who gave the course only seemed to have experience of Univac 1100-series
mainframes, which were of course nowhere to be found anywhere on or near
our campus.
Meanwhile, I was reading actual manuals (at the computer centre) about how
that kind of stuff worked on the PDP-11, and I was able to write real code
to try it out.
Nowadays of course all the students can carry around their own personal
hardware more powerful than that. But maybe there’s still a related
phenomenon at play: there was a discussion in another newsgroup a few days
ago involving someone who was doing a Comp Sci course around late 200x.
The lecture material was completely Microsoft-centric, talking only about
the Windows platform. They did look a little bit at Linux, but much.
To me, this represented an opportunity missed. Linux is a system you can
completely pull apart, to learn how every part of it works. And you can
learn things from comparing it with Windows, to see the pros and cons of
different design decisions. Yet the lecturers didn’t seem to have a clue.
Does that sort of thing still apply today?
Yes! I teach and the force of AWS/Azure/GCP is strong in the educational
programs. They sneak into the mgmt groups defining the curriculum and try
to argue that anything except their own services is completely
unnecessary.

My biggest victory in one school was when I managed to ban Windows from
being used as a teaching platform for a program.

I tried multiple times at another school but they insist on keeping
windows because one of the sponsors of the program is a Microsoft only
consulting company and they are saying that there's no need in the market
for linux, and if anything, linux should be scratched from the program.

I think it's going that way, because I no longer teach linux and the new
teacher is from the windows side and his idea of teaching linux is to
teach the stuednts only the graphical mgmt tools.

Of course when they reach my courses I need to start from the beginning
since they lack the knowledeg of linux to take the steps into the
container and cloud worlds.

But let's see the class of 2025, maybe then I managed to change the second
school as well!
Charlie Gibbs
2024-06-08 18:49:38 UTC
Permalink
Post by Lawrence D'Oliveiro
I’m just curious to know whether a phenomenon I experienced, way back when
I first entered University, when the first 8-bit micros were just starting
to scuttle through the undergrowth between the feet of the “real computer”
dinosaurs, still applies today.
From the first time I came in contact with an actual computer, I loved
programming. Friends and I would spend hours in the terminal room, doing
all kinds of stuff not strictly related to our studies. Back then the main
campus system was a DEC PDP-11/70 running RSTS/E.
That sounds like me back in the day, although I would stay late after
work playing with my employer's small mainframe. I had a lot of fun
and wrote many useful utility programs, although I was a bit envious
of the DEC shops; their character-at-a-time I/O was so simple,
but it was was unheard of on mainframes, which used block-mode I/O
exclusively. This might have been more efficient for sending files
across the country, but it was a pain in the ass for sending messages
across the room, and made interactive programming quite cumbersome.
Post by Lawrence D'Oliveiro
By contrast, most of the Comp Sci lecturers didn’t seem to be very
enthusiastic for it. Not only that, but they seemed to have little
awareness of how the real computer system, accessible just outside
the lecture hall, worked.
I attended university computer science classes for three years starting
in 1968. I became disenchanted when I realized CS consisted of a bunch
of people sitting in their ivory towers contemplating the whichness
of what, with very little regard for the Real World [tm]. During
the summer break between my second and third years I managed to get
a programming job in a small service bureau, and finally got some
exposure to real-world programming. For my third year at university
I arranged my schedule so I had Thursdays off, and I continued working
at the service bureau on those days. At the end of what turned out
to be a disastrous third year (for various reasons), I dropped out and
joined the service bureau full-time. I've been programming ever since.
Post by Lawrence D'Oliveiro
Nowadays of course all the students can carry around their own personal
hardware more powerful than that. But maybe there’s still a related
phenomenon at play: there was a discussion in another newsgroup a few days
ago involving someone who was doing a Comp Sci course around late 200x.
The lecture material was completely Microsoft-centric, talking only about
the Windows platform. They did look a little bit at Linux, but much.
At the time of my story above, Microsoft didn't yet exist.
I got to see the scourge spread from its beginnings.
--
/~\ Charlie Gibbs | The Internet is like a big city:
\ / <***@kltpzyxm.invalid> | it has plenty of bright lights and
X I'm really at ac.dekanfrus | excitement, but also dark alleys
/ \ if you read it the right way. | down which the unwary get mugged.
Mike Spencer
2024-06-09 05:33:34 UTC
Permalink
Post by Charlie Gibbs
I attended university computer science classes for three years starting
in 1968. I became disenchanted when I realized CS consisted of a bunch
of people sitting in their ivory towers contemplating the whichness
of what, with very little regard for the Real World [tm]. During
the summer break between my second and third years I managed to get
a programming job in a small service bureau, and finally got some
exposure to real-world programming.
In 1964, my 5th year as a chemistry major (yeah, some bumps in year
3), I signed up for a "computer programming" course. Turned out it
was being taught by a non-academic seconded from some
inustrial/commercial employer. The required text book was on
numerical methods and the instructor commenced lecturing on numerical
methods.

We complained that we wanted to actually program a computer, not study
more math. The instructor wailed something like, "But you can't write
a program for a computer unless you understand the numerical
principles." But more than half the class threatened to drop the
course unless we could learn to write programs. So he taught us some
variant of Fortran. I wrote an extremely simple little gem that did
elementary statistics calculations on something or other from the
chemistry lab. Punched cards, cycled through the IBM 1620, output
cards, line printer, debugging, rinse and repeat.

Didn't have occasion to touch a computer again for 20 years but then
things computerish picked up a bit. CP/M, Unix, VMS, MS-DOS, now I'm
a Linux weenie but never had another comp. sci. course.
--
Mike Spencer Nova Scotia, Canada
Lawrence D'Oliveiro
2024-06-09 07:48:42 UTC
Permalink
Post by Mike Spencer
We complained that we wanted to actually program a computer, not study
more math. The instructor wailed something like, "But you can't write a
program for a computer unless you understand the numerical principles."
In theory, computers, in fact the whole of maths, is just about numbers
(see “Gödel numbering”). In practice, what makes Comp Sci a bit more than
just some branch of maths is its emphasis on the practicalities of
programming: efficiency, debugging, maintainability and so on.
Post by Mike Spencer
So he taught us some variant of Fortran.
My first exposure to program code was a few lines of Fortran in the
“Computers” article in Encyclopedia Britannica. Not having access to a
real computer at that point, I spun a whole little world out of those few
lines.

My second, and far more intensive, exposure to program code was a book
called “The Complete Cybernaut”, and one thing I remember about the
author’s bio was it ended by saying she had moved to some British city I
can’t remember, to become its only practising blacksmith.

That was a pretty good introduction to the basics Fortran programming. I
swallowed it all up over a single weekend. Still without ever having
touched a real computer.

Then, when I finally got to University, and the first-year course was
using something called “WATFOR”, I thought “hey, this looks familiar” ...
Lawrence D'Oliveiro
2024-06-09 07:51:04 UTC
Permalink
Post by Lawrence D'Oliveiro
My second, and far more intensive, exposure to program code was a book
called “The Complete Cybernaut”
Make that “The Compleat Cybernaut”. Gotta get it right ...
Ahem A Rivet's Shot
2024-06-09 09:16:51 UTC
Permalink
On Sun, 9 Jun 2024 07:48:42 -0000 (UTC)
Post by Lawrence D'Oliveiro
In theory, computers, in fact the whole of maths, is just about numbers
Numbers are a higher level construct derived from set theory which
is the true underpinning of mathematics (it *all* derives from the axioms
of set theory). Even concepts like continuity are most generally expressed
in terms of open sets rather than epsilons, deltas and open intervals.
--
Steve O'Hara-Smith
Odds and Ends at http://www.sohara.org/
For forms of government let fools contest
Whate're is best administered is best - Alexander Pope
Andreas Eder
2024-06-09 14:22:45 UTC
Permalink
Post by Ahem A Rivet's Shot
On Sun, 9 Jun 2024 07:48:42 -0000 (UTC)
Post by Lawrence D'Oliveiro
In theory, computers, in fact the whole of maths, is just about numbers
Numbers are a higher level construct derived from set theory which
is the true underpinning of mathematics (it *all* derives from the axioms
of set theory).
Yes, you can base everything on set theory. But that is not the only
way. You can equally well base it in category theory and there is an
even newer approach base on homotoy type theory.
Post by Ahem A Rivet's Shot
Even concepts like continuity are most generally expressed
in terms of open sets rather than epsilons, deltas and open intervals.
Yes, topology is a basis for real (and complex) analysis.

'Andreas
--
ceterum censeo redmondinem esse delendam
Ahem A Rivet's Shot
2024-06-09 15:26:25 UTC
Permalink
On Sun, 09 Jun 2024 16:22:45 +0200
Post by Andreas Eder
Yes, you can base everything on set theory. But that is not the only
way. You can equally well base it in category theory and there is an
True enough.
Post by Andreas Eder
even newer approach base on homotoy type theory.
That one I did not know about - but then it's been a long time
since I called myself a mathematician.
--
Steve O'Hara-Smith
Odds and Ends at http://www.sohara.org/
For forms of government let fools contest
Whate're is best administered is best - Alexander Pope
Mike Spencer
2024-06-10 06:27:10 UTC
Permalink
Post by Lawrence D'Oliveiro
My second, and far more intensive, exposure to program code was a book
called "The Complete Cybernaut", and one thing I remember about the
author's bio was it ended by saying she had moved to some British city I
can't remember, to become its only practising blacksmith.
Intersting coincidence. I dropped out of science 50 years ago to
become (among lesser entertainments) a blacksmith. I only came to
programming/computing in my mid-40s.

All good. I'm probably not smart enough to have been a good scientist
and in any case, in retrospect, temperamentally unsuited for the
science working environment. Modest success as an artist blacksmith
q.g.
--
Mike Spencer Nova Scotia, Canada
Lars Poulsen
2024-06-10 21:35:00 UTC
Permalink
Post by Lawrence D'Oliveiro
Then, when I finally got to University, and the first-year course was
using something called “WATFOR”, I thought “hey, this looks familiar” ...
I remember WATFOR ... and its successor WATFIV (which AFAICR was a more
complete implementation of Fortran IV).

The Waterloo Fortran was a godsend for introductory programming classes
on OS/360. The "normal" FORTCLG procedure invoked the job step setup
overhead 3 times. The Waterloo package stayed in memory all day
processing one student trial compilation after another. The tradeoff was
that you did not get billing records written to SMF for each job, so the
installation I knew, limited each mini-job to 5 seconds, and did not
bill for them.

At "my" computer center, we had a Univac 1106 (later 1108), and did not
have to worry about such inefficiencies, but in order to compete with
the IBM shop, we had to institute a 5-second job class that was "free".
And then, after an enterprising physics student got hold of the OS
reference manual, learned about checkpoint/restart and used that to
slice a 2-hour nuclear simulation into 4.5-second stops, we had to
disable checkpointing.
Lynn Wheeler
2024-06-11 02:50:27 UTC
Permalink
Post by Lars Poulsen
I remember WATFOR ... and its successor WATFIV (which AFAICR was a
more complete implementation of Fortran IV).
The Waterloo Fortran was a godsend for introductory programming
classes on OS/360. The "normal" FORTCLG procedure invoked the job step
setup overhead 3 times. The Waterloo package stayed in memory all day
processing one student trial compilation after another. The tradeoff
was that you did not get billing records written to SMF for each job,
so the installation I knew, limited each mini-job to 5 seconds, and
did not bill for them.
I took two credit hr intro to computers/fortran and at end of semester
was hired to rewrite 1401 MPIO (unit record front end for 709) in
assembler for 360/30. Univ was getting 360/67 for TSS/360 for tss/360
replacing 709/1401 and pending availability of 360/67, got a 360/30
temporarily replacing 1401 (for getting 360 experience). The
univ. shutdown the datacenter over the weekend and I would have it
dedicated, although 48hrs w/o sleep made monday classes hard. They gave
me a bunch of hardware & software manuals and I got to design my own
monitor, device drivers, interrupt handler, error recovery, storage
management, etc ... with a few weeks, I had 2000 card program.

Within a year of intro class, 360/67 arrives and I was hired fulltime
responsible for os/360 (tss/360 never came to production fruition and
ran as 360/65). 709 ran student fortran tape->tape in less than second,
initially os/360 they ran over minute. I install HASP which cuts the
time in half. I then redo STAGE2 SYSGEN, carefully placing datasets and
PDS members to optimize arm seek and multi-track search cutting another
2/3rds to 12.9secs (nearly all job setup, 3step fortgclg, 4.3secs/step).

Student fortran never got better than 709 until I install Univ. of
Waterloo WATFOR. WATFOR on 360/65 ran about 20,000 cards/min (333/secs)
.... univ. ran about tray of cards/batch ... 2000-2500 cards
... 6sec-7.5secs plus 4.3secs for job step, or 10.3-11.8sec per tray,
typically 40-60 cards/job ... .17-.3sec/student-job.
--
virtualization experience starting Jan1968, online at home since Mar1970
Lawrence D'Oliveiro
2024-06-11 04:26:53 UTC
Permalink
Post by Lars Poulsen
And then, after an enterprising physics student got hold of the OS
reference manual, learned about checkpoint/restart and used that to
slice a 2-hour nuclear simulation into 4.5-second stops, we had to
disable checkpointing.
I did something similar, entirely by accident, on a VAX-11/750 I think it
was. I wrote a program which interrupted itself so often, it was almost
never running when the 100Hz interrupt that accumulated CPU usage for the
current process was triggered. And yet at the same time it was CPU-bound.
So I was bringing the entire system to its knees, while registering hardly
any CPU usage at all.
Charlie Gibbs
2024-06-11 20:23:00 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by Lars Poulsen
And then, after an enterprising physics student got hold of the OS
reference manual, learned about checkpoint/restart and used that to
slice a 2-hour nuclear simulation into 4.5-second stops, we had to
disable checkpointing.
I did something similar, entirely by accident, on a VAX-11/750 I think it
was. I wrote a program which interrupted itself so often, it was almost
never running when the 100Hz interrupt that accumulated CPU usage for the
current process was triggered. And yet at the same time it was CPU-bound.
So I was bringing the entire system to its knees, while registering hardly
any CPU usage at all.
We discovered that WATFOR only checked whether time and page limits
had been exceeded after completely processing the current statement.
Our first program that tested this generated several times as much
data as would fit within the page limit, but stored it in arrays
and printed it with a single PRINT statement full of implied DOs.
Yes, the program was aborted with a "page limit exceeded" message -
but by then everything had been printed.

A subsequent program stuffed the bulk of its calculations into
a similar PRINT statement, and gave us the desired result while
using several times our CPU time allowance.
--
/~\ Charlie Gibbs | The Internet is like a big city:
\ / <***@kltpzyxm.invalid> | it has plenty of bright lights and
X I'm really at ac.dekanfrus | excitement, but also dark alleys
/ \ if you read it the right way. | down which the unwary get mugged.
Peter Flass
2024-06-11 23:59:56 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by Lars Poulsen
And then, after an enterprising physics student got hold of the OS
reference manual, learned about checkpoint/restart and used that to
slice a 2-hour nuclear simulation into 4.5-second stops, we had to
disable checkpointing.
I did something similar, entirely by accident, on a VAX-11/750 I think it
was. I wrote a program which interrupted itself so often, it was almost
never running when the 100Hz interrupt that accumulated CPU usage for the
current process was triggered. And yet at the same time it was CPU-bound.
So I was bringing the entire system to its knees, while registering hardly
any CPU usage at all.
We used to do something (else) similar on a Sigma7. Interactive programs
got a performance boost from the scheduler vs. non-interactive. We’d start
a big job, say a compile, and periodically press interrupt (or whatever) on
the TTY and then just continue. Every time we pressed interrupt the job got
marked interactive for a time interval.
--
Pete
Andy Walker
2024-06-09 14:41:23 UTC
Permalink
In 1964, [...]. The required text book was on
numerical methods and the instructor commenced lecturing on numerical
methods.
We complained that we wanted to actually program a computer, not study
more math. The instructor wailed something like, "But you can't write
a program for a computer unless you understand the numerical
principles." [...]
My first real exposure to computing was a first-year NA course
given by Maurice Wilkes in which was included a few lectures on Edsac
Autocode. I believe this was a pioneering idea, as undergrads weren't
normally allowed anywhere near a computer in those days [UK, 1961].
But you can see the point; real computers cost millions of pounds [call
it 1000 years of typical salary], and were reserved for Real Work. When
I started lecturing at Nott'm [1969], I had to fight for undergrads to
be allowed to write programs; luckily, Eric Foxley, then Director of the
Computing Centre, agreed, over the dead bodies of senior professors.

Somewhere around 1965, I was using Atlas to write poetry. When
this was discovered, I was Severely Warned as to my future conduct; Atlas
was for Serious Computing only. Luckily, my previous programs to count the
numbers of 4x4, 5x5 and 3x3x3 "magic" squares/cubes went undiscovered, as
did the efforts of a friend who discovered how to half-space the Flexowriter
and so generate quite realistic black, white and red pictures. [We both did
lots of real work as well, keeping Atlas busy looking at stellar structure
for around 30 hours every weekend, when the time was not charged against the
departmental allocation. This was ~5 min/day, despite pleas that we needed
more time as colleagues were working on the moon programme, but luckily the
algorithm for allocations took into account the freebie time used, so we
got it up to ~20 min/day.]
--
Andy Walker, Nottingham.
Andy's music pages: www.cuboid.me.uk/andy/Music
Composer of the day: www.cuboid.me.uk/andy/Music/Composers/Lange
Lawrence D'Oliveiro
2024-06-09 23:19:39 UTC
Permalink
[We both did lots of real work as well, keeping Atlas
busy looking at stellar structure for around 30 hours every weekend,
when the time was not charged against the departmental allocation. This
was ~5 min/day, despite pleas that we needed more time as colleagues
were working on the moon programme, but luckily the algorithm for
allocations took into account the freebie time used, so we got it up to
~20 min/day.]
Presumably this was a batch system. How long did you have to wait for the
results of your 5 minutes of computer time a day? Was overnight usage
heavily booked (and charged for) as well? (I imagine it was, in the early
days.)
Andy Walker
2024-06-10 22:41:31 UTC
Permalink
On 10/06/2024 00:19, Lawrence D'Oliveiro wrote:
[Re Atlas:]
Post by Lawrence D'Oliveiro
Presumably this was a batch system. How long did you have to wait for the
results of your 5 minutes of computer time a day? Was overnight usage
heavily booked (and charged for) as well? (I imagine it was, in the early
days.)
The Astronomy Dept was practically next door to the computer; we
put our tapes into a box and every hour or three someone would trog over
and pick up any output that was waiting. No usage was "booked", it was
up to the operators to sort the tapes into priority order and run them.
Other universities served by the Manchester Atlas were less lucky; there
was a regular van, typically once per day, that delivered all the tapes,
returning the output the following day. There were three priority levels;
depts were charged triple for priority ["P"] usage, and not at all for
low priority ["X"] usage, which commonly had to wait for the weekend.
Our Professor and Head of Department was inordinately proud of Atlas:
"The old computer [Mercury] was booked by the half hour, the new one is
booked /by the second/!".

Some years later, I was at a conference and we got around, as so
often, to bemoaning the computing service. "How long do you have to wait
for your output, Andy?" "Oh, half an hour or so. How about you?" "Oh,
usually five or ten minutes." At which point another friend said "Oh, in
my university, we count it a failure if you can get from the tape reader
to the lineprinter before your output." Grr! That was probably 1974-ish.
--
Andy Walker, Nottingham.
Andy's music pages: www.cuboid.me.uk/andy/Music
Composer of the day: www.cuboid.me.uk/andy/Music/Composers/Handel
Peter Flass
2024-06-10 23:53:06 UTC
Permalink
Post by Andy Walker
Some years later, I was at a conference and we got around, as so
often, to bemoaning the computing service. "How long do you have to wait
for your output, Andy?" "Oh, half an hour or so. How about you?" "Oh,
usually five or ten minutes." At which point another friend said "Oh, in
my university, we count it a failure if you can get from the tape reader
to the lineprinter before your output." Grr! That was probably 1974-ish.
This was my first experience with the Burroughs 5500. Output started
printing even before all the cards had been read in.
--
Pete
Lawrence D'Oliveiro
2024-06-11 04:24:00 UTC
Permalink
Post by Peter Flass
This was my first experience with the Burroughs 5500. Output started
printing even before all the cards had been read in.
Interesting. When John McCarthy (the father of Lisp) became director of
the Stanford AI Lab (I think it was), IBM donated a machine for him to
work with. But they already had a Burroughs machine. McCarthy, an IBM man,
went out of his way to make use of the Burroughs as unattractive as
possible, so users would switch to the IBM instead.
Rich Alderson
2024-06-12 01:38:28 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by Peter Flass
This was my first experience with the Burroughs 5500. Output started
printing even before all the cards had been read in.
Interesting. When John McCarthy (the father of Lisp) became director of
the Stanford AI Lab (I think it was), IBM donated a machine for him to
work with. But they already had a Burroughs machine. McCarthy, an IBM man,
went out of his way to make use of the Burroughs as unattractive as
possible, so users would switch to the IBM instead.
The Stanford Artificial Intelligence Laboratory never used IBM or Burroughs
equipment. Their first machine was a PDP-1, followed by a brand new PDP-6
(serial #2, IIRC) when that became available. From there, it's PDP-10s all the
way down.
--
Rich Alderson ***@alderson.users.panix.com
Audendum est, et veritas investiganda; quam etiamsi non assequamur,
omnino tamen proprius, quam nunc sumus, ad eam perveniemus.
--Galen
Lawrence D'Oliveiro
2024-06-12 03:15:35 UTC
Permalink
Post by Rich Alderson
Post by Lawrence D'Oliveiro
Post by Peter Flass
This was my first experience with the Burroughs 5500. Output started
printing even before all the cards had been read in.
Interesting. When John McCarthy (the father of Lisp) became director of
the Stanford AI Lab (I think it was), IBM donated a machine for him to
work with. But they already had a Burroughs machine. McCarthy, an IBM man,
went out of his way to make use of the Burroughs as unattractive as
possible, so users would switch to the IBM instead.
The Stanford Artificial Intelligence Laboratory never used IBM or
Burroughs equipment. Their first machine was a PDP-1, followed by a
brand new PDP-6 (serial #2, IIRC) when that became available. From
there, it's PDP-10s all the way down.
Ah. So where was McCarthy director, then? Because those well-known “CAR”
and “CDR” names that persist in Lisp to this day came from some IBM
machine.

Scott Lurndal
2024-06-11 15:11:46 UTC
Permalink
l applies today.
Post by Charlie Gibbs
Post by Lawrence D'Oliveiro
From the first time I came in contact with an actual computer, I loved
programming. Friends and I would spend hours in the terminal room, doing
all kinds of stuff not strictly related to our studies. Back then the main
campus system was a DEC PDP-11/70 running RSTS/E.
That sounds like me back in the day, although I would stay late after
work playing with my employer's small mainframe.
In the 7th grade, my math teacher arranged to get a
teletype and acoustic coupler for a week to dial into
a B5500 at the local university.

That got me hooked, but it wasn't until the 10th grade
when I started high school that I got to learn programming -
we had a teletype/coupler to a PDP-8 that the school district
had retired. It was running TSS/8.24 and was used by several
of the high-schools in the eastern part of the state. We
learned a great deal playing with that after classes. Then
they allowed timesharing access to the production HP-3000,
which provided additional languages to play with (SPL/3000);
went to the HP sales office (30 miles away) and they gave me
a bunch of manuals. The teletype was replaced by a decwriter
LA120 (300 baud!) for the last two years of high school.
D
2024-06-11 20:17:22 UTC
Permalink
Post by Lawrence D'Oliveiro
l applies today.
Post by Charlie Gibbs
Post by Lawrence D'Oliveiro
From the first time I came in contact with an actual computer, I loved
programming. Friends and I would spend hours in the terminal room, doing
all kinds of stuff not strictly related to our studies. Back then the main
campus system was a DEC PDP-11/70 running RSTS/E.
That sounds like me back in the day, although I would stay late after
work playing with my employer's small mainframe.
In the 7th grade, my math teacher arranged to get a
teletype and acoustic coupler for a week to dial into
a B5500 at the local university.
That got me hooked, but it wasn't until the 10th grade
when I started high school that I got to learn programming -
we had a teletype/coupler to a PDP-8 that the school district
had retired. It was running TSS/8.24 and was used by several
of the high-schools in the eastern part of the state. We
learned a great deal playing with that after classes. Then
they allowed timesharing access to the production HP-3000,
which provided additional languages to play with (SPL/3000);
went to the HP sales office (30 miles away) and they gave me
a bunch of manuals. The teletype was replaced by a decwriter
LA120 (300 baud!) for the last two years of high school.
Teachers that go "above and beyond" are very powerful and inspiring! I try
to help my good students who show interest into internships at various IT
companies if I can. It's a good feeling to then hear they graduated and
are appreciated at their new jobs! =)
Lawrence D'Oliveiro
2024-06-12 00:13:25 UTC
Permalink
The teletype was replaced by a decwriter LA120 (300 baud!) for the last
two years of high school.
What a waste of an LA120, is all I can say. The old LA36 might have been a
better fit for such a slow line speed. The LA120, by comparison,
positively flew -- it would go into bidirectional print mode if you could
feed it data fast enough. No way that would happen at 300 baud.

Plus all the other fun features it had -- almost like a VT100 but with
paper, not a screen.
Julieta Shem
2024-06-09 01:20:42 UTC
Permalink
Lawrence D'Oliveiro <***@nz.invalid> writes:

[...]
Post by Lawrence D'Oliveiro
By contrast, most of the Comp Sci lecturers didn’t seem to be very
enthusiastic for it. Not only that, but they seemed to have little
awareness of how the real computer system, accessible just outside the
lecture hall, worked.
[...]
Post by Lawrence D'Oliveiro
Does that sort of thing still apply today?
It does. A lot of computer science lecturers are not the
programmer-type, say. Some get their positions because they're
math-smart just enough to score higher than the competition in some
theoretical exam; some are book-smart enough to know a few things about
other areas, say, involving hardware, the so-called scientific
computing, interfaces, formal methods et cetera. Some of them are not
even in computer science proper; some are people in business and teach
things like processes; some are more like psychologists and study design
of interfaces and do research such as interviewing people to find out
what they think of a certain interface and so on. Many are effectively
students of statistics and analyse some data, plot stuff and write
reports.

What about students? They all report loving programming, but in truth
what they love is the idea of programming. They come in interested in
the computer, but they lack so much culture (which they won't get in the
university anyhow) and they don't know what to do about it. They come
in, they take courses such as C programming, but they don't know who
Dennis Richie is or who Brian Kernighan is or who Ken Thompson is or
Doug McIlroy. Nobody tells them about Lisp or John McCarthy. They have
never heard of the USENET or have any idea who Bill Joy is or Richard
Stallman. They don't really know how to use e-mail. (They use it, but
they have no culture of it.) They have never joined a mail list with a
an automated system for managing subscriptions. They seem unable to
handle the amount of information that goes into a few-messages-weekly
mail list. (Too much information. They get lost.) They're very well
aware of systems such as Whatsapp, Telegram, Discord. NNTP is a
4-letter unknown acronym and so on.

I would bet that many departments use services like Google Workspace, so
they don't even have their own servers around. Students just attend
classes and are never invited to run the computer services for the
department. They have pretty much no opportunity to maintain systems
and software that people actually use.

On the bright side, they do seem to like to get exposed to all of this,
but, as we might expect, they're very slow to catch on and to learn to
walk on their own. Math classes are hard and the non-math classes are
boring. As an observer, it makes me feel that most of them are not
really interested in computers, which is actually a well-known
phenomenon since the dot-com bubble.

You can count on one hand how many programmer-types exist in the average
computer science department across the whole world. Surely if you look
into the very best schools, you'll find more, but I'm not making any
promises.

Now is that the case only with computer science or are we seeing a
genralized phenomenon in all or most other areas?
Lawrence D'Oliveiro
2024-06-09 03:18:50 UTC
Permalink
Post by Julieta Shem
A lot of computer science lecturers are not the
programmer-type, say. Some get their positions because they're
math-smart just enough to score higher than the competition in some
theoretical exam; some are book-smart enough to know a few things about
other areas, say, involving hardware, the so-called scientific
computing, interfaces, formal methods et cetera. Some of them are not
even in computer science proper; some are people in business and teach
things like processes; some are more like psychologists and study design
of interfaces and do research such as interviewing people to find out
what they think of a certain interface and so on. Many are effectively
students of statistics and analyse some data, plot stuff and write
reports.
Nothing wrong with a diversity of backgrounds, though: none of that
precludes programming ability. At my first job at a software house after
graduation, one colleague had a psychology degree, another accounting, and
a third one history. I was the first actual Comp Sci graduate they had
hired.
Post by Julieta Shem
On the bright side, [students] do seem to like to get exposed to all
of this, but, as we might expect, they're very slow to catch on and
to learn to walk on their own. Math classes are hard and the
non-math classes are boring.
The only Comp Sci classes I remember being “boring” were the ones in a
graduate-level compiler course. All that stuff about LL(k), LR(k) parsers
and so on just left me cold, so I pulled out.

Hands-on practice helps make stuff interesting. There were these matchbox-
sized logic gates being used in one course, that could be connected
together with wires to build working digital circuitry (adders etc). One
lecturer built an entire very simple CPU out of these. And just for fun, I
wrote a cross-assembler in Pascal for it.
Ahem A Rivet's Shot
2024-06-09 09:59:30 UTC
Permalink
On Sat, 8 Jun 2024 01:18:20 -0000 (UTC)
Post by Lawrence D'Oliveiro
From the first time I came in contact with an actual computer, I loved
programming. Friends and I would spend hours in the terminal room, doing
all kinds of stuff not strictly related to our studies. Back then the
main campus system was a DEC PDP-11/70 running RSTS/E.
Yep - except the University machine was a 370 running Phoenix.

Although that was far from my first experience with computing -
that was with an 1130 at the local tech college and a teletype in a tiny
room at school, programs ready to run were put on paper tape, sent across a
modem link while the results of the previous day's offerings were sent
back, printed and recorded on tape.

Then I found out that I could go to where the machine was and once
a day from 5pm to 6pm was open for anyone to walk in and drop a deck of
cards in the hopper - run time limited to five minutes. After a few weeks I
got my very own card drawer and permission to book the machine to myself
for up to an hour. HEAVEN!

One evening I even walked past the opening of the first CAMRA pub
in Cambridge to get there - although if I'd known that they were giving
away the first thousand pints I might have pushed into the crowd instead.

In between were some of the first TRS-80s in the UK because, in
my year break, I worked at a Tandy store run by an ex computer consultant
who had been trying to get away from the business. It followed him, that
Tandy store became Cambridge Computer Store.
Post by Lawrence D'Oliveiro
By contrast, most of the Comp Sci lecturers didn’t seem to be very
enthusiastic for it. Not only that, but they seemed to have little
awareness of how the real computer system, accessible just outside the
lecture hall, worked.
There our experiences differed, our lecturers knew their subject.
It wasn't perfect, one major blinkered viewpoint was a strong resistance to
C and Unix - but then Martin Richards worked there so we got BCPL and
Tripos instead.

The course focus was on breadth so we used many languages and
learned about more - while learning data structures and algorithms. They
missed out on OO and functional styles which were a bit too new and racy
for them in 1981. I gather they turned up in the courses soon after my
time.

Similarly we covered machine architectures including Harvard,
Von-Neuman and Data Flow. Operating systems was very generic and
conceptual so we learned more about the choices faced by OS designers
than the details of any OS. Somewhere along the line we covered Turing and
Minsky machines and proved them equivalent and a bunch of numerical
analysis stuff (error bars, badly conditioned calculations ...). A few bits
of archaic terminology stick in the mind like "deadly embrace".

It was only a one year course, a two year version was started up
just as I was taking the one year one in my final year. These days it is of
course a full three year degree rather than a place for mathematicians to
escape to when they discover they're not the next Gauss and that they have
limits past which the concepts refuse to gel.

Once I got out the world was CP/M and hardware was MSI, LSI and
single chip micros - but I knew that already having worked on the Newbrain
in my holidays and was ready for wire wrap gun and ZSID. Then there was
MP/M, XENIX and a dozen or more flavours of Unix, after that things just
kept getting bigger and faster with clustering thrown in for good measure.

Apparently some people have been using a thing called Windows, not
sure why.
--
Steve O'Hara-Smith
Odds and Ends at http://www.sohara.org/
For forms of government let fools contest
Whate're is best administered is best - Alexander Pope
Lawrence D'Oliveiro
2024-06-09 23:36:09 UTC
Permalink
Post by Ahem A Rivet's Shot
There our experiences differed, our lecturers knew their subject.
OK, I should say, not all my lecturers were bad. There were some good
ones.

For example, the lecturer who ran the main first-year programming course
(which I took) knew how to pace things: there would be a programming
assignment that had to be handed in every week, obviously small enough to
be done in the assigned lab time for that week.

For comparison, the second-year lecturer I previously mentioned, who only
knew about Univac, handed out a list of all the assignments at the start
of the course (we had our choice of some minimum number of them to
actually do), to be handed in at the end. A lot of students hadn’t learned
to pace themselves, and got caught out by lack of time as the deadline
approached.
Post by Ahem A Rivet's Shot
It wasn't perfect, one major blinkered viewpoint was a strong resistance
to C and Unix - but then Martin Richards worked there so we got BCPL and
Tripos instead.
I suppose you have to remember what Unix was like at that point. It may
have had some neat ideas, but the proprietary OSes usually had greater
functionality.

When I was doing my Masters, one of the lecturers was a Unix fan. Back
then, VAX/VMS was the coolest OS I knew. So we would have lots of
arguments over that. The fact that Unix systems were not so easily
available on campus was another factor ...

What changed since then was that VMS stagnated, while Unix-type systems
developed and grew. These days, I managed to annoy quite a few VMS
diehards on comp.os.vms by suggesting that the long-overdue port to AMD64
could have been finished years sooner if they had got rid of most of it
and implemented the essential parts as an emulation layer on top of a
Linux kernel.
Post by Ahem A Rivet's Shot
The course focus was on breadth so we used many languages and
learned about more - while learning data structures and algorithms.
It is good to pick up a range of flavours of programming languages, and
being able to discern significant features that each one has. But it can
take time to acquire that kind of experience, and a lot of learners don’t
have the patience or the motivation.
Peter Flass
2024-06-10 23:53:02 UTC
Permalink
Post by Lawrence D'Oliveiro
I’m just curious to know whether a phenomenon I experienced, way back when
I first entered University, when the first 8-bit micros were just starting
to scuttle through the undergrowth between the feet of the “real computer”
dinosaurs, still applies today.
From the first time I came in contact with an actual computer, I loved
programming. Friends and I would spend hours in the terminal room, doing
all kinds of stuff not strictly related to our studies. Back then the main
campus system was a DEC PDP-11/70 running RSTS/E.
By contrast, most of the Comp Sci lecturers didn’t seem to be very
enthusiastic for it. Not only that, but they seemed to have little
awareness of how the real computer system, accessible just outside the
lecture hall, worked.
For example, one second-year course was on “system software”. This was
supposed to be about operating systems, some details about how they worked
and how an application program would interface to them. But the lecturer
who gave the course only seemed to have experience of Univac 1100-series
mainframes, which were of course nowhere to be found anywhere on or near
our campus.
Meanwhile, I was reading actual manuals (at the computer centre) about how
that kind of stuff worked on the PDP-11, and I was able to write real code
to try it out.
Nowadays of course all the students can carry around their own personal
hardware more powerful than that. But maybe there’s still a related
phenomenon at play: there was a discussion in another newsgroup a few days
ago involving someone who was doing a Comp Sci course around late 200x.
The lecture material was completely Microsoft-centric, talking only about
the Windows platform. They did look a little bit at Linux, but much.
To me, this represented an opportunity missed. Linux is a system you can
completely pull apart, to learn how every part of it works. And you can
learn things from comparing it with Windows, to see the pros and cons of
different design decisions. Yet the lecturers didn’t seem to have a clue.
Does that sort of thing still apply today?
Linux is the future, while windows is the dodo.
--
Pete
Loading...