Discussion:
HA - Found a CP/M-86 image and C compiler for VBox
(too old to reply)
Andreas Kohlbach
2021-09-02 18:51:05 UTC
Permalink
The IBM "portable" PC was the same idea - medium-suitcase sized,
heavy as hell - very orange mini-screen - BUT you could put 640k
and an 8087 in that one. Z80s, well, not so much .....
Didn't IBM came up with that after they saw the tremendous success of the
COMPAQ Portable (I think it was called "COMPAQ" when advertised). I seem
to remember the look came to be when two of the employees were on a lunch
break and put the design on a napkin.
OK, OK ... at 300 baud those pictures might load up just
a TAD slow ......
There was a Computer Chronicles episode about computer security in the
mid 80s, where a young hacker demonstrated how to break into a BBS. The
text appeared slow and the hacker mentioned something like "This 300 baud
is slow. I wished we had a 1200 baud modem - that would speed things up a
great deal". *g*

I F'up this into the folklore group. Cannot remember seen you there. You
might enjoy it.
--
Andreas
SixOverFive
2021-09-03 07:19:54 UTC
Permalink
Post by Andreas Kohlbach
The IBM "portable" PC was the same idea - medium-suitcase sized,
heavy as hell - very orange mini-screen - BUT you could put 640k
and an 8087 in that one. Z80s, well, not so much .....
Didn't IBM came up with that after they saw the tremendous success of the
COMPAQ Portable (I think it was called "COMPAQ" when advertised). I seem
to remember the look came to be when two of the employees were on a lunch
break and put the design on a napkin.
Yep, it was intended as a direct competitor to the
Compaq "portable" ... and, at the time, also Osbourne
and KayPro.

But, like them, it was basically a desktop PC shoved
into a suitcase-sized box with a handle.

I was using the IBM-PPC for agricultural-product
research at the time. Wanted to see how certain
dusts would disperse in the wind across the fields.
Kept track of wind direction/speed and there were
sticky-slides at certain intervals in the bushes.
Got lots of neato pretty-colored wind-drift
charts out of that. From that, "average" dispersal
data could be gleaned, useful for real-world
application charts.
Post by Andreas Kohlbach
OK, OK ... at 300 baud those pictures might load up just
a TAD slow ......
There was a Computer Chronicles episode about computer security in the
mid 80s, where a young hacker demonstrated how to break into a BBS. The
text appeared slow and the hacker mentioned something like "This 300 baud
is slow. I wished we had a 1200 baud modem - that would speed things up a
great deal". *g*
Heh ... yea yea ... I *remember*. Had one of those
'acoustic modems' in the beginning - you literally
squished the phone handpiece into them. My first
1200 baud was Anchor Robotics. VASTLY better. At
300 baud you could actually read the text real-time
as it came in. Oh well, there WERE slower baud rates
before then .........

Somewhere I have a Radio Shack "laptop" ... last
thing Gates actually wrote some code for. This was
WAY before real 'laptops'. They were VERY popular
with the Press - you could fit the acoustic coupler
into any phone in the world and send in your story.
(also had a direct-connect phone line capability -
it could dial tone or pulse AND deal with common
foreign systems). Ran on actual dry/alk BATTERIES
you could buy at any store.
Post by Andreas Kohlbach
I F'up this into the folklore group. Cannot remember seen you there. You
might enjoy it.
Having LIVED a lot of this "folklore" it doesn't seem
like nostalgic "lore" to me .....

But I did mostly miss the mainframe/mini days ...
only had to use punchcards/paper-tape ONCE in a
college class (which I dropped out of because
the school already had serial terminals that'd
do all that PLUS). The class was years behind
the reality ......
J. Clarke
2021-09-03 18:11:42 UTC
Permalink
Post by SixOverFive
Post by Andreas Kohlbach
The IBM "portable" PC was the same idea - medium-suitcase sized,
heavy as hell - very orange mini-screen - BUT you could put 640k
and an 8087 in that one. Z80s, well, not so much .....
Didn't IBM came up with that after they saw the tremendous success of the
COMPAQ Portable (I think it was called "COMPAQ" when advertised). I seem
to remember the look came to be when two of the employees were on a lunch
break and put the design on a napkin.
Yep, it was intended as a direct competitor to the
Compaq "portable" ... and, at the time, also Osbourne
and KayPro.
But, like them, it was basically a desktop PC shoved
into a suitcase-sized box with a handle.
I was using the IBM-PPC for agricultural-product
research at the time. Wanted to see how certain
dusts would disperse in the wind across the fields.
Kept track of wind direction/speed and there were
sticky-slides at certain intervals in the bushes.
Got lots of neato pretty-colored wind-drift
charts out of that. From that, "average" dispersal
data could be gleaned, useful for real-world
application charts.
Post by Andreas Kohlbach
OK, OK ... at 300 baud those pictures might load up just
a TAD slow ......
There was a Computer Chronicles episode about computer security in the
mid 80s, where a young hacker demonstrated how to break into a BBS. The
text appeared slow and the hacker mentioned something like "This 300 baud
is slow. I wished we had a 1200 baud modem - that would speed things up a
great deal". *g*
Heh ... yea yea ... I *remember*. Had one of those
'acoustic modems' in the beginning - you literally
squished the phone handpiece into them. My first
1200 baud was Anchor Robotics. VASTLY better. At
300 baud you could actually read the text real-time
as it came in. Oh well, there WERE slower baud rates
before then .........
Somewhere I have a Radio Shack "laptop" ... last
thing Gates actually wrote some code for. This was
WAY before real 'laptops'. They were VERY popular
with the Press - you could fit the acoustic coupler
into any phone in the world and send in your story.
(also had a direct-connect phone line capability -
it could dial tone or pulse AND deal with common
foreign systems). Ran on actual dry/alk BATTERIES
you could buy at any store.
Post by Andreas Kohlbach
I F'up this into the folklore group. Cannot remember seen you there. You
might enjoy it.
Having LIVED a lot of this "folklore" it doesn't seem
like nostalgic "lore" to me .....
But I did mostly miss the mainframe/mini days ...
only had to use punchcards/paper-tape ONCE in a
college class (which I dropped out of because
the school already had serial terminals that'd
do all that PLUS). The class was years behind
the reality ......
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
Ahem A Rivet's Shot
2021-09-03 18:53:45 UTC
Permalink
On Fri, 03 Sep 2021 14:11:42 -0400
Post by J. Clarke
and that idiot insisted that his students use
cards because that's what they'd be working with in the real world.
Reminds me of the A level computer science course I wished I hadn't
taken - the year before it had been all about machine architecture,
assembly language programming, data structures and algorithms, fun stuff.
But I wasn't allowed to take it that year (because I was taking my O
levels) I had to wait and take it the following year and so the course I
got to take was COBOL, systems analysis and data validation and not what I
had been looking forward to at all. But it was "what we'd be working with
in the real world" or so I was told when I moaned about the change.
--
Steve O'Hara-Smith | Directable Mirror Arrays
C:\>WIN | A better way to focus the sun
The computer obeys and wins. | licences available see
You lose and Bill collects. | http://www.sohara.org/
Peter Flass
2021-09-03 20:59:26 UTC
Permalink
Post by Ahem A Rivet's Shot
On Fri, 03 Sep 2021 14:11:42 -0400
Post by J. Clarke
and that idiot insisted that his students use
cards because that's what they'd be working with in the real world.
Reminds me of the A level computer science course I wished I hadn't
taken - the year before it had been all about machine architecture,
assembly language programming, data structures and algorithms, fun stuff.
But I wasn't allowed to take it that year (because I was taking my O
levels) I had to wait and take it the following year and so the course I
got to take was COBOL, systems analysis and data validation and not what I
had been looking forward to at all. But it was "what we'd be working with
in the real world" or so I was told when I moaned about the change.
COBOL programmers are still in demand, apparently.
--
Pete
J. Clarke
2021-09-04 00:18:04 UTC
Permalink
Post by Peter Flass
Post by Ahem A Rivet's Shot
On Fri, 03 Sep 2021 14:11:42 -0400
Post by J. Clarke
and that idiot insisted that his students use
cards because that's what they'd be working with in the real world.
Reminds me of the A level computer science course I wished I hadn't
taken - the year before it had been all about machine architecture,
assembly language programming, data structures and algorithms, fun stuff.
But I wasn't allowed to take it that year (because I was taking my O
levels) I had to wait and take it the following year and so the course I
got to take was COBOL, systems analysis and data validation and not what I
had been looking forward to at all. But it was "what we'd be working with
in the real world" or so I was told when I moaned about the change.
COBOL programmers are still in demand, apparently.
They are. Unfortunately these days to get a job you have to move to
India and be willing to work for an Indian wage. It's a _good_ Indian
wage mind you, I understand you can live comfortably on it, but it's
below US minimum.
SixOverFive
2021-09-05 06:14:26 UTC
Permalink
Post by J. Clarke
Post by Peter Flass
Post by Ahem A Rivet's Shot
On Fri, 03 Sep 2021 14:11:42 -0400
Post by J. Clarke
and that idiot insisted that his students use
cards because that's what they'd be working with in the real world.
Reminds me of the A level computer science course I wished I hadn't
taken - the year before it had been all about machine architecture,
assembly language programming, data structures and algorithms, fun stuff.
But I wasn't allowed to take it that year (because I was taking my O
levels) I had to wait and take it the following year and so the course I
got to take was COBOL, systems analysis and data validation and not what I
had been looking forward to at all. But it was "what we'd be working with
in the real world" or so I was told when I moaned about the change.
COBOL programmers are still in demand, apparently.
They are. Unfortunately these days to get a job you have to move to
India and be willing to work for an Indian wage. It's a _good_ Indian
wage mind you, I understand you can live comfortably on it, but it's
below US minimum.
Know a guy who got a job fairly recently at a govt
op ... one requirement was that he learn COBOL because
they'd heavily invested in *perfect* COBOL apps way
back in the day and would not, could not afford to,
have them re-written in anything else. Important
customized stuff like payrolls, scheduling ...

COBOL was a wonder-language back in the day, perfect for
all kinds of biz apps and (sort of) self-documenting
because of the quasi-natural-language code. Its "PIC"
statement was great, could do everything printf() can
do, help you out with format conversions and forms.
It was assumed you were using TTY terminals and serial
ASCII printers. There ARE a couple of COBOL development
tools for Linux ... one, I think, will even set up
tinted columns for the older, more anal, COBOL versions
where you had to put certain codes in EXACTLY the
right columns. DID make the compilers simpler ...

And for the science types, FORTRAN. That's ALSO still
used. There are HUGE libraries of heavy-duty sci-related
code that NOBODY dares throw away and don't have the
time/budget to recreate. I remember having to translate
a FORTRAN stats collection into IBMPC BASICA ... long
"poke" lines for working the 8087 .... yuk !

Despite the similar-looking chip number, the 8087 was
NOT like the 8088 (or most any other microprocessor).
A very different paradigm. A couple years later the
compiler-makers all added '87 math libraries and that
made it all disappear, but BEFORE .... let's say I
was *thrilled* that Turbo Pascal included an '87
code switch - right when I had to write something that
did a lot of graphic transforms.

I'd suggest "Foley & Van-Dam Fundamentals Of Interactive
Computer Graphics" ... STILL worth looking at. That's how
it really gets done "under the hood".
The Natural Philosopher
2021-09-05 08:47:51 UTC
Permalink
Post by Peter Flass
Post by Ahem A Rivet's Shot
On Fri, 03 Sep 2021 14:11:42 -0400
Post by J. Clarke
and that idiot insisted that his students use
cards because that's what they'd be working with in the real world.
    Reminds me of the A level computer science course I wished I hadn't
taken - the year before it had been all about machine architecture,
assembly language programming, data structures and algorithms, fun stuff.
But I wasn't allowed to take it that year (because I was taking my O
levels) I had to wait and take it the following year and so the course I
got to take was COBOL, systems analysis and data validation and not what I
had been looking forward to at all. But it was "what we'd be working with
in the real world" or so I was told when I moaned about the change.
COBOL programmers are still in demand, apparently.
They are.  Unfortunately these days to get a job you have to move to
India and be willing to work for an Indian wage.  It's a _good_ Indian
wage mind you, I understand you can live comfortably on it, but it's
below US minimum.
  Know a guy who got a job fairly recently at a govt
  op ... one requirement was that he learn COBOL because
  they'd heavily invested in *perfect* COBOL apps way
  back in the day and would not, could not afford to,
  have them re-written in anything else. Important
  customized stuff like payrolls, scheduling ...
  COBOL was a wonder-language back in the day, perfect for
  all kinds of biz apps and (sort of) self-documenting
  because of the quasi-natural-language code. Its "PIC"
  statement was great, could do everything printf() can
  do, help you out with format conversions and forms.
  It was assumed you were using TTY terminals and serial
  ASCII printers. There ARE a couple of COBOL development
  tools for Linux ... one, I think, will even set up
  tinted columns for the older, more anal, COBOL versions
  where you had to put certain codes in EXACTLY the
  right columns. DID make the compilers simpler ...
COBOL was and is a damned good language for commercial programming: It
enforces a discipline on coding and can be used on machines with
extremely low RAM. It is extremely *efficient* in execution (though
massively wordy in source code) What it didn't have back them was a
database language to run under it. I particularly liked an idea which I
believe it originated - and that is a formal way of specifying the data
structures - tables and fields - in advance. When building SQL style
applications this is massively useful even in a small project coded by
a single person
And for the science types, FORTRAN. That's ALSO still
used. There are HUGE libraries of heavy-duty sci-related
code that NOBODY dares throw away and don't have the
time/budget to recreate. I remember having to translate
a FORTRAN stats collection into IBMPC BASICA ... long
"poke" lines for working the 8087 .... yuk !
Despite the similar-looking chip number, the 8087 was
NOT like the 8088 (or most any other microprocessor).
A very different paradigm. A couple years later the
compiler-makers all added '87 math libraries and that
made it all disappear, but BEFORE .... let's say I
was *thrilled* that Turbo Pascal included an '87
code switch - right when I had to write something that
did a lot of graphic transforms.
I'd suggest "Foley & Van-Dam Fundamentals Of Interactive
Computer Graphics" ... STILL worth looking at. That's how
it really gets done "under the hood".
Again, FORTRAN is a fully functional efficient compiled procedural
language. There is no need to 'improve' it.

The IT world gota lot worse when computer scientists started writing
languages. They just couldn't keep it simple. They had to show off.

Hence 'object oriented' rubbish and 'operator overloading' - sheesh the
worst idea EVER. Making a expresssion symbol dependent on the context in
which its being used.
--
“when things get difficult you just have to lie” ― Jean Claud Jüncker
David W. Hodgins
2021-09-05 16:05:10 UTC
Permalink
Post by The Natural Philosopher
COBOL was and is a damned good language for commercial programming: It
enforces a discipline on coding and can be used on machines with
extremely low RAM. It is extremely *efficient* in execution (though
massively wordy in source code) What it didn't have back them was a
database language to run under it. I particularly liked an idea which I
believe it originated - and that is a formal way of specifying the data
structures - tables and fields - in advance. When building SQL style
applications this is massively useful even in a small project coded by
a single person
Again, FORTRAN is a fully functional efficient compiled procedural
language. There is no need to 'improve' it.
Then there was PL/1, which was basically a combination of Fortran and Cobol.

Regards, Dave Hodgins
--
Change ***@nomail.afraid.org to ***@teksavvy.com for
email replies.
John Levine
2021-09-05 22:20:43 UTC
Permalink
Post by David W. Hodgins
Post by The Natural Philosopher
Again, FORTRAN is a fully functional efficient compiled procedural
language. There is no need to 'improve' it.
Then there was PL/1, which was basically a combination of Fortran and Cobol.
It also had a fair amount of Algol mixed in to give it block structure and recursion.

Considering that PL/I was invented by a committee at IBM under intense time pressure
to ship and all-purpose language to use on their all-purpose S/360 mainframes, it's
a remarkably good language. Sixty years later we can see some of the mistakes like
the wild abandon with which you can mix datatypes, but they got a lot of it right.
--
Regards,
John Levine, ***@taugh.com, Primary Perpetrator of "The Internet for Dummies",
Please consider the environment before reading this e-mail. https://jl.ly
SixOverFive
2021-09-06 03:00:03 UTC
Permalink
On Sun, 05 Sep 2021 04:47:51 -0400, The Natural Philosopher
Post by The Natural Philosopher
COBOL was and is a damned good language for commercial programming: It
enforces a discipline on coding and can be used on machines with
extremely low RAM. It is extremely *efficient* in execution (though
massively wordy in source code) What it didn't have back them was  a
database language to run under it. I particularly liked an idea which I
believe it originated - and that is a formal way of specifying the data
structures - tables and fields - in advance. When building SQL style
applications this is massively useful even in a small project coded  by
a single person
Again, FORTRAN is a fully functional efficient compiled procedural
language. There is no need to 'improve' it.
Then there was PL/1, which was basically a combination of Fortran and Cobol.
Regards, Dave Hodgins
Found an DR-PL/I compiler image that'll run in
a DosBox or VirtualBox DOS environment. No MANUAL
though. There was no such thing as a "standard
implementation" back then (and hardly NOW) so
you need an DR-PL/I manual to effectively use
DR-PL/I.

PL/I always seemed to be a bit of a "kitchen sink"
language - you could spot bits of several, even BASIC,
in there. There were several viable approaches to
every task ... which ain't terrible.

Anyway, except for the wordiness and a few annoying quirks,
COBOL really is a perfectly viable language for commercial
programming. It does what you need to be done. Certainly
not "visual" though, however I really don't like Object
Oriented (still always write basically K&R 'C') so I don't
miss THAT in COBOL.

FORTRAN likewise is a perfectly good "sci/math oriented"
language. STILL widely used, often because SO many great
and powerful code snippets were writ back in the 60s and
nobody wants to take the time to RE-write them in anything
else.
Charlie Gibbs
2021-09-08 00:17:23 UTC
Permalink
BTW has anyone else noticed that Newsguy has died?
https://www.reddit.com/r/usenet/comments/ph9lde/newsguycom_whats_happening/
Bankrupt.
Shame, but they brought it on themselves. Too many outages--I finally
changed over to Forteinc. One of the things that annoyed me about
Newsguy was the periodic shutdowns when the moved their servers to a
new location--remind me of the kids who move to a different apartment
every time the lease is up.
Too bad. I've been with them a long time, and never noticed too many
outages. I dug through my stuff and in the configuration for Pan
(which I used to use to download from the binaries groups but haven't
used for several years) I found the login information for Astraweb.
Fortunately, that site and my account on it are still alive - so I'm back.
--
/~\ Charlie Gibbs | They don't understand Microsoft
\ / <***@kltpzyxm.invalid> | has stolen their car and parked
X I'm really at ac.dekanfrus | a taxi in their driveway.
/ \ if you read it the right way. | -- Mayayana
The Natural Philosopher
2021-09-06 07:50:59 UTC
Permalink
On Sun, 05 Sep 2021 04:47:51 -0400, The Natural Philosopher
Post by The Natural Philosopher
COBOL was and is a damned good language for commercial programming: It
enforces a discipline on coding and can be used on machines with
extremely low RAM. It is extremely *efficient* in execution (though
massively wordy in source code) What it didn't have back them was  a
database language to run under it. I particularly liked an idea which I
believe it originated - and that is a formal way of specifying the data
structures - tables and fields - in advance. When building SQL style
applications this is massively useful even in a small project coded  by
a single person
Again, FORTRAN is a fully functional efficient compiled procedural
language. There is no need to 'improve' it.
Then there was PL/1, which was basically a combination of Fortran and Cobol.
which turned out to be less handy.

ALGOL was also there, and IIRC that morphed into B, BCPL and then C..
Regards, Dave Hodgins
--
I would rather have questions that cannot be answered...
...than to have answers that cannot be questioned

Richard Feynman
David W. Hodgins
2021-09-06 15:16:55 UTC
Permalink
Post by The Natural Philosopher
On Sun, 05 Sep 2021 04:47:51 -0400, The Natural Philosopher
Post by The Natural Philosopher
COBOL was and is a damned good language for commercial programming: It
enforces a discipline on coding and can be used on machines with
extremely low RAM. It is extremely *efficient* in execution (though
massively wordy in source code) What it didn't have back them was a
database language to run under it. I particularly liked an idea which I
believe it originated - and that is a formal way of specifying the data
structures - tables and fields - in advance. When building SQL style
applications this is massively useful even in a small project coded by
a single person
Again, FORTRAN is a fully functional efficient compiled procedural
language. There is no need to 'improve' it.
Then there was PL/1, which was basically a combination of Fortran and Cobol.
which turned out to be less handy.
Depends on the task. PL/1 supported 15 dimensional arrays. I don't remember if
Fortran did, but even if it did, PL/1 allowing better variable names to be
used allowing it to still be understandable. If a variable name exceeded 30
characters it took the first and last 15 characters. As long as that was unique,
it worked. COBOL was a max of 3 dimensions and a max of 30 character variable
names.
Post by The Natural Philosopher
ALGOL was also there, and IIRC that morphed into B, BCPL and then C..
I never used ALGOL, B, or BCPL, and only did a tiny bit of work in C.

Regards, Dave Hodgins
--
Change ***@nomail.afraid.org to ***@teksavvy.com for
email replies.
Quadibloc
2021-09-16 12:50:30 UTC
Permalink
Post by David W. Hodgins
Depends on the task. PL/1 supported 15 dimensional arrays. I don't remember if
Fortran did,
Some dialects of FORTRAN IV only went up to seven-dimensional arrays. This
may have had something to do with the fact that the IBM 7094 had seven
index registers.

John Savard
Rich Alderson
2021-09-06 19:04:10 UTC
Permalink
Post by The Natural Philosopher
ALGOL was also there, and IIRC that morphed into B, BCPL and then C..
Order of invention was CPL, BCPL, B, and C.

There were jokes in the 1980s nad 1990s about the proper name for the next
language in the sequence: D, or P?

(Note that at least one object-oriented C successor named D was created.)
--
Rich Alderson ***@alderson.users.panix.com
Audendum est, et veritas investiganda; quam etiamsi non assequamur,
omnino tamen proprius, quam nunc sumus, ad eam perveniemus.
--Galen
John Levine
2021-09-05 22:16:53 UTC
Permalink
Post by The Natural Philosopher
COBOL was and is a damned good language for commercial programming: It
enforces a discipline on coding
Hah. You can write bad code in any language, and CObOL gives you
plenty of rope. ALTER clause anyone?
No worse than the Fortran assigned GOTO. Sensible programmers stopped using them
as soon as there were other ways to write subroutines, like about 1962.

Academincs sneered at COBOL because it was so wordy and looked like pseudo-English
but it had a lot of good ideas that modern programmers have no idea originated there.

C structures and C++ class data structures are COBOL data (via PL/I
which got them from COBOL and uses essentially the COBOL syntax.) The
COBOL report writer was one of the first uses of coroutines but
unfortunately few COBOL programmers understood it well enough to use
it effectively.
--
Regards,
John Levine, ***@taugh.com, Primary Perpetrator of "The Internet for Dummies",
Please consider the environment before reading this e-mail. https://jl.ly
Quadibloc
2021-09-16 12:46:34 UTC
Permalink
Hah. You can write bad code in any language, and CObOL gives you
plenty of rope. ALTER clause anyone?
No worse than the Fortran assigned GOTO. Sensible programmers stopped using them
as soon as there were other ways to write subroutines, like about 1962.
Actually, the ALTER mechanism in COBOL is much, much worse than the assigned
GOTO in FORTRAN, since there is no evidence in COBOL at the site of the GOTO
being altered that an alteration has taken place.

That is a language construct tailor-made for malfeasance.

John Savard
Jerry Peters
2021-09-09 02:04:16 UTC
Permalink
Post by The Natural Philosopher
Post by Peter Flass
Post by Ahem A Rivet's Shot
On Fri, 03 Sep 2021 14:11:42 -0400
Post by J. Clarke
and that idiot insisted that his students use
cards because that's what they'd be working with in the real world.
    Reminds me of the A level computer science course I wished I hadn't
taken - the year before it had been all about machine architecture,
assembly language programming, data structures and algorithms, fun stuff.
But I wasn't allowed to take it that year (because I was taking my O
levels) I had to wait and take it the following year and so the course I
got to take was COBOL, systems analysis and data validation and not what I
had been looking forward to at all. But it was "what we'd be working with
in the real world" or so I was told when I moaned about the change.
COBOL programmers are still in demand, apparently.
They are.  Unfortunately these days to get a job you have to move to
India and be willing to work for an Indian wage.  It's a _good_ Indian
wage mind you, I understand you can live comfortably on it, but it's
below US minimum.
  Know a guy who got a job fairly recently at a govt
  op ... one requirement was that he learn COBOL because
  they'd heavily invested in *perfect* COBOL apps way
  back in the day and would not, could not afford to,
  have them re-written in anything else. Important
  customized stuff like payrolls, scheduling ...
  COBOL was a wonder-language back in the day, perfect for
  all kinds of biz apps and (sort of) self-documenting
  because of the quasi-natural-language code. Its "PIC"
  statement was great, could do everything printf() can
  do, help you out with format conversions and forms.
  It was assumed you were using TTY terminals and serial
  ASCII printers. There ARE a couple of COBOL development
  tools for Linux ... one, I think, will even set up
  tinted columns for the older, more anal, COBOL versions
  where you had to put certain codes in EXACTLY the
  right columns. DID make the compilers simpler ...
COBOL was and is a damned good language for commercial programming: It
enforces a discipline on coding and can be used on machines with
extremely low RAM. It is extremely *efficient* in execution (though
ROTFL, certainly *not* the IBM compilers. I had a habit of looking at
the generated code -- it was *horrible*. It was so bad that a company
called Capex wrote an optimizer for it. It greatly improved the
efficiency of the programs.

Jerry
Quadibloc
2021-09-16 12:37:55 UTC
Permalink
Hence 'object oriented' rubbish and 'operator overloading' - sheesh the
worst idea EVER. Making a expresssion symbol dependent on the context in
which its being used.
In FORTRAN, + - * and / work with INTEGER, REAL, DOUBLE PRECISION, and
COMPLEX variables equally well.

So operator overloading is not a bad idea - if it is used properly, instead of abused.

John Savard
J. Clarke
2021-09-16 17:50:40 UTC
Permalink
On Thu, 16 Sep 2021 05:37:55 -0700 (PDT), Quadibloc
Post by Quadibloc
Hence 'object oriented' rubbish and 'operator overloading' - sheesh the
worst idea EVER. Making a expresssion symbol dependent on the context in
which its being used.
In FORTRAN, + - * and / work with INTEGER, REAL, DOUBLE PRECISION, and
COMPLEX variables equally well.
So operator overloading is not a bad idea - if it is used properly, instead of abused.
In APL they work if one argument is scalar and the other an array. In
Python they also work if the left argument is a string.
Thomas Koenig
2021-09-16 20:22:02 UTC
Permalink
Post by J. Clarke
On Thu, 16 Sep 2021 05:37:55 -0700 (PDT), Quadibloc
Post by Quadibloc
Hence 'object oriented' rubbish and 'operator overloading' - sheesh the
worst idea EVER. Making a expresssion symbol dependent on the context in
which its being used.
In FORTRAN, + - * and / work with INTEGER, REAL, DOUBLE PRECISION, and
COMPLEX variables equally well.
So operator overloading is not a bad idea - if it is used properly, instead of abused.
In APL they work if one argument is scalar and the other an array.
Same in Fortran (but not in FORTRAN).
Robin Vowels
2021-09-16 21:05:25 UTC
Permalink
Post by Quadibloc
Hence 'object oriented' rubbish and 'operator overloading' - sheesh the
worst idea EVER. Making a expresssion symbol dependent on the context in
which its being used.
In FORTRAN, + - * and / work with INTEGER, REAL, DOUBLE PRECISION, and
COMPLEX variables equally well.
You mean equally badly.
Things overflow in integer arithmetic, without warning.
What about integer complex numbers?
Post by Quadibloc
So operator overloading is not a bad idea - if it is used properly, instead of abused.
And anyway, that isn't operator overloading.

maus
2021-09-05 10:44:00 UTC
Permalink
Post by J. Clarke
Post by Peter Flass
Post by Ahem A Rivet's Shot
On Fri, 03 Sep 2021 14:11:42 -0400
Post by J. Clarke
and that idiot insisted that his students use
cards because that's what they'd be working with in the real world.
Reminds me of the A level computer science course I wished I hadn't
taken - the year before it had been all about machine architecture,
assembly language programming, data structures and algorithms, fun stuff.
But I wasn't allowed to take it that year (because I was taking my O
levels) I had to wait and take it the following year and so the course I
got to take was COBOL, systems analysis and data validation and not what I
had been looking forward to at all. But it was "what we'd be working with
in the real world" or so I was told when I moaned about the change.
COBOL programmers are still in demand, apparently.
They are. Unfortunately these days to get a job you have to move to
India and be willing to work for an Indian wage. It's a _good_ Indian
wage mind you, I understand you can live comfortably on it, but it's
below US minimum.
1) Indian accents are a real turnoff for British users, even if the
Indian is living in the UK (or Ireland). Poor answering soured the
idea.
2) In India (according to a friend who worked there), you are expected
to have servants if you earn a good salary.
3) Unfortunatly, the universal trend to lower earning will lower US
salaries in time.
J. Clarke
2021-09-05 13:44:32 UTC
Permalink
Post by maus
Post by J. Clarke
Post by Peter Flass
Post by Ahem A Rivet's Shot
On Fri, 03 Sep 2021 14:11:42 -0400
Post by J. Clarke
and that idiot insisted that his students use
cards because that's what they'd be working with in the real world.
Reminds me of the A level computer science course I wished I hadn't
taken - the year before it had been all about machine architecture,
assembly language programming, data structures and algorithms, fun stuff.
But I wasn't allowed to take it that year (because I was taking my O
levels) I had to wait and take it the following year and so the course I
got to take was COBOL, systems analysis and data validation and not what I
had been looking forward to at all. But it was "what we'd be working with
in the real world" or so I was told when I moaned about the change.
COBOL programmers are still in demand, apparently.
They are. Unfortunately these days to get a job you have to move to
India and be willing to work for an Indian wage. It's a _good_ Indian
wage mind you, I understand you can live comfortably on it, but it's
below US minimum.
1) Indian accents are a real turnoff for British users, even if the
Indian is living in the UK (or Ireland). Poor answering soured the
idea.
2) In India (according to a friend who worked there), you are expected
to have servants if you earn a good salary.
3) Unfortunatly, the universal trend to lower earning will lower US
salaries in time.
I don't know what it is about Asia in general, but the accents just do
not sound good and are sometimes difficult to understand. Noticing
that will get one harshly criticized in some circles though. Our
Indians don't talk to customers though.

And the real problem is that the people in India are taking their good
wage and buying stuff in India, not in the US, and they can't afford
US made so they are buying local or Chinese.

And management doesn't see what's wrong with this.
Quadibloc
2021-09-16 12:43:17 UTC
Permalink
Post by J. Clarke
And the real problem is that the people in India are taking their good
wage and buying stuff in India, not in the US, and they can't afford
US made so they are buying local or Chinese.
And management doesn't see what's wrong with this.
That's not a problem for management. If they choose to
purchase other than the least expensive labor as an input
into their products, they will be destroyed by the competition.

If you want to prevent the free-market system from working
as intended, you need to enlist the aid of the government
in interfering with it, through things like tariff legislation.

John Savard
J. Clarke
2021-09-16 17:51:38 UTC
Permalink
On Thu, 16 Sep 2021 05:43:17 -0700 (PDT), Quadibloc
Post by Quadibloc
Post by J. Clarke
And the real problem is that the people in India are taking their good
wage and buying stuff in India, not in the US, and they can't afford
US made so they are buying local or Chinese.
And management doesn't see what's wrong with this.
That's not a problem for management.
Well, yes, it is.
Post by Quadibloc
If they choose to
purchase other than the least expensive labor as an input
into their products, they will be destroyed by the competition.
You mean like Henry Ford was?
Post by Quadibloc
If you want to prevent the free-market system from working
as intended, you need to enlist the aid of the government
in interfering with it, through things like tariff legislation.
You mean like it helped Henry Ford?
Andreas Kohlbach
2021-09-05 17:01:10 UTC
Permalink
Post by maus
1) Indian accents are a real turnoff for British users, even if the
Indian is living in the UK (or Ireland). Poor answering soured the
idea.
When I call a service hot line and have Apu (from the Simpsons) on the
line I hang up. I don't really understand them (not being a native English
speaker?), they don't understand me. Because in the past I ended up
"signing up" for an additional service I didn't ask for, while the
initial problem wasn't solved. Went through a lot of trouble because of this.
--
Andreas
maus
2021-09-06 17:27:38 UTC
Permalink
Post by Andreas Kohlbach
Post by maus
1) Indian accents are a real turnoff for British users, even if the
Indian is living in the UK (or Ireland). Poor answering soured the
idea.
When I call a service hot line and have Apu (from the Simpsons) on the
line I hang up. I don't really understand them (not being a native English
speaker?), they don't understand me. Because in the past I ended up
"signing up" for an additional service I didn't ask for, while the
initial problem wasn't solved. Went through a lot of trouble because of this.
There are almost twice as many English Speakers in India than the UK.
Many non-Hindu speakers find English a more likable common tongue that
Hindi
I should stress that I find most Indians easy to deal with and
competant.
Andreas Kohlbach
2021-09-07 03:59:43 UTC
Permalink
Post by maus
Post by Andreas Kohlbach
Post by maus
1) Indian accents are a real turnoff for British users, even if the
Indian is living in the UK (or Ireland). Poor answering soured the
idea.
When I call a service hot line and have Apu (from the Simpsons) on the
line I hang up. I don't really understand them (not being a native English
speaker?), they don't understand me. Because in the past I ended up
"signing up" for an additional service I didn't ask for, while the
initial problem wasn't solved. Went through a lot of trouble because of this.
There are almost twice as many English Speakers in India than the UK.
Many non-Hindu speakers find English a more likable common tongue that
Hindi
The owner of one of the convenience stores is from India. He also has a
lot of Indian customers he talks to in Hindi. Interestingly when it comes
to numbers he says them in English.
Post by maus
I should stress that I find most Indians easy to deal with and
competant.
My problem might be that I am not a native English speaker. Already have
problems to understand the Welsh. ;-)
--
Andreas
maus
2021-09-07 09:41:48 UTC
Permalink
Post by Andreas Kohlbach
Post by maus
Post by Andreas Kohlbach
Post by maus
1) Indian accents are a real turnoff for British users, even if the
Indian is living in the UK (or Ireland). Poor answering soured the
idea.
When I call a service hot line and have Apu (from the Simpsons) on the
line I hang up. I don't really understand them (not being a native English
speaker?), they don't understand me. Because in the past I ended up
"signing up" for an additional service I didn't ask for, while the
initial problem wasn't solved. Went through a lot of trouble because of this.
There are almost twice as many English Speakers in India than the UK.
Many non-Hindu speakers find English a more likable common tongue that
Hindi
The owner of one of the convenience stores is from India. He also has a
lot of Indian customers he talks to in Hindi. Interestingly when it comes
to numbers he says them in English.
I had an argument recently with a man from the Delhi area, My contention
was the Hindi and Urdu are essentially the same language, I should have
kept my mouth shut.
Post by Andreas Kohlbach
Post by maus
I should stress that I find most Indians easy to deal with and
competant.
My problem might be that I am not a native English speaker. Already have
problems to understand the Welsh. ;-)
Nobody understands the Welsh language, but they are nice about it. Far
better than those who want to make Gaelic the only language in Ireland.
gareth evans
2021-09-07 10:54:13 UTC
Permalink
Post by maus
Nobody understands the Welsh language,
Cymraeg is the oldest European language in continuous use
but coming as it does long before any significant technological
developments has to now borrow words, mainly from English.

(Although English also borrows words such as, "Bungalow")

This borrowing has resulted in the non-native letter, "J"
appearing in the Welsh alphabet.

Although he did not teach us any of it, being brought up
in Somerset, my father was a Welsh speaker and for his
school certificate (Precursor to the 'O' Levels) he
did English as the foreign language.

After he died, I inherited a large fraction of his
Welsh library, and I am trying to learn some of
the language in order to be able to read those books,
especially several volumes of the proceedings of the
National Eisteddfod.
Niklas Karlsson
2021-09-07 10:58:27 UTC
Permalink
Post by gareth evans
(Although English also borrows words such as, "Bungalow")
English doesn’t borrow from other languages. English follows other
languages down dark alleys, knocks them over and goes through their
pockets for loose grammar. -- James Nicoll

Niklas
--
To the typical Mac end-user, Unix is mysterious, and ancient, and strong. It's
made of cast iron and the bones of heroic programmers of old.
-- Table and Chair, Slashdot
maus
2021-09-07 13:42:51 UTC
Permalink
Post by Niklas Karlsson
Post by gareth evans
(Although English also borrows words such as, "Bungalow")
English doesn’t borrow from other languages. English follows other
languages down dark alleys, knocks them over and goes through their
pockets for loose grammar. -- James Nicoll
Niklas
++
Quadibloc
2021-09-16 12:58:24 UTC
Permalink
Post by Niklas Karlsson
Post by gareth evans
(Although English also borrows words such as, "Bungalow")
English doesn’t borrow from other languages. English follows other
languages down dark alleys, knocks them over and goes through their
pockets for loose grammar. -- James Nicoll
This is quite unfair to English. The Norman Conquest is hardly something
England did to France; it is rather quite the other way around.

John Savard
Niklas Karlsson
2021-09-16 13:42:14 UTC
Permalink
Post by Quadibloc
Post by Niklas Karlsson
Post by gareth evans
(Although English also borrows words such as, "Bungalow")
English doesn’t borrow from other languages. English follows other
languages down dark alleys, knocks them over and goes through their
pockets for loose grammar. -- James Nicoll
This is quite unfair to English. The Norman Conquest is hardly something
England did to France; it is rather quite the other way around.
There's a lot more than French involved, I'd say.

Niklas
--
Q: What sort of alcohol are you drinking in this heatwave?
A: Ethyl alcohol. Some might be drinking methyl alcohol, but they're probably
in no shape to answer the question.
Ahem A Rivet's Shot
2021-09-16 14:12:01 UTC
Permalink
On Thu, 16 Sep 2021 05:58:24 -0700 (PDT)
Post by Quadibloc
Post by Niklas Karlsson
Post by gareth evans
(Although English also borrows words such as, "Bungalow")
English doesn’t borrow from other languages. English follows other
languages down dark alleys, knocks them over and goes through their
pockets for loose grammar. -- James Nicoll
This is quite unfair to English. The Norman Conquest is hardly something
England did to France; it is rather quite the other way around.
That is true and accounts for the presence of a lot of French in
English but it doesn't account for all the bits the English nicked from
elsewhere while running an empire. Bungalow certainly did not come from
French for example.
--
Steve O'Hara-Smith | Directable Mirror Arrays
C:\>WIN | A better way to focus the sun
The computer obeys and wins. | licences available see
You lose and Bill collects. | http://www.sohara.org/
Charlie Gibbs
2021-09-16 17:40:33 UTC
Permalink
Post by Ahem A Rivet's Shot
On Thu, 16 Sep 2021 05:58:24 -0700 (PDT)
Post by Quadibloc
Post by Niklas Karlsson
Post by gareth evans
(Although English also borrows words such as, "Bungalow")
English doesn’t borrow from other languages. English follows other
languages down dark alleys, knocks them over and goes through their
pockets for loose grammar. -- James Nicoll
This is quite unfair to English. The Norman Conquest is hardly something
England did to France; it is rather quite the other way around.
That is true and accounts for the presence of a lot of French in
English but it doesn't account for all the bits the English nicked from
elsewhere while running an empire. Bungalow certainly did not come from
French for example.
"The trouble with the French is that they
don't have a word for entrepreneur."
-- George W. Bush

Oh all right, nobody can prove he actually said this.
But it's fun - and English has stolen words from so
many other languages that James Nicoll's quote is
both witty and true.
--
/~\ Charlie Gibbs | They don't understand Microsoft
\ / <***@kltpzyxm.invalid> | has stolen their car and parked
X I'm really at ac.dekanfrus | a taxi in their driveway.
/ \ if you read it the right way. | -- Mayayana
maus
2021-09-07 13:42:10 UTC
Permalink
Post by gareth evans
Post by maus
Nobody understands the Welsh language,
Cymraeg is the oldest European language in continuous use
but coming as it does long before any significant technological
developments has to now borrow words, mainly from English.
I dont think `AFAR', painted frequently on Welsh roads, is derived from
English :)
Post by gareth evans
(Although English also borrows words such as, "Bungalow")
This borrowing has resulted in the non-native letter, "J"
appearing in the Welsh alphabet.
Although he did not teach us any of it, being brought up
in Somerset, my father was a Welsh speaker and for his
school certificate (Precursor to the 'O' Levels) he
did English as the foreign language.
After he died, I inherited a large fraction of his
Welsh library, and I am trying to learn some of
the language in order to be able to read those books,
especially several volumes of the proceedings of the
National Eisteddfod.
How is the language served by computers?
gareth evans
2021-09-07 13:53:02 UTC
Permalink
Post by maus
I dont think `AFAR', painted frequently on Welsh roads, is derived from
English :)
ARAF
Post by maus
How is the language served by computers?
AIUI from correspondence with the Speech Prof at Sheffield,
extremely well!
J. Clarke
2021-09-07 16:45:01 UTC
Permalink
Post by maus
Post by gareth evans
Post by maus
Nobody understands the Welsh language,
Cymraeg is the oldest European language in continuous use
but coming as it does long before any significant technological
developments has to now borrow words, mainly from English.
I dont think `AFAR', painted frequently on Welsh roads, is derived from
English :)
Do you mean "ARAF"? Why would Welsh have to borrow a word for "slow"?
Post by maus
Post by gareth evans
(Although English also borrows words such as, "Bungalow")
This borrowing has resulted in the non-native letter, "J"
appearing in the Welsh alphabet.
Although he did not teach us any of it, being brought up
in Somerset, my father was a Welsh speaker and for his
school certificate (Precursor to the 'O' Levels) he
did English as the foreign language.
After he died, I inherited a large fraction of his
Welsh library, and I am trying to learn some of
the language in order to be able to read those books,
especially several volumes of the proceedings of the
National Eisteddfod.
How is the language served by computers?
Quadibloc
2021-09-16 13:00:26 UTC
Permalink
Post by maus
Post by gareth evans
After he died, I inherited a large fraction of his
Welsh library, and I am trying to learn some of
the language in order to be able to read those books,
especially several volumes of the proceedings of the
National Eisteddfod.
How is the language served by computers?
I think Apple ran an ad once about how the Macintosh
was available with Welsh or Icelandic support, as
opposed to Windows, which wasn't.

However, since then, it has become possible to connect
a color monitor to the Macintosh, and Windows has
expanded its language support options.

John Savard
Peter Flass
2021-09-07 13:45:25 UTC
Permalink
Post by gareth evans
Post by maus
Nobody understands the Welsh language,
Cymraeg is the oldest European language in continuous use
but coming as it does long before any significant technological
developments has to now borrow words, mainly from English.
It’s been borrowing at least since the Welsh fought the Romans. A lot of
borrowings from Latin.
Post by gareth evans
(Although English also borrows words such as, "Bungalow")
This borrowing has resulted in the non-native letter, "J"
appearing in the Welsh alphabet.
Although he did not teach us any of it, being brought up
in Somerset, my father was a Welsh speaker and for his
school certificate (Precursor to the 'O' Levels) he
did English as the foreign language.
After he died, I inherited a large fraction of his
Welsh library, and I am trying to learn some of
the language in order to be able to read those books,
especially several volumes of the proceedings of the
National Eisteddfod.
--
Pete
Rich Alderson
2021-09-08 20:17:06 UTC
Permalink
Post by maus
I had an argument recently with a man from the Delhi area, My contention
was the Hindi and Urdu are essentially the same language, I should have
kept my mouth shut.
Linguistically, they are. Politically, less so.
--
Rich Alderson ***@alderson.users.panix.com
Audendum est, et veritas investiganda; quam etiamsi non assequamur,
omnino tamen proprius, quam nunc sumus, ad eam perveniemus.
--Galen
Quadibloc
2021-09-16 12:55:53 UTC
Permalink
Post by maus
Far
better than those who want to make Gaelic the only language in Ireland.
Fortunately, however, most of the Irish are not interested
in the idea, as they themselves do not have a command
of that language.

John Savard
Charlie Gibbs
2021-09-16 17:40:34 UTC
Permalink
Post by Quadibloc
Far better than those who want to make Gaelic the only
language in Ireland.
Fortunately, however, most of the Irish are not interested
in the idea, as they themselves do not have a command
of that language.
Still, I appreciate their efforts to keep the language alive.
On our trip to Ireland in 2017 I enjoyed listening to the
recorded announcements on the train - the voice was so
beautiful that it even made mundane announcements like
"please keep your feet off the seats" sound like poetry.

The Irish attitude toward Gaelic is refreshingly mature
compared to Quebec's attitude toward French.
--
/~\ Charlie Gibbs | They don't understand Microsoft
\ / <***@kltpzyxm.invalid> | has stolen their car and parked
X I'm really at ac.dekanfrus | a taxi in their driveway.
/ \ if you read it the right way. | -- Mayayana
maus
2021-09-16 18:16:36 UTC
Permalink
Post by Charlie Gibbs
Post by Quadibloc
Far better than those who want to make Gaelic the only
language in Ireland.
Fortunately, however, most of the Irish are not interested
in the idea, as they themselves do not have a command
of that language.
Still, I appreciate their efforts to keep the language alive.
On our trip to Ireland in 2017 I enjoyed listening to the
recorded announcements on the train - the voice was so
beautiful that it even made mundane announcements like
"please keep your feet off the seats" sound like poetry.
The Irish attitude toward Gaelic is refreshingly mature
compared to Quebec's attitude toward French.
The Welsh have a better attitude. There is something in Finnish at the
moment, I think, about making English a second official language.

As far as I can make out, Tolkien's books have a kinda Finnish
undertone.
--
***@mail.com
Down the wrong mousehole.
Ahem A Rivet's Shot
2021-09-16 18:19:17 UTC
Permalink
On Thu, 16 Sep 2021 17:40:34 GMT
Post by Charlie Gibbs
The Irish attitude toward Gaelic is refreshingly mature
compared to Quebec's attitude toward French.
Well yes, Gaelic is what they speak in Scotland here it's Irish or
Gaeilge in Gaeilge. Some people even get upset when it's called Gaelic but
most will trot out that correction with a smile.
--
Steve O'Hara-Smith | Directable Mirror Arrays
C:\>WIN | A better way to focus the sun
The computer obeys and wins. | licences available see
You lose and Bill collects. | http://www.sohara.org/
maus
2021-09-16 18:12:30 UTC
Permalink
Post by Quadibloc
Post by maus
Far
better than those who want to make Gaelic the only language in Ireland.
Fortunately, however, most of the Irish are not interested
in the idea, as they themselves do not have a command
of that language.
May turn nasty at the moment. Prey for us that only want peace.
Post by Quadibloc
John Savard
--
***@mail.com
Down the wrong mousehole.
Kerr-Mudd, John
2021-09-16 19:55:12 UTC
Permalink
On 16 Sep 2021 18:12:30 GMT
Post by maus
Post by Quadibloc
Post by maus
Far
better than those who want to make Gaelic the only language in Ireland.
Fortunately, however, most of the Irish are not interested
in the idea, as they themselves do not have a command
of that language.
May turn nasty at the moment. Prey for us that only want peace.
Prey?!

Naively I'd hoped that the UK Brexit customs "border" (in the Irish Sea) would make it easier for the people in the North to realise they're better off trading with no tariffs to the RoI.
--
Bah, and indeed Humbug.
Peter Flass
2021-09-07 13:45:23 UTC
Permalink
Post by Andreas Kohlbach
Post by maus
Post by Andreas Kohlbach
Post by maus
1) Indian accents are a real turnoff for British users, even if the
Indian is living in the UK (or Ireland). Poor answering soured the
idea.
When I call a service hot line and have Apu (from the Simpsons) on the
line I hang up. I don't really understand them (not being a native English
speaker?), they don't understand me. Because in the past I ended up
"signing up" for an additional service I didn't ask for, while the
initial problem wasn't solved. Went through a lot of trouble because of this.
There are almost twice as many English Speakers in India than the UK.
Many non-Hindu speakers find English a more likable common tongue that
Hindi
The owner of one of the convenience stores is from India. He also has a
lot of Indian customers he talks to in Hindi. Interestingly when it comes
to numbers he says them in English.
Post by maus
I should stress that I find most Indians easy to deal with and
competant.
My problem might be that I am not a native English speaker. Already have
problems to understand the Welsh. ;-)
Many British dialects are impenetrable to outsiders. Being a native English
speaker had nothing to do with it. Of course Germany has a lot of areas
where they speak dialect, too.
--
Pete
Andreas Kohlbach
2021-09-07 18:12:58 UTC
Permalink
Post by Peter Flass
Post by Andreas Kohlbach
My problem might be that I am not a native English speaker. Already have
problems to understand the Welsh. ;-)
Many British dialects are impenetrable to outsiders. Being a native English
speaker had nothing to do with it. Of course Germany has a lot of areas
where they speak dialect, too.
Thant applies to many languages. Anyone in the US understands people in
Louisiana? ;-)
--
Andreas
Andreas Eder
2021-09-10 16:25:51 UTC
Permalink
Anyone in the US understands people in Louisiana? ;-)
If they speak cajun french, then that is no problem.

'Andreas
Quadibloc
2021-09-16 13:09:03 UTC
Permalink
Post by Andreas Kohlbach
Thant applies to many languages. Anyone in the US understands people in
Louisiana? ;-)
Mostly. As far as I can tell, I can understand people,from anywhere in the
US, with greater or lesses difficulty. I have trouble with a lot of British
dialects, but, of course, I only hear them infrequently.
Odd, I could not find Andreas Kohlbach's post to which you responded.

I would say it is indeed likely that people in, say, *Paris*, wouild have
trouble understanding the Cajuns in Louisiana when they were speaking
French... but most people in Louisiana are not of Acadian origin, and
as far as I know, the accent of people from Louisiana when speaking
English is not all that strong.

John Savard
Mike Spencer
2021-09-05 21:31:33 UTC
Permalink
Post by maus
2) In India (according to a friend who worked there), you are expected
to have servants if you earn a good salary.
This was true in the USA up through the 1950's.
I was there in the 1950s. I didn't then know any really wealthy
people but many of my friends were from families with "good salary" --
professionals, CEOs, small biz owners. None had servants although at
least one lived in a large house with (then disused) servants'
quaarters. My family's landlady in 1954, over 90 and widow of a
locally prestigious clergyman, owned a house with elegant woodwork and
serants' quarters on the 3rd floor, albeit with less than superb
kitchen, bemoaned the impossibility of having servants.

OTOH, all of the senior academics I met during a decade-long,
itermittent excursion from my own rustication into up-scale academia
in the 80s & 90s, were married to other professionals of one sort or
another and they all had daily or live-in house keepers.

Re India: My son had an Indian math teacher whose accent he found
impenetrable. When he said so, the teacher was greatly offended;
after all, he was highly educated (in India) and English was his first
language.
--
Mike Spencer Nova Scotia, Canada
J. Clarke
2021-09-05 23:45:54 UTC
Permalink
On 05 Sep 2021 18:31:33 -0300, Mike Spencer
Post by Mike Spencer
Post by maus
2) In India (according to a friend who worked there), you are expected
to have servants if you earn a good salary.
This was true in the USA up through the 1950's.
I was there in the 1950s. I didn't then know any really wealthy
people but many of my friends were from families with "good salary" --
professionals, CEOs, small biz owners. None had servants although at
least one lived in a large house with (then disused) servants'
quaarters. My family's landlady in 1954, over 90 and widow of a
locally prestigious clergyman, owned a house with elegant woodwork and
serants' quarters on the 3rd floor, albeit with less than superb
kitchen, bemoaned the impossibility of having servants.
OTOH, all of the senior academics I met during a decade-long,
itermittent excursion from my own rustication into up-scale academia
in the 80s & 90s, were married to other professionals of one sort or
another and they all had daily or live-in house keepers.
Re India: My son had an Indian math teacher whose accent he found
impenetrable. When he said so, the teacher was greatly offended;
after all, he was highly educated (in India) and English was his first
language.
I find "say again" and "sorry, could you say that more slowly, I'm old
and my hearing isn't so good" is efficacious without giving offense.
It's really bad if it's a woman in the upper vocal range--my highs are
long gone.
SixOverFive
2021-09-05 05:40:06 UTC
Permalink
Post by J. Clarke
Post by SixOverFive
Post by Andreas Kohlbach
The IBM "portable" PC was the same idea - medium-suitcase sized,
heavy as hell - very orange mini-screen - BUT you could put 640k
and an 8087 in that one. Z80s, well, not so much .....
Didn't IBM came up with that after they saw the tremendous success of the
COMPAQ Portable (I think it was called "COMPAQ" when advertised). I seem
to remember the look came to be when two of the employees were on a lunch
break and put the design on a napkin.
Yep, it was intended as a direct competitor to the
Compaq "portable" ... and, at the time, also Osbourne
and KayPro.
But, like them, it was basically a desktop PC shoved
into a suitcase-sized box with a handle.
I was using the IBM-PPC for agricultural-product
research at the time. Wanted to see how certain
dusts would disperse in the wind across the fields.
Kept track of wind direction/speed and there were
sticky-slides at certain intervals in the bushes.
Got lots of neato pretty-colored wind-drift
charts out of that. From that, "average" dispersal
data could be gleaned, useful for real-world
application charts.
Post by Andreas Kohlbach
OK, OK ... at 300 baud those pictures might load up just
a TAD slow ......
There was a Computer Chronicles episode about computer security in the
mid 80s, where a young hacker demonstrated how to break into a BBS. The
text appeared slow and the hacker mentioned something like "This 300 baud
is slow. I wished we had a 1200 baud modem - that would speed things up a
great deal". *g*
Heh ... yea yea ... I *remember*. Had one of those
'acoustic modems' in the beginning - you literally
squished the phone handpiece into them. My first
1200 baud was Anchor Robotics. VASTLY better. At
300 baud you could actually read the text real-time
as it came in. Oh well, there WERE slower baud rates
before then .........
Somewhere I have a Radio Shack "laptop" ... last
thing Gates actually wrote some code for. This was
WAY before real 'laptops'. They were VERY popular
with the Press - you could fit the acoustic coupler
into any phone in the world and send in your story.
(also had a direct-connect phone line capability -
it could dial tone or pulse AND deal with common
foreign systems). Ran on actual dry/alk BATTERIES
you could buy at any store.
Post by Andreas Kohlbach
I F'up this into the folklore group. Cannot remember seen you there. You
might enjoy it.
Having LIVED a lot of this "folklore" it doesn't seem
like nostalgic "lore" to me .....
But I did mostly miss the mainframe/mini days ...
only had to use punchcards/paper-tape ONCE in a
college class (which I dropped out of because
the school already had serial terminals that'd
do all that PLUS). The class was years behind
the reality ......
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.

So, the prof wasn't necessarily stupid - but perhaps
just a SINGLE year behind the curve.

Some of the bigger data centers DID hang on to cards
and tape for quite awhile after they were officially
obsolete. They'd invested big $$$ in that equipment
and were gonna USE it (and had blown their new
equipment budgets on it). So, the punch-card experience
wasn't necessarily useless, depending on where the
student was going to wind up.

Oh, go shopping ... they DO still sell cartridge-tape
backup units still - they're up to three or four TB
now and cost lots of $$$. Clearly there's a market
for that 60s/70s/80s style tech, certain niches.

Microprocessors were a New Frontier back then, and a
certain democratization for invention that hadn't been
seen since the latter 1800s with Edison/Tesla/Marconi/
Deforest. You didn't need a 200 IQ and a big development
team to come up with and make use of really neat
innovations. Lots that came out of that period is
STILL in play now. Often, laziness was the Mother
Of All Invention - "This SUCKS ... how can I make
it easier/better ?".
The Natural Philosopher
2021-09-05 08:38:56 UTC
Permalink
Post by SixOverFive
Often, laziness was the Mother
  Of All Invention - "This SUCKS ... how can I make
  it easier/better ?".
Which was pretty much Linus' attitude, and the rest, is history
--
“when things get difficult you just have to lie”

― Jean Claud Jüncker
Ahem A Rivet's Shot
2021-09-05 09:49:59 UTC
Permalink
On Sun, 5 Sep 2021 09:38:56 +0100
Post by The Natural Philosopher
Post by SixOverFive
Often, laziness was the Mother
  Of All Invention - "This SUCKS ... how can I make
  it easier/better ?".
Which was pretty much Linus' attitude, and the rest, is history
More like Stallman's and before that a certain group at Berkeley
and before that a handful of people at Bell and before that ...

It's giant's shoulders all the way down, all stacked up like a
circus pyramid.
--
Steve O'Hara-Smith | Directable Mirror Arrays
C:\>WIN | A better way to focus the sun
The computer obeys and wins. | licences available see
You lose and Bill collects. | http://www.sohara.org/
The Natural Philosopher
2021-09-05 10:22:00 UTC
Permalink
Post by Ahem A Rivet's Shot
On Sun, 5 Sep 2021 09:38:56 +0100
Post by The Natural Philosopher
Post by SixOverFive
Often, laziness was the Mother
  Of All Invention - "This SUCKS ... how can I make
  it easier/better ?".
Which was pretty much Linus' attitude, and the rest, is history
More like Stallman's and before that a certain group at Berkeley
and before that a handful of people at Bell and before that ...
It's giant's shoulders all the way down, all stacked up like a
circus pyramid.
And then you get Computer Scientists, and their attitude is 'this is too
easy/simple, how can we make it more complicated?'
--
There’s a mighty big difference between good, sound reasons and reasons
that sound good.

Burton Hillis (William Vaughn, American columnist)
gareth evans
2021-09-05 10:38:46 UTC
Permalink
AIUI, you failed your Computer Science PhD assessment if you used,
"very" instead of, "highly".
Sorry, an afterthought ... I thought that to get a PhD you had to
either invent something or else discover something previously
unfathomable but the plague of PhDs awarded these days do not
seem to be accompanied by widespread increases of knowledge
or developments in technology.
The Natural Philosopher
2021-09-05 11:15:23 UTC
Permalink
Post by gareth evans
AIUI, you failed your Computer Science PhD assessment if you used,
"very" instead of, "highly".
Sorry, an afterthought ... I thought that to get a PhD you had to
either invent something or else discover something previously
unfathomable but the plague of PhDs awarded these days do not
seem to be accompanied by widespread increases of knowledge
or developments in technology.
No, you just have to write some impenetrable gobbledygook and get a chum
to peer review it.


"The influence of Patriarchal Modality on the development of unconscious
homophobia in the Victorian institution' etc etc


Clever people strive to make difficult stuff easy to understand.
Not so clever people strive to make simple stuff more complicated so
only they can understand it, and thus preserve their status and their
careers

As a wonderful example of prime communist blatherskite Eric Hobsbawn is
the go-to idiot

“The test of a progressive policy is not private but public, not just
rising income and consumption for individuals, but widening the
opportunities and what Amartya Sen calls the 'capabilities' of all
through collective action. But that means, it must mean, public
non-profit initiative, even if only in redistributing private
accumulation. Public decisions aimed at collective social improvement
from which all human lives should gain. That is the basis of progressive
policy—not maximising economic growth and personal incomes. Nowhere will
this be more important than in tackling the greatest problem facing us
this century, the environmental crisis. Whatever ideological logo we
choose for it, it will mean a major shift away from the free market and
towards public action, a bigger shift than the British government has
yet envisaged. And, given the acuteness of the economic crisis, probably
a fairly rapid shift. Time is not on our side.”


― Eric Hobsbawm

Contrast Scruton


“When, in the works of Lacan, Deleuze and Althusser, the nonsense
machine began to crank out its impenetrable sentences, of which nothing
could be understood except that they all had “capitalism” as their
target, it looked as though Nothing had at last found its voice.”
― Roger Scruton, Thinkers Of The New Left

“The greatest task on the right, therefore, is to rescue the language of
politics: to put within our grasp what has been forcibly removed from it
by jargon.”
― Roger Scruton, Fools, Frauds and Firebrands: Thinkers of the New Left
--
WOKE is an acronym... Without Originality, Knowledge or Education.
maus
2021-09-06 17:30:47 UTC
Permalink
Post by The Natural Philosopher
Post by gareth evans
AIUI, you failed your Computer Science PhD assessment if you used,
"very" instead of, "highly".
Sorry, an afterthought ... I thought that to get a PhD you had to
either invent something or else discover something previously
unfathomable but the plague of PhDs awarded these days do not
seem to be accompanied by widespread increases of knowledge
or developments in technology.
No, you just have to write some impenetrable gobbledygook and get a chum
to peer review it.
"The influence of Patriarchal Modality on the development of unconscious
homophobia in the Victorian institution' etc etc
Clever people strive to make difficult stuff easy to understand.
Not so clever people strive to make simple stuff more complicated so
only they can understand it, and thus preserve their status and their
careers
As a wonderful example of prime communist blatherskite Eric Hobsbawn is
the go-to idiot
As one who left school early, I sorta educated my self via Hogsbawm
Post by The Natural Philosopher
“The test of a progressive policy is not private but public, not just
rising income and consumption for individuals, but widening the
opportunities and what Amartya Sen calls the 'capabilities' of all
through collective action. But that means, it must mean, public
non-profit initiative, even if only in redistributing private
accumulation. Public decisions aimed at collective social improvement
from which all human lives should gain. That is the basis of progressive
policy—not maximising economic growth and personal incomes. Nowhere will
this be more important than in tackling the greatest problem facing us
this century, the environmental crisis. Whatever ideological logo we
choose for it, it will mean a major shift away from the free market and
towards public action, a bigger shift than the British government has
yet envisaged. And, given the acuteness of the economic crisis, probably
a fairly rapid shift. Time is not on our side.”
― Eric Hobsbawm
Contrast Scruton
“When, in the works of Lacan, Deleuze and Althusser, the nonsense
machine began to crank out its impenetrable sentences, of which nothing
could be understood except that they all had “capitalism” as their
target, it looked as though Nothing had at last found its voice.”
― Roger Scruton, Thinkers Of The New Left
“The greatest task on the right, therefore, is to rescue the language of
politics: to put within our grasp what has been forcibly removed from it
by jargon.”
― Roger Scruton, Fools, Frauds and Firebrands: Thinkers of the New Left
Charlie Gibbs
2021-09-07 20:12:50 UTC
Permalink
Post by Ahem A Rivet's Shot
On Sun, 5 Sep 2021 09:38:56 +0100
Post by The Natural Philosopher
Post by SixOverFive
Often, laziness was the Mother
  Of All Invention - "This SUCKS ... how can I make
  it easier/better ?".
Which was pretty much Linus' attitude, and the rest, is history
More like Stallman's and before that a certain group at Berkeley
and before that a handful of people at Bell and before that ...
It's giant's shoulders all the way down, all stacked up like a
circus pyramid.
With Bill Gates hacking away at the bottom...
--
/~\ Charlie Gibbs | They don't understand Microsoft
\ / <***@kltpzyxm.invalid> | has stolen their car and parked
X I'm really at ac.dekanfrus | a taxi in their driveway.
/ \ if you read it the right way. | -- Mayayana
Stéphane CARPENTIER
2021-09-05 10:13:33 UTC
Permalink
Post by The Natural Philosopher
Post by SixOverFive
Often, laziness was the Mother
  Of All Invention - "This SUCKS ... how can I make
  it easier/better ?".
Which was pretty much Linus' attitude, and the rest, is history
No, Linus' attitude was more "I want to have fun, do you want to have
fun too?"
--
Si vous avez du temps à perdre :
https://scarpet42.gitlab.io
J. Clarke
2021-09-05 13:50:06 UTC
Permalink
On Sun, 5 Sep 2021 09:38:56 +0100, The Natural Philosopher
Post by The Natural Philosopher
Post by SixOverFive
Often, laziness was the Mother
  Of All Invention - "This SUCKS ... how can I make
  it easier/better ?".
Which was pretty much Linus' attitude, and the rest, is history
Works for me. The two that go with it are that "it is easier to gain
forgiveness than permission" and "initiative is when you did something
you weren't supposed to do and it turned out good".

Much of my work is boring and annoying. When I get annoyed I automate
whatever is annoying me.
SixOverFive
2021-09-06 00:18:08 UTC
Permalink
Post by J. Clarke
On Sun, 5 Sep 2021 09:38:56 +0100, The Natural Philosopher
Post by SixOverFive
Often, laziness was the Mother
  Of All Invention - "This SUCKS ... how can I make
  it easier/better ?".
Which was pretty much Linus' attitude, and the rest, is history.
I think Linus was also too broke to buy UNIX :-)

So, BUILD YOUR OWN.
Post by J. Clarke
Works for me. The two that go with it are that "it is easier to gain
forgiveness than permission" and "initiative is when you did something
you weren't supposed to do and it turned out good".
Much of my work is boring and annoying. When I get annoyed I automate
whatever is annoying me.
The only way to stay sane. Often the automation work
is MUCH more interesting than whatever project you're
automating :-)
J. Clarke
2021-09-06 02:38:57 UTC
Permalink
Post by SixOverFive
Post by J. Clarke
On Sun, 5 Sep 2021 09:38:56 +0100, The Natural Philosopher
Post by SixOverFive
Often, laziness was the Mother
  Of All Invention - "This SUCKS ... how can I make
  it easier/better ?".
Which was pretty much Linus' attitude, and the rest, is history.
I think Linus was also too broke to buy UNIX :-)
So, BUILD YOUR OWN.
I understand it started out making some improvements on Minix, and he
improved right into a full blown Unix workalike.
Post by SixOverFive
Post by J. Clarke
Works for me. The two that go with it are that "it is easier to gain
forgiveness than permission" and "initiative is when you did something
you weren't supposed to do and it turned out good".
Much of my work is boring and annoying. When I get annoyed I automate
whatever is annoying me.
The only way to stay sane. Often the automation work
is MUCH more interesting than whatever project you're
automating :-)
Ahem A Rivet's Shot
2021-09-06 07:53:51 UTC
Permalink
On Sun, 05 Sep 2021 22:38:57 -0400
Post by J. Clarke
Post by SixOverFive
Post by J. Clarke
On Sun, 5 Sep 2021 09:38:56 +0100, The Natural Philosopher
Post by SixOverFive
Often, laziness was the Mother
  Of All Invention - "This SUCKS ... how can I make
  it easier/better ?".
Which was pretty much Linus' attitude, and the rest, is history.
I think Linus was also too broke to buy UNIX :-)
So, BUILD YOUR OWN.
I understand it started out making some improvements on Minix, and he
improved right into a full blown Unix workalike.
The earliest story of the origin I read (sometime before 1.0) was
that he started with a clever idea for a more efficient context switch,
developed it under Minix and built out from there picking up fellow
travellers along the way to help round it out into enough of a unix kernel
clone to run GNU tools. The first working code was said to be two
processes one printing As the other Bs.
--
Steve O'Hara-Smith | Directable Mirror Arrays
C:\>WIN | A better way to focus the sun
The computer obeys and wins. | licences available see
You lose and Bill collects. | http://www.sohara.org/
Andreas Kohlbach
2021-09-05 16:22:00 UTC
Permalink
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.

Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
--
Andreas
The Natural Philosopher
2021-09-05 16:26:54 UTC
Permalink
Post by Andreas Kohlbach
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
Its not even that - its more cache, more cores, and SSD - RAM and CPU
clock speed are stuck.
--
Karl Marx said religion is the opium of the people.
But Marxism is the crack cocaine.
J. Clarke
2021-09-05 21:29:32 UTC
Permalink
On Sun, 5 Sep 2021 17:26:54 +0100, The Natural Philosopher
Post by The Natural Philosopher
Post by Andreas Kohlbach
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
Its not even that - its more cache, more cores, and SSD - RAM and CPU
clock speed are stuck.
And the end result is that for single-threaded code, a ten year old
machine is pretty much as good as a new one. Sad part is that the "IT
Professionals" don't know that and think that replacing a 3 year old
machine with a new one is still going to bring about a big performance
improvement.
Dave Garland
2021-09-06 04:35:29 UTC
Permalink
Post by J. Clarke
On Sun, 5 Sep 2021 17:26:54 +0100, The Natural Philosopher
Post by The Natural Philosopher
Post by Andreas Kohlbach
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
Its not even that - its more cache, more cores, and SSD - RAM and CPU
clock speed are stuck.
And the end result is that for single-threaded code, a ten year old
machine is pretty much as good as a new one. Sad part is that the "IT
Professionals" don't know that and think that replacing a 3 year old
machine with a new one is still going to bring about a big performance
improvement.
Sad? I guess on a global scale. But it has provided me with "good
enough" hardware for many years (what clients were throwing away
always worked fine for me, so I'd offer to recycle it for them). My
current desktop (I hate laptop usability) was $40 at a thrift store
(move the old display/keyboard/mouse over), likely a lease return. And
I could do video editing on its predecessor.
maus
2021-09-06 17:37:11 UTC
Permalink
Post by Dave Garland
Post by J. Clarke
On Sun, 5 Sep 2021 17:26:54 +0100, The Natural Philosopher
Post by The Natural Philosopher
Post by Andreas Kohlbach
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
Its not even that - its more cache, more cores, and SSD - RAM and CPU
clock speed are stuck.
And the end result is that for single-threaded code, a ten year old
machine is pretty much as good as a new one. Sad part is that the "IT
Professionals" don't know that and think that replacing a 3 year old
machine with a new one is still going to bring about a big performance
improvement.
Sad? I guess on a global scale. But it has provided me with "good
enough" hardware for many years (what clients were throwing away
always worked fine for me, so I'd offer to recycle it for them). My
current desktop (I hate laptop usability) was $40 at a thrift store
(move the old display/keyboard/mouse over), likely a lease return. And
I could do video editing on its predecessor.
I agree thoroughly with the above. The problem is now that more people
are discovering it aw well, and old laptops with good keyboards are
getting expensive.

A friend who visited me in hospital was praising his chromebook, so
I got one after getting home. They are far less usable than even
Windows.
J. Clarke
2021-09-06 17:46:08 UTC
Permalink
Post by maus
Post by Dave Garland
Post by J. Clarke
On Sun, 5 Sep 2021 17:26:54 +0100, The Natural Philosopher
Post by The Natural Philosopher
Post by Andreas Kohlbach
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
Its not even that - its more cache, more cores, and SSD - RAM and CPU
clock speed are stuck.
And the end result is that for single-threaded code, a ten year old
machine is pretty much as good as a new one. Sad part is that the "IT
Professionals" don't know that and think that replacing a 3 year old
machine with a new one is still going to bring about a big performance
improvement.
Sad? I guess on a global scale. But it has provided me with "good
enough" hardware for many years (what clients were throwing away
always worked fine for me, so I'd offer to recycle it for them). My
current desktop (I hate laptop usability) was $40 at a thrift store
(move the old display/keyboard/mouse over), likely a lease return. And
I could do video editing on its predecessor.
I agree thoroughly with the above. The problem is now that more people
are discovering it aw well, and old laptops with good keyboards are
getting expensive.
A friend who visited me in hospital was praising his chromebook, so
I got one after getting home. They are far less usable than even
Windows.
It seems like the software industry is trying to move the world into
the chromebook model, why I don't really understand.

You are not constrained to use the keyboard attached to the laptop you
know. I haven't touched the keyboard on the one my employer issued me
in well over a year. You also aren't constrained to use the built-in
display.
Peter Flass
2021-09-06 17:58:06 UTC
Permalink
Post by J. Clarke
Post by maus
Post by Dave Garland
Post by J. Clarke
On Sun, 5 Sep 2021 17:26:54 +0100, The Natural Philosopher
Post by The Natural Philosopher
Post by Andreas Kohlbach
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
Its not even that - its more cache, more cores, and SSD - RAM and CPU
clock speed are stuck.
And the end result is that for single-threaded code, a ten year old
machine is pretty much as good as a new one. Sad part is that the "IT
Professionals" don't know that and think that replacing a 3 year old
machine with a new one is still going to bring about a big performance
improvement.
Sad? I guess on a global scale. But it has provided me with "good
enough" hardware for many years (what clients were throwing away
always worked fine for me, so I'd offer to recycle it for them). My
current desktop (I hate laptop usability) was $40 at a thrift store
(move the old display/keyboard/mouse over), likely a lease return. And
I could do video editing on its predecessor.
I agree thoroughly with the above. The problem is now that more people
are discovering it aw well, and old laptops with good keyboards are
getting expensive.
A friend who visited me in hospital was praising his chromebook, so
I got one after getting home. They are far less usable than even
Windows.
It seems like the software industry is trying to move the world into
the chromebook model, why I don't really understand.
Security is a big reason.
Post by J. Clarke
You are not constrained to use the keyboard attached to the laptop you
know. I haven't touched the keyboard on the one my employer issued me
in well over a year. You also aren't constrained to use the built-in
display.
--
Pete
J. Clarke
2021-09-06 18:01:58 UTC
Permalink
Post by Peter Flass
Post by J. Clarke
Post by maus
Post by Dave Garland
Post by J. Clarke
On Sun, 5 Sep 2021 17:26:54 +0100, The Natural Philosopher
Post by The Natural Philosopher
Post by Andreas Kohlbach
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
Its not even that - its more cache, more cores, and SSD - RAM and CPU
clock speed are stuck.
And the end result is that for single-threaded code, a ten year old
machine is pretty much as good as a new one. Sad part is that the "IT
Professionals" don't know that and think that replacing a 3 year old
machine with a new one is still going to bring about a big performance
improvement.
Sad? I guess on a global scale. But it has provided me with "good
enough" hardware for many years (what clients were throwing away
always worked fine for me, so I'd offer to recycle it for them). My
current desktop (I hate laptop usability) was $40 at a thrift store
(move the old display/keyboard/mouse over), likely a lease return. And
I could do video editing on its predecessor.
I agree thoroughly with the above. The problem is now that more people
are discovering it aw well, and old laptops with good keyboards are
getting expensive.
A friend who visited me in hospital was praising his chromebook, so
I got one after getting home. They are far less usable than even
Windows.
It seems like the software industry is trying to move the world into
the chromebook model, why I don't really understand.
Security is a big reason.
How is having having every keystroke I type sent across the Internet
more secure than having it go across a wire on my desk?

How is _anything_ that relies on the Web for everything more secure
than something that can be air-gapped and locked in a drawer when not
in use?
Post by Peter Flass
Post by J. Clarke
You are not constrained to use the keyboard attached to the laptop you
know. I haven't touched the keyboard on the one my employer issued me
in well over a year. You also aren't constrained to use the built-in
display.
Peter Flass
2021-09-07 00:42:04 UTC
Permalink
Post by J. Clarke
Post by Peter Flass
Post by J. Clarke
Post by maus
Post by Dave Garland
Post by J. Clarke
On Sun, 5 Sep 2021 17:26:54 +0100, The Natural Philosopher
Post by The Natural Philosopher
Post by Andreas Kohlbach
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
Its not even that - its more cache, more cores, and SSD - RAM and CPU
clock speed are stuck.
And the end result is that for single-threaded code, a ten year old
machine is pretty much as good as a new one. Sad part is that the "IT
Professionals" don't know that and think that replacing a 3 year old
machine with a new one is still going to bring about a big performance
improvement.
Sad? I guess on a global scale. But it has provided me with "good
enough" hardware for many years (what clients were throwing away
always worked fine for me, so I'd offer to recycle it for them). My
current desktop (I hate laptop usability) was $40 at a thrift store
(move the old display/keyboard/mouse over), likely a lease return. And
I could do video editing on its predecessor.
I agree thoroughly with the above. The problem is now that more people
are discovering it aw well, and old laptops with good keyboards are
getting expensive.
A friend who visited me in hospital was praising his chromebook, so
I got one after getting home. They are far less usable than even
Windows.
It seems like the software industry is trying to move the world into
the chromebook model, why I don't really understand.
Security is a big reason.
How is having having every keystroke I type sent across the Internet
more secure than having it go across a wire on my desk?
How is _anything_ that relies on the Web for everything more secure
than something that can be air-gapped and locked in a drawer when not
in use?
Some of my knowledge is dated. As originally shipped I think Chrome OS
didn’t allow users to install software, and all data was stored in the
cloud, so no viruses. I think that all went away, and Chrome now runs
native, Linux, and windoze applications, a lot of the security benefits
have gone away.
Post by J. Clarke
Post by Peter Flass
Post by J. Clarke
You are not constrained to use the keyboard attached to the laptop you
know. I haven't touched the keyboard on the one my employer issued me
in well over a year. You also aren't constrained to use the built-in
display.
--
Pete
J. Clarke
2021-09-07 01:07:25 UTC
Permalink
Post by Peter Flass
Post by J. Clarke
Post by Peter Flass
Post by J. Clarke
Post by maus
Post by Dave Garland
Post by J. Clarke
On Sun, 5 Sep 2021 17:26:54 +0100, The Natural Philosopher
Post by The Natural Philosopher
Post by Andreas Kohlbach
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
Its not even that - its more cache, more cores, and SSD - RAM and CPU
clock speed are stuck.
And the end result is that for single-threaded code, a ten year old
machine is pretty much as good as a new one. Sad part is that the "IT
Professionals" don't know that and think that replacing a 3 year old
machine with a new one is still going to bring about a big performance
improvement.
Sad? I guess on a global scale. But it has provided me with "good
enough" hardware for many years (what clients were throwing away
always worked fine for me, so I'd offer to recycle it for them). My
current desktop (I hate laptop usability) was $40 at a thrift store
(move the old display/keyboard/mouse over), likely a lease return. And
I could do video editing on its predecessor.
I agree thoroughly with the above. The problem is now that more people
are discovering it aw well, and old laptops with good keyboards are
getting expensive.
A friend who visited me in hospital was praising his chromebook, so
I got one after getting home. They are far less usable than even
Windows.
It seems like the software industry is trying to move the world into
the chromebook model, why I don't really understand.
Security is a big reason.
How is having having every keystroke I type sent across the Internet
more secure than having it go across a wire on my desk?
How is _anything_ that relies on the Web for everything more secure
than something that can be air-gapped and locked in a drawer when not
in use?
Some of my knowledge is dated. As originally shipped I think Chrome OS
didn’t allow users to install software, and all data was stored in the
cloud, so no viruses.
What leads you to believe that data being in the cloud means no
viruses? The viruses go on the cloud server, not your local machine,
and breach security for everybody who uses that service, not just you.
And that leaves aside the possibility that an insider in the cloud
provider might steal your data.

And that leaves totally aside "man in the middle" attacks.

Hacking me is not worth the effort--too much work, too little reward.
Hacking AWS though is very tempting, because the hacker gets not only
my data but the data of millions of others, some of whom have very
deep pockets.
Post by Peter Flass
I think that all went away, and Chrome now runs
native, Linux, and windoze applications, a lot of the security benefits
have gone away.
Which were mostly imaginary.
Post by Peter Flass
Post by J. Clarke
Post by Peter Flass
Post by J. Clarke
You are not constrained to use the keyboard attached to the laptop you
know. I haven't touched the keyboard on the one my employer issued me
in well over a year. You also aren't constrained to use the built-in
display.
Ahem A Rivet's Shot
2021-09-06 19:01:25 UTC
Permalink
On Mon, 6 Sep 2021 10:58:06 -0700
Post by Peter Flass
Post by J. Clarke
It seems like the software industry is trying to move the world into
the chromebook model, why I don't really understand.
Security is a big reason.
More like a good excuse, the industry (for that read Wall Street)
much prefers an income model based on subscriptions rather than sales
because it is predictable which gives investors warm fuzzy feelings.
--
Steve O'Hara-Smith | Directable Mirror Arrays
C:\>WIN | A better way to focus the sun
The computer obeys and wins. | licences available see
You lose and Bill collects. | http://www.sohara.org/
maus
2021-09-08 09:22:59 UTC
Permalink
On Tue, 07 Sep 2021 20:12:52 GMT
Post by Ahem A Rivet's Shot
On Mon, 6 Sep 2021 10:58:06 -0700
Post by Peter Flass
Post by J. Clarke
It seems like the software industry is trying to move the world into
the chromebook model, why I don't really understand.
Security is a big reason.
More like a good excuse, the industry (for that read Wall
Street) much prefers an income model based on subscriptions rather than
sales because it is predictable which gives investors warm fuzzy
feelings.
I find it ironic that these huge corporations - who are no doubt
strongly right-wing - have made it their goal to eliminate private
... for other people, they rather like it for themselves.
the dream of Karl Marx.
Russia is currently powered by turbines attached to his grave.
So that is what the Nord Stream pipes are really for!
Andreas Kohlbach
2021-09-07 04:02:47 UTC
Permalink
Post by J. Clarke
It seems like the software industry is trying to move the world into
the chromebook model, why I don't really understand.
Chromebooks probably target the casual user doing Facebook, Twitter, some
chat (text, audio, video), email or some games. And removes the virus
thread known on Windows computers. Unless an app is bugged of course.
--
Andreas
Peter Flass
2021-09-07 13:45:24 UTC
Permalink
Post by Andreas Kohlbach
Post by J. Clarke
It seems like the software industry is trying to move the world into
the chromebook model, why I don't really understand.
Chromebooks probably target the casual user doing Facebook, Twitter, some
chat (text, audio, video), email or some games. And removes the virus
thread known on Windows computers. Unless an app is bugged of course.
Schools use them a lot, too. All the apps and data are in Google Docs.
--
Pete
Andreas Kohlbach
2021-09-07 18:15:14 UTC
Permalink
Post by Peter Flass
Post by Andreas Kohlbach
Post by J. Clarke
It seems like the software industry is trying to move the world into
the chromebook model, why I don't really understand.
Chromebooks probably target the casual user doing Facebook, Twitter, some
chat (text, audio, video), email or some games. And removes the virus
thread known on Windows computers. Unless an app is bugged of course.
Schools use them a lot, too. All the apps and data are in Google Docs.
For the casual user I don't see anything wrong with that.

But if you have confidential documents or deal with other sensitive data,
Cloud Computing (and so the use of a Chromebook) is a bad idea.

Anyway, they are cheap. And if you want, install Linux on them.
--
Andreas

PGP fingerprint 952B0A9F12C2FD6C9F7E68DAA9C2EA89D1A370E0
Dave Garland
2021-09-07 05:22:23 UTC
Permalink
Post by J. Clarke
You are not constrained to use the keyboard attached to the laptop you
know. I haven't touched the keyboard on the one my employer issued me
in well over a year. You also aren't constrained to use the built-in
display.
Of course. But once you've replaced the pointing device, keyboard, and
display with decent ones, why would you want to be stuck with a
computer that has cooling issues and is difficult to economically
upgrade or repair? Unless it's on your employer's dime, of course.

I do use an old Win 7 netbook (now running Linux) as an always-on
server (with a proper keyboard/mouse/display). It's requirements
aren't great, though (mostly just sit there and listen for the rare
times that anyone wants to talk to it), and the low-power aspect is nice.
Kerr-Mudd, John
2021-09-15 17:08:28 UTC
Permalink
On Mon, 6 Sep 2021 09:04:26 +0100
On Sun, 05 Sep 2021 17:29:32 -0400
Post by J. Clarke
And the end result is that for single-threaded code, a ten year old
machine is pretty much as good as a new one. Sad part is that the
"IT Professionals" don't know that and think that replacing a 3
year old machine with a new one is still going to bring about a big
performance improvement.
Works fine when the application is a kubernetes administered
swarm of docker images running on virtual machines connected by
virtual networks and using data on big SANs running NVMe over fabric
with only hypervisors running directly on the hardware.
That kind of architecture just loves running on lots and lots
and lots of cores that don't need to be too fruity as long as the
concurrency is good.
I'm so out of it I only recognise 'SAN' and 'virtual machine'
--
Bah, and indeed Humbug.
SixOverFive
2021-09-06 03:14:32 UTC
Permalink
Post by Andreas Kohlbach
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
IF ... lots of promises but Real World units are few,
weak and VERY quirky. PRICE is gawdawful too. QM
may wind up being a flash in the pan due to all the
little issues. The Spooks will probably keep a few
units for codebreaking, but that'll be about it.

Anyway, depending on the era, I'll absolve the Prof.
It was literally punch-cards/tape one month in the
high holy Computer Room and a bunch of serial terminals
in the library building the next. Minor issue - NOBODY
knew how to USE them. No instructions, no manuals. We
kinda hacked around and got some BASIC programs going,
but that was about it. This was, I think, 1979.
J. Clarke
2021-09-06 04:01:28 UTC
Permalink
Post by SixOverFive
Post by Andreas Kohlbach
Post by SixOverFive
Post by J. Clarke
Some professors are like that. I remember helping an undergrad find
the keypunch (I was a computer science grad student and I didn't even
know the school _had_ a keypunch or card reader until that came up).
Seems he was taking a programming course from somebody in the
chemistry department and that idiot insisted that his students use
cards because that's what they'd be working with in the real world. I
ended up having to go dig up an operator.
PART of the problem is that the computer world was
evolving SO fast back then. What was State Of The
Art one year was Obsolete Crap the next.
I think that is only true until the mid 80s. From 1971 to around 1984 the
address/data bus size of a (microprocessor based) CPU doubled several
times to 32-Bit. Since around 2000 mainstream hardware is stuck at
64-bit.
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
IF ... lots of promises but Real World units are few,
weak and VERY quirky. PRICE is gawdawful too. QM
may wind up being a flash in the pan due to all the
little issues. The Spooks will probably keep a few
units for codebreaking, but that'll be about it.
Anyway, depending on the era, I'll absolve the Prof.
It was literally punch-cards/tape one month in the
high holy Computer Room and a bunch of serial terminals
in the library building the next. Minor issue - NOBODY
knew how to USE them. No instructions, no manuals. We
kinda hacked around and got some BASIC programs going,
but that was about it. This was, I think, 1979.
The school had had 3270s as long as any current students could
remember. And the VAX had had CRTs long enough for there to be asong
about them on campus sufficiently well known that a sophomore
journalism major knew it.
Andreas Kohlbach
2021-09-06 20:08:57 UTC
Permalink
Post by SixOverFive
Post by Andreas Kohlbach
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
IF ... lots of promises but Real World units are few,
weak and VERY quirky. PRICE is gawdawful too. QM
may wind up being a flash in the pan due to all the
little issues. The Spooks will probably keep a few
units for codebreaking, but that'll be about it.
Is you real name Thomas J. Watson by chance? ;-)

He worked as CEO at IBM and said 1943 "I think there is a world market
for maybe five computers". ;-)

Today some tech companies have them to let customers using them in the
cloud. Again back to the late 70s and before, when people used cloud
computing *ahem* time sharing, because no ordinary user was able to
purchase a mainframe. In a decade or two the phone in your pocket (or may be
transplanted in brains by then) runs on a Quantum-Computer.
--
Andreas
maus
2021-09-06 21:09:24 UTC
Permalink
Post by Andreas Kohlbach
Post by SixOverFive
Post by Andreas Kohlbach
Today it's more RAM, faster CPU, more RAM, faster CPU... - how
boring. That should only change once Quantum-Computing becomes
mainstream.
IF ... lots of promises but Real World units are few,
weak and VERY quirky. PRICE is gawdawful too. QM
may wind up being a flash in the pan due to all the
little issues. The Spooks will probably keep a few
units for codebreaking, but that'll be about it.
Is you real name Thomas J. Watson by chance? ;-)
He worked as CEO at IBM and said 1943 "I think there is a world market
for maybe five computers". ;-)
Today some tech companies have them to let customers using them in the
cloud. Again back to the late 70s and before, when people used cloud
computing *ahem* time sharing, because no ordinary user was able to
purchase a mainframe. In a decade or two the phone in your pocket (or may be
transplanted in brains by then) runs on a Quantum-Computer.
I believe the head of the bank of england oce remarked, `Why Phones?..
We have lots of messangers'.
Rich Alderson
2021-09-08 20:14:13 UTC
Permalink
Post by Andreas Kohlbach
Is you real name Thomas J. Watson by chance? ;-)
He worked as CEO at IBM and said 1943 "I think there is a world market
for maybe five computers". ;-)
No, $DEITY DAMN IT, he did NOT!

That entire meme is a misrepresentation of what he actually said in IBM's
annual report to stockholders in 1952. In his report, he stated that he
accompanied a presentation of the brand new IBM 701 computer to 20 scientific
laboratories; the expected sales from this dog-and-pony show was about 5
systems. He was happy to report that there were orders for *19* of them.

The report can be found on the Web. Look it up for yourself; I don't have the
time.

But please, PLEASE, PLLLEEEAASSSSSSE, stop spreading this stupid lie.
--
Rich Alderson ***@alderson.users.panix.com
Audendum est, et veritas investiganda; quam etiamsi non assequamur,
omnino tamen proprius, quam nunc sumus, ad eam perveniemus.
--Galen
Questor
2021-09-03 19:34:53 UTC
Permalink
Post by SixOverFive
But I did mostly miss the mainframe/mini days ...
only had to use punchcards/paper-tape ONCE in a
college class (which I dropped out of because
the school already had serial terminals that'd
do all that PLUS). The class was years behind
the reality
I'm glad I had the opportunity to have some first-hand experience with the
"submit a deck of punched cards" model of computing in college. I'm even
gladder that timewharing was an option and I didn't have to use punched
cards very long.

I also used paper tape as one of the boot loaders for a DEC KI-10.

I'm thankful I got to use those things and get my feet wet, so to speak, in that
part of computing history. It's even better that those experiences were
peripheral, not central, to my computing activities.
Mike Spencer
2021-09-03 20:43:48 UTC
Permalink
Post by SixOverFive
Heh ... yea yea ... I *remember*. Had one of those
'acoustic modems' in the beginning - you literally
squished the phone handpiece into them. My first
1200 baud was Anchor Robotics. VASTLY better. At
300 baud you could actually read the text real-time
as it came in. Oh well, there WERE slower baud rates
before then .........
Try 110 baud with an ASR-33. Switching to 300 baud on
a LA-120 was blazing fast...
Would that be the same as the DEC-Writer II? Somebody gave me one of
those w/ an acoustic coupler circa 1990 that I lugged home and
connected just for entertainment value. IIRC, it had a 110/300
selector switch on the keyboard. I used it a few times to connect to
a dialup BBS that wasn't very fast anyhow. The optical whatsit with
the slotted wheel finally gave up so its only operational mode was
"slam the printer head against the left stop and hold it there."

I still have the cute slotted wheel somewhere and the base now
supports my computer desk.
--
Mike Spencer Nova Scotia, Canada
Scott Lurndal
2021-09-03 20:51:34 UTC
Permalink
Post by Mike Spencer
Post by SixOverFive
Heh ... yea yea ... I *remember*. Had one of those
'acoustic modems' in the beginning - you literally
squished the phone handpiece into them. My first
1200 baud was Anchor Robotics. VASTLY better. At
300 baud you could actually read the text real-time
as it came in. Oh well, there WERE slower baud rates
before then .........
Try 110 baud with an ASR-33. Switching to 300 baud on
a LA-120 was blazing fast...
Would that be the same as the DEC-Writer II?
LA-120 was the DECwriter III.
Quadibloc
2021-09-05 04:29:29 UTC
Permalink
Post by Scott Lurndal
Post by Mike Spencer
Post by SixOverFive
Heh ... yea yea ... I *remember*. Had one of those
'acoustic modems' in the beginning - you literally
squished the phone handpiece into them. My first
1200 baud was Anchor Robotics. VASTLY better. At
300 baud you could actually read the text real-time
as it came in. Oh well, there WERE slower baud rates
before then .........
Try 110 baud with an ASR-33. Switching to 300 baud on
a LA-120 was blazing fast...
Would that be the same as the DEC-Writer II?
LA-120 was the DECwriter III.
And the DECwriter II was the LA-36.

John Savard
Mike Spencer
2021-09-05 07:19:45 UTC
Permalink
Post by Quadibloc
Post by Scott Lurndal
Post by Mike Spencer
Would that be the same as the DEC-Writer II?
LA-120 was the DECwriter III.
And the DECwriter II was the LA-36.
Right. Thanks. I still have the user manual here somewhere but I
can't find it.
--
Mike Spencer Nova Scotia, Canada
gareth evans
2021-09-04 14:46:54 UTC
Permalink
Post by Mike Spencer
Post by SixOverFive
Heh ... yea yea ... I *remember*. Had one of those
'acoustic modems' in the beginning - you literally
squished the phone handpiece into them. My first
1200 baud was Anchor Robotics. VASTLY better. At
300 baud you could actually read the text real-time
as it came in. Oh well, there WERE slower baud rates
before then .........
Try 110 baud with an ASR-33. Switching to 300 baud on
a LA-120 was blazing fast...
Would that be the same as the DEC-Writer II? Somebody gave me one of
those w/ an acoustic coupler circa 1990 that I lugged home and
connected just for entertainment value. IIRC, it had a 110/300
selector switch on the keyboard. I used it a few times to connect to
a dialup BBS that wasn't very fast anyhow. The optical whatsit with
the slotted wheel finally gave up so its only operational mode was
"slam the printer head against the left stop and hold it there."
I still have the cute slotted wheel somewhere and the base now
supports my computer desk.
The DecWriter was delivered on a pallet that had solid sheets
of plywood rather than slats. As the result of that, when we first
married 47 years ago, our first bookcase, which we still have and use,
was made from those sheets of plywood!

After the clanking Teletypes, how we praised the DecWriter for its
quietness, but perhaps nowadays we'd think it to be a noisy thing!
Kurt Weiske
2021-09-03 15:00:00 UTC
Permalink
To: SixOverFive
-=> SixOverFive wrote to alt.folklore.computers,comp.os.linux.misc <=-

Si> But I did mostly miss the mainframe/mini days ...
Si> only had to use punchcards/paper-tape ONCE in a
Si> college class (which I dropped out of because
Si> the school already had serial terminals that'd
Si> do all that PLUS). The class was years behind
Si> the reality ......

I had a high school teacher insist we spend a day writing punch card decks
on an old system he'd brought in. He said "trust me, one day this will make
sense..."

So, I get to say I started out on punch cards. :)

My first job in college was an homage to the old mainframe days. The
bookstore had a computer room with a raised floor, a Microdata midrange
system running a variant of PICK OS, 48 serial terminals, a line printer and
a 9-track tape drive for nightly backups. Spent the first years of my career
feeding 11x17 greenbar paper and tapes into the beast.



... Retrace your steps
--- MultiMail/DOS v0.52
--- Synchronet 3.19a-Win32 NewsLink 1.113
* realitycheckBBS - Aptos, CA - telnet://realitycheckbbs.org
J. Clarke
2021-09-04 19:09:36 UTC
Permalink
On Fri, 3 Sep 2021 08:00:00 -0700, "Kurt Weiske"
Post by Kurt Weiske
To: SixOverFive
-=> SixOverFive wrote to alt.folklore.computers,comp.os.linux.misc <=-
Si> But I did mostly miss the mainframe/mini days ...
Si> only had to use punchcards/paper-tape ONCE in a
Si> college class (which I dropped out of because
Si> the school already had serial terminals that'd
Si> do all that PLUS). The class was years behind
Si> the reality ......
I had a high school teacher insist we spend a day writing punch card decks
on an old system he'd brought in. He said "trust me, one day this will make
sense..."
So, I get to say I started out on punch cards. :)
My first job in college was an homage to the old mainframe days. The
bookstore had a computer room with a raised floor, a Microdata midrange
system running a variant of PICK OS, 48 serial terminals, a line printer and
a 9-track tape drive for nightly backups. Spent the first years of my career
feeding 11x17 greenbar paper and tapes into the beast.
Where was that? I thought OSU was a big school, but I don't recall
their bookstore having anything near that elaborate.
Post by Kurt Weiske
... Retrace your steps
--- MultiMail/DOS v0.52
--- Synchronet 3.19a-Win32 NewsLink 1.113
* realitycheckBBS - Aptos, CA - telnet://realitycheckbbs.org
Charlie Gibbs
2021-09-07 20:12:49 UTC
Permalink
Post by Kurt Weiske
I had a high school teacher insist we spend a day writing punch card decks
on an old system he'd brought in. He said "trust me, one day this will make
sense..."
So, I get to say I started out on punch cards. :)
My first job in the Real World was in a shop that had a machine
with 16K of RAM and no disks or tapes - only cards. At least
we had three card readers so we didn't need a collator to merge
input decks - or a sorter to separate them again afterwards.
--
/~\ Charlie Gibbs | They don't understand Microsoft
\ / <***@kltpzyxm.invalid> | has stolen their car and parked
X I'm really at ac.dekanfrus | a taxi in their driveway.
/ \ if you read it the right way. | -- Mayayana
Loading...