Discussion:
August 7, 1944: today is the 65th Anniversary of the Birth of the Computer
(too old to reply)
Quadibloc
2009-08-07 15:07:18 UTC
Permalink
It wasn't electronic.

It didn't get a conditional branch instruction until much later.

But it was the first working digital computer whose existence was
generally known to the public - as work on it had been made public in
1939, before the U.S. entered World War II. Howard Aiken saw his
machine as "Babbage's dream come true", which means that Charles
Babbage was not totally obscure in the computer field until being
rediscovered in the late 1960s, as some have claimed. (And I have a
copy of the Encyclopedia Britannica that says that the connection
between Hamilton's quaternions and vector calculus wasn't an obscure
secret that needed rediscovering either... predating the book on the
subject that made such a claim.)

Torres y Quevedo nearly gave the world the computer decades before,
but it was not to be.

John Savard
Chris Barts
2009-08-10 10:41:25 UTC
Permalink
Post by Quadibloc
It wasn't electronic.
It didn't get a conditional branch instruction until much later.
Then I'd deny that it's a 'computer' in the modern sense of the
word. Under that definition we could call the abacus a computer: It
isn't electronic or Turing-complete, either, but it still 'works with
numbers' in some ill-defined sense.

This touches on a pet peeve of mine that I've already bugged Anne &
Lynne Wheeler about: Claiming to be the 'first' with something by
redefining some essential term. The Wheelers did it by claiming IBM
was 'first' with personal computing by redefining 'personal' to
include an individual's CMS sessions on a shared mainframe. You're
doing it by redefining the word 'computer' to include things that
aren't even programmable in the modern sense of the term. This is
'history as boosterism' or 'history as propaganda' and I dislike
remaining quiet while it's being done in my presence.
jmfbahciv
2009-08-10 11:46:03 UTC
Permalink
Post by Chris Barts
Post by Quadibloc
It wasn't electronic.
It didn't get a conditional branch instruction until much later.
Then I'd deny that it's a 'computer' in the modern sense of the
word. Under that definition we could call the abacus a computer: It
isn't electronic or Turing-complete, either, but it still 'works with
numbers' in some ill-defined sense.
This touches on a pet peeve of mine that I've already bugged Anne &
Lynne Wheeler about: Claiming to be the 'first' with something by
redefining some essential term. The Wheelers did it by claiming IBM
was 'first' with personal computing by redefining 'personal' to
include an individual's CMS sessions on a shared mainframe.
Quite the contrary: "personal" was redefined when those little
toys started to be sold a decade later.
Post by Chris Barts
You're
doing it by redefining the word 'computer' to include things that
aren't even programmable in the modern sense of the term. This is
'history as boosterism' or 'history as propaganda' and I dislike
remaining quiet while it's being done in my presence.
People who used TOPS-10 viewed the system as _their_ computer.
The services provided by that computer system gave users the
impression that the whole computer was available to them.
If that ain't personal, I wouldn't know what is.

/BAH
p***@spamlessxhost.org
2009-08-10 13:54:30 UTC
Permalink
Post by jmfbahciv
People who used TOPS-10 viewed the system as _their_ computer.
The services provided by that computer system gave users the
impression that the whole computer was available to them.
If that ain't personal, I wouldn't know what is.
That ain't "personal". If it were "personal", the whole computer *would* be
available to them, and only them, and it would be small enough to be kept in a
normal *personal* space.

-- Larry
Quadibloc
2009-08-11 07:31:41 UTC
Permalink
Post by p***@spamlessxhost.org
That ain't "personal". If it were "personal", the whole computer *would* be
available to them, and only them, and it would be small enough to be kept in a
normal *personal* space.
All right, that does eliminate the Bendix G-15 or the IBM 1620.

There's still the Honeywell 316, except that Neiman-Marcus never sold
one.

The Wang 500 programmable calculator might be considered...And the
PDP-8 almost qualified, particularly the 8/e.

Despite the Mark-8, one can still note that it was the Altair 8800
that made the revolution. It had the right combination of power and
affordability. That was 1975. Only 6 years later, in 1981, the
revolution was consolidated (and/or co-opted, depending on your point
of vies) with the IBM PC... and then, in 1984, the wonders presaged
first by experimental machines from Xerox, then by Apple's Lisa,
became affordable with the Macintosh.

Somewhat later, Windows 3.1 came out, and the rest is the history
we're living in.

John Savard
Chris Barts
2009-08-11 11:49:05 UTC
Permalink
Post by Quadibloc
Post by p***@spamlessxhost.org
That ain't "personal". If it were "personal", the whole computer
*would* be available to them, and only them, and it would be small
enough to be kept in a normal *personal* space.
All right, that does eliminate the Bendix G-15 or the IBM 1620.
Yes. Thank you. ;)
Post by Quadibloc
There's still the Honeywell 316, except that Neiman-Marcus never sold
one.
True.
Post by Quadibloc
The Wang 500 programmable calculator might be considered...And the
PDP-8 almost qualified, particularly the 8/e.
The PDP-8 was still too expensive to really be a /useful/ personal
computer, especially if you think that disk storage and enough RAM to
be useful are essential attributes of a workable computer system.
Post by Quadibloc
Despite the Mark-8, one can still note that it was the Altair 8800
that made the revolution. It had the right combination of power and
affordability. That was 1975.
Right.

Just as a point of refernece, in 1987 The Computer History Musem named
the Kenbak-1 the First Personal Computer. Around 40 were produced at
about $750 per in 1971. It was a TTL design with no microprocessor
(entirely hand-made CPU) and it had 256 bytes of RAM.

http://www.kenbak-1.net/
http://www.kenbak-1.net/index_files/page0005.htm

* Opcode reference: Yes, it *did* have conditional jumps!

http://www.vintage-computer.com/kenbak-1.shtml
Post by Quadibloc
Somewhat later, Windows 3.1 came out, and the rest is the history
we're living in.
There are high-schoolers who don't remember Windows 3.1 and who have
never seen a desktop computer without a mouse. In short: Yer old! :)
Walter Bushell
2009-08-10 14:50:02 UTC
Permalink
Post by jmfbahciv
Post by Chris Barts
Post by Quadibloc
It wasn't electronic.
It didn't get a conditional branch instruction until much later.
Then I'd deny that it's a 'computer' in the modern sense of the
word. Under that definition we could call the abacus a computer: It
isn't electronic or Turing-complete, either, but it still 'works with
numbers' in some ill-defined sense.
This touches on a pet peeve of mine that I've already bugged Anne &
Lynne Wheeler about: Claiming to be the 'first' with something by
redefining some essential term. The Wheelers did it by claiming IBM
was 'first' with personal computing by redefining 'personal' to
include an individual's CMS sessions on a shared mainframe.
Quite the contrary: "personal" was redefined when those little
toys started to be sold a decade later.
Post by Chris Barts
You're
doing it by redefining the word 'computer' to include things that
aren't even programmable in the modern sense of the term. This is
'history as boosterism' or 'history as propaganda' and I dislike
remaining quiet while it's being done in my presence.
People who used TOPS-10 viewed the system as _their_ computer.
The services provided by that computer system gave users the
impression that the whole computer was available to them.
If that ain't personal, I wouldn't know what is.
/BAH
Did not. Performance varied wildly as load changed. I ended up working
nights basically because there was no way to run my programs in the
daytime.
Charles Richmond
2009-08-10 16:53:54 UTC
Permalink
Post by Walter Bushell
[snip...] [snip...] [snip...]
People who used TOPS-10 viewed the system as _their_ computer.
The services provided by that computer system gave users the
impression that the whole computer was available to them.
If that ain't personal, I wouldn't know what is.
/BAH
Did not. Performance varied wildly as load changed. I ended up working
nights basically because there was no way to run my programs in the
daytime.
At the college I attended, there was access to a PDP-10 from MCRC
(Medical Computing Resource Center, part of the Southwestern
Medical School in Dallas). The max recommended number of users was
64, but the site allowed 80 users at a time. It was s-o s-l-o-w
that working with it was impossible. (I did *not* try 3 am...)

Later our college got a DECsystem 20 and things were much better.
Few folks at the college knew how to use it, so it was *not*
loaded down.

As for a "personal computer", definitions vary wildly and ISTM
that <a.f.c.> will *never* reach a consensus on a single
definition. Ditto for the definition of a "computer" at all.

It is my opinion that the "first computer" may have been Konrad
Zuse's systems in Nazi Germany. It was a general purpose machine
and created from relays. It was electric, but *not* electronic.
--
+----------------------------------------+
| Charles and Francis Richmond |
| |
| plano dot net at aquaporin4 dot com |
+----------------------------------------+
Anne & Lynn Wheeler
2009-08-10 17:13:00 UTC
Permalink
Post by Charles Richmond
At the college I attended, there was access to a PDP-10 from MCRC
(Medical Computing Resource Center, part of the Southwestern
Medical School in Dallas). The max recommended number of users was
64, but the site allowed 80 users at a time. It was s-o s-l-o-w
that working with it was impossible. (I did *not* try 3 am...)
Later our college got a DECsystem 20 and things were much better.
Few folks at the college knew how to use it, so it was *not*
loaded down.
re:
http://www.garlic.com/~lynn/2009l.html#8 August 7, 1944: today is the 65th Anniversary of the Birth of the Computer
http://www.garlic.com/~lynn/2009l.html#9 August 7, 1944: today is the 65th Anniversary of the Birth of the Computer
http://www.garlic.com/~lynn/2009l.html#10 August 7, 1944: today is the 65th Anniversary of the Birth of the Computer

I had done huge amount of pathlength optimization, general I/O thruput
optimization, page replacement algorithm optimization and page I/O
optimization for cp67.

I also did dynamic adaptive resource management ... frequently referred
to as fair share scheduling ... because the default policy was resource
fair share.

Grenoble science center had 360/67 similar to cambridge science center
machine ... but had 1mbyte of real storage instead of only 768kbytes
(Grenoble machine netted 50% more memory for paging ... after cp67
fixed storage requires ... than the cambridge machine).

At one point, Grenoble took cp67 and modified it to implement a
relatively straight forward "working set" dispatching ... pretty much as
described in the computer literature of the time ... and published an
article in ACM with some amount of detailed workload, performance and
thruput study.

It turned out that the (modified) Grenoble cp67 system with 35 users
(and 50% more real storage for paging), running nearly same workload,
got about the same thruput as the Cambridge cp67 system did with 80
users (and my dynamic adaptive resource management).

On the Cambridge system, trivial interactive response degraded much more
gracefully ... even under extremely heavy workloads (including during
extended periods of 100% cpu utilization, high i/o activty and high
paging i/o activity).

misc. past posts mentioning cambridge science center
http://www.garlic.com/~lynn/subtopic.html#545tech

misc. past posts mentioning grenoble science center:
http://www.garlic.com/~lynn/93.html#7 HELP: Algorithm for Working Sets (Virtual Memory)
http://www.garlic.com/~lynn/94.html#1 Multitasking question
http://www.garlic.com/~lynn/99.html#18 Old Computers
http://www.garlic.com/~lynn/2001h.html#26 TECO Critique
http://www.garlic.com/~lynn/2001l.html#6 mainframe question
http://www.garlic.com/~lynn/2002c.html#49 Swapper was Re: History of Login Names
http://www.garlic.com/~lynn/2002o.html#30 Computer History Exhibition, Grenoble France
http://www.garlic.com/~lynn/2002q.html#24 Vector display systems
http://www.garlic.com/~lynn/2003f.html#50 Alpha performance, why?
http://www.garlic.com/~lynn/2004.html#25 40th anniversary of IBM System/360 on 7 Apr 2004
http://www.garlic.com/~lynn/2004c.html#59 real multi-tasking, multi-programming
http://www.garlic.com/~lynn/2004g.html#13 Infiniband - practicalities for small clusters
http://www.garlic.com/~lynn/2004q.html#73 Athlon cache question
http://www.garlic.com/~lynn/2005d.html#37 Thou shalt have no other gods before the ANSI C standard
http://www.garlic.com/~lynn/2005d.html#48 Secure design
http://www.garlic.com/~lynn/2005f.html#47 Moving assembler programs above the line
http://www.garlic.com/~lynn/2005h.html#10 Exceptions at basic block boundaries
http://www.garlic.com/~lynn/2005h.html#15 Exceptions at basic block boundaries
http://www.garlic.com/~lynn/2005n.html#23 Code density and performance?
http://www.garlic.com/~lynn/2006e.html#7 About TLB in lower-level caches
http://www.garlic.com/~lynn/2006e.html#37 The Pankian Metaphor
http://www.garlic.com/~lynn/2006f.html#0 using 3390 mod-9s
http://www.garlic.com/~lynn/2006i.html#31 virtual memory
http://www.garlic.com/~lynn/2006i.html#36 virtual memory
http://www.garlic.com/~lynn/2006i.html#37 virtual memory
http://www.garlic.com/~lynn/2006i.html#42 virtual memory
http://www.garlic.com/~lynn/2006j.html#1 virtual memory
http://www.garlic.com/~lynn/2006j.html#17 virtual memory
http://www.garlic.com/~lynn/2006j.html#25 virtual memory
http://www.garlic.com/~lynn/2006l.html#14 virtual memory
http://www.garlic.com/~lynn/2006o.html#11 Article on Painted Post, NY
http://www.garlic.com/~lynn/2006q.html#19 virtual memory
http://www.garlic.com/~lynn/2006q.html#21 virtual memory
http://www.garlic.com/~lynn/2006r.html#34 REAL memory column in SDSF
http://www.garlic.com/~lynn/2006w.html#46 The Future of CPUs: What's After Multi-Core?
http://www.garlic.com/~lynn/2007i.html#15 when was MMU virtualization first considered practical?
http://www.garlic.com/~lynn/2007s.html#5 Poster of computer hardware events?
http://www.garlic.com/~lynn/2007u.html#79 IBM Floating-point myths
http://www.garlic.com/~lynn/2007v.html#32 MTS memories
http://www.garlic.com/~lynn/2008c.html#65 No Glory for the PDP-15
http://www.garlic.com/~lynn/2008h.html#70 New test attempt
http://www.garlic.com/~lynn/2008h.html#79 Microsoft versus Digital Equipment Corporation
http://www.garlic.com/~lynn/2008r.html#21 What if the computers went back to the '70s too?
--
40+yrs virtualization experience (since Jan68), online at home since Mar1970
jmfbahciv
2009-08-11 11:46:06 UTC
Permalink
Post by Walter Bushell
Post by jmfbahciv
Post by Chris Barts
Post by Quadibloc
It wasn't electronic.
It didn't get a conditional branch instruction until much later.
Then I'd deny that it's a 'computer' in the modern sense of the
word. Under that definition we could call the abacus a computer: It
isn't electronic or Turing-complete, either, but it still 'works with
numbers' in some ill-defined sense.
This touches on a pet peeve of mine that I've already bugged Anne &
Lynne Wheeler about: Claiming to be the 'first' with something by
redefining some essential term. The Wheelers did it by claiming IBM
was 'first' with personal computing by redefining 'personal' to
include an individual's CMS sessions on a shared mainframe.
Quite the contrary: "personal" was redefined when those little
toys started to be sold a decade later.
Post by Chris Barts
You're
doing it by redefining the word 'computer' to include things that
aren't even programmable in the modern sense of the term. This is
'history as boosterism' or 'history as propaganda' and I dislike
remaining quiet while it's being done in my presence.
People who used TOPS-10 viewed the system as _their_ computer.
The services provided by that computer system gave users the
impression that the whole computer was available to them.
If that ain't personal, I wouldn't know what is.
/BAH
Did not. Performance varied wildly as load changed.
And where did I say that performance was instantaneous? At least
it was "faster" than fucking MS crap.
Post by Walter Bushell
I ended up working
nights basically because there was no way to run my programs in the
daytime.
You still had all the resources available to you, didn't you? You
didn't have to wait in a human line to have your turn. You also
could do your small computing tasks at other times.

/BAH
Anne & Lynn Wheeler
2009-08-10 11:49:30 UTC
Permalink
Post by Chris Barts
This touches on a pet peeve of mine that I've already bugged Anne &
Lynne Wheeler about: Claiming to be the 'first' with something by
redefining some essential term. The Wheelers did it by claiming IBM
was 'first' with personal computing by redefining 'personal' to
include an individual's CMS sessions on a shared mainframe. You're
doing it by redefining the word 'computer' to include things that
aren't even programmable in the modern sense of the term. This is
'history as boosterism' or 'history as propaganda' and I dislike
remaining quiet while it's being done in my presence.
I was differentiating between "personal computer" and "personal
computing" ... where CMS provided a virtual machine "personal computing"
environment (it wasn't just CMS personal computing sessions on shared
mainframe, it was CMS virtual machine personal computing sessions on
shared mainframe).

There is some resurgence of virtual machine use ... with some number of
shared virtual machine sessions concurrent on real machines.

CMS was originally developed as a single indivdiual, personal computing
environment on a dedicated 360/40 (non-shared) personal computer
(mainframe, individual sitting at the dedicated 360/40 1052-7 keyboard).

CMS development went on concurrently (on real 360/40, non-shared
personal computer mainframe) with the 360/40 hardware changes to support
virtual memory and cp40 development to support virtual machines. When
cp40 virtual machine support was far enough along, CMS personal
computing development was able to move into a virtual machine
environment (not all that different than a lot of virtual machine
activity today).

Later, the science center acquired a 360/67 (that came standard with
virtual memory hardware support) ... and virtual machine cp40 morphed
into virtual machine cp67. however, cms design point continued to pretty
much remain the 360/40 dedicated personal computing environment ... that
happened to be running in a virtual machine.

A lot of the early CMS personal computing characteristics had come from
CTSS ... which I've claimed that both UNIX and MULTICS trace back to (as
common ancestor). I've commented that in the late 80s, I had to deal
with some unix (scheduling) code that was nearly identical to some cp67
code that I had replaced two decades earlier (and conjectured that
possibly both had inherited it from CTSS).

for some additional details ... see Melinda's history ... which goes
into more detail regarding CTSS, MULTICS, and early virtual machine
period:
http://www.princeton.edu/~melinda/
--
40+yrs virtualization experience (since Jan68), online at home since Mar1970
Anne & Lynn Wheeler
2009-08-10 13:05:50 UTC
Permalink
re:
http://www.garlic.com/~lynn/2009l.html#8 August 7, 1944: today is the 65th Anniversary of the Birth of the Computer

traditional cms virtual machine configuration definition basically
defined the virtual mapping to real devices. basic cms configuration was
the 360/40 dedicated personal computing mainframe, 1052-7 keyboard,
256kbytes (virtual) memory, 2540 card reader, 2540 card punch, 1403
printer, 2311 (mini-)disks.

cp40 (and later cp67) virtual machine support handled the translation of
2741 keyboard/terminal to emulated mainframe 1052-7 keyboard. It also
handled mapping of the virtual unit record devices to real devices
(again, analogous to what goes on in most current day virtual machine
implementations).

in the 70s, there were some pathlengths short-cuts added to cp67 with
cms having support to differentiate whether it was running on a "real"
machine (or virtual equivalent) ... or a virtual machine with the
pathlength shortcuts. however, cms could still be ipled/booted and
executed on a dedicated, real-machine, personal computer mainframe (not
being shared for any other purpose). it was part of the morph to vm370
that cms was artificially crippled so that it would no longer ipl/boot
on a real machine ... along with changing cms from "cambridge monitor
system" to "conversational monitor system".
--
40+yrs virtualization experience (since Jan68), online at home since Mar1970
Anne & Lynn Wheeler
2009-08-10 14:29:28 UTC
Permalink
re:
http://www.garlic.com/~lynn/2009l.html#8 August 7, 1944: today is the 65th Anniversary of the Birth of the Computer
http://www.garlic.com/~lynn/2009l.html#9 August 7, 1944: today is the 65th Anniversary of the Birth of the Computer

as aside, by jan68 when 3 people came out from the science center to
install cp67 at the univ., I had thousands of hands-on "personal
computer" time using 360/30 and 360/67 mainframe.

earlier, the univ. had 709 running ibsys ... and installed a 360/30 as
part of plan to move from 709/ibsys to 360/67 running tss/360. the
360/30 had 1401 hardware emulation mode for the 1401 it replaced (as
unit-record front-end for 709). It was planned that the 360/30 would
be used for some amount of time in 360 mode ... to gain 360 experience
at the univ.

During this period ... the univ would also shutdown the computing center
at 8am saturday ... and wouldn't re-open until 8am monday. As a result,
they would let me have the computer center and everything in it for
48hrs straight (monday classes were little hard having not slept for
48hrs). On the weekends, I had the computing center all to myself and I
got to use the 64kbyte 360/30 as my personal computer for 48hrs straight
... it was similar to later 64kbyte personal computers ... except in a
larger form factor. Later, the 360/30 was replaced with 360/67 ... and I
was still allowed to have the computing center all to myself for the
weekend and used the 768kbyte 360/67 as my personal computer.

in any case, by the time cp67 was installed at the univ. ... I already
had thousands of hrs of hands-on mainframe personal computer
experience.

misc. past posts mentioning having use of 360/30 as my personal computer
http://www.garlic.com/~lynn/98.html#55 Multics
http://www.garlic.com/~lynn/2004d.html#10 IBM 360 memory
http://www.garlic.com/~lynn/2004g.html#0 Usenet invented 30 years ago by a Swede?
http://www.garlic.com/~lynn/2005b.html#18 CAS and LL/SC
http://www.garlic.com/~lynn/2005b.html#54 The mid-seventies SHARE survey
http://www.garlic.com/~lynn/2005h.html#35 Systems Programming for 8 Year-olds
http://www.garlic.com/~lynn/2005n.html#8 big endian vs. little endian, why?
http://www.garlic.com/~lynn/2006k.html#27 PDP-1
http://www.garlic.com/~lynn/2006o.html#43 "25th Anniversary of the Personal Computer"
http://www.garlic.com/~lynn/2008r.html#19 What if the computers went back to the '70s too?
--
40+yrs virtualization experience (since Jan68), online at home since Mar1970
Chris Barts
2009-08-11 12:04:51 UTC
Permalink
Post by Anne & Lynn Wheeler
Post by Chris Barts
This touches on a pet peeve of mine that I've already bugged Anne &
Lynne Wheeler about: Claiming to be the 'first' with something by
redefining some essential term. The Wheelers did it by claiming IBM
was 'first' with personal computing by redefining 'personal' to
include an individual's CMS sessions on a shared mainframe. You're
doing it by redefining the word 'computer' to include things that
aren't even programmable in the modern sense of the term. This is
'history as boosterism' or 'history as propaganda' and I dislike
remaining quiet while it's being done in my presence.
I was differentiating between "personal computer" and "personal
computing" ... where CMS provided a virtual machine "personal computing"
environment (it wasn't just CMS personal computing sessions on shared
mainframe, it was CMS virtual machine personal computing sessions on
shared mainframe).
This makes more sense. I still think that 'personal' implies a much
greater individual control over hardware, software (installing your
own system software, no disk quotas, etc.), and being able to develop
and modify your own software with nobody looking over your shoulder
(or inspecting your data sets) to make sure your developments aren't
going to destroy the company or offend a government.

My example, as you recall, was the development of PGP, strong
encryption that, for a while, was a legally touchy piece of software:
The government *really* *really* wanted it to be illegal for
individuals to use encryption good enough to keep them out.

Had Zimmermann been developing PGP on a share computer he did not own,
like a 370 running VM/370, he could have been effectively shut down by
the owner of the computer denying him resources. Since he owned his
own computer, though, nobody could do that to him.
Post by Anne & Lynn Wheeler
There is some resurgence of virtual machine use ... with some number of
shared virtual machine sessions concurrent on real machines.
This is true. It's probably related to the cycle of reincarnation.
Post by Anne & Lynn Wheeler
CMS was originally developed as a single indivdiual, personal computing
environment on a dedicated 360/40 (non-shared) personal computer
(mainframe, individual sitting at the dedicated 360/40 1052-7 keyboard).
That sounds like the most expensive PC ever. Did they ever think they
could get companies to buy an OS like that?

<snip>
Post by Anne & Lynn Wheeler
for some additional details ... see Melinda's history ... which goes
into more detail regarding CTSS, MULTICS, and early virtual machine
http://www.princeton.edu/~melinda/
jmfbahciv
2009-08-11 12:23:16 UTC
Permalink
Post by Chris Barts
Post by Anne & Lynn Wheeler
Post by Chris Barts
This touches on a pet peeve of mine that I've already bugged Anne &
Lynne Wheeler about: Claiming to be the 'first' with something by
redefining some essential term. The Wheelers did it by claiming IBM
was 'first' with personal computing by redefining 'personal' to
include an individual's CMS sessions on a shared mainframe. You're
doing it by redefining the word 'computer' to include things that
aren't even programmable in the modern sense of the term. This is
'history as boosterism' or 'history as propaganda' and I dislike
remaining quiet while it's being done in my presence.
I was differentiating between "personal computer" and "personal
computing" ... where CMS provided a virtual machine "personal computing"
environment (it wasn't just CMS personal computing sessions on shared
mainframe, it was CMS virtual machine personal computing sessions on
shared mainframe).
This makes more sense. I still think that 'personal' implies a much
greater individual control over hardware, software (installing your
own system software, no disk quotas, etc.), and being able to develop
and modify your own software with nobody looking over your shoulder
(or inspecting your data sets) to make sure your developments aren't
going to destroy the company or offend a government.
Anybody who bought a PDP-nn could do that in the 60s and 70s. :-)
Post by Chris Barts
My example, as you recall, was the development of PGP, strong
The government *really* *really* wanted it to be illegal for
individuals to use encryption good enough to keep them out.
Had Zimmermann been developing PGP on a share computer he did not own,
like a 370 running VM/370, he could have been effectively shut down by
the owner of the computer denying him resources. Since he owned his
own computer, though, nobody could do that to him.
One of the reasons computer manufacturers made money was supplying
the "basic" (not the interpreter) software so that bright people
didn't have to reinvent the wheel and could go on to make new
stuff.
Post by Chris Barts
Post by Anne & Lynn Wheeler
There is some resurgence of virtual machine use ... with some number of
shared virtual machine sessions concurrent on real machines.
This is true. It's probably related to the cycle of reincarnation.
Post by Anne & Lynn Wheeler
CMS was originally developed as a single indivdiual, personal computing
environment on a dedicated 360/40 (non-shared) personal computer
(mainframe, individual sitting at the dedicated 360/40 1052-7 keyboard).
That sounds like the most expensive PC ever. Did they ever think they
could get companies to buy an OS like that?
Back then companies didn't buy an OS. They bought hardware and the
software came with it. Just like you can buy a car and the color
comes with it; you don't have to paint it yourself.

<snip>

/BAH

Ben Pfaff
2009-08-10 17:38:34 UTC
Permalink
Post by Chris Barts
This touches on a pet peeve of mine that I've already bugged Anne &
Lynne Wheeler about: Claiming to be the 'first' with something by
redefining some essential term. The Wheelers did it by claiming IBM
was 'first' with personal computing by redefining 'personal' to
include an individual's CMS sessions on a shared mainframe. You're
doing it by redefining the word 'computer' to include things that
aren't even programmable in the modern sense of the term. This is
'history as boosterism' or 'history as propaganda' and I dislike
remaining quiet while it's being done in my presence.
A virtual machine on a shared mainframe isn't what we would call
a personal computer today. But I do think that it might have
been thought of as a personal computer at the time. I wrote
about this in my thesis (http://benpfaff.org/papers/thesis.pdf):

An article about virtual machines in 1970 contrasted the two
models this way [9]:

Remember the bad old days when you could sit at the console
and develop programs without being bothered by a horde of
time-hungry types? Then things got worse and they closed
the door and either you took a 24 or 48 hour turnaround, or
they let you have 15 minutes at 1:15 AM on Sunday night.

[...]

Once time-sharing became the goal, the next question was how
to design the user interface for these new time-sharing
systems. To anyone of the era who had had the opportunity to use
a machine interactively, the obvious answer was that it should
look as though the user had a computer to himself. The early
discussions of time-sharing systems emphasized this aspect. For
example, in a 1962 lecture, John McCarthy described the goal of
time-sharing as: "From the user's point of view, the solution
clearly is to have a private computer" [6]. Similarly, in an MIT
report proposing research into time-sharing systems, Herbert
Teager described its goal as presenting ". . . all the
characteristics of a user's own personal computer. . . " [14].

This orientation naturally carried over to early time-sharing
system implementations. The authors of the APEX time-sharing
system built in 1964, for example, said that it "simulates an
apparent computer for each console" [15]. A time-sharing system
at UCB was described in a 1965 paper as built on the principle
that ". . . each user should be given, in effect, a machine of
his own with all the flexibility, but onerousness, inherent in a
`bare' machine" [12]. These systems were not exceptional cases,
as reported in a 1967 theoretical treatment of time-sharing
systems [10]: "Time-shared systems are often designed with the
intent of appearing to a user as his personal processor."

It should not be surprising, then, that many of these early
time-sharing systems were almost virtual machine monitors. The
APEX system mentioned above, which ran on the TX2 machine at
MIT, is representative. Its "apparent computers" were described
as "somewhat restricted replicas of TX-2 augmented by features
provided through the executive program." [...]

[...]
--
Ben Pfaff
http://benpfaff.org
Michael Wojcik
2009-08-10 18:57:46 UTC
Permalink
Post by Chris Barts
Post by Quadibloc
It wasn't electronic.
It didn't get a conditional branch instruction until much later.
Then I'd deny that it's a 'computer' in the modern sense of the
word. Under that definition we could call the abacus a computer: It
isn't electronic or Turing-complete, either, but it still 'works with
numbers' in some ill-defined sense.
You don't need a conditional branch for Turing-completeness. All you
need is iteration and arithmetic. Predicated execution takes care of
conditions; you accumulate the results of both branches, first
multiplying each result by a coefficient 1 or 0 depending on which
result you want. The coefficient is determined by simple arithmetic,
so there's no conditional execution there either.

See for example [1], which explains using this method to demonstrate
that Zuse's Z3 is Turing-complete (modulo storage limits, as with any
real computer).

Note the abacus is still not a computer in this sense, because it does
not incorporate arithmetic or iteration. Abacus+operator is a
computer, but since the operator is already a computer, the abacus is
just an optimization.


[1] http://www.ddj.com/architect/184404241
--
Michael Wojcik
Micro Focus
Rhetoric & Writing, Michigan State University
Joe Pfeiffer
2009-08-10 19:39:44 UTC
Permalink
Post by Michael Wojcik
Post by Chris Barts
Post by Quadibloc
It wasn't electronic.
It didn't get a conditional branch instruction until much later.
Then I'd deny that it's a 'computer' in the modern sense of the
word. Under that definition we could call the abacus a computer: It
isn't electronic or Turing-complete, either, but it still 'works with
numbers' in some ill-defined sense.
You don't need a conditional branch for Turing-completeness. All you
need is iteration and arithmetic. Predicated execution takes care of
conditions; you accumulate the results of both branches, first
multiplying each result by a coefficient 1 or 0 depending on which
result you want. The coefficient is determined by simple arithmetic,
so there's no conditional execution there either.
See for example [1], which explains using this method to demonstrate
that Zuse's Z3 is Turing-complete (modulo storage limits, as with any
real computer).
Note the abacus is still not a computer in this sense, because it does
not incorporate arithmetic or iteration. Abacus+operator is a
computer, but since the operator is already a computer, the abacus is
just an optimization.
[1] http://www.ddj.com/architect/184404241
Did the Mark I have predicated arithmetic? I'd certainly never heard
that, and if it didn't then it isn't possible to squint hard enough to
make it look like a computer to me.
--
Klingon programs don't have parameters. They have arguments and win
them (Walter Bushell)
Dav Vandenbroucke
2009-08-10 20:43:51 UTC
Permalink
It is my belief that there was never a first or last anything, because
every such claim always gets lost in a morass of "it depends on what
you mean by..."

Dav Vandenbroucke
davanden at cox dot net
Quadibloc
2009-08-11 07:22:51 UTC
Permalink
Post by Chris Barts
You're
doing it by redefining the word 'computer' to include things that
aren't even programmable in the modern sense of the term.
The Harvard Mark I definitely was programmable, since it worked from
punched paper tapes with instructions to perform calculations. This is
very much unlike the British Colossus device, even though it had the
advantage of being electronic.

John Savard
Continue reading on narkive:
Loading...