larsbrinkhoff / its-manual

ITS Reference Manual
9 stars 0 forks source link

Lessons learned from ITS #2

Open larsbrinkhoff opened 1 year ago

larsbrinkhoff commented 1 year ago

@lieber made a good point about writing about lessons we can learn from ITS:

what the lessons are for today from the ITS experience, so that today's students and researchers can see the relevance. [...] themes like how ITS was a no-bullshit, DIY, community-based environment, and how it developed in tandem with the hacker ethic are vibrant and relevant.

I have though about it, but I feel a bit hard pressed to come up with many lessons. Maybe the prime example of surviving ITS technology is Emacs. It's not the most popular editor today, but it seems to have found a niche and a thriving user base. I think one lesson learned is the typical ITS user interface: operating an application using short key sequences with immediate feedback. DDT also exemplifies this user interface, but has not captured an audience like Emacs has. I sometimes add a DDT-style user interface to my programs and I enjoy using them very much!

The ethics and community aspect is still around in some form. GNU and FSF are early examples, but there are also many other projects with similar values.

Can anyone think of more lessons learned (or forgotten!) from ITS?

oilcan-productions commented 1 year ago

I asked Chat GPT the question "What are lessons learned from the user of ITS at MIT in the 60s and 70s? Here is what it had to say:

The Incompatible Time-Sharing System (ITS) at MIT in the 1960s and 70s was an extremely influential solution to computer bottlenecks at the time. Here are some of the key lessons and insights gained from this period:

Timesharing as a Solution to Computing Problems: In the 1960s, computers were large and expensive, and there were no personal computers. In the workplace, you would have to reserve a time slot to use the big office computer, bring all of your files down to it, do what you needed to do, then get off and let the next person on. MIT helped advance a solution called timesharing, which allowed hundreds of people to share the same computer to do computations without leaving their seat, drastically boosting productivity and solving one of the largest computer bottleneck issues of the time.

Influence of Multics: MIT's Multics (Multiplexed Information and Computing Service) was an extremely influential early timesharing operating system. It started in 1964 and helped timesharing become the dominant mode of computing in the 1970s, before personal computers became popular.

Similarities Between Timeshare and Cloud Computing: The concept of timesharing has similarities with modern cloud computing. Just as timesharing allowed multiple users to share the same computer resources, cloud computing allows multiple users to store their data on a server and access it from anywhere. This allows people to collaborate, share, and process data without needing to set up heavy infrastructure in their own homes.

Continued Evolution of Computing: The evolution from timesharing to personal computing and now to cloud computing shows the continuous evolution of computing technology. The lessons learned from the use of ITS at MIT in the 60s and 70s continue to inform the development of new computing technologies and systems.

Importance of Collaboration and Funding: The development of the MIT Computation Center, including the Multics system, was made possible through collaboration and funding from various sources, including IBM. This highlights the importance of collaboration and funding in advancing technological innovation.

These lessons provide valuable insights into the evolution of computing and the ongoing importance of innovative solutions to address computing challenges.

larsbrinkhoff commented 1 year ago

Sorry to say, but not surprised, ChatGPT mostly misses the point and has nothing to say specifically about ITS.

lieber commented 1 year ago

I just want to clarify a point Henry was making. He wrote:

 The other was commercial and administratively
 oriented. It imagined computation "as a service"
 that users would pay for and companies would sell.

I don't think that the motive was as commercial as that. The vision of CTSS and Multics was the development of computation as a public utility, to be available to everyone. This involved considerations of security and privacy. It also required administration, accounting, and billing, because computation does require resources to provide.

The AI Lab's ITS was not thought of as the beginnings of a public utility. It was a private research facility designed to rapidly adjust to the needs of the AI Lab's researchers. For example, it was important to provide real-time services, so we could do experiments in robotics. It was necessary to provide easy ways to attach nonstandard I/O devices, like cameras and boxes of potentiometers (for adjusting parameters of a control algorithm). These experimental devices could not be specified until they were built.

Thus, because of this flexibility, the AI Lab was able to host the XGP (This was the very first laser printer that used the Xerox process). The XGP did not eat a ream of paper -- it sucked paper from a giant roll that was cut into pages by an I/O command from the controlling computer. The interface to ITS was a PDP-11 that controlled the printer. The PDP-11 got its instructions from ITS (running on a PDP-6 or PDP-10) through a direct hardware link.

Quoting from https://xeroxalto.computerhistory.org/xerox_alto_file_system_archive.html

"In the meantime, various people at Xerox were
building a series of experimental raster printers.
The first of these was called XGP, the Xerox
Graphics Printer, and had a resolution of 192 dots
to the inch. Xerox made XGP's available to certain
universities, and by 1972 they were in use at
Carnegie-Mellon, Stanford, MIT, Caltech, and the
University of Toronto. Each of those organizations
produced its own hardware and software interfaces.
The XGP is historically interesting only because it
is the first raster printer to gain substantial use
by computer scientists, and was the arena in which
a lot of mistakes were made and a lot of lessons
learned."

By the way, my PhD thesis (1973) was prepared using the XGP. It turns out to be the first PhD thesis printed on a "laser printer".

GJS

NoelChiappa commented 1 year ago

It's worth noting that the XGP was not, I think, a 'laser printer' as the term world be understood now - i.e. a printer which uses static electricity to hold powdered ink on a drum (where the laser could 'draw' images in the static patterns on the drum) until it can be melted onto a piece of paper by heat from the fuser (the system used by the later Dover). My vague memory, which could easily be wrong, was that the XGP used heat-sensitive paper, and the images were drawn directly on that. (I remember the paper the XGP produced, and it was just like the paper used in an old photocopier in my dad's office when I was a boy - and much later on, in my fax machine - both of which were heat-sensitive - if you left a printout out in the sun it went black!)

NoelChiappa commented 1 year ago

So I was wrong (again :-). The XGP was based on something called the Long Distance Xerography (LDX), an early hi-res fax system; I was able to find documentation on it (and the XGP), and it definitely used a xerographic printing process on plain paper.

I'm wondering if I am remembering a thing available on Knight TV's - there was a command you could give ('ALT Q'; I lost where I saw it in the docs, and had to resort to reading SYSTEM; TV >) and it would print what was on the screen. The output for that may have used a heat-sensitive paper printer; it used a Tektronix printer of some kind, according to the source.

larsbrinkhoff commented 9 months ago

Yes, the Knight TV system had a printer attached. I haven't looked into it much.

Second, we now have an XGP emulator that includes the PDP-11 frontend. Here's some sample output. Screenshot from 2024-02-09 08-07-49