University Computing before Timesharing

End-user computing

During the 1960s, academic computing diverged from the mainstream. Most commercial applications, whether data processing or scientific, were large production jobs that ran for several hours and used the entire computer. Companies hired professionals to write the programs, punch the input, and run the jobs. In universities, the faculty, students, and researchers wrote their own programs and often ran them themselves. They would spend long periods developing programs, hoping for fast turn around of compilations and small tests, followed by a few large computations. Since many of the programs were run only a few times, priority was given to convenient program development.

Computer hardware was improving rapidly, both in performance and reliability, but hardware was a scarce resource. Processor cycles could not be wasted. Because there were very great economies of scale, the best strategy for a university was to buy a single large computer and find a way to share it among the users. This led to the growth of university computer centers.

The University of Sussex

When we were at the University of Sussex from 1969 to 1972, we had computing facilities that were typical of the better universities at the time. The university computer was an ICL 1904A. The ICL 1900 series was an early third generation system that competed quite successfully with the IBM 360. The architecture used a 24-bit word. I think that the computer at the University of Sussex had 32K words, equivalent to less than 100K bytes.

ICL developed a sequence of operating systems called George. While waiting for ICL to deliver George III, the University of Sussex developed a simple monitor for running small Fortran jobs. It used a circular buffer on magnetic disk that held jobs waited to be processed. The jobs were card decks of Fortran programs to be compiled and run. Since card reading was slow, the aim was always to have several jobs that had been read into the buffer, waiting to be run, so that the central processor was never idle. Output to the line printer was also buffered.

The Fortran compiler and the monitor used almost all the memory, but about 4K words were spare. My wife and I reached an agreement with the computing center that we could use this small amount of memory for our experiments with online catalogs. The computer had provision for a few terminals and we used one of them. This sounds an absurdly small amount of memory but we were able to use the Fortran IV overlay mechanism. This was a primitive form of paging by which the programmer specified that certain subroutines and data structures could overlay each other in memory. Since the basic Fortran I/O package itself used more than 4K, we wrote a physical I/O routine to control the terminal and never loaded the I/O package.

The university ran the computer for two shifts per day. The third shift, from midnight to 8 a.m., was available for researchers. Several of us went through the operator training course and for one night shift per week we had sole use of the machine. The procedure for rebooting the computer was typical of the era. Nowadays computers hold their boot program in some form of persistent storage, but the boot program for the 1904A was on paper tape. The first step in booting the machine was to set a few instructions using hand switches on the central processor. These instructions were executed when the computer was powered up. They instructed the paper tape reader to read the boot program into memory and the boot program then read the operating system from magnetic tape.

Because the machine room was noisy we would work on our programs in the reception area. We could tell what the machine was doing by listening to the audio monitor. This device, which was common on machines of that era, made a distinctive tone for each category of instruction that was being executed. Since each program had a distinctive pattern of sounds we could tell when a job, such as a tape sort or the Fortran compiler, came to an end.

Punched card equipment

The central computer was used almost entirely for academic work. Administrative data processing used punched card equipment. For example, the library's circulation system used nothing but punched cards and made no use of the computer. For one of my analyses I had more than seventy trays each containing 2,000 cards.

The punched card machines were direct descendents of the Hollerith machines that were built in the early 1900s to tabulate census data. IBM took over the Hollerith company and much of the company's data processing expertise came from its experience with punched card equipment.

Punched card equipment

In this photograph the operators are men, but in my experience most of them were women. The tidiness is also misleading. In practice, trays of cards and boxes of printer paper were stacked everywhere.

IBM Archives

The data processing room at the University of Sussex had about six large devices each with its specialized function: a card sorter, copier, collator, tabulator, printer, etc. The collator was particular important as it could merge data from two stacks of cards and punch out a new card combining information from them. Each device was controlled by cables that were inserted into a plug board, thus creating a very simple program.

As an example, the card sorter had one input hopper and ten output hoppers. Sorting was one column at a time. The operator would use the plug board to specify a column of the card and other parameters. She would load the cards into the input hopper, one tray at a time, and the sorter would send each card to the output hopper that corresponded to the number punched in the appropriate column. To sort by a three digit number, the cards would be passed through the sorter three times, sorting first by the least significant digit. Complex data processing operations, such as a master file update, were carried out by passing trays of cards repeatedly through the various devices.