Early computing histories Chesher slides

Download as pdf or txt
Download as pdf or txt
You are on page 1of 51

Hi

I’m Chris Chesher, a long termer in the Digital Cultures program. I also teach
Technology and Culture at 3000 level, so we may meet again.

Today I want to explore some of the early history of computing. But I also want to
introduce you to an original way of thinking about computers as a media form. You
may not have thought about it, but what is distinctive about computers is that you
call things up. Googling calls up search results. A game calls up a space, some rules,
an avatar, enemies. TikTok calls up video after video, calling up your own history to
anticipate what you might want. Tinder lets you call up faces you might like. I refer to
these events of calling up things as invocations. I’ll introduce this theory I am
developing in a forthcoming book called Invocational Media: Reconceptualising the
Computer.

The second part of the lecture explores the ways in which computers changed
meanings at different points in their history. They have been perceived as intelligent
electronic brains, machines that give corporations, governments and the military
power. On the other hand, they have been perceived as sources of magical
empowerment. In this section I relate distinctive moments in popular culture —

1
movies, books, TV shows — to the technologies of their day. I argue that the
computer of any age reflects the wider priorities and ideas in the society in which it is
embedded.

1
First, I’d like to pay respects to the traditional owners of the land upon which I am
recording this video: the Gadigal people of the Eora Nation. They never ceded
sovereignty and remain custodians of the land.

2
There are many histories of computing. Some of the common genres are biographies
of computer pioneers such as Babbage, Turing, Von Neumann, Ted Nelson, Bill Gates,
Steve Jobs, and so on.
There are histories of particular conceptual breakthroughs: calculating machines,
digital logic, stored programs, object-oriented programming and so on.
Then there are biographies of machines: Colossus, ENIAC, ACE, IBM’s System 360,
Macintosh that place

There are journals dedicated to computing history such as IEEE Annals of the History
of Computing and Journal of the Association for History of computing.

3
There is no real agreement on when the history of computing began.
O’Regan begins with early civilisations such as the Babylonians and Egyptians. He
then acknowledges the contribution of key figures such as Boole and algebra,
Babbage and the Difference Engine, Turing’s notional machine, Von Neumann’s
computing architecture. The following sections in O’Regan’s book analyse particular
programming languages, the practices of software engineering, the influences that
led to artificial intelligence, the internet ‘revolution, and then a section on the major
computing companies.

Here there is a mix of history of ideas, biographies of pioneers, history of industries,


companies and sectors.

4
So for example, Ceruzzi mixes up these elements: a broad introduction, followed by
early computers, the stored program, hardware elements (chips and
microprocessors), and networking.
Ceruzzi starts with an overview of the digital age that establishes some background,
such as Pascal’s 17th century mechanical calculators, Babbage’s 19th century
experimentation with mechanical computing engines. It was based on Babbage’s
machines that Ada Lovelace developed an early idea of computer programming.
Ceruzzi goes on to the development, later in the 19th century, of Hollerith’s
tabulators, using cards with punched holes to record the census in 1890, and in the
twentieth century the teletypewriter established electronic components that
preceded the full blown computer.

5
In the second chapter, Ceruzzi tells the story of how the digital computer was
developed first between 1935 and 1945, particularly for code-breaking (which we will
come to in the second half of this lecture) and calculations for weapons ballistics:
aiming guns and designing atomic bombs. The Colossus was a code-breaking
computer built in Bletchley Park, near London, to speed up the decryption of enemy
messages during the second world war. One of the applications of the early computer
called the Harvard Mark 1 (built in1944) was in the Manhattan Project to design the
atomic bomb.

6
In the third chapter Ceruzzi identifies the key innovation of stored programs. That is,
it became possible to build a machine that could change into a different machine by
changing the software. This seems obvious in the age of apps, but the idea of a
machine that could be so versatile — by invoking new virtual machines — had to be
developed.

7
In the fourth and fifth chapters computing accelerated dramatically with the
development of the chip, and the microchip, and with the social context near San
Francisco that came to be known as Silicon Valley. Computers gradually became
miniaturized and popularized. Where IBM dominated computing during the 1950s to
1970s, it was Microsoft that dominated the retail licensed software market in the
1980s to early 2000s. Since the 2000s the platforms of Facebook, Amazon, Apple,
Netflix and Google have come to dominate not only the computer industry, but the
wider society and culture.

The story of computers is one of remarkable change and diversity, but also one of
remarkable continuity.

From this it is key to find out what distinguishes the digital computer from other
technologies. What is the magic sauce?

8
Here is the basic architecture architecture of a digital computer in the way that it is
usually shown in text books.
I won’t go into the detail of the histories that speak of Turing, Von Neumann, Eckert
and Mauchley and other uncredited workers who established the basic architecture
of computing, but this architecture is critical because it establishes the capacities of
the medium. It is also important that these devices work digitally and use binary
operations (the famous ones and zeros, or high and low voltages). They use
algorithms that automate logical and mathematical operations, but also operations
on language, images and sounds.
It is important to understand the basic design of the general-purpose computer:
- input devices like switches, keyboards, mice, cameras and touch screens
- The processor that steps through one logical instruction at a time
- The storage that holds those instructions and data in retrievable form
- Output devices that make data visible on screens and printers, audible through
speakers, and even feelable through rumble packs and vibrations.

But it is missing some key elements.


What about the user, who sits, or moves, somewhere between the input and output?
They’re the one that makes the inputs, perceives the outputs, writes the programs,

9
and cares about what is stored.
In fact, it’s better to say that inputs sense all sorts of changes or differences in the
environment. A thermostat can be triggered by changes in temperature. A security
camera with facial recognition can be triggered by a target passing by. This may be
with or without the intentions of a user.
The other thing that is missing is the verb. That is, what is happening when the
general purpose computer is in operation? The usual metaphors are processing,
running, computing and so on. Then there are the actions of people and
environments acting with the computer: using, interacting, playing, surfing and so on.
These are interesting metaphors but are not particular to the medium.

9
This is where I veer off the usual path to provide my own provocations about what
defines the computer computer through its history.

Rather than processing or running, I like to think of what happens with a computer is
acts of invocation. It’s a term already used in computing discourse to refer to
operations where a program calls on existing resources such as parts of programs —
subroutines or methods, but it would be accurate to say that computers continuously
invoke data and instructions from memory or from input devices.

If you’re using a voice assistant like Google Assistant or Alexa you already understand
the invocation: calling on an authority for guidance or support at a point of decision.
The authority could use calculations, prior knowledge or procedures to inform their
response.
So I propose that this metaphor can be extended to the way in which all computers
operate.
Computers are invocational media.

10
I’d like to propose another way of diagramming the general-purpose computer in
terms of invocations. I propose the following diagram.
This is not a diagram of material components (like the previous diagram), but a
diagram of the operations of invocational media. I will take this step-by-step.
Avocations help users create invocations.
Invocations call up events — such as performing a calculation, making a sound, firing
a weapon in a game
Evocations express the outcomes — displaying a result, hearing the sound, seeing the
enemy die
Invocable domains are logically defined spaces from which results can be invoked:
such as memory

11
First of all, there are what I call the avocations: the range of material objects,
instructions or affordances that allow users to start and continue using a computer.
The avocation is a minor form of the vocation. The vocation is a calling to a profession
and to an identity: the calling to be a doctor, a historian or an artist. The avocation is
traditionally a hobby, but I prefer to adapt the meaning a little to see the avocation as
a force that urges subjects to change their course. For example, rather than writing by
hand, the subject decides to take up a word processor, even if this requires some
effort to learn, and quite new behaviours and thought processes.
There physical keyboard supports the subject’s hands in composing invocations,
simultaneously offering power and constraints. There must be established protocols
and material arrangements that allow invocations to be performed. Input devices are
key to this, and the sensitivities of inputs establishes the limits for the kind of
invocations that can be performed. A typical computer keyboard has 101 keys that
can invoke characters (letters, numbers, punctuation), events (escape) and modifiers
(control, command). But conventions for entering instructions or clicking on icons are
also avocations. Your friend’s help, the advertisements, manuals, Google searches
and so on are all avocations that help constitute you as invoking subject.

12
Second, invocations are events that takes place very rapidly in today’s computers
when the computer is ‘running’. Invocations call from memory for data and
instructions, call from inputs from users or the environment, and call out to the
environment through output devices.

13
Third, what becomes perceptible after invocations I refer to as evocations. That is,
outputs are evocative. They create meanings and affects that exceed the data
themselves. Where data are relatively stable, the event of the evocation is singular:
the text, image or sound takes place at a particular moment in a particular
environment. In most cases of using the computer there is feedback between
evocations (such as a mouse cursor), avocations such as the movement of the
mouse), and the invocation (such as the change in position) that generates further
evocations.

14
Finally, there are invocable domains: everything that can be invoked must be
addressed within invocable domains (or digital domains). Hard disks are indexed,
processors work by invoking instructions from memory that invoke algorithmic logic
units and storing the result to memory. Even output devices are invocable domains
that allow a pixel to change colour or a sound to be played through digital to
analogue converters.

15
So to sum it up, I am arguing that the common metaphor for the computer — a
machine that computes — is barely adequate for explaining a metamedium that takes
on such a wide range of incarnations: automatic teller, artificial intelligences, virtual
spaces, artificial life, robot, social environment, media production tool and so on.

The distinctive media experiences are made possible courtesy of the first order
invocations at the engineering level that I just diagrammed: avocations, invocations,
evocations and invocable domains.

Then there are second order invocations. Using a computer is experienced as a kind
of magic that allows users to call up all sorts of things: calculations that would take
hours to do manually; personal messages from others anywhere in the world; music
selectable from enormous libraries that plays instantly; spoken responses to natural
language requests from a smart speaker. These experiences occur at the level of the
second order invocations — the everyday experience of making and witnessing
invocations.

Then there are third order invocations.


While the low level invocational diagram at the first order establishes a platform for

16
invocations, in order to create meaningful experiences of the second order, it must
draw upon the wider culture to inform the programs and experiences of the first and
second order.

Obviously, it has drawn upon logic and mathematics to formulate programs. I think of
Red Dead Redemption 2, which draws upon the history of the American West, upon
the biology of the horse and the rider, the mechanics of weapons, and the sociology
of crime. All computer programs work with metaphors. Metaphors are tropes that
carry meanings from one domain into another domain. They transpose and encode
all manner of patterns, procedures, protocols, techniques, values and ideologies from
all manner of practices through third order invocations.

In the second part of the lecture, I will revisit the history of computing as a history of
invocational media by focusing upon third order invocations. Third order invocations
are necessarily historically and culturally specific. That is, they reflect the concerns,
values and ideas of their time. So rather than seeing software and hardware
development as simply about improvements in speed and capacity (and there is no
doubt that these improvements have transformed the medium), I’d like to look at the
cultural meanings that have informed the history of the medium.

16
17
Hi
This second part of the lecture is not your usual history of computing. It refers to
many well-known computing developments since the 1940s, but not all of them.
Rather than giving the story of better and better technical innovations, or biographies
of genius innovators, I want to give a feel for the place of computers or invocational
media in changing world.

18
I am using a conception of history that gets beyond the usual narratives and analyses
— stories of who invented what and when, and what caused what and so on. These
kind of historical facts are important to know, but they’re don’t easily capture the
wider conditions or atmosphere of particular historical moments, places and
everyday lives that inform how computers are made, perceived and experienced.
That is, there are qualities — things in the air — that characterise a time and a place.
Cultural Studies pioneer Raymond Williams referred to these patterns as ‘structures
of feeling’.

Structures of feeling are apparent in popular culture — in stories, music and rituals of
the day. They are manifest in the ecology of the available media— theatre, cinema,
television or TikTok. They are manifest in what is taught in schools, what is talked
about in the news, politics and the everyday lives of communities, families,
workplaces and environments.

So for the rest of the lecture, I want to explore some of the structures of feeling that
informed the design, use and popular attitudes to computers. These are what I refer
to as third order invocations: the cultural resources that computer designers, users
and critics draw upon in imagining and using computers, or invocational media.

20
It’s 1943 and Europe is in the midst of the Second World War. A golden collie emerges
from the dark waters of a freezing river, shakes herself dry and continues on her
courageous journey. The dog has a look of tired determination on her face. She must
find her way hundreds of miles across the Scottish and English countryside to find the
family who had to sell her because they were too poor to keep her. In the classic
‘family’ film Lassie Come Home, released in 1943, this dog embodied all the
humanistic virtues of courage, intelligence and loyalty.

21
22
If Lassie got very lost, she might have passed an ultra-secret collection of buildings at
Bletchley Park north of London. Inside one of the buildings was COLOSSUS, the first
large scale electronic computer to go into service. It was a special purpose code-
cracking machine used to help decrypt messages sent by the German command using
the so-called ENIGMA machine.

Like Lassie, it seemed that COLOSSUS was intelligent — a common claim from the
earliest computers. Certainly, it did automatically do calculations previously
performed by human computers. Usually women who did the calculations for the
male mathematicians.

In the structure of feeling in the 1940s, intelligence and loyalty were highly prized.

In one sense, the task of the code-breakers was to collect intelligence about enemy
activities.
In another sense, the mathematicians and engineers were considered highly
intelligent, which is why their loyal service (which remained secret for forty years)
was so important in the war effort. The success of the machine showed how
intelligent they were.

23
The machine itself was highly valued because of its apparent intelligence. After the
war, Turing challenges computer programmers to create a machine that could pass as
intelligent after a short conversation. If the human was fooled, he claimed, the
machine could be called intelligent. This famous Turing test helped establish in the
popular imagination the paradigm that computers are machines ‘who’ think
(McCorduck 1979).

23
24
In the Lassie narrative, the dog was imputed with intelligence and a sense of morality.
She was passing as a cinematic central character, a role conventionally played by
humans. But that’s not all. As England was at war, the scenes of English countryside
were actually shot in Washington State on the west coast of the US. The Pacific North
West was passing as Yorkshire. The female dog who was to play Lassie was unreliable
and started losing her hair, so the film-makers switched her with her stunt double,
Pal. So Lassie was a male dog passing as female. Everything about the Lassie story
seemed to be masking a world of shifting identities: things imputed to be something
else.

In1950 Turing wrote the famous essay ‘Computing machinery and intelligence’ which
addressed the question of whether machines can think. He rejects the argument that
computers lacked a soul. He rejected the idea that a digital device — that works with
discrete states — could not have human-like intelligence, saying that we don’t know
whether humans don’t already have this limitation. He proposes that a computer
might be programmed as a child that can learn through ‘education and other
experienced. (In this he perhaps anticipated the later AI technique of machine
learning).

25
But Turing’s ultimate test of intelligence came down to a game — the imitation game.
If the computer can win the game, by fooling a human into believing that it was
talking to a human. This idea of passing — that performance is more important than
essence, and that passing as something, or invoking another identity also reflected
the capacity of the computer to pass as any variety of machines. It passes as a human
computer. It will pass as a magical spreadsheet, as electronic mail, as a desktop, as a
navigable space.

25
Early computer science capitalised on the perceived value of intelligence by invoking
the ‘giant brain’ trope (Berkeley 1949). It helped found a new discipline that John
McCarthy dubbed ’artificial intelligence’. Large populations had been subject to IQ
tests, which were later criticised for their cultural bias. In 1946 the club for people
with a high IQ, Mensa, was founded at Oxford.

26
It’s 1948 and George Orwell is in his sick bed, typing up the final manuscript for what
would be his final novel. He was dying of TB and preparing to get married. His novel
was part parody and part nightmare about a world in which the party, personified as
Big Brother, dominated and oppressed the population of Oceania. The protagonist,
Winston, is a lowly worker whose job was to revise historical facts to show Big
Brother in a better light, and to erase anything that made Big Brother look bad. In
home and at work, he is constantly surveilled through devices called telescreens.

While Orwell’s attitude to early computers is unknown, he certainly abhorred


technocratic power. He disliked large cities, bureaucracies and modern technologies
like the motorcar and the radio. He drew on the totalitarianism of Fascism and
Communism, as well as his own experience as a propagandist during the second
world war.

In the nightmare world presented in the famous novel 1984, with the help of
technology, the state has almost absolute knowledge about its subjects, down to
their most inner thoughts and fears. Orwell’s anxiety reflects and informs a structure
of feeling in the day characterised by fear and anxiety about technology and
government oppression that would become a subcurrent in the reception of

28
computers. Governments and corporations were taking up these machines

28
29
In this scene from the movie 1984, which was released in 1984, you see the
telescreen in action. It’s uncannily like an exercise session over Zoom. While in 1984
this was a fanciful technology, the capacity of computers to invoke traces of your life
— your movements through the street with security cameras and mobile phone
connections, and through the internet with cookies and IP addresses — arguably
rivals 1984’s bureaucracies and telescreens.

30
Also in 1948, the company International Business Machines, IBM, announced the
release of their latest million-dollar computer: the Selective Sequence Electronic
Calculator or SSEC. This corporation had for fifty years been in the business of
mechanical tabulating machines used for the census, time clocks for workers to
punch in when they arrived and left work, and complex scales for weighing things.
The tabulating machines, the flagship product, were used for tracking products,
railway cars, customers and sales people.

IBM installed the SSEC in its headquarters in Madison Avenue in Manhattan, where it
could be seen through the shop front windows, so this was the first computer seen by
the public. It was massive: half a football field in size. It had punch machines and card
readers, thousands of relays and vacuum tubes, flashing lights and dials. One of its
first applications was to calculate the position of the moon and the planets.

Even in these early days, people were coming to fear the computer. In fact, IBM
deliberately referred to this as a calculator to avoid drawing attention to how it might
displace human workers.

This machine was also a form of public relations. The President of IBM loved the

31
computer, but hated the structural columns in the middle of the room.

31
So he directed the photographers to remove the columns from their publicity photos,
just like Winston in the Ministry of Truth might have disappeared things and people in
1984.

32
Popular culture has often reflected the structures of feeling around the technology of
the day. Think about movies like Tron, Blade Runner and Ralph breaks the internet.
Even in 1957 a romantic comedy movie called Desk Set negotiated popular fears
about computers. Head librarian for Federal Broadcasting Network, Bunny Watson
(played by Katharine Hepburn) and her team were highly skilled at researching facts
on almost any topic for journalists. However, the company wanted to boost their
productivity by installing a computer (EMERAC). Effiiciency expert Richard Sumner
(played by Spencer Tracy) came in to secretly evaluate the installation of the
computer. The team was worried about their jobs if the computer was to be installed.
You’ll be glad to hear that the computerized library became even better at their tasks,
and no one lost their jobs.

33
Forty six years later, in the year 1984 an advertisement showed during the Superbowl
that recalled Orwell’s novel. Directed by Ridley Scott (who directed Blade Runner and
Alien), it alluded to IBM as a kind of oppressive big brother with a huge telescreen.

34
The mid-1960s structure of feeling was characterised by the space age technologies
of the military industrial complex, but also by the emergence of a counterculture and
the new age movement. There was a deepening confrontation between rationality
and romanticism, masculinist power versus feminine empowerment, technology and
magic. Alternatives to IBM were emerging.

35
36
By 1965 competitors to IBM’s huge computers had started to arrive, the
minicomputers. The most popular was the PDP-8 by Digital Equipment Corporation,
released in that year. This was much less powerful than the massive corporate
information processors, but was portable enough (as big as a fridge) and cheap
enough (around ten thousand dollars) to be bought by smaller teams of researchers
or business people.

Head of DEC's operations in England, John Leng circulated a sales report that started:
‘Here is the latest minicomputer activity in the land of miniskirts as I drive around in
my Mini Minor’. As you can see from this ad, this was a sexy technology in contrast to
IBM’s notoriously conservative uniform of pin-striped jackets and white button-down
shirts.

37
In their promotions DEC tried to find a balance between technical concepts and
popular culture. They were technically powerful, but also approachable and non-
technocratic. In this manual for the programming language FOCAL, they show how it’s
possible to make a program more human in a loan calculator. In the sample program,
the computer would address the borrower and ‘good looking’, ‘dear’ and ‘sweets’.
The manual concluded that this was too much, particularly if it was your grandmother
running the program.

38
In 1968 the anxiety about high technology appeared once again in the avatar of HAL
(an alphabetic shift from IBM). In this scene, before the machine goes insane, HAL
explains in a friendly voice how it never makes mistakes, but of course the subtext is
otherwise.

39
A decade after the mini-computer, the microcomputer emerged, first with the Altair
8800 by MITS in 1974, which sold for hundreds rather than thousands of dollars. In
the 1970s and 1980s a dozens microcomputers were released, none of them
compatible with the others: Apple II, Atari 800, BBC Micro, Commodore 64, Tandy
TRS80 and the Zilog Z80. Byte magazine dubbed the Pet, Apple II and TRS-80 as the
1977 Trinity of commercially successful microcomputers.

With the Macintosh in 1984, Apple promoted the idea of computers for the rest of us
(remember the 1984 advertisement)? Meanwhile, the IBM PC, released the year
before in 1983, would establish itself as the dominant standard for personal
computers, creating a lineage which continues today.

40
I choose 1993 as a landmark year in which computers became widely experienced as
a mediator of invoked space because of three pieces of software: Myst, Doom and
NCSA Mosaic.

There had been computer games, graphical interfaces and computer networking
since 1960s, but coming largely out of the milirary they were not widely known.
Spacewar! In 1962 is credited as among the earliest graphical games on DEC’s PDP-1,
featuring a vector graphics spaceship bouncing around the screen in simulated
gravity. In 1973 Xerox created the Alto computer, with a graphical interface, but in
fact Doug Englebart had demonstrated a graphical interface in the famous demo of
NLS in 1968. The Advanced Research Projects Agency Network was conceived in
1961.

But there were three pieces of software released in 1993 that popularized the
experience of navigating a space invoked by the computer/ Like other killer apps,
each became reason enough for buying a multimedia computer with a CD-ROM drive,
a colour screen and a modem.

41
Doom was the most influential first-person shooter game developed by id software
and released in 1993. While the company had made the nazi-killing game Wolfenstein
in 1992, Doom really resonated. It offered a horror universe reminiscent of dungeons
and dragons, Aliens and the Evil Dead II. As a programming feat is allowed players on
the PC to experience a gloomily lit immersive 3D game world rendered at high speed.
It also had other things going for it. A free cut down version of the game was
available on the internet, so it was taken up everywhere, which later became known
as the freemium model. It was also possible for keen gamers to modify the WAD file
(where is all the data) and create their own versions of the game, supporting game
modding.

42
The first of these is the game Myst, created by brothers Robyn and Rand Miller and
released by the company called Cyan. Appearing on the Macintosh in 1993, this was
one of the first CD-ROM game releases. It featured rendered 3D graphics, interactive
puzzles, a deep narrative and an atmospheric soundtrack. Notice how the game
opens with a book will a small illustration of a 3D rendered flyover of the island. The
game pushed the limits of the medium. Because of computer speed and the load
time of the CD-ROM it could only display video in a small part of the screen.
Compared with Doom, this was a contemplative game in which players had to
investigate the island through several ages to unlock the secret of its history.

CD-ROMs, which used the audio format to carry data, were hailed as an entirely new
medium. Releases from musicians Laurie Anderson and Peter Gabriel showed some
of its possibilities, and Microsoft released an encyclopedia called Encarta.

43
The third piece of software released in 1993 is apparently far less spectacular than
the games released in the same year. But arguably it was more significant. This was
the first graphical web browser. Even more than Doom, the browser proliferated
across the internet, bootstrapping itself. While there were hypertext systems before
Mosaic, and network software such as FTP and gopher, and graphics programs,
Mosaic combined these principles to offer a publishing medium that gave a strong
sense of a navigable information space on a global scale.

44
So in this second part of the lecture, I’ve recounted several vignettes about
computing innovations and the structures of feeling associated with them. In each
case, the computer invoked something different — an intelligent worker or
conversation partner — it invoked you — as a surveillance machine complicit in
corporate and government power — from the 1960s it became a consumer device
that gives a consumer device that augments human experience and even sexuality —
and it became a gateway into new spaces.

45
46

You might also like