Computing Change The World
Computing Change The World
Computing Change The World
Thomas J. Misa
Charles Babbage Institute University of Minnesota How can we satisfactorily address the history of computing, recognizing that computing artifacts and practices are often shaped by local circumstances and cultures, and yet also capture the longerterm processes by which computing has shaped the world? This article reviews three traditions of scholarly work, proposes a new line of scholarship, and concludes with thoughts on collaborative, international, and interdisciplinary research.
Everyone knows that computing has changed the world, but, strangely enough, our existing historiography of computing faces numerous difficulties in addressing this question directly. Examples and models for a historical understanding of this key question are surprisingly scarce.1 I believe this is because historians disciplinary preferences for subject specificity and archival virtuosity have encouraged us to do detailed studies of individual machines, programs, and companies, and occasionally to examine the social construction of specific computing technologies, but our focus on specifics has made it difficult to conceive and conduct the wide-ranging and long-duration studies that can show the longer-term consequences of technical changes for society, culture, economics, and politics. I have suggested elsewhere that the nature of technologieswhether they seem to have impact on society and culture, or appear instead to reflect society and culturedepends crucially on the temporal and analytical scale of our inquiries, that is, whether we are looking at them closely with a fine-grained historical microscope or instead taking a wider or longer-term view.2 To take just one example, is Moores law better understood as an irresistible agent of changean instance of raw technological determinism as Paul Ceruzzi recently assertedor rather as a contingent and constructed entity, as Ethan Mollick suggests?3 Of course microscopes and telescopes each can tell us something about the natural world, even if the views are quite distinct and by themselves partial and necessarily incomplete. How might we develop new modes of analysis and explanation that will address the history of computing in satisfying detail, recognizing that computing practices are often shaped by local circumstances and distinct cultures, and yet capture wider or longer-term processes where computing has manifestly shaped the world? This article first reviews three thematic traditions of scholarship in history of computing; it then proposes a new line of scholarship to understanding how computing has changed the world; it concludes with some thoughts on collaborative, international, and interdisciplinary research programs to understand how and when and why this came about.4 Very roughly, the history of computing has progressed through three distinct thematic traditions in the past quarter century or so. First, in an early, machine-centered phase, computer historians and leading practitioners (they were sometimes one and the same person) debated the priority and internal functioning of certain key electronic digital machines at both hardware and software levels. Next, the first generation of professional historians of computing traced the varied roots of the information age. Most recently, historians have directed attention to the institutional context of computing. Of course, many people remain interested in hardware, software, information, and institutions. Clearly these thematic traditions are healthy and can be extended with future research.5 The new line of research outlined here, while drawing on this work, proposes that we shift to focus on the interaction of comput-
1058-6180/07/$25.00
2007 IEEE
Authorized licensed use limited to: University of Minnesota. Downloaded on August 06,2010 at 19:08:30 UTC from IEEE Xplore. Restrictions apply.
ingincluding hardware, software, and institutional dimensionswith large-scale transformations in economies, cultures, and societies. Citizens and policymakers know that computing has changed the world, and historians of computing should take a more prominent role in helping understand this history. I cannot think of a more pressing charge for the next generation of our work. Such a project may also help overcome the Anglo-American bias that persists in much of the current literature.6 All societies today have a relationship with computing, not only those that have pioneered computing or even those labeled as early or late adopters. In todays global economy, a country or region that might entirely lack access to computing still has a relationship with the computer-mediated global economy through trade and travel, even if it is entirely frozen out of such trade. (One might imagine an island somewhere that is entirely off the Net and with no computing whatsoever, but to my mind this special situation resembles David Nyes conception of wilderness in contemporary America: a specific and delimited part of society or culture deliberately held apart from the mainstream. Just as Nye insists that wilderness is a part of mainstream urbantechnological society, owing at minimum to the need to maintain physical and legal boundaries, so too would this hypothetical computer-free island have a boundary relationship with the wider world where computing is more or less pervasive.7) Indeed, it may be crucial to understand just those countries, regions, or cultures that partially or wholly lack access to first-world computing. The terms digital divide, E-junk, and digital dumping flag these latter phenomena.
Thematic traditions
The first thematic tradition in the history of computing took form with questions posed by practitioners and pioneers of digital computing. Their key questions directed scholars to identify the first digital computers, and to understand the technical details of how they worked. It was simply assumed that the computer that mattered was the electronic digital computer, its immediate predecessors and obvious offspring; overlooked in this early literature was that the computer was for many decades a person, often a woman, doing numerical calculations of great complexity.8 In round terms the narrative of significant machines, concepts, and pioneers began with one of the several World War IIspawned
machines (Manchester, Enigma, Atlas, ENIAC, or Whirlwind), untangled the genesis of the stored-program concept, and marched forward to Univac and perhaps crested with IBMs conquest of the world. Early numbers of the IEEE Annals of the History of Computing record the several priority debates; Emerson Pughs several books on IBM continued this tradition;9 and volumes right down to the present have echoed these weighty matters, including most explicitly Alice Burks Who Invented the Computer? The Legal Battle That Changed Computing History.10 Michael Mahoney provided an early critique of this tradition as insider history. Among the problems he identified were the distinct preference for pinning down facts and firsts as compared with the understanding of historical context; a recitation of technical givens versus a recognition of actors historical uncertainty and the difficult choices they faced; and a preference for vivid anecdotes over the cultivation of context and perspective. With a focus on the details of computing technology, these accounts do not give an assessment of the social, economic, or cultural changes that computers were presumed to bring about.11 Accordingly, while these works are clearly valuable in documenting what when on, they are of limited help in addressing the question of how computing changed the world. For instance, the coming of the digital age was no mere technical advance but also an important cultural shift within the technical community. Early historical work on the electronic digital computer entirely ignored the alternate tradition of computation for fire control and the vibrant world of analog computing, recently explored by James Small and David Mindell12 (see Figure 1). Problematizing the coming of the digital age has been another rewarding and insightful approach. In prize-winning articles, Larry Owens drew attention to MITs rich tradition in analog computing from the 1920s and also provided a sharp cultural analysis of the sea change from analog to digital computing at MIT during and after the war years.13 Owens corrects the common perception that the way forward into the digital future was clear and uncontested. Accordingly, he makes an important step in seeing the history of computing as the history of cultural change. A contextual technical historydevoting close attention to specific details of the machines while situating them in their historical contextshould be a vital ongoing tradi-
OctoberDecember 2007
Authorized licensed use limited to: University of Minnesota. Downloaded on August 06,2010 at 19:08:30 UTC from IEEE Xplore. Restrictions apply.
53
roots of the information age (see Figure 2). A self-described band of colonizers including historians William Aspray, Martin CampbellKelly, and Paul Ceruzzi asked a new set of questions, which pivoted on the genesis of the information age. In his essay The History of the History of Computing, Campbell-Kelly wrote that
Professionals and colonizers emerged in the 1980s. We have become colonists, in the sense of staking a claim for the history of computing to be recognized as a valid historical enterprise. This has involved establishing the usual trappings of academic recognition: a scholarly journal, monographs, conferences, research centers, museums, PhD programmes, and undergraduate courses.16
Figure 1. Analog computing persisted long after the ENIAC launched the digital age in 1946, with active research programs and college-level textbooks. The brainchild of Edwin Harder (pictured here), Westinghouses Anacom facility in East Pittsburgh, Pennsylvania, opened in 1946, provided an accurate scale model of complex electric power systemsand remained in operation until 1991. (Courtesy Charles Babbage Institute.)
tion. Recall the classic passage in Tracy Kidders Soul of a New Machine describing the 27 printed-circuit boards constituting a VAX minicomputer and positing the useful theory, whether strictly speaking true or not, that VAX embodied flaws in DECs corporate organization. Both were too complex and hierarchical, according to the accounts protagonist. Kidder proposed that the computers architecture was a mirror of the architecture of the company. Not all such readings will find a one-to-one correspondence between the technical details and anything else, of course, whether corporate structure or social structure. Donald MacKenzie finds consequential drama in the minutia of hardware (a riveting story tells how Intels i8087 floating point coprocessor handles extremely small denormalized numbers) and a full-scale tragedy in the apparently mundane, but literally deadly, software-timing errors in the Patriot air-defense missile.14 Lawrence Lessig, Jay Kesan, and other legal scholars are devoting attention to understanding how code is law.15 Clearly, we need more such hardware and software histories attentive to technical details and aware of their wider social, political, and legal implications and meanings. A second thematic tradition in the history of computing shifted focus to the historical
In this information-age view, computers were machines that first and foremost processed information and only secondarily provided the functions of calculation, control, or communication. Numerous landmark volumes published in the 1990s prominently developed this theme including CampbellKelly and Asprays Computer: A History of the Information Machine; Chandler and Cortadas A Nation Transformed by Information; and Manuel Castells Information Age trilogy. Ceruzzi framed his History of Modern Computing around the transformation of the mathematical engines of the 1940s to the networked information appliance of the 1990s. Even Riordan and Hoddesons tightly focused history of the transistor at Bell Laboratories was subtitled, somewhat grandly, The Birth of the Information Age.17 Attention to electronic digital machines did not disappear in these information-age accounts, of course, but the focus expanded to a broader set of technologies and to the actual use of these machines in insurance, finance, and government. Earlier counting and tabulating machines that processed information mechanically or electromechanically commanded new attention and respect. It became clear that, at least 15 years before the emergence of the electronic digital computer, an entire technical infrastructure of data processing was in place and already thoroughly embedded in business and government routines. In 1933, IBM offered 17 different types of key punches, in various mechanical and electric configurations, for 34-, 45-, and 80column cards; five distinct sorting machines and nine different tabulators, each available in multiple models and for different-width punched cards; while in the same decade
Burroughs created an entire suite of mechanical bookkeeping and accounting machines.18 On reflection, it was indeed no accident that the office machine giants of the 1920sIBM, Burroughs, NCR, and Remington Randbecame early leaders in the postwar computer industry (see Figure 3). The theme of information and society shows ample signs of continued interest and conceptual innovation. Recent works here include Jon Agars The Government Machine, Campbell-Kellys pioneering book-length study of the software industry, and JoAnne Yates Structuring the Information Age.19 Agars work especially breaks new methodological ground, providing an extended evaluation of the computer as a materialization of bureaucratic action (p. 391) with wide-ranging examples drawn from the 19th-century British Civil Service, turn-of-the-century statistical reformers, and post-1945 welfare state. Agar surveys the cryptography, radar-based air defense, social-statistical surveys and national registry, as well as the wartime logistics, personnel records, and operations research of World War II, and aptly calls it an information war. Each of these works, in providing a benchmark to evaluate a major social, economic, and political change (the coming of the information age), are obviously promising in the effort to understand how computing changed the world. A third thematic traditionin addition to the pioneering machines and the information agecan be discerned with the work of historians who take up the question, How did (certain) institutions shape computing? This is a pronounced shift in emphasis, if not an entirely novel dimension. These accounts move to the background their treatment of individual computing machines or the contours of the information society, foregrounding instead the governmental, engineering, or corporate institutions that brought them about. The US military services, the National Science Foundation, and IBM have received particular attention. Among exemplary works in this tradition I would number Arthur Norberg and Judy ONeills institutional study of the wide-ranging ARPA initiatives in computing; Donald MacKenzies studies of supercomputing; Janet Abbates Inventing the Internet; Alex Rolands critical evaluation in Strategic Computing; and Steve Usselmans work on business strategies and learning processes within IBM.20 In different ways, these studies each place the story of the technical developments in
Figure 2. The information age coupled computing technologies to the routinized processing of information. Here, in 1960, nine operators enter bank transactions into Burroughs F-600 machines, probably at a St. Louis, Missouri, bank. (Courtesy Charles Babbage Institute.)
computing squarely into the context of institutions. Institutional dynamicsthat is, the specific situated context of decision-making that exists within a complex organization such as DARPA or IBMare just as important here as engineers drawing circuit diagrams or executives debating corporate strategy. To some extent, this literature obviously draws on earlier studies of the federal governments role in computing by Kenneth Flamm as well as the more recent NRC report Funding a Revolution.21 Yet what distinguishes this newer institutional literature, I believe, is explicit attention not only to the rate of technical change but also to its direction.22 These studies largely accept the proposition that directed institutional sponsorship sped up the pace of computing developments; in addition, they often grapple with the question of what difference did such institutional sponsorship make in the shaping and direction of computing developments? Considering the multiple potential lines of hardware or
OctoberDecember 2007
Authorized licensed use limited to: University of Minnesota. Downloaded on August 06,2010 at 19:08:30 UTC from IEEE Xplore. Restrictions apply.
55
Figure 3. Punched cards with digital information came in many forms. Here in 1948 a demonstration of Calvin Mooerss Zatocoding system for coding, classifying, storing, and retrieving information. Mooers is credited with coining the term information retrieval in 1950; his papers are at CBI. (Courtesy Charles Babbage Institute.)
software development that were possible at some time, these authors ask how institutional dynamics influenced the actual developments in favor of one potential outcome or another. If the older studies were strong on description, these studies move more assertively to an analysis of how and why certain paths were chosen as well as how and why certain results came to bewhile others did not.23
A tradition to be made?
For a fourth thematic cluster, clearly not yet a tradition, I can tentatively suggest three characteristics. Recognition of them will help historians of computing to better tackle the question of how computing has changed the world and at the same time connect our field to other scholarly concerns. In this fourth cluster, I believe history of computing will be a hybrid field, increasingly drawing on
diverse disciplines and methods. I hope our field will take up the challenge of comprehending the twofold shaping of computing and society. And to do so, I suggest we engage in studies that situate computing within major historical transformations. If we believe that computing has changed the world, this is what we should study. First, what might the history of computing look like as a hybrid field? Scholarly work in humanities and social sciences frequently exhibits a version of hybrid vigor, in which a core field or discipline is invigorated through exchange with neighboring fields or disciplines. Conversely, fields that too narrowly define their core concerns are at risk of being cut off from broader scholarly debates. Sometimes a dominant method or influential paradigm has the effect of consolidating a field around a set of key questions, with the attendant risk that the field can become isolated if no one else finds these questions to be compelling. As instances, I would point out that business history, history of technology, and philosophy of technology have all, in the past, flirted with this unhealthy isolation. Each has substantially revived in no small measure owing to sustained interactions with neighboring disciplines. Philosophers of technology have engaged with sociology and politics, while business and technology historians have sought new inspiration in studies of consumption, identity, gender, and politics. Each of these three fields is certainly less focused on core questions than it was two decades ago, but all are the more interesting for it and indeed show many indications of hybrid vigor. In my view, historians of computing can confidently be looking outward to neighboring fields and disciplines for conceptual inspiration as well as new audiences. Let me make a couple suggestions. In conceptualizing studies dealing with computing artifacts, computing systems, and their interactions with society and culture, there are obvious overlaps with the concerns of historians of technology who have been studying diverse artifacts and systems as well as their interactions with culture.24 Historians of computing studying companies, corporate culture, and various levels of industries are finding common cause with business and economic historians as they examine organizations, learning processes, and the flows of information.25 At present both business history and history of technology are themselves hybrid fields, with multiple productive and inspiring overlaps with histor-
ians of labor, gender, culture, and consumption. Histories of labor, gender, and consumption have yet to make a significant interaction with the history of computing, despite several suggestive articles pointing the way.26 Historians of computing seem ideally positioned for evaluating and extending the rich bodies of theorizing coming from organizational theory and evolutionary and institutional economics.27 Historians of science have been somewhat less avid for cross-field interactions, but studies of computer science as an academic discipline have much to learn from them, as do studies of professionalization in diverse forms from data processing to software engineering.28 There is a second way in which history of computing will become a hybrid field. The historians of science, technology, medicine, and business who recognize that computers have become vital infrastructures that constrain and enable intellectual and institutional developments in their chosen fields of study will in effect become historians of computing. Even though the list of promising topics here could be extended nearly without limit, think about just such instances as the impact of computing on chemistry, physics, biology, and the atmospheric sciences;29 medical informatics in its several incarnations; and the entire information infrastructure of modern business from financial transactions to pointof-sale terminals or from supply chains to value chains. We need more historical studies of the entire e-revolution in government policies and practices. Computer art also beckons. Second, to fully engage the question how computing has changed the world, we need to craft new and embracing narratives that adopt a twofold analytical goal. For some time, I have tried to understand (to introduce a bit of jargon) the social shaping of technology as well as the technological shaping of society.30 Understanding both of these will help in understanding how computing has changed the world. On the one hand, we need to show how developments in computing shaped major historical transformations, that is, how the evolution of computing was consequential for the transformations in work routines, business processes, government activities, cultural formations, and the myriad activities of daily life. It may be a commonplace that computing in some way led to the information revolution but I would like to know more deeply as well as more precisely how computing in its
various forms and manifestations influenced the rate and direction (to take a term from early evolutionary economics) of these social, cultural, and economic transformations. What specific characteristics of the information age can we trace to the proliferation of computers (or other technical practices), and which characteristics of highly bureaucratized societies were merely enhanced by the availability of computing? After all, standardized and routinized forms of information as a key aspect of society long predates the emergence of analog or digital computers in the 20th century, with the essays in Chandler and Cortada making an impressive case for the 19th century and Headricks When Information Came of Age making a spirited case for the 18th century.31 What is distinctive about these varied historical manifestations of the information age? How did they come about? Could the present-day computer-saturated information age have been different? And then there is the question that results when we use the tools of history to think about the present and the future.32 What possibilities exist for using the evolutions in computing theories and practices to shape future social, cultural, political, and economic developments? At the very least, think about the cultural enthusiasms behind Unix, personal computing, or the open source movement, which each drew inspiration from some notion that this was the way to change history. At the same time, our narratives and analysis should show how major historical transformations shaped the evolution of computing. We know that the office machine giants in the US were a key locus of innovation in information, data processing, and digital computing. Yet, how did the specific institutional contexthere, commercial information processinginfluence the varieties of hardware, software, systems, and services that emerged? Using evolutionary language, we can ask what were the successful variants, and how and why were these selected over the unsuccessful ones. The latter are all too often simply written off as inferior, as if latter-day criteria were clear at the time, or entirely forgotten.33 In the US, the military was a pervasive influence on many sectors of computing through the long decades of the Cold War.34 In The Closed World, Paul Edwards begins to evaluate the impact of the Cold War on the character of computing, finding that a preference for closed-world structures and practices
OctoberDecember 2007
Authorized licensed use limited to: University of Minnesota. Downloaded on August 06,2010 at 19:08:30 UTC from IEEE Xplore. Restrictions apply.
57
What possibilities exist for using the evolutions in computing theories and practices to shape future social, cultural, political, and economic developments?
suffused military institutions and computer designs. Yet pervasive does not imply omniscient or deterministic. Donald MacKenzies essay Influence of the Los Alamos and Livermore National Labs on Supercomputing shows that the assessment of institutional influence may be complex and yet compelling: while the computational needs of nuclear weapons designers were indeed paramount in supercomputing, their specific technical requirementsdeterministic, number-crunching, mesh computation versus probabilistic, multiple-branch, Monte Carlo techniques sent a generation of high-performance computing in at least two directions, not down a single path.35 And the specific institutional context of DARPAs relations to computing, where leading figures from within academic computing were placed in charge of relatively large pools of military research funds, meant that purely military and purely academic influences may never be neatly separated.36 Perhaps the largest and most pervasive institution in computing, although it stretches the term, is the high-technology environment of Californias Santa Clara County, better known as Silicon Valley. A spate of recent studies has amplified the basic findings of AnnaLee Saxenians now-classic Regional Advantage: Culture and Competition in Silicon Valley and Route 128, which emphasized risk-taking, entrepreneurship, and networks of innovative companies.37 Stuart Leslie and Rebecca Lowen deal with the interactions of high-technology innovation and military research sponsorship at MIT and Stanford. Ross Bassett, in his To the Digital Age, gives a close technical history of the now-pervasive metal oxide semiconductor technology. Two recently published studies give distinct interpretations of Silicon Valley, with Leslie Berlin focusing on the contribu-
tions of Robert Noyce while Christophe Le cuyer emphasizes instead the valleys longer history and its firms ability to master the manufacturing of vacuum tubes forward to semiconductors.38 In analyzing why certain events in computing as well as broader processes in society, politics and culture unfolded in the way they did, and not in some other way, international studies and comparative studies will be crucial. It is a common practice in assessing influence to begin with some given institution or initiating event and then read forward the consequent developments, tracing (so it appears) the influence of the institution or event. A generation of technology assessment exercises attempted to read off the impacts of a given technology in this way. The NSFfunded TRACES study in the late 1960s claimed to show the influence of basic research on technological innovation. Assessments of the militarys role in computing often operate in a similar fashion. Vernon Ruttans recent historical analysis of six general-purpose technologies, including semiconductors and computers, amasses impressive evidence that the US military services played an important role in fostering the development of technology.39 Yet in reading forward from the US militarys initiating role during the Cold War decades, he might overestimate the militarys influence, substantial though it was, as a force in technology development. (Such a retrospective method of reading forward from case studies of success has a number of inherent biases, such as underestimating the complexity and uncertainty of the innovation process as well as obscuring the presence of blind alleys or dead ends in research and innovation.) A comparative analysis of Japan tells a different story: there private companies worked in concert with the long-fabled guidance from the civilian bureaucrats at MITI (Ministry of International Trade and Industry), with little or no overt military influence, to build up world-beating capabilities in consumer electronics, semiconductors, and certain classes of computers. Whether war is necessary (Ruttans choice of word) for economic growth, then, seems to depend on whether your paradigm case is the US or Japan. Third, I am suggesting that we devote attention to situating our studies of computing within and as a vital part of major historical transformations. If computing has changed the world, surely this is a compelling site to investigate. Keeping in mind my second point above, what we need are studies that examine
the two-way shaping or co-construction of computing alongside such major processes as globalization; the set of e institutions (ecommerce, e-government, e-education); and surveillance and privacy. Then there is the profound transformation of research practices: across the board, in industrial, academic, and governmental laboratories and research sites, computers have become not merely helpful research tools but a necessary infrastructure that researchers use in collecting, interpreting, and visualizing data as well as running the models that evaluate the data. Paul Edwards studies of computing and global climate change are an extremely promising step in this direction. This vision of studying computing in the context of broad historical transformations almost certainly entails drawing on a much wider set of research methods and archival materials than we have traditionally used. Revisit the birth of the information age, thinking about it as widely as you can. The traditional archival sources such as papers of leading computer researchers, engineers, and entrepreneurs will of course remain important; oral histories and documents will certainly have their place. Still, understanding these wider problems and questions will require engagement with a diverse range of research materials (and, not coincidently, diverse competences that researchers embracing diverse hybrid fields will gain access to). We will need business historians to help understand businesses as leading users of computers, a step taken by Jim Cortadas Digital Hand trilogy as well as work by JoAnne Yates, Eric von Hippel,40 and others. And we will need specialists in governmental records to probe the varied levels of government as leading users, too. Labor historians might study the untold legions of information-technology industry workers. Social historians might use census microdata to explore fine-grained patterns. And, researchers attentive to rhetoric and popular culture will provide insight into cultural change.41
These involve Europe, Moores law, and globalization, but surely there are many more such topics deserving our attention. For Europeans, setting technical developments squarely in the context of ongoing social, political, economic, and cultural processes is simple: they face a new currency, new food standards, new flows of consumer goods and technologies, and many new and aspiring members to the European community. European integration was launched formally in the 1950s and gained significant force in the 1990s. With Dutch leadership, a group of technology historians set up an international network called Tensions of Europe, with around 150 participants working in 10 parallel research teams, to investigate the role of technology in the making of Europe across the 20th century. Research teams focused on varied sectors and aspects of this immensely complex history: cities, mobility, infrastructures, colonialism, consumption, communication, information, big engineering projects, agriculture, and food.42 A follow-on project funded by the European Science Foundation is being organized under the banner Inventing Europe, and we hope that the history of computing will play a significant role.43 A group of leading European historians of computing, organized by Gerald Alberts, is exploring how Europe took shape through the dissemination and use of software. Multinational companies formed something like a pan-European informationtechnology network. Even though IBM was a US company, its wide reach and standardsetting technology tended to bind European companies and business cultures together. What resulted, however, was not precisely a single corporate culture. IBM found that its technology and practices interacted with local cultures and expectations: in Finland IBM mean easy access to Western Europe, while in France and the Benelux countries IBM meant access to American culture, even if people traveled to Stuttgart to get it. In Zurich, the site of an important IBM research lab, IBM meant an international technology heavyweight, but not precisely an American one. IBM had surprising influence also in Eastern Europe, through the unauthorized duplication of its machines and their integration into the Soviet planning system. Another topic of interest is IFIP, founded in 1959 as an international forum for computer scientists, and its advocacy of the programming language Algol. A second research program that addresses how computing has changed the world is one
OctoberDecember 2007
Authorized licensed use limited to: University of Minnesota. Downloaded on August 06,2010 at 19:08:30 UTC from IEEE Xplore. Restrictions apply.
59
have provided a clutch of generalizations and assertionsthe death of distance, the world is flat, and many othersthat need thorough and sustained historical evaluation. Clearly, fax machines, airplane reservation systems, networked computers, and many other computer-mediated forms of communication have something to do with globalization. Absent a network of computing, Federal Express and Airbus would each fall out of the skies, while just-in-time logistics would come to a grinding halt. But does this mean that computing, by itself, brought these developments about? A more nuanced approach to globalization and computing is provided by S.E. Goodman, writing in the Communications of the ACM, who notes the very uneven distribution of computing and telecommunications between countries: what is there to compare between the semiconductor industries of Japan and Nigeria?45 There is moreover an important environmental history to computing and globalization. The United Nations Environment Program estimates that each year between 20 million and 50 million tons of electronics are discarded, an unintended but nonetheless real consequence of Moores law. At the port in Lagos, Nigeria, each month five hundred 40foot containers arrive filled with obsolete computer components from the developed world, but up to three-quarters of the equipment is literally junk that is not even useful in the citys active recycling market.46 The contemporary debate about outsourcing is another topic needing further historical analysis.47 We have a great deal of work to do. Look up global in the IEEEs search tool for IEEE Annals of the History of Computing, and, apart from some book reviews, you get only Jim Cortadas programmatic essay, proposing a research agenda to investigate how computing went global.48 In sum, a tremendously exciting era is opening up for historians of computing in the next quarter century or so. If I may hazard a few general predictions, it would be these. In the coming years weboth those writing for and those reading the IEEE Annals of the History of Computingwill be a larger group, and a more heterogeneous group than in the past quarter century. Historians of many specialties will inevitably see computing interacting with their chosen subject matter, and many of them will become at least part-time historians of computing. One sees early signs of this development even today: business and technology historians, as well as rhetoric and
popular culture scholars, who might not consider themselves authentic card-carrying historians of computing, are doing interesting work in the field. In addition, citizens and policymakers will certainly need historical perspective and practical insight into the world that has, to a significant extent, been shaped by the varied forms of computing the entire array of machines of calculation, control, information, education, entertainment, communication, infrastructure. And I hope that historians of computing, however they arrive in the field, will be drawn into considering the big questions of historical transformation. If computing has changed the world, we have an immense opportunity, and also a significant responsibility, to help understand how this came about.
Aspray, The History of Computing within the History of Information Technology, History and Technology, vol. 11, 1994, pp. 7-19, and P.N. Edwards, From Impact to Social Process: Computers in Society and Culture, Handbook of Science and Technology Studies, S. Jasanoff et al., eds., SAGE Publications, 1994, chapter 12. Also see Tom Haighs History Resources; http:// www.tomandmaria.com/tom/Resources/ ResourceFile.htm. For another view, see Martin Campbell-Kellys list of courses in the history of computing at http://www.dcs.warwick.ac.uk/ ,mck/HoC_Courses.html. 6. Work by European historians of computing has both broadened and deepened in the past decade or so. One institutional manifestation is the computing history research group within the Tensions of Europe project; see http:// www.histech.nl/tensions/Projecten/IT/ ITMain.htm and the discussion about Europe in this article. See also Informatics Goes Global: Methods at a Crossroads, http:// rkcsi.indiana.edu/article.php/conferences2/45; the History of Nordic Computing conferences at http://www.comphist.org. Research on topics in Asia is also emerging: for example, R. Heeks, Indias Software Industry: State Policy, Liberalisation and Industrial Development, Sage, 1996. 7. D. Nye, Technology Matters: Questions to Live With, MIT Press, 2006, pp. 194-198. 8. J.S. Light, When Computers Were Women, Technology and Culture, vol. 40, no. 3, 1999, pp. 455-483; D.A. Grier, When Computers Were Human, Princeton Univ. Press, 2005. 9. E.W. Pugh, Memories that Shaped an Industry: Decisions Leading to IBM System/360, MIT Press, 1984, E.W. Pugh, Building IBM: Shaping an Industry and its Technology, MIT Press, 1995, and E.W. Pugh, L.R. Johnson, and J.H. Palmer, IBMs 360 and early 370 Systems, MIT Press, 1991. 10. A.W. Burks, The First Electronic Computer: The Atanasoff Story, Univ. of Michigan Press, 1988; and A. Rowe Burks, Who Invented the Computer? The Legal Battle That Changed Computing History, Prometheus Books, 2003. 11. M.S. Mahoney, The History of Computing in the History of Technology, Annals of the History of Computing, vol. 10, no. 2, 1988, pp. 113-125. 12. For example, J.S. Small, General-purpose Electronic Analog Computing: 19451965, IEEE Annals of the History of Computing, vol. 15, no. 2, 1993, pp. 8-18; D. Mindell, Between Human and Machine: Feedback, Control, and Computing Before Cybernetics, Johns Hopkins Univ. Press, 2002. See http://www.umn.edu/,tmisa/biblios/ hist_computing.html#Analog-Era. 13. For example, L. Owens, Vannevar Bush and the Differential Analyzer: The Text and Context of an
OctoberDecember 2007
Authorized licensed use limited to: University of Minnesota. Downloaded on August 06,2010 at 19:08:30 UTC from IEEE Xplore. Restrictions apply.
61
Early Computer, Technology and Culture, vol. 27, 1986, pp. 63-95; L. Owens, Where Are We Going, Phil Morse? Changing Agendas and the Rhetoric of Obviousness in the Transformation of Computing at MIT, 19391957, IEEE Annals of the History of Computing, vol. 18, no. 4, 1996, pp. 34-41. 14. T. Kidder, The Soul of a New Machine, Little, Brown, 1981, pp. 29-32; D. MacKenzie, Knowing Machines: Essays on Technical Change, MIT Press, 1996. 15. L. Lessig, Code and Other Laws of Cyberspace, Basic Books, 1999; J.P. Kesan and R.C. Shah, Shaping Code, Harvard J. Law & Technology, vol. 18, no. 2, 2005, pp. 319-399. 16. The essay is at http://www.iee.org/OnComms/ pn/History/HistoryWk_History_of_Computing. pdf, 2 Mar. 2006. 17. M. Campbell-Kelly and W. Aspray, Computer: A History of the Information Machine, Basic Books, 1996; A.D. Chandler Jr., and J.W. Cortada, eds., A Nation Transformed by Information: How Information Has Shaped the United States from Colonial Times to the Present, Oxford Univ. Press, 2000; P.E. Ceruzzi, A History of Modern Computing, MIT Press, 1998, second ed. 2003; M. Riordan and L. Hoddeson, Crystal Fire: The Birth of the Information Age, Norton, 1997. 18. J.W. Cortada, Before the Computer: IBM, NCR, Burroughs, and Remington Rand and the Industry They Created, 18651956, Princeton Univ. Press, 1993, p. 109. 19. J. Agar, The Government Machine: A Revolutionary History of the Computer, MIT Press, 2003; M. Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry, MIT Press, 2003; J. Yates, Structuring the Information Age: Life Insurance and Technology in the Twentieth Century, Johns Hopkins Univ. Press, 2005. 20. A.L. Norberg and J.E. ONeill, Transforming Computer Technology: Information Processing for the Pentagon, 19621986, Johns Hopkins Univ. Press, 1996; D. MacKenzie, The Influence of the Los Alamos and Livermore National Labs on Supercomputing, IEEE Annals of the History of Computing, vol. 13, no. 2, 1991, pp. 179-201; J. Abbate, Inventing the Internet, MIT Press, 1999; A. Roland and P. Shiman, Strategic Computing: DARPA and the Quest for Machine Intelligence, 19831993, MIT Press, 2002; S.W. Usselman, IBM and Its Imitators: Organizational Capabilities and the Emergence of the International Computer Industry, Business and Economic History, vol. 22, no. 2, 1993, pp. 1-35. 21. K. Flamm, Creating the Computer: Government, Industry and High Technology, Brookings Institution, 1988; Natl Research Council, Funding
a Revolution: Government Support for Computing Research, Washington, D.C.: Natl Academy Press, 1999; http://www.nap.edu/readingroom/books/ far/contents.html. 22. See T.J. Misa, Revisiting the Rate and Direction of Technical Change: Scenarios and Counterfactuals in the Information Technology Revolution, paper presented to the Society for the History of Technology (SHOT), 2006; https:// netfiles.umn.edu/users/tmisa/www/papers/ Misa_Rate-2006.pdf. 23. P. Edwards, Making History: New Directions in Computer Historiography, IEEE Annals of the History of Computing, vol. 23, no. 1, 2001, pp. 86-87. For an exploration, see A. Tympas, From Digital to Analog and Back: The Ideology of Intelligent Machines in the History of the Electrical Analyzer, 1870s1960s, IEEE Annals of the History of Computing, vol. 18, no. 4, 1996, pp. 42-48. 24. P.N. Edwards, The Closed World: Computers and the Politics of Discourse in Cold War America, MIT Press, 1996. 25. Recently business and technological historians have strengthened their common concerns, with such figures as JoAnne Yates and others playing prominent roles. See the symposium discussion built around Yates, How Business Enterprises Use Technology: Extending the Demand Side Turn, Enterprise and Society, vol. 7, no. 3, 2006, pp. 422-455. 26. See, for example, M.S. Mahoney, Boys Toys and Womens Work: Feminism Engages Software, Feminism in Twentieth-Century Science, Technology, and Medicine, A.N.H. Creager, E. Lunbeck, and L. Schiebinger, eds., Univ. of Chicago Press, 2001, pp. 169-185. 27. See the work by Yates in Ref. 25 and Usselman in Ref. 20; and J. Fagerberg, D.C. Mowery, and R.R. Nelson, eds., The Oxford Handbook of Innovation, Oxford Univ. Press, 2006. 28. See, for example, M.S. Mahoney, Software as ScienceScience as Software, History of Computing: Software Issues, U. Hashagen, R. KeilSlawik, and A. Norberg, eds., Springer Verlag, 2002, pp. 25-48. 29. In 2005 the National Science Foundation established an Office of Cyberinfrastructure to deal proactively with designing, building, and using computer networks across the research enterprise; see http://www.nsf.gov/dir/ index.jsp?org5OCI. 30. For analysis and cases attempting this twofold analysis, see T. Misa, P. Brey, and A. Feenberg, eds., Modernity and Technology, MIT Press, 2003; T. Misa, Leonardo to the Internet: Technology and Culture from the Renaissance to the Present, Johns Hopkins Univ., 2004.
31. A.D. Chandler Jr. and J.W. Cortada, A Nation Transformed by Information: How Information Has Shaped the United States from Colonial Times to the Present, Oxford Univ. Press, 2000; D.R. Headrick, When Information Came of Age: Technologies of Knowledge in the Age of Reason and Revolution, 17001850, Oxford Univ. Press, 2002. 32. Thanks to Alex Pang of the Institute for the Future, Palo Alto, California, for this formulation. 33. This attention to distinct lines of research as well as different companies that experienced both leadership as well as problems certainly distinguishes A.L. Norberg, Computers and Commerce: A Study of Technology and Management at Eckert-Mauchly Computer Company, Engineering Research Associates, and Remington Rand, 19461957, MIT Press, 2005. 34. Early computing in the former Soviet Union is investigated in, for example, G.D. Crowe and S.E. Goodman, S.A. Lebedev and the Birth of Soviet computing, IEEE Annals of the History of Computing, vol. 16, no. 1, 1994, pp. 4-24, and A. Fitzpatrick, T. Kazakova, and S. Berkovich, MESM and the Beginning of the Computer Era in the Soviet Union, IEEE Annals of the History of Computing, vol. 28, no. 3, 2006, pp. 4-17. For years Soviet computing depended on the reverse engineering of IBM machines; see N.C. Davis and S.E. Goodman, The Soviet Blocs Unified System of Computers, ACM Computing Surveys, vol. 10, no. 2, June 1978, pp. 93-122. 35. D. MacKenzie, The Influence of the Los Alamos and Livermore National Labs on Supercomputing, IEEE Annals of the History of Computing, vol. 13, no. 2, 1991, pp. 179-201. 36. Valuable studies of computing in the Cold War context include A.C. Hughes and T.P. Hughes, eds., Systems, Experts, and Computers: The Systems Approach in Management and Engineering, World War II and After, MIT Press, 2000; S. Gerovitch, From Newspeak to Cyberspeak: A History of Soviet Cybernetics, MIT Press, 2002; A. Akera, Calculating a Natural World: Scientists, Engineers, and Computers During the Rise of U.S. Cold War Research, MIT Press, 2007. 37. A. Saxenian, Regional Advantage: Culture and Competition in Silicon Valley and Route 128, Harvard Univ. Press, 1994. 38. S.W. Leslie, The Cold War and American Science: The Military-Industrial-Academic Complex at MIT and Stanford, Columbia Univ. Press, 1993; R.S. Lowen, Creating the Cold War University: The Transformation of Stanford, Univ. of California Press, 1997; R. Knox Bassett, To the Digital Age: Research Labs, Start-Up Companies, and the Rise of MOS Technology, Johns Hopkins Univ. Press, 2002; L. Berlin, The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley,
Oxford Univ. Press, 2005; C. Le cuyer, Making Silicon Valley: Innovation and the Growth of High Tech, 19301970, MIT Press, 2005. 39. V. Ruttan, Is War Necessary for Economic Growth? Oxford Univ. Press, 2006. 40. J. Cortada, The Digital Hand, Oxford, 20042008, three volumes; J. Yates, How Business Enterprises Use Technology in Ref. 25; E. von Hippel, Democratizing Innovation, MIT Press, 2005. 41. For example studies, see C. Malone, Imagining Information Retrieval in the Library: Desk Set in Historical Context, IEEE Annals of the History of Computing, vol. 24, no. 3, 2002, pp. 14-22; Edwards, Closed World; T. Friedman, Electric Dreams: Computers in American Culture, New York Univ. Press, 2005; J. Markoff, What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry, Viking Penguin, 2005. 42. See T.J. Misa and J. Schot, Inventing Europe: Technology and the Hidden Integration of Europe, History and Technology, vol. 21, no. 1, 2005, pp. 1-19; E. van der Vleuten and A. Kaijser, eds., Networking Europe: Transnational Infrastructures and the Shaping of Europe, 18502000, Science History Publications, 2006; and M. Ha rd and T.J. Misa, eds., Urban Machinery: Inside Modern European Cities, MIT Press, forthcoming in 2008. 43. See the European Science Foundations EUROCORES announcement at http://www.esf.org/ inventingeurope and the new projects Web site at http://www.histech.nl/inventing/index.htm. 44. See D. Brock, ed., Understanding Moores Law: Four Decades of Innovation, Chemical Heritage Press, 2006. 45. S.E. Goodman, The Globalization of Computing: Perspectives on a Changing World, Comm. ACM, vol. 34, no. 1, 1991, pp. 19-21, quote p. 19. 46. See the Basel Action Networks The Digital Dump: Exporting Re-Use and Abuse to Africa, 24 Oct. 2005; http://www.ban.org/Library/TheDigitalDump.pdf. 47. W. Aspray, F. Mayadas, and M.Y. Vardi, eds., Globalization and Offshoring of Software: A Report of the ACM Job Migration Task Force, ACM Press, 2006. 48. J. Cortada, How Did Computing Go Global? The Need for an Answer and a Research Agenda, IEEE Annals of the History of Computing, vol. 26, no. 1, 2004, pp. 53-58. Thomas Misas biography appears on page 7 of this issue. Readers may contact Misa at http:// www.cbi.umn.edu.
For further information on this or any other computing topic, please visit our Digital Library at http://www.computer.org/csdl.
OctoberDecember 2007
Authorized licensed use limited to: University of Minnesota. Downloaded on August 06,2010 at 19:08:30 UTC from IEEE Xplore. Restrictions apply.
63