Computer

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 1

Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate, resulting

in increasingly smaller, cheaper, more powerful and more efficient and reliable devices. The history of computer development is often referred to in reference to the different generations of computing devices. Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices. Read about each generation and the developments that led to the current devices that we use today. First Generation (1940-1956) Vacuum Tubes The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts. The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951. Second Generation (1956-1963) Transistors Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output. Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The first computers of this generation were developed for the atomic energy industry. Third Generation (1964-1971) Integrated Circuits The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitorsand interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors. Fourth Generation (1971-Present) Microprocessors The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computerfrom the central processing unit and memory to input/output controlson a single chip. In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handhelddevices. Fifth Generation (Present and Beyond) Artificial Intelligence Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization. The Fifth-Generation Computer was to be the end result of a massive government/industry research project in Japan during the 1980s, which aimed to create an "epoch-making computer" that would leapfrog more evolutionary designs by using the Prolog programming language to create a desktop system with supercomputer-like performance and usable artificial intelligence capabilities. The term "fifth generation" was intended to convey the system as being a leap beyond existing machines. Computers using vacuum tubes were called the first generation, transistors and diodes the second, ICs the third, and those using microprocessors the fourth. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU, the fifth generation, it was widely believed at the time, would instead turn to massive numbers of CPUs for added performance. Throughout these multiple generations since the 1950s, Japan had largely been a follower in terms of computing advancement, building computers following US and British leads. The Ministry of International Trade and Industry (MITI) decided to attempt to break out of this follow-the-leader pattern, and in the mid-1970s started looking, on a small scale, into the future of computing. They asked the Japan Information Processing Development Center (JIPDEC) to indicate a number of future directions, and in 1979 offered a three-year contract to carry out more in-depth studies along with industry and academia. It was during this period that the term "fifth-generation computer" started to be used. The primary fields for investigation from this initial project were: y Inference computer technologies for knowledge processing y Computer technologies to process large-scale data bases and knowledge bases y High performance workstations y Distributed functional computer technologies y Super-computers for scientific calculation The project imagined a parallel processing computer running on top of massive databases, as opposed to a file system, using a logic programming language to access the data. They envisioned building a prototype machine with performance between 100M and 1G LIPS, where a LIPS is a Logical Inference Per Second. At the time typical workstation machines were capable of about 100k LIPS. They proposed to build this machine over a ten year period, 3 years for initial R&D, 4 years for building various subsystems, and a final 3 years to complete a working prototype system. In 1982 the government decided to go ahead with the project, and established the Institute for New Generation Computer Technology (ICOT) through joint investment with various Japanese computer companies. So ingrained was the belief that parallel computing was the future of all performance gains that the Fifth-Generation project generated a great deal of apprehension in the computing field. After having seen the Japanese take over the consumer electronics field during the 1970s and apparently doing the same in the automotive world, the Japanese in the 1980s had a reputation for invincibility. Soon parallel projects were set up in the US as the Microelectronics and Computer Technology Corporation (MCC), in England as Alvey, and in Europe as the European Strategic Program of Research in Information Technology (ESPRIT). Over the next ten years the Fifth-Generation project ran into one difficulty after another. A primary problem was that their selected language, Prolog, did not support concurrency, and therefore they had to develop their own language for their multi-CPU goals. This never happened cleanly, and in fact a number of languages were developed, all with their own limitations. Another problem was that existing CPU performance quickly pushed through the "obvious" barriers that everyone believed existed in the 1970s, and the value of parallel computing quickly dropped to the point where it is today used only in niche situations. Although a number of workstations of increasing capacity were designed and built over the project's lifespan, they generally found themselves soon outperformed by "off the shelf" units available commercially. The Fifth-Generation Computer was constantly on the wrong side of technology curve in software as well. Over the period of it's lifespan Apple Computer introduced the GUI to the masses, the internet made locally-stored large databases a thing of the past, and even simple research projects constantly provided better real-world results in data mining, Google being a good example. Moreover the project found that the promises of logic programmer were largely illusitory, and they ran into the same sorts of limitations that earlier artificial intelligence researchers had, albeit at a different scale. Repeated attempts to make the system work after changing one language feature or another simply moved the point at which the computer suddenly seemed stupid. In fact it can be said that the project "missed the point" as a whole. It was during this time that the computer industry moved from hardware to software as a primary focus. The Fifth Generation project never made a clean separation, feeling that, as it was in the 1970s, hardware and software were inevitably mixed. By any measure the project was an abject failure. At the end of the ten year period they had burned through over 50 billion yen and the program was terminated without having met its goals. The workstations had no appeal in a market where single-CPU systems could outrun them, the software systems never worked, and the entire concept was then made obsolete by the internet.

You might also like