Pre-Ss1 Notes New
Pre-Ss1 Notes New
Pre-Ss1 Notes New
Data processing
1. HISTORY OF COMPUTING 1
Most individuals use some form of computing every day whether they realize
it or not e.g. Swiping of debit cards, sending an email, or even using a cell
even messages, the first known tally stick made use of animal bones.
multiplication and division, also for functions such as root logarithms and
trigonometry.
(Pascaline)
ENGINE
COMPUTER
VACUUM TUBES.
i. List the digits in order, and count them off from the right to left, starting
with zero
ii. After which arrange and solve carefully.
F.
only absolute values (0 and 1) and octal system has eight absolute values
(0,1,2,3,4,5,6,7) etc.
6,7,8,4,5 etc.
c. position: in the decimal system the zero position has the positional value
100 or 1.
CONCEPTS OF NUMBER SYSTEM IN COMPUTING
Most computer system operates using binary logic. The binary number
system works like the decimal number system except the binary number
systems uses base 2, which include only the digits 0 and 1. The common
1. numbers
3. microprocessor instruction
4. graphics/video
5. sound
language. This language is made of words and letters. The term computer
understand the words and letters. Rather, those words and letters are
Binary or base 2: th8ere are only two numbers in binary, 0 and 1. Because
bit), base 2 works very well for them. Math in base 2 is pathetically simple,
Octal or base 8: uses the numbers 0 to 7. There are eight bits in a byte,
which is used very often in the computer field. (a bit is great, but it’s too
small to hold any useful data, thus the byte is used) math in octal is more
Decimal or base 10: uses the numbers 0-9. I’m sure you’re familiar with
all their work in binary. Math is quite simple with this number system,
Hexadecimal or base 16: uses the numbers 0-F. because there are 16
values per placeholder, six new numbers have to be created. Those numbers
are A,B,C,D,E and F. “A” has a value of 10; “B” is 11, and so on. Math in
The abacus
The abacus, also called a counting frame, is a calculating tool that was in
use in the ancient Near East, Europe, China, and Russia, centuries before
the adoption of the written Arabic numeral system. The exact origin of the
Napier bones
each marked with a counting number at the top, and the multiples of that
The slide rule was invented around 1620–1630, shortly after John Napier's
wanted to help his friend Johannes Kepler who was calculating Alfonsine
tables manually.
Pascal’s Calculator
in the year 1642, Blaise Pascal a French scientist invented an adding
Leibnz Calculator
Analytical Engine
the two main parts of his Analytical Engine the “store” and the “Mill”, as
both terms are used in the weaving industry. The store was where numbers
were held and the Mill was where they were “woven” into new results. In
modern computer these same parts are called the memory unit and central
methods. The U.S. Census Bureau had taken eight years to complete the
1880 census, and it was feared that the 1890 census would take even
longer. Hollerith invented and used a punched card device to help analyse
the 1890 U.S. census data. His great breakthrough was his use of electricity
to read, count and sort punched cards whose holes represented data
His machines were used for the 1890 census and accomplished in one year
what would have taken nearly 10 years of hand tabulating. In 1896,
Hollerith founded the Tabulating Machine Company to sell his invention,
the Company became part of IBM in 1924.
Hollerith first got his idea for the punch-card tabulation machine from
watching a train conductor punch tickets. For his tabulation machine, he
used the punch card invented in the early 1800s, by a French silk weaver
called Joseph-Marie Jacquard. Jacquard invented a way of automatically
controlling the warp and weft threads on a silk loom by recording patterns
of holes in a string of cards.
Hollerith's punch cards and tabulating machines were a step toward
automated computation. His device could automatically read information
which had been punched onto a card. He got the idea and then saw
Jacquard's punch card. Punch card technology was used in computers up
until the late 1970s. Computer "punched cards" were read electronically,
the cards moved between brass rods, and the holes in the cards created an
electric current where the rods would touch.
early computer scientist and the inventor of the merge sort algorithm
30, 1945) a computer architecture in which the data and the program
3. DIGITALIZATION OF DATA
samples. This is binary data that computers and many devices with
computing capacity (such as digital camera and digital hearing aids) can
records such as text, images, video and audio are converted into digital
forms.
into everyday life. Modern cameras, television, phones and computers are
range of steps like a digital system. A digital system uses a binary numeric
pulse or 0 for a low pulse. Digital systems can more easily represent
by a computer.
product.
The computer was born not for entertainment or email but out of a need to
solve a serious number-crunching crisis. By 1880, the U.S. population had
grown so large that it took more than seven years to tabulate the U.S.
Census results. The government sought a faster way to get the job done,
giving rise to punch-card based computers that took up entire rooms.
Today, we carry more computing power on our smartphones than was
available in these early models. The following brief history of computing is a
timeline of how computers evolved from their humble beginnings to the
machines of today that surf the Internet, play games and stream
multimedia in addition to crunching numbers.
1801: In France, Joseph Marie Jacquard invents a loom that uses punched
wooden cards to automatically weave fabric designs. Early computers would
use similar punch cards.
1822: English mathematician Charles Babbage conceives of a steam-driven
calculating machine that would be able to compute tables of numbers. The
project, funded by the English government, is a failure. More than a century
later, however, the world's first computer was actually built.
1890: Herman Hollerith designs a punch card system to calculate the 1880
census, accomplishing the task in just three years and saving the
government $5 million. He establishes a company that would ultimately
become IBM.
1936: Alan Turing presents the notion of a universal machine, later called
the Turing machine, capable of computing anything that is computable. The
central concept of the modern computer was based on his ideas.
1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State
University, attempts to build the first computer without gears, cams, belts
or shafts.
1939: Hewlett-Packard is founded by David Packard and Bill Hewlett in a
Palo Alto, California, garage, according to the Computer History Museum.
1941: Atanasoff and his graduate student, Clifford Berry, design a
computer that can solve 29 equations simultaneously. This marks the first
time a computer is able to store information on its main memory.
1943-1944: Two University of Pennsylvania professors, John Mauchly and
J. Presper Eckert, build the Electronic Numerical Integrator and Calculator
(ENIAC). Considered the grandfather of digital computers, it fills a 20-foot
by 40-foot room and has 18,000 vacuum tubes.
1946: Mauchly and Presper leave the University of Pennsylvania and receive
funding from the Census Bureau to build the UNIVAC, the first commercial
computer for business and government applications.
1947: William Shockley, John Bardeen and Walter Brattain of Bell
Laboratories invent the transistor. They discovered how to make an electric
switch with solid materials and no need for a vacuum.
1953: Grace Hopper develops the first computer language, which eventually
becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO
Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the
United Nations keep tabs on Korea during the war.
1954: The FORTRAN programming language, an acronym for FORmula
TRANslation, is developed by a team of programmers at IBM led by John
Backus, according to the University of Michigan.
1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as
the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for
his work.
1964: Douglas Engelbart shows a prototype of the modern computer, with a
mouse and a graphical user interface (GUI). This marks the evolution of the
computer from a specialized machine for scientists and mathematicians to
technology that is more accessible to the general public.
1969: A group of developers at Bell Labs produce UNIX, an operating
system that addressed compatibility issues. Written in the C programming
language, UNIX was portable across multiple platforms and became the
operating system of choice among mainframes at large companies and
government entities. Due to the slow nature of the system, it never quite
gained traction among home PC users.
1970: The newly formed Intel unveils the Intel 1103, the first Dynamic
Access Memory (DRAM) chip.
1971: Alan Shugart leads a team of IBM engineers who invent the "floppy
disk," allowing data to be shared among computers.
1973: Robert Metcalfe, a member of the research staff for Xerox, develops
Ethernet for connecting multiple computers and other hardware.
1974-1977: A number of personal computers hit the market, including
Scelbi & Mark-8 Altair, IBM 5100, Radio Shack's TRS-80 — affectionately
known as the "Trash 80" — and the Commodore PET.
1975: The January issue of Popular Electronics magazine features the
Altair 8080, described as the "world's first minicomputer kit to rival
commercial models." Two "computer geeks," Paul Allen and Bill Gates, offer
to write software for the Altair, using the new BASIC language. On April 4,
after the success of this first endeavor, the two childhood friends form their
own software company, Microsoft.
1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool's
Day and roll out the Apple I, the first computer with a single-circuit board,
according to Stanford University.
1977: Radio Shack's initial production run of the TRS-80 was just 3,000. It
sold like crazy. For the first time, non-geeks could write programs and make
a computer do what they wished.
1977: Jobs and Wozniak incorporate Apple and show the Apple II at the
first West Coast Computer Faire. It offers color graphics and incorporates
an audio cassette drive for storage.
1978: Accountants rejoice at the introduction of VisiCalc, the first
computerized spreadsheet program.
1979: Word processing becomes a reality as MicroPro International releases
WordStar. "The defining change was to add margins and word wrap,"
said creator Rob Barnaby in email to Mike Petrie in 2000. "Additional
changes included getting rid of command mode and adding a print function.
I was the technical brains — I figured out how to do it, and did it, and
documented it. "
The first IBM personal computer, introduced on Aug. 12, 1981, used the
MS-DOS operating system. (Image credit: IBM)
2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core
mobile computer, as well as an Intel-based iMac. Nintendo's Wii game
console hits the market.
2007: The iPhone brings many computer functions to the smartphone.
2009: Microsoft launches Windows 7, which offers the ability to pin
applications to the taskbar and advances in touch and handwriting
recognition, among other features.
2010: Apple unveils the iPad, changing the way consumers view media and
jumpstarting the dormant tablet computer segment.
2011: Google releases the Chromebook, a laptop that runs the Google
Chrome OS.
2012: Facebook gains 1 billion users on October 4.
2015: Apple releases the Apple Watch. Microsoft releases Windows 10.
2016: The first reprogrammable quantum computer was created. "Until
now, there hasn't been any quantum-computing platform that had the
capability to program new algorithms into their system. They're usually
each tailored to attack a particular algorithm," said study lead author
Shantanu Debnath, a quantum physicist and optical engineer at the
University of Maryland, College Park.
2017: The Defense Advanced Research Projects Agency (DARPA) is
developing a new "Molecular Informatics" program that uses molecules as
computers. "Chemistry offers a rich set of properties that we may be able to
harness for rapid, scalable information storage and processing," Anne
Fischer, program manager in DARPA's Defense Sciences Office, said in a
statement. "Millions of molecules exist, and each molecule has a unique
three-dimensional atomic structure as well as variables such as shape, size,
or even color. This richness provides a vast design space for exploring novel
and multi-value ways to encode and process data beyond the 0s and 1s of
current logic-based, digital architectures." [Computers of the Future May Be
Minuscule Molecular Machines]
4. TYPES OF COMPUTER
OFF”; YES/NO. They count numbers and can only produce integers (floating
Microcomputer.
3. Hybrid computers: These are the type of computers that combine the
single computer.
control
input
memory
Components of computer
1. data path- manipulates the data coming through the processor. It
provide a small amount of temporary data storage. The data path consists of
the following components:
The program counters (PC)- holds the address for fetching instructions.
Multiplexers- have control inputs coming from control. They are used for
routing data through the data path.
Processing elements- compute new data values from old data values. In
simple processors the major processing elements are grouped into an
arithmetic-logic unit (ALU).
3. memory- holds instruction and most of the data for currently executing
programs. The rest of the data is held in programmable registers, which can
only a limited amount of data.
System unit
The system unit is the computer casing. It is the core of a computer system.
Usually it’s a rectangular box placed on or underneath your desk. Inside
this box are many electronic components that process information. The
most important of these components is the central processing unit (CPU),
microprocessor, which acts as the brain of your computer. Another
component is random access memory (RAM), which temporarily stores
information that the CPU uses while the computer is on. The information
stored in RAM is erased when the computer is turned off.