Pre-Ss1 Notes New

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 21

Pre-SS1 notes

Data processing

1. HISTORY OF COMPUTING 1

Definition of computing: This is the process of utilizing computer technology

to complete a task; simply put, it is the use or operation of computers.

Most individuals use some form of computing every day whether they realize

it or not e.g. Swiping of debit cards, sending an email, or even using a cell

phone can all be considered as forms of computing.

Concrete Computing Devices

1. Tally stick: This is used to record and document numbers, quantities, or

even messages, the first known tally stick made use of animal bones.

2. Abacus/ Counting frame: is a calculating tool used for performing simple

mathematical operations such as adding and subtracting.

3. The difference engine: An automatic mechanical calculator designed to

tabulate polynomial functions.

4. Coins / Stones: An ancient adding and subtracting tool.

5. Rope: A piece of strong tick cord used for marking distance.

6. Slide rule: This is a mechanical analogue computer, used primarily for

multiplication and division, also for functions such as root logarithms and

trigonometry.

7. Computer: A computer is a special multipurpose machine (electronic

device), that is capable of receiving instructions (data), storing and


processing it and giving a desired result as output accurately at an

incredibly high speed.

8. Adding Machine: The adding machine was invented by Blaise Pascal in

1642; it is used for computing simple arithmetic problems.

COMPUTING DEVICE AND THEIR INVENTORS

1. William Oughtred developed the SLIDE RUELE

2. John Napier invented NAPIER BONES

3. Blaise Pascal invented THE FIRST MECHANICAL DIGITAL CALCULATOR

(Pascaline)

4. Hollerith invented the PUNCH CARD

5. Charles Babbage invented the ANALYTICAL ENGINE AND DIFFERENCE

ENGINE

6. Mauchly and Eckert invented THE FIRST ELECTRONIC DIGITAL

COMPUTER

7. Atanasoff Berry Computer was invented by Prof. John V. Antanasoff

8. UNIVAC (IBM) was the FIRST COMERCIAL COMPUTER TO USE

VACUUM TUBES.

BASIC CALCULATIONS ON NUMBER BASE

To convert from base 2 to base 10

Example: Convert 101100101 base 2 to base 10

i. List the digits in order, and count them off from the right to left, starting

with zero
ii. After which arrange and solve carefully.

To convert from base 10 to base 2

Example: Convert 357 base 10 to base 2

i. To do this you need to divide by 2 repeatedly

ii. Keep track of the remainders

iii. Answer is gotten by reading remainders from top to bottom.

If properly done you should arrive at the Answer 101100101 base 2

SOME NUMBER SYSTEM AND DESCRIPTION

1. Binary Number system: Base 2, Digits used 0 and 1

2. Octal number system: Base 8. Digits used: 0 to 7

3. Hexadecimal number system: Base 16. Digits used: 0 - 9, letters used A -

F.

COMMON TERMS OF NUMBER SYSTEMS

a. base: the base of a number indicates how many absolute values

represented by the digits 0,1,2,3,4,5,6,7,8,9. In binary system, there are

only absolute values (0 and 1) and octal system has eight absolute values

(0,1,2,3,4,5,6,7) etc.

b. absolute value: this denotes whole numbers represented by symbol e.g.

6,7,8,4,5 etc.

c. position: in the decimal system the zero position has the positional value

100 or 1.
CONCEPTS OF NUMBER SYSTEM IN COMPUTING

Most computer system operates using binary logic. The binary number

system works like the decimal number system except the binary number

systems uses base 2, which include only the digits 0 and 1. The common

number systems used in computing are:

1. decimal number system (base 10)

2. binary number system (base 2)

3. octal number system ( base 8)

4. hexadecimal number system (base 16)

All types of information in computers can be represented using binary code.

Some examples are:

1. numbers

2. letters of the alphabet and punctuation marks.

3. microprocessor instruction

4. graphics/video

5. sound

When humans are speaking to one another, they speak in a particular

language. This language is made of words and letters. The term computer

numbering formats refers to the schemes implemented in digital computer

and calculator hardware and software to represent numbers.


Although we type words and letters in the computer, the computer does not

understand the words and letters. Rather, those words and letters are

translated into numbers. Computers “talk” and understand in numbers.

There are common number systems used in computing.

Binary or base 2: th8ere are only two numbers in binary, 0 and 1. Because

computers use a sequence of switched that can be on or off (also called a

bit), base 2 works very well for them. Math in base 2 is pathetically simple,

but incredibly time consuming.

Octal or base 8: uses the numbers 0 to 7. There are eight bits in a byte,

which is used very often in the computer field. (a bit is great, but it’s too

small to hold any useful data, thus the byte is used) math in octal is more

complicated than decimal.

Decimal or base 10: uses the numbers 0-9. I’m sure you’re familiar with

the system. Computers only display numbers in decimal; they actually do

all their work in binary. Math is quite simple with this number system,

although some may argue.

Hexadecimal or base 16: uses the numbers 0-F. because there are 16

values per placeholder, six new numbers have to be created. Those numbers

are A,B,C,D,E and F. “A” has a value of 10; “B” is 11, and so on. Math in

hexadecimal is not very simple compared to decimal.

The abacus

The abacus, also called a counting frame, is a calculating tool that was in

use in the ancient Near East, Europe, China, and Russia, centuries before
the adoption of the written Arabic numeral system. The exact origin of the

abacus is still unknown. It is known as the first mechanical calculating

device. Abacus is made up of wooden frame in which rod.

Napier bones

Napier's bones is a manually-operated calculating device created by

John Napier of Merchiston, Scotland for the calculation of products and

quotients of numbers. The method was based on lattice multiplication, and

also called 'radiology', a word invented by Napier. John Napier,

the inventor of logarithms, also invented this aid to calculation known as

'Napier's Bones' in 1617. The 'bones' consist of a set of rectangular rods,

each marked with a counting number at the top, and the multiples of that

number down their lengths.


Slide rule

The slide rule was invented around 1620–1630, shortly after John Napier's

publication of the concept of the logarithm. In 1620 Edmund Gunter of

Oxford developed a calculating device with a single logarithmic scale; with

additional measuring tools it could be used to multiply and divide.

Schikard’s calculating clock

Wilhelm Shickard was born in Herrenberg, Germany. He was

mathematician, astronomer, painter, Lutheran minister and Hebrew and

Aramaic teacher. He invented the calculating clock in 1623 because he

wanted to help his friend Johannes Kepler who was calculating Alfonsine

tables manually.

Pascal’s Calculator
in the year 1642, Blaise Pascal a French scientist invented an adding

machine called Pascal’s calculator or Pascaline, which represents the

position of digit with the help of gears in it.

Leibnz Calculator

In the year 1671, a German mathematics, Gottfried Leibniz modified the

Pascal calculator and he developed a machine which could perform various

calculation based on multiplication and division as well.

Analytical Engine

In 1833, Babbage designed a machine called an “analytical engine”. The

device is large as a house, powered by 6 steam engines, more general

purpose in nature and programmable due to the punched card technology

of jacquard. Through the connection to the Jacquard loom, Babbage called

the two main parts of his Analytical Engine the “store” and the “Mill”, as

both terms are used in the weaving industry. The store was where numbers

were held and the Mill was where they were “woven” into new results. In
modern computer these same parts are called the memory unit and central

processing unit (CPU)

Herman Hollerith punch card: In 1881, Herman Hollerith began designing

a machine to tabulate census data more efficiently than by traditional hand

methods. The U.S. Census Bureau had taken eight years to complete the

1880 census, and it was feared that the 1890 census would take even

longer. Hollerith invented and used a punched card device to help analyse

the 1890 U.S. census data. His great breakthrough was his use of electricity

to read, count and sort punched cards whose holes represented data

gathered by the census-takers.

His machines were used for the 1890 census and accomplished in one year
what would have taken nearly 10 years of hand tabulating. In 1896,
Hollerith founded the Tabulating Machine Company to sell his invention,
the Company became part of IBM in 1924.
Hollerith first got his idea for the punch-card tabulation machine from
watching a train conductor punch tickets. For his tabulation machine, he
used the punch card invented in the early 1800s, by a French silk weaver
called Joseph-Marie Jacquard. Jacquard invented a way of automatically
controlling the warp and weft threads on a silk loom by recording patterns
of holes in a string of cards.
Hollerith's punch cards and tabulating machines were a step toward
automated computation. His device could automatically read information
which had been punched onto a card. He got the idea and then saw
Jacquard's punch card. Punch card technology was used in computers up
until the late 1970s. Computer "punched cards" were read electronically,
the cards moved between brass rods, and the holes in the cards created an
electric current where the rods would touch.

John Von Newman’s Machine

John Von Newmann who was a Hungarian American mathematician, an

early computer scientist and the inventor of the merge sort algorithm

described in his first draft of a report on the EDVAC (distributed on June

30, 1945) a computer architecture in which the data and the program

(instructions) are both stored in the computer’s memory in the same

address space. EDVAC is


a computer that by design includes an instruction set and can store in

memory a set of instructions (a program) that details a computation. The

idea of the stored-program computer changed everything.

3. DIGITALIZATION OF DATA

Digitization is the process of converting information into digital format. This

information may represent an object, image, sound, document or a signal

(usually an analog signal) organized into discrete set of its points or

samples. This is binary data that computers and many devices with

computing capacity (such as digital camera and digital hearing aids) can

process. Data digitalization is the process by which physical and manual

records such as text, images, video and audio are converted into digital

forms.

Digitalization can also be defined as the integration of digital technologies

into everyday life. Modern cameras, television, phones and computers are

all examples digital technology. Prior to the digital system, most

technologies ran on the analog system.


An analog system uses a continuous signal that varies in amplitude to

represent a variable, such as voice or data, rather than having a limited

range of steps like a digital system. A digital system uses a binary numeric

system in which electronic pulses are represented by either 1 for a high

pulse or 0 for a low pulse. Digital systems can more easily represent

symbols, such as alphanumeric characters that represent real-world data,

than analog system.

• Note the difference between Digitization

• Digitization means - converting from analog to digital. Converting

information of any format into a format that can be read/understood

by a computer.

Digitalization means - Using digital technologies to change a business

model. For example, a business is using newspaper ads to promote their

product.

BENEFITS OF DIGITALIZING DATA

Long term preservation of documents

Orderly archiving of documents

Easy and customized access to information

HISTORY OF COMPUTER DEVELOPMENT

The computer was born not for entertainment or email but out of a need to
solve a serious number-crunching crisis. By 1880, the U.S. population had
grown so large that it took more than seven years to tabulate the U.S.
Census results. The government sought a faster way to get the job done,
giving rise to punch-card based computers that took up entire rooms.
Today, we carry more computing power on our smartphones than was
available in these early models. The following brief history of computing is a
timeline of how computers evolved from their humble beginnings to the
machines of today that surf the Internet, play games and stream
multimedia in addition to crunching numbers.
1801: In France, Joseph Marie Jacquard invents a loom that uses punched
wooden cards to automatically weave fabric designs. Early computers would
use similar punch cards.
1822: English mathematician Charles Babbage conceives of a steam-driven
calculating machine that would be able to compute tables of numbers. The
project, funded by the English government, is a failure. More than a century
later, however, the world's first computer was actually built.
1890: Herman Hollerith designs a punch card system to calculate the 1880
census, accomplishing the task in just three years and saving the
government $5 million. He establishes a company that would ultimately
become IBM.
1936: Alan Turing presents the notion of a universal machine, later called
the Turing machine, capable of computing anything that is computable. The
central concept of the modern computer was based on his ideas.
1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State
University, attempts to build the first computer without gears, cams, belts
or shafts.
1939: Hewlett-Packard is founded by David Packard and Bill Hewlett in a
Palo Alto, California, garage, according to the Computer History Museum.
1941: Atanasoff and his graduate student, Clifford Berry, design a
computer that can solve 29 equations simultaneously. This marks the first
time a computer is able to store information on its main memory.
1943-1944: Two University of Pennsylvania professors, John Mauchly and
J. Presper Eckert, build the Electronic Numerical Integrator and Calculator
(ENIAC). Considered the grandfather of digital computers, it fills a 20-foot
by 40-foot room and has 18,000 vacuum tubes.
1946: Mauchly and Presper leave the University of Pennsylvania and receive
funding from the Census Bureau to build the UNIVAC, the first commercial
computer for business and government applications.
1947: William Shockley, John Bardeen and Walter Brattain of Bell
Laboratories invent the transistor. They discovered how to make an electric
switch with solid materials and no need for a vacuum.
1953: Grace Hopper develops the first computer language, which eventually
becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO
Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the
United Nations keep tabs on Korea during the war.
1954: The FORTRAN programming language, an acronym for FORmula
TRANslation, is developed by a team of programmers at IBM led by John
Backus, according to the University of Michigan.
1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as
the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for
his work.
1964: Douglas Engelbart shows a prototype of the modern computer, with a
mouse and a graphical user interface (GUI). This marks the evolution of the
computer from a specialized machine for scientists and mathematicians to
technology that is more accessible to the general public.
1969: A group of developers at Bell Labs produce UNIX, an operating
system that addressed compatibility issues. Written in the C programming
language, UNIX was portable across multiple platforms and became the
operating system of choice among mainframes at large companies and
government entities. Due to the slow nature of the system, it never quite
gained traction among home PC users.
1970: The newly formed Intel unveils the Intel 1103, the first Dynamic
Access Memory (DRAM) chip.
1971: Alan Shugart leads a team of IBM engineers who invent the "floppy
disk," allowing data to be shared among computers.
1973: Robert Metcalfe, a member of the research staff for Xerox, develops
Ethernet for connecting multiple computers and other hardware.
1974-1977: A number of personal computers hit the market, including
Scelbi & Mark-8 Altair, IBM 5100, Radio Shack's TRS-80 — affectionately
known as the "Trash 80" — and the Commodore PET.
1975: The January issue of Popular Electronics magazine features the
Altair 8080, described as the "world's first minicomputer kit to rival
commercial models." Two "computer geeks," Paul Allen and Bill Gates, offer
to write software for the Altair, using the new BASIC language. On April 4,
after the success of this first endeavor, the two childhood friends form their
own software company, Microsoft.
1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool's
Day and roll out the Apple I, the first computer with a single-circuit board,
according to Stanford University.

1977: Radio Shack's initial production run of the TRS-80 was just 3,000. It
sold like crazy. For the first time, non-geeks could write programs and make
a computer do what they wished.
1977: Jobs and Wozniak incorporate Apple and show the Apple II at the
first West Coast Computer Faire. It offers color graphics and incorporates
an audio cassette drive for storage.
1978: Accountants rejoice at the introduction of VisiCalc, the first
computerized spreadsheet program.
1979: Word processing becomes a reality as MicroPro International releases
WordStar. "The defining change was to add margins and word wrap,"
said creator Rob Barnaby in email to Mike Petrie in 2000. "Additional
changes included getting rid of command mode and adding a print function.
I was the technical brains — I figured out how to do it, and did it, and
documented it. "
The first IBM personal computer, introduced on Aug. 12, 1981, used the
MS-DOS operating system. (Image credit: IBM)

1981: The first IBM personal computer, code-named "Acorn," is introduced.


It uses Microsoft's MS-DOS operating system. It has an Intel chip, two
floppy disks and an optional color monitor. Sears & Roebuck and
Computerland sell the machines, marking the first time a computer is
available through outside distributors. It also popularizes the term PC.
1983: Apple's Lisa is the first personal computer with a GUI. It also features
a drop-down menu and icons. It flops but eventually evolves into the
Macintosh. The Gavilan SC is the first portable computer with the familiar
flip form factor and the first to be marketed as a "laptop."
1985: Microsoft announces Windows, according to Encyclopedia Britannica.
This was the company's response to Apple's GUI. Commodore unveils the
Amiga 1000, which features advanced audio and video capabilities.
1985: The first dot-com domain name is registered on March 15, years
before the World Wide Web would mark the formal beginning of Internet
history. The Symbolics Computer Company, a small Massachusetts
computer manufacturer, registers Symbolics.com. More than two years
later, only 100 dot-coms had been registered.
1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture
provides as speed comparable to mainframes.
1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics
laboratory in Geneva, develops HyperText Markup Language (HTML), giving
rise to the World Wide Web.
1993: The Pentium microprocessor advances the use of graphics and music
on PCs.
1994: PCs become gaming machines as "Command & Conquer," "Alone in
the Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big
Adventure" are among the games to hit the market.
1996: Sergey Brin and Larry Page develop the Google search engine at
Stanford University.
1997: Microsoft invests $150 million in Apple, which was struggling at the
time, ending Apple's court case against Microsoft in which it alleged that
Microsoft copied the "look and feel" of its operating system.
1999: The term Wi-Fi becomes part of the computing language and users
begin connecting to the Internet without wires.
2001: Apple unveils the Mac OS X operating system, which provides
protected memory architecture and pre-emptive multi-tasking, among other
benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a
significantly redesigned GUI.
2003: The first 64-bit processor, AMD's Athlon 64, becomes available to the
consumer market.
2004: Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the
dominant Web browser. Facebook, a social networking site, launches.
2005: YouTube, a video sharing service, is founded. Google acquires
Android, a Linux-based mobile phone operating system.

2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core
mobile computer, as well as an Intel-based iMac. Nintendo's Wii game
console hits the market.
2007: The iPhone brings many computer functions to the smartphone.
2009: Microsoft launches Windows 7, which offers the ability to pin
applications to the taskbar and advances in touch and handwriting
recognition, among other features.
2010: Apple unveils the iPad, changing the way consumers view media and
jumpstarting the dormant tablet computer segment.
2011: Google releases the Chromebook, a laptop that runs the Google
Chrome OS.
2012: Facebook gains 1 billion users on October 4.
2015: Apple releases the Apple Watch. Microsoft releases Windows 10.
2016: The first reprogrammable quantum computer was created. "Until
now, there hasn't been any quantum-computing platform that had the
capability to program new algorithms into their system. They're usually
each tailored to attack a particular algorithm," said study lead author
Shantanu Debnath, a quantum physicist and optical engineer at the
University of Maryland, College Park.
2017: The Defense Advanced Research Projects Agency (DARPA) is
developing a new "Molecular Informatics" program that uses molecules as
computers. "Chemistry offers a rich set of properties that we may be able to
harness for rapid, scalable information storage and processing," Anne
Fischer, program manager in DARPA's Defense Sciences Office, said in a
statement. "Millions of molecules exist, and each molecule has a unique
three-dimensional atomic structure as well as variables such as shape, size,
or even color. This richness provides a vast design space for exploring novel
and multi-value ways to encode and process data beyond the 0s and 1s of
current logic-based, digital architectures." [Computers of the Future May Be
Minuscule Molecular Machines]

4. TYPES OF COMPUTER

Classification based on Data Representation They are basically 1. Analogue

Computers 2. Digital computers 3. Hybrid computers

1. Analogue Computers: This handles data in form of varying

signal/quality that assume an infinite number of levels during variation. i.e

It works by measuring changes in continuous physical or electrical state

rather than counting. Such measurements as temperature, voltage, volume,

chemical composition of petroleum products, amount of current flowing

through an electric current etc, are used.


2. Digital computers: This type processes data in discrete form i.e “ON or

OFF”; YES/NO. They count numbers and can only produce integers (floating

point numbers). All the input must be in “quantized” or integral form

everything they do is translated into a series of numerals or digits. They

process discrete numbers. Examples are Mainframe, Mini and

Microcomputer.

3. Hybrid computers: These are the type of computers that combine the

functions of both Analogue and Digital signal/data using Analogue-Digital

converter or vice versa and work as a single system. Thus, a Hybrid

computer consists of an analogue and a digital computer functioning as a

single computer.

5. COMPONENTS OF THE COMPUTER

The classic components of a computer are briefly described below:

Each component is discussed in more detail in its own section. The

operation of the processor is best understood in terms of these components.


Processor External

control
input

memory

Data path output

Components of computer
1. data path- manipulates the data coming through the processor. It
provide a small amount of temporary data storage. The data path consists of
the following components:

Programmable registers: small units of data storage that are directly


visible to assembly language programmers. They can be used like simple
variables in a high-level program.

The program counters (PC)- holds the address for fetching instructions.

Multiplexers- have control inputs coming from control. They are used for
routing data through the data path.

Processing elements- compute new data values from old data values. In
simple processors the major processing elements are grouped into an
arithmetic-logic unit (ALU).

Special-purpose registers- holds data that is needed for processor


operation but is not directly visible to assembly language programmers.

2. control- generates control signals that direct the operation of memory


and the data path.

Tell memory to send or receive data.

Tell the ALU what operation to perform.

Route data between different parts of the data path.

3. memory- holds instruction and most of the data for currently executing
programs. The rest of the data is held in programmable registers, which can
only a limited amount of data.

4. input- external devices such as keyboards, mice, disks and networks


that provide input to the processor. In modern processors, the data is
placed in memory before entering the processor. Input handling is largely
under the control of operating system software.
5. output- external devices such as displays, printers, disks and networks
that receive data from the processor. In modern processors, this data is
placed in memory before leaving the processor. Output handling is largely
under the control of operating system software.

System unit

The system unit is the computer casing. It is the core of a computer system.
Usually it’s a rectangular box placed on or underneath your desk. Inside
this box are many electronic components that process information. The
most important of these components is the central processing unit (CPU),
microprocessor, which acts as the brain of your computer. Another
component is random access memory (RAM), which temporarily stores
information that the CPU uses while the computer is on. The information
stored in RAM is erased when the computer is turned off.

You might also like