The Role of Memory in The Computer
The Role of Memory in The Computer
The Role of Memory in The Computer
People in the computer industry commonly use the term "memory" to refer to RAM (Random
Access Memory). A computer uses Ram to hold temporary instructions and data needed to
complete tasks. This enables the computer's CPU (Central Processing Unit), to access
instructions and data stored in memory very quickly.
A good example of this is when the CPU loads an application program - such as a word
processing or spreadsheet program - into memory, thereby allowing the application program to
work as quickly and efficiently as possible. In practical terms, having the program loaded into
memory means that you can get work done more quickly with less time spent waiting for the
computer to perform tasks.
The process begins when you enter a command from your keyboard. The CPU interprets the
command and instructs the hard drive to load the command or program into memory. Once the
data is loaded into memory, the CPU is able to access it much more quickly than if it had to
retrieve it from the hard drive.
People often confuse the terms memory and storage, especially when describing the amount
they have of each. The term memory refers to the amount of RAM installed in the computer,
whereas the term storage refers to the capacity of the computer's hard disk. To clarify this
common mix-up, it helps to compare your computer to an office that contains a desk and a file
cabinet.
Here's another important difference between memory and storage: the information stored on a
hard disk remains intact even when the computer is turned off. However, any data held in
memory is lost when the computer is turned off. In our desk space metaphor, it's as though any
files left on the desk at closing time will be thrown away. Memory needs power to hold data –
without power it is as if all the data and files it is currently holding is thrown away. This is
known as “volatility” and memory is “volatile”.
It's been proven that adding more memory to a computer system increases its performance. If
there isn't enough room in memory for all the information the CPU needs, the computer has to
set up what's known as a virtual memory file. In so doing, the CPU reserves space on the hard
disk to simulate additional RAM. This process, referred to as "swapping", slows the system
down. In an average computer, it takes the CPU approximately 200ns (nanoseconds) to access
RAM compared to 12,000,000ns to access the hard drive. To put this into perspective, this is
equivalent to what's normally a 3 1/2 minute task taking 4 1/2 months to complete!
What is RAM?
The term 'RAM' is an acronym for Random Access Memory, this is the memory that your
computer uses to run its operating system and any applications (software programs) that you
start. The name means that the computer can access information held anywhere (i.e. at a random
location) in RAM by addressing that part of the RAM directly. In other words if there is some
information stored in the 1000th location in memory the system does not have to read the
information in the preceding 999 locations to get there, instead it can access the 1000th location
simply by specifying it. The alternative would be called sequential access, an example of which
would be accessing information stored on a hard drive - the hard drive can only read the
information as it passes underneath the read/write heads, so if an application wants information
in say sector 14 of a certain track the hard drive has no option but to read all the information on
that track until it gets to sector 14. So RAM is much quicker in retrieving information.
Why not have everything on your computer stored on RAM? The answer is cost and volatility -
RAM costs far more per GB than a hard drive and most RAM requires power to maintain the
information stored in it. If you had a RAM only computer you would have to reload the
operating system and all your software applications and data every time you switched off or
there was a power cut. Generally a system is best served by a mix of RAM and Hard Drive
storage.
The more applications you open and run simultaneously the more RAM is required. You might
think that sooner or later you will run out of RAM and then what? Well the operating system is
designed to cope with that situation by 'swapping' blocks of RAM to the Hard Drive. What that
means is if the system is running out of RAM it takes the contents of a 'chunk' of RAM (usually
the least used part) and writes it to a reserved area of the Hard Drive. The 'chunk' of RAM is then
declared free for use. By using the swap space in this way the system normally never runs out of
RAM. But accessing information on the Hard Drive is inherently slower than accessing it from
RAM so the result is the computer slows down. No-one likes a slow computer so what do you do
about it? Obviously you want to add more RAM.
Bus - A group of electrical conductors (e.g., wires) linking different hardware in the
computer. Just as a bus in real life is a means of transporting large numbers of people
from one location to another, so a bus in a computer is a means of transporting large
numbers of signals (or data) from one part to another. For example the front-side bus
(FSB) transports data between the CPU and the Memory (and to other destinations).
Cache Memory - Cache memory is a separate store of SRAM used by the CPU to store
the most frequently used 'information'. The cache can be accessed more quickly than
normal RAM so by storing frequently used functions/data there an overall speed increase
can be obtained. There are different "levels" of cache depending on how close they are to
the CPU, Level 1 cache is actually part of the CPU chip itself, Level 2 and Level 3 are
external to the CPU usually on the motherboard.
Virtual Memory - This is simulating RAM on the Hard Drive when running out of space
in the RAM memory module. It is far slower to access than real RAM. Significant
degradation of system performance occurs if a lot of data resides in virtual memory.
DRAM - Dynamic Random Access Memory - a generic term describing RAM in which
the data needs to be refreshed continually. Very widely used in PCs.
SRAM - Static Random Access Memory - a generic term describing RAM in which the
data is retained without the need to refresh. Faster, larger and more expensive than
DRAM.
DIMM - Dual Inline Memory Module - a memory stick with power and data contacts on
both sides of the board.
A DIMM module
RIMM - Rambus Inline Memory Module - the memory stick used in systems using
Rambus RAM.
DDR - Double Data Rate memory - a type of DRAM based on SDRAM technology that
operates at twice the bus clock rate. It uses 184 pin modules. Released in 2000. This was
the mainstream memory technology to the end of 2005.
DDR2 - Double Data Rate2 memory - a type of DRAM based on DDR technology that
operates at twice the clock rate. Released in 2004. This is expected to be the mainstream
memory technology to the end of 2007. Not compatible with DDR motherboards.
RAM Speed
The RAM in Intel based computers is accessed by the CPU via the front-side bus (FSB) and the
memory bus. Improvements in technology have changed the speed of the FSB dramatically.
Similarly the RAM itself has a maximum speed at which it can reliably operate. This must be at
least as high as the memory bus speed.
What does this mean in terms of quantity of data that could be transferred per second? Taking
information from a variety of memory manufacturers sites we can make a table to show some
comparisons of peak memory performance:
RAM Speed
Type of RAM
in MHz
SDRAM 100
SDRAM 133
RIMM 400
RIMM 533
DDR 200
DDR 266
DDR 366
DDR 400
Dual Channel RIMM 400
Dual Channel RIMM 533
Dual Channel DDR2 400
Dual Channel DDR2 533
Dual Channel DDR2 667
Dual Channel DDR2 800
Data is stored in your computer's memory chips in a similar way to storing data in a spreadsheet
- it is organized in rows and columns and is sequential along a row. Each cell in the chip holds
four bits of data. Part of the chip might look like this:
Address Column 1 Column 2 Column 3 Column 4
Row 1 1101 1001 0100 0110
Row 2 1011 1000 1100 0000
Row 3 1111 1010 0101 1100
Row 4 1011 0011 1010 1100
To read the data in a particular cell in our memory chip the computer needs to indicate which
Row the data is in and then indicate the Column that holds the cell containing the required data.
It does this by issuing (in binary) an "address" for the Row and then the Column. The data is
then accessed using the address bus. The number of bits (ones and zeros) that make up the
address also make up the size of the address bus. For example to read the data in the green cell in
the diagram the computer must first address Row 3 (highlighted in yellow) and after that address
is fixed it addresses Column 2 (highlighted in blue).
Ironic that this quote should come from the founder of Microsoft - the company whose Windows
operating system goes through computer resources like kids go through birthday cake. Users of
Windows Vista will quickly realize that it does not run properly on the minimum system
requirements of 64MB RAM and many people recommend 1 GB RAM at a bare minimum for
running even low complexity applications. So how much RAM is enough for anybody these
days? Well there is lots of advice available on the Internet including detailed analyses covering
different Operating Systems such as the 'Kingston Ultimate Memory Guide'.
To cut a long story short you should look at the maximum RAM your system can accommodate,
the maximum you can afford to spend on the upgrade and then aim to at least double what you
have now. If you are purchasing a PC with Windows Vista, shoot for 2 GB of RAM, if not more.
Do not go under 1 GB.