Custom PC Retrograde

Download as pdf or txt
Download as pdf or txt
You are on page 1of 97

FROM THE MAKERS OF

RETROGRADE
THE ULTIMATE GUIDE TO PRE-MILLENNIAL PC HARDWARE

IS YOUR FLOPPY DISK


KNOWLEDGE MISSING? I
COMP BM AND 10
ATIBL 0
FIND OUT HOW THEY E PC S %
YSTEM
286 , 386, S
WORK INSIDE! CGA, E
486 an
d Pent
ium
3.5in h GA and
igh-de VGA
Voodo nsity flo
o, Pow ppy dr
e r VR and ive
Sound GeForc
Blaste e
and PCr, Roland MT
MS-DO speake -32
S and M r
icrosof
t Wind
REQUI ows 1.0
R
fondne ED: PDF rea
ss for o d
ld com er and
puters

HARDWARE ANALYSIS | INTERVIEWS | BUILD A RETRO GAMING PC | LOADS MORE


THE BEST-SELLING MAG FOR PC HARDWARE, OVERCLOCKING, GAMING & MODDING

PC HARDWARE
ENTHUSIASTS
GAMEPAD GROUP TEST 9 PC GAME CONTROLLERS REVIEWED
ISSUE 228 / SEPTEMBER 2022 MINI-ITX MOTHERBOARD LABS / MINI-ITX BUILD GUIDE / GAMEPAD GROUP TEST / AMD FSR 2 / GPU BUYERS GUIDE / HOW TO SPRAY-PAINT YOUR PC CASE

THE BEST-SELLING MAG FOR PC HARDWARE, OVERCLOCKING, GAMING & MODDING / ISSUE COMFORTABLE
228 COMPUTING OPTIMISE YOUR DESK ERGONOMICS

BUILD A

MINI PC
ISSUE 227 / AUGUST 2022 CPU MEGATEST / AMD RADEON RX 6750 XT AND 6950 XT / COMFORTABLE COMPUTING / RESPONSE TIMES TESTED / WATER-COOL YOUR GPU / BIG BOX GAMES

THE BEST-SELLING MAG FOR PC HARDWARE, OVERCLOCKING, GAMING & MODDING / ISSUE 227

12
CPUs
TESTED

CREATE A KILLER GAMING PC IN A TINY PACKAGE


FIT A FULL-SIZE GRAPHICS CARD 1. FIND THE RIGHT CPU 3. TESTED IN GAMES
FOLLOW OUR FULL BUILD GUIDE FOR YOUR NEEDS AND APPLICATIONS
2. PRICED FROM 4. OVERCLOCKED AND
LEARN OUR EXPERT TIPS £165 TO £720 STOCK SPEED RESULTS
CHOOSE YOUR CASE

MINI-ITX
MOTHERBOARDS
PRICES FROM £140 TO £373
INTEL AND AMD OPTIONS

10
BOARDS AUGUST 2022 / £5.99
REVIEWED!

HOW TO WATER-COOL HIGH-END GPUS


001_CPC#227_Cover final.indd 1 25/05/2022 15:59

HOW TO WATER-COOL HIGH-END GPUS


001_CPC#227_Cover final.indd 1 25/05/2022 15:59
SEPTEMBER 2022 / £5.99

HOW TO SPRAY-PAINT YOUR CASE


SEMAG XOB GIB / UPG RUOY LOOC-RETAW / DETSET SEMIT ESNOPSER / GNITUPMOC ELBATROFMOC / TX 0

001_CPC#228_Cover final.indd 1 21/06/2022 15:12

HOW TO SPRAY-PAINT YOUR CASE

ISSUE 228 OUT NOW


001_CPC#228_Cover final.indd 1 21/06/2022 15:12

AUGUST 2022 / £5.99


ESAC CP RUOY TNIAP-YARPS OT WOH / EDIUG SREYUB UPG / 2

SEPTEMBER 2022 / £5.99

10
REVIEWED!
BOARDS

VISIT CUSTOMPC.CO.UK TO LEARN MORE


INTEL AND AMD OPTIONS £165 TO £720 STOCK SPEED RESULTS
PRICES FROM £140 TO £373 2. PRICED FROM 4. OVERCLOCKED AND
FOR YOUR NEEDS AND APPLICATIONS
MOTHERBOARDS 1. FIND THE RIGHT CPU 3. TESTED IN GAMES
RETROGRADE

WELCOME
BEYOND NOSTALGIA
n the face of it, the first PC we had in my family home

O bears little resemblance to the multi-core beasts of today


with at least 16GB of RAM and often terabytes of storage.
Forget running Crysis – it couldn’t even run Eye of the Beholder (it
needed 640KB of RAM). It had one floppy drive (no hard drive), it
could only play most games in four (hideous) colours at 320 x 200,
and the only sound came from a bleeptastic internal speaker.
Even then, there was very clearly room for a lot of improvement,
but one of the things I love about the PC is that there’s been a
EDITORIAL DESIGN
continual line of progress from the 1980s to the PCs we have now.
criticalmedia.co.uk
Apple has flipped between different instruction sets, console EDITOR
makers have come and gone, and the Amiga and Atari ST have Ben Hardwidge HEAD OF DESIGN
disappeared, but the PC standard has stuck with Intel’s x86 [email protected] Lee Allen
instruction set and continued to grow throughout that time. FEATURES EDITOR DESIGNERS
If you like, you can attach a USB floppy drive to your new PC, Edward Chester Ty Logan
stick in a 3.5in boot disk for MS-DOS 3.3 from 1988, set the EFI to boot [email protected]
from the floppy drive, and your brand-new PC will still boot with COMMERCIAL &
CONTRIBUTORS
that archaic operating system, running it natively. ADVERTISING
K.G. Orphanides, Stuart Andrews
I look back on those neolithic PC days with fondness. Double-
ADVERTISING
checking the system requirements stickers on the corners of game PRODUCTION EDITOR
Charlotte Milligan
boxes; making multiple boot disks for different upper memory Julie Birrell
[email protected]
configurations; poring over the multi-page retailer price lists in PHOTOGRAPHY +44 (0)7725 368887
bumper PC mags; gawping at 256-colour VGA after upgrading Andrzej W K, Ben Hardwidge,
from CGA; first hearing synth music in games with a Sound Brian O’Halloran, Darklanlan, SUBSCRIPTIONS
Blaster; and being astounded by the first 3D accelerator cards. Fiacre Muller, Henry Mühlpfordt, Unit 6 The Enterprise Centre
But at Custom PC we like to delve into PC hardware to find what Konstantin Lanzet, Maddmaxstar, Kelvin Lane, Manor Royal,
makes it tick, and the same goes for the vintage gear in our Retro Matt Britt, Peter Short, Qurren, Crawley, West Sussex, RH10 9PE
tech section. We don’t just want to say ‘hey, remember this!’ in the Samuel Demeulemeester,
Phone
hope of triggering a bout of fuzzy, rose-tinted nostalgia – we want Tullius, Vlask
01293 312182
to get our hands on the old hardware, show you how it worked and
PUBLISHING Email
tell you the story of its development.
PUBLISHING DIRECTOR [email protected]
This free digital book contains nearly 100 pages of content
Russell Barnes Website
from that section, not only covering the inner workings of loads
[email protected] custompc.co.uk/subscribe
of PC hardware and software from the 1980s and 1990s, but also
speaking to some of the people involved and even showing you
how to build your own retro PC. If you like what’s in this book, and Custom PC magazine is published by Raspberry Pi Ltd, Maurice Wilkes
Building, St. John’s Innovation Park, Cowley Road, Cambridge, CB4 0DS.
you like PCs as much as we do, there’s a good chance you’ll also like The publisher, editor, and contributors accept no responsibility in respect
our magazine. Go and have a look at custompc.co.uk of any omissions or errors relating to goods, products or services referred
to or advertised.
EDITOR
Ben Hardwidge DON’T TRY THIS AT HOME The information in this digital book
is provided in good faith. Raspberry Pi Ltd cannot accept any
responsibility for loss, disruption or damage to your data or your
[email protected] computer that may occur as a result of following or attempting
to follow advice given in this digital book. If things do go wrong,
@custompcmag take a break.

3
RETROGRADE

CONTENTS
PROCESSORS 72
6 Intel 286
11 Intel 386
14 Intel 486
18 Socket 7
20 AMD Athlon
23 Intel Slot 1

GRAPHICS
27 CGA
30 EGA
32 VGA
36 3dfx Voodoo
39 PowerVR
42 Nvidia GeForce 11 79

SOUND
47 The Sound Blaster Story
52 The PC speaker
54 Roland MT-32

STORAGE
57 Floppy disks
27
SOFTWARE
62 Windows 1.0
67 Windows 3.1

THE FIRST PC
72 IBM PC 5150

HOW TO
79 Build a DOS PC with a modern twist
88 Install FreeDOS on vintage hardware
92 Emulate DOS on Raspberry Pi

4
RETROGRADE

PROCESSORS

5
PROCESSORS

INTEL 286
K.G. Orphanides takes a technical look
at Intel’s16-bit swansong

W
e’re now so used to tiny transistors that the 7nm Its successor, the 80386, was a true 32-bit processor, with
process used to fabricate AMD’s latest Zen 3 CPUs a 32-bit data bus and memory addressing to match. But even
hardly seems worth mentioning now – it’s hard to as its technology was superseded, the 286 was just hitting
keep track of the numbers of transistors when they get into its stride in the home PC market, which it would dominate
billions. However, you only have to look at early PC CPUs to see until 386 and 486-based PCs started to become vaguely
just how far silicon manufacturing has come. Intel’s 80286 affordable in the early 1990s.
processor was released in 1982, and fabricated on a 1.5µ
(1,500nm) manufacturing process, compared to the 3µ A VISION FOR THE FUTURE
(3,000nm) process used by its predecessor, the 8086. It When development began on the 80286 in 1979, Intel’s
packed in 134,000 transistors: 4.6 times as many as the 8086. product requirements document envisioned that the
By comparison, AMD’s 7nm Zen 2 processors contain up to 9.8 powerful new processor would be primarily used in industrial
billion transistors. applications, from telecoms to manufacturing automation
The 80286 was introduced with an entry-level-model clock and medical instruments. It was explicitly designed to be
speed of just 6MHz. This figure would go as high as 12.5MHz for compatible with the 8086, ensuring that software for the
the popular Intel 80286-12, and up to 25MHz for late-era takes older processor would run without modification on the new
on the CPU by other manufacturers, such as AMD and Harris. It device. But unlike the 80186 (see opposite), PCs weren’t on
would be the last, fastest 16-bit PC processor Intel made. the 286’s original roadmap.
In Intel’s 1984 annual report, which details the 286’s
THE 80287 COPROCESSOR development, release and nascent domination of the industry,
Since the 8086, floating-point coprocessor chips – popularly known as maths coprocessors the company admits that in hundreds of pages of planning
– had been made available as optional additions via a motherboard socket. They allow materials ‘the personal computer – which would eventually
addition, subtraction, multiplication, division and square root calculations on numbers with become its biggest user – wasn’t mentioned once’.
decimal points to be carried out more quickly than on a standard integer unit, improving The 80286 was announced in February 1982, and the
performance in arithmetic-intensive applications. designers had a working prototype to show industry partners
Originally, that was mostly accounting and computer-aided design (CAD) software, but
that spring, promising ‘about three times the performance of
later games were also able to take advantage of the hardware, notably including 1989’s
SimCity and flight sims such as Falcon 3 in 1991. The 486SX series was the last range of Intel
any other 16-bit microprocessor’. However, after initial testing
CPUs to be released without a built-in maths coprocessor – its sibling 486DX integrated a of the first 286 wafers, ‘progress just seemed to drop to a
floating point unit into the CPU. snail’s pace’, according to logic design supervisor Jim Slager,
again quoted in Intel’s 1984 annual report. The processor
wasn’t yet running fast enough, and the testing programme
for CPUs that would come off the manufacturing line was
running late.
But in June 1982, IBM – then the world’s largest maker of
computers – came calling. IBM had been using Intel’s 8088
since 1979 and it was looking to give a power boost to its next
generation of PCs: the IBM model 5170, better known at the
An optional 80287 coprocessor provided the 286 with a floating point unit
IBM PC/AT.

6
THE OBSCURE, WILDLY SUCCESSFUL 80186
Released at around the same time as the 286, the 80186 was
fully software-compatible with the 8086, with an emphasis
on increased performance at the lowest possible cost. It was an
instant success, and Intel produced 30 times as many 80186s as
8086s in the new processor’s first year of release.
Although Intel at one point envisioned the 186 being used
in workstations, word processors and PCs, it was the 286 that
ultimately came to dominate the desktop market. Unlike the 286,
the 186 had its clock generator, timer and interrupt controller –
previously motherboard components – built into the CPU.
However, these integrated components weren’t compatible
with the hardware used in the IBM PC, leading IBM to select the
286 for its PC/AT range of computers.
The 186 was nonetheless massively successful, due to its
speed and ease of integration into other systems, appearing in
coprocessors, communications controllers, flight management
computers and general-purpose microcontrollers.
Intel pulled together a cross-disciplinary task force to It did appear as the main CPU of a few PCs, including the 1986
Sega AI in Japan, the Tandy 2000 in the USA and the frankly
complete the testing tools, address bugs and complete
inexplicable RM Nimbus schools PC in the UK. Intel ended
the parallel development of motherboard components.
production of the 186 in 2007, although fully compatible third-
Marketing focused on a new public presentation of the 80286, party clones are still available.
highlighting its superiority to Motorola’s popular 68000
processor and emphasising that it was far more than a minor
update to the 8086.
Intel emphasised the 286’s multi-user and multi-tasking
capabilities, including variable privilege levels to restrict
access to specific parts of memory, as well as an instruction
set designed to rapidly switch between programs, providing
support for Unix as well as DOS.
The marketing push – and especially IBM’s adoption of
the processor – worked. Chip samples were delivered to
customers later the same year
Unlike the 80186, PCs and, in 1983, volume production
of the 80286 began. The IBM
weren’t on the 286’s PC/AT launched in August
1984, prompting a wave of
original roadmap AT-compatible computers
from companies including The 186 was hugely successful outside of the
Compaq and NEC. By the end of 1988, Intel estimates, there desktop PC world. Image credit: Konstantin Lanzet
were around 15 million 286-based PCs in use worldwide.

A DIFFERENT MODE To ensure backwards compatibility, the system has to boot


Changes to memory handling were a headline feature of the in real mode and then be switched into protected mode by
286, but software support was slow to emerge. The processor setting a status register bit. To get out of protected mode, you
introduced protected mode memory addressing and retained have to reset the CPU. This switching process was crash-prone
real mode addressing to ensure compatibility with applications in some versions of IBM’s OS/2 operating system, where it
designed for the 80186, 8088 and 8086. was used to provide an MS-DOS compatibility mode. Some
In real mode, like the 8086, the 286 can address up to 1MB manufacturers put out specialised motherboards, which
of memory via a 16-bit address bus. In protected mode, it integrated additional ‘warm reset’ capabilities.
can address up to 16MB of memory using a 24-bit bus. This However, protected mode simply wasn’t used by MS-DOS,
approach has security and stability benefits in that, in protected the most popular operating system used with the processor.
mode, different programs and users can’t access memory Instead, an undocumented instruction, LOADALL, allowed
segments in use by others. the CPU to access all memory from real mode. It was
Protected mode made the 286-compatible with Unix- critical to the HIMEM.SYS file used to manage memory, and
based operating systems such as Microsoft Xenix, and its allowed real-mode processes to access up to 16MB of RAM
secure memory handling made it possible for up to eight users by updating the segment-descriptor cache to point at an
on terminals to be connected to a 286-based Xenix server. extended memory address.

7
PROCESSORS

Sierra’s King’s Quest II: Romancing the Throne explicitly supported the 286-based IBM PC/AT

Protected mode would evolve with the later adoption


of 32-bit addressing in the 80386. By 1988, Windows
3.0 was able to take advantage of a 16-bit protected
mode environment, compatible with both 286 and 386
processors, and Microsoft released compilers and SDKs for The 80287 add-on maths coprocessor could be used to
The 80286 die was improve performance in some games, including SimCity
built on a 1.5-micron third-party developers.
(1,500nm) process. Windows 3.0’s use of 16-bit, rather than 32-bit, protected
Image credit: Pauli mode memory addressing ensured backwards compatibility
Rautakorpi / CC BY
with the 286, but this would be abandoned with the release NEW INSTRUCTIONS
creativecommons.
org/licenses/ of Windows for Workgroups 3.11, which requires the 32-bit Developed simultaneously, the 286 and 186 shared a number
by/3.0 protected mode introduced with the 386. of new additions to their instruction set architecture, above
and beyond those of the original 8086. Like its predecessor,
the 286 instruction set has a 16-bit word size – the number
of bits (binary on/off switches) on which it can operate with
one instruction.
Shared with the 80186 are the ENTER, LEAVE, BOUND, INS,
OUTS, PUSHA, POPA, PUSH immediate and IMUL immediate

A 10MHz 286 could execute


programs up to six times
faster than a 5MHz 8086
instructions, and a range of immediate shifts and rotates. These
include both mathematical operations, such as the signed
integer multiplication of IMUL, and data handling operations. An
example of the latter is PUSHA (push all registers), which saves
the contents of all eight general registers, used to temporarily
store data, to the stack, to and from which instructions can store
or retrieve data.
The 80286 additionally added ARPL, CLTS, LAR, LGDT, LIDT,
LLDT, LMSW, LSL, LTR, SGDT, SIDT, SLDT, SMSW, STR, VERR
and VERW. Most of these instructions are used for protected
mode memory handling, but a few, such as SMSW (store
machine status word) and LMSW (load machine status word)
are used in real mode.

8
The 286’s machine word status is used to indicate the
presence of features such as an 80287 maths coprocessor
(see p106), and whether the CPU is supposed to be running
in protected or real mode. The introduction of instructions
to efficiently end the execution of a task, save its state and
switch to another, loading its last state, significantly improved
multitasking performance.

PERFORMANCE
The 286 provided a marked performance boost over the
8086 and 8088. This was in part down to faster clock speeds,
particularly when 12.5MHz, 16MHz and even faster 286 CPUs
became popular. The CPU also benefited from significant
architectural redesigns, enabling a 10MHz 286 to execute
programs up to six times faster than a 5MHz 8086, according
to Intel’s Introduction to the iAPX 286 document.
A 12MHz 286 can calculate between 1.28 and 2.66 million
instructions per second (MIPS), compared to 0.330 MIPS
for a 5MHz 8086 and 0.750 MIPS for a 10MHz 8088. The
286’s instructions per clock (IPC) count works out at 0.21
MIPS per megahertz. To help achieve this, the 80286 CPU
comprises four independent processing units: address unit,
bus unit, instruction unit and execution unit, compared with
the two-unit execution and bus organisation of the 8086. It
has demultiplexed address and data buses to improve bus
efficiency, particularly in protected mode.
The instruction unit can decode and hold a queue of three
prefetched instructions, which it sequentially feeds to the
execution unit. Meanwhile, the presence of a dedicated
address unit, which calculated the physical addresses in
memory of the instruction and data being called upon, offered
a key performance improvement over previous systems.

GAMING
The 286’s extra power meant that more was possible for game
By 1990, memory-hungry games, such as the US release of Sorcerian, advertised their need developers. New instructions for moving data between stacks
for ‘AT-compatible’ PCs and registers benefited those working in high-level languages
such as C. Although the increasing multimedia capabilities of
PC systems through the 1980s also played a significant role,
the PC’s processor power was becoming apparent. That said,
in the 1980s, 286 systems were still prohibitively expensive
compared with more family-orientated microcomputers, as
well as low-end 8086-based PC-compatible machines.
Despite this, the second instalment in Sierra’s King’s Quest
series, 1985’s Romancing the Throne, explicitly supported the
286-based IBM PC/AT, booting directly from a floppy disk. By
1990, popular series, such as Ultima and Wizardry, which had
once been developed for rival systems, such as the Apple II and
IIGS, were receiving MS-DOS first releases.
It wasn’t all positive. Some older games whose performance
was fixed to clock cycles became unplayably fast, which led
to the widespread use of ‘Turbo buttons’, which would slow
the system down to clock speeds comparable with 8086 and
8088 CPUs. Other 286 PCs had a BIOS option to do the same,
and utilities such as Mo’Slo were developed in the 1990s to
The 80286 has a dedicated address unit, bus unit, instruction unit and execution unit slow down overspeed games.

9
Get the competitive edge you need to unleash your full gaming potential with the 24’’ and 27‘‘ G-Masters
offering 0.8ms MPRT and 165Hz refresh rate. Armed with FreeSync Premium you can make split second
decisions and forget about ghosting effects or smearing issues. The ability to adjust brightness and the
dark shades with the Black Tuner delivers greater viewing performance in shadowed areas and the IPS
panel technology guarantees superb image quality.

0.8ms
MPRT

Fixed stand versions:


24‘‘ G2470HSU-B1 & 27‘‘ G2770HSU-B1 Find your match at
Version with height adjustment: gmaster.iiyama.com
24‘‘ GB2470HSU-B1 & 27‘‘ GB2770HSU-B1
PROCESSORS

INTEL 386
Ben Hardwidge looks back at the PC’s first 32-bit CPU

W
e often complain about the over-inflated price Just like the prices, the numbers involved with the
of graphics cards these days, but the prices of manufacturing process of the 386 are staggering compared
today’s PC components are extraordinarily with today’s CPUs. The first 386 chips contained 275,000
generous in comparison with the early days. If you want the transistors, which made them a marvel of miniaturisation at
latest top-end Threadripper CPU, the fastest gaming GPU the time, but that’s a piddly number compared with over 9
and an enormous amount of storage, a machine such as billion transistors, which you’ll find in the Ryzen 9 3950X
Chillblast’s Fusion Conqueror (see p32) will deliver all of it across all its dies. In terms of raw transistor numbers, a
in a well-built machine for £5,999 inc VAT. Ryzen 9 3950X is like 35,000 386 CPUs.
Now, I’m not going to pretend that’s a small amount of money Those transistors were massively bigger as well,
Inside a 386 die,
– it’s unaffordable for most of us. But, to get some perspective, produced on a 1,000-1,500nm node, compared to 7nm in with 275,000
let’s take the TARDIS back to September 1986, when Compaq AMD’s latest CPU dies. The very first CPUs off the production transistors
released the Deskpro 386, marketed as the first ‘true’ 32-bit
computer. This was a good 11 months after Intel first launched
the first 12MHz 386 CPUs, but seven months before IBM’s first
386 machine got out of the doors, marking a new era where
‘clone’ PCs were becoming dominant.

Adjust that figure for 34


years of inflation, and
the price is £19,890
The top-end launch model of the Deskpro came with a
16MHz Intel 80386 CPU, 1MB of RAM, a 130MB hard drive and
a 1.2MB 5.25in floppy drive. It cost $8,799 US, which works out
at around £6,737 (the exchange rate in 1986 was very similar
to now). Adjust that figure for 34 years of inflation, and the price
is £19,890, and that doesn’t even include VAT.
If you couldn’t afford superfluous luxuries such as a
130MB hard drive, you could alternatively plump for the
cheaper model with a 40MB drive – a bargain at $6,499 US
(£14,694 ex VAT, adjusted for inflation). This is why most PC
users at this time used machines with much older CPUs,
often with no hard drive and small amounts of memory, for
many years – I was still using an 8MHz 8086 a good 12 years
after 1978 when that CPU was first launched.

11
PROCESSORS

line were clocked at 12MHz, then 16MHz, with 20MHz, In order to maintain backwards compatibility, the 386 still
25MHz and 33MHz flavours launching later – even the latter retained this segmenting approach in ‘real mode’, but it also
is around 1 per cent of the clock speed we see on today’s offered a new form of ‘protected mode’. This mode was first
CPUs. Pin-compatible CPUs were also made by AMD, as introduced with the 286 to allow the use of virtual memory
well as other manufacturers, including Cyrix. (effectively paging to a hard drive). However, the 386 added an
on-board paging translation unit to mediate between the
MEMORY MANAGEMENT segments and the physical address bus, which effectively
The first 32-bit x86 CPU was big news in the computing enabled the computer to present all these segments as one big
world though. While Motorola’s 68000 (used in the Atari ST sea of memory, even though it was technically still segmented.
An Intel marketing and Commodore Amiga, among others) had introduced us to It made for a much friendlier memory system for software
shot for the 386 an internal 32-bit CISC CPU architecture back in 1979, it also developers, particularly for memory-hungry graphical user
shows a 16MHz 386 used a 16-bit external data bus and a 24-bit address bus. interfaces, and it paved the way for PCs with ever larger
CPU, as well as an
Intel’s first 80386 CPUs were 32-bit internally and across memory allocations.
80387 CPU and
some 1.2MB 5.25in external buses, offering a huge advance over the previous
floppy disks 16-bit 8088, 8086 and 80286 processors. THE JOY OF SX
The ability to address so much memory was overkill for the home
market, though, and the prices of original 386 machines put them
well out of the reach of this market anyway. To get the 386 into
home machines, Intel introduced a cut-down version called the
386SX, with the original design now getting the ‘DX’ suffix.
This isn’t to be confused with the ‘SX’ and ‘DX’ suffixes
used on the later 486 chips though. When it came to 486
CPUs, the DX versions had a built-in floating-point unit, called
a math coprocessor at the time, while the SX chips only had
an integer unit, although you could add an 80487 math
coprocessor to most 486SX machines separately.
Conversely, neither the 386SX or DX had a built-in floating
point unit – you needed a separate 80387 coprocessor if you
wanted that. The difference between the 386SX and DX
was that the former had a 16-bit data bus, although it kept the
CPU’s internal 32-bit architecture. The idea was that having
a 16-bit data bus would cut down on the need for highly
intricate PCBs with loads of traces, reducing the cost of
manufacturing. The other knock-on effect of fewer
connections was that a 386SX could only address up to
16MB of RAM. However, as we’ve already covered, this was
still way more than enough for the home market at the time.

SOFTWARE
The big problem for the 386 for most of its useful lifespan
In theory, this meant a PC could now address 4GB of was mainstream software support. An executable file called
RAM (a limit that would only become seriously challenged
20 years later), although realistically the limits of A special 386 version of Links gave you gorgeous SVGA graphics
technology at the time meant that most 386 PCs could for the time
only address up to 32MB, and even that was considered
overkill. For reference, my 386 PC in the 1990s came with
4MB of RAM, but I upgraded it to 8MB using 30-pin
SIMMS and it felt decadent.
More importantly for the time, the 386’s memory
system was designed to be easily extended well beyond
the 640KB base memory limit of MS-DOS. The ins and
outs of archaic memory systems are well beyond the
scope of a two-page nostalgia piece, but the basic gist is
that a 16-bit x86 CPU could only address 64KB of memory,
so any memory on top of this figure had to be divided into
‘segments’ that it could address separately.

12
Master II, Myst The Elder Scrolls: Arena, Sim City 2000
and UFO: Enemy Unknown (otherwise known as XCOM) all
required a 386 CPU as the bare minimum. There was also
a special 386 version of the Golf game Links, giving you
superior graphics at 800 x 600.
That said, I ran many of these games on my 20MHz 386SX in
the early 1990s, and while they technically worked, I usually had to
run them at extremely low detail, and even then the frame rate
would have been unacceptable by today’s standards. Running
Doom required me to have big bars around a tiny screen in order
to make the game playable.

THE 386’S LEGACY


There was clearly room for improvement where gaming was
concerned, but the 386 laid the foundation for what was to
come, being the binary blueprint for many of its successors. It
Technically, you
EMM386 was made a part of several variations of DOS to introduced us to the IA-32 (sometimes called i386 or x86)
could play Doom
on a 386, but only allow these primitive operating systems to access a 386’s standard that’s still used by some software today – any
on a tiny screen extended memory, but Microsoft’s Windows operating Windows software in the ‘Program Files (x86)’ folder on your C
surrounded by system was still stuck in the 16-bit era at this time. There drive will be fundamentally based on this instruction set, and
a big frame
were nods to the 386’s capabilities in Windows 3, including Intel continued to develop new IA-32-only CPUs well into the
a 386 Enhanced Mode (if you had 2MB of RAM) that let you Pentium 4 era. It was only when AMD launched its first 64-bit
run DOS and Windows software at the same time, but there AMD64 CPUs in 2003, and PCs started bumping up against that
was no mainstream 32-bit operating system. 4GB memory limit, that mainstream CPUs started to push into
It wasn’t until Microsoft introduced Windows NT 3.1 in 1993 the 64-bit era.
that 32-bit Windows became a reality, but even then there was
little supporting 32-bit software, and it also ran slowly on most
machines at the time. It wasn’t until Windows 95 came out, ten It was the 386SX that got
years after Intel made the first 386 CPUs, that the 386’s internal
32-bit architecture was properly used in everyday software. It these powerful PCs into
introduced the Win32 API, giving you proper 32-bit computing
abilities, and it enabled filenames longer than eight characters.
our homes for gaming
I was thrilled at the time. I was still using a 20MHz 386SX with
8MB of RAM as my main PC, which just satisfied Windows 95’s In many ways, my old HP Vectra 386 is the PC for which I
system requirements. It was dog-slow, of course. As a hold the most affection from the past. I was still using my
reference point, in the morning I would switch on the PC. Then I 20MHz 386SX up until 1997, a good 12 years after the first 386
would go downstairs, eat a bowl of Weetabix, then make and chips came off the production line, and I’d pushed my machine
drink a cup of tea. By the time I got back to my PC, Windows 95 as far as it could go. Every 30-pin SIMM slot was filled; all the
would have just about finished loading. Windows 95 was really IDE channels were occupied by hard drives and a quad-speed
designed for 486 and Pentium machines, but you could still run CD-ROM drive; most of the 16-bit ISA slots were taken up by a
it on a 386, finally fulfilling its 32-bit promise. 1MB SVGA card, a 14.4K modem and a 16-bit sound card (with
That doesn’t mean the 386 was useless for all this time a wavetable daughterboard).
though. It still had loads of power when acting as a 16-bit It’s so different to my PC now, which only has one of its
processor – upgrading from a 16MHz 286 to a 33MHz 386 expansion slots filled. My 386 might have struggled with
made a huge difference to the performance of Windows 3.1, Windows 95 and Doom, but it ran Windows 3.1 well, and it
and for gamers, the 386 was the holy grail. This was before made for an awesome setup for playing Dune, Civilization and
the days of GPUs and 3D accelerators, so every aspect of the LucasArts adventures.
number crunching for games was performed on the CPU, Plus, while the 386SX was considered to be limited in
which meant you needed all the CPU power you could get. comparison with the 386DX at the time, it was the 386SX that
By the early 1990s, PC gaming had started to progress got these powerful PCs into our homes where we could use
from basic EGA graphical adventures and platform them for gaming. Once the 386 started getting into homes, the
games, and were starting to see games that really took PC started to take off as the leading games machine that we
advantage of processing power. If you wanted to play know today. It was the 386SX that that first properly put the PC
Wing Commander II or Strike Commander, you really in front of the Amiga and Atari ST when it came to gaming power,
needed to have a 386, and preferably a 486. Meanwhile, leading to PC exclusives such as X-Wing and Myst, and the PC
X-Wing, TIE-Fighter, Doom, The 7th Guest, Dungeon has never looked back since.

13
PROCESSORS

INTEL 486
Stuart Andrews recalls the mighty CPU that
made the PC the ultimate powerhouse

T
he 486 went into development at an interesting In theory, the i860 should have trumped any 386
time for Intel. The Intel 386 line had seen Intel successor, but in 1985, Intel’s CEO, Andy Grove, put John
snatch a victory from the jaws of a disaster, making Crawford and hotshot architect Pat Gelsinger in charge of
up for the failure of Intel’s new-fangled iAPX432 architecture the design. Crawford and Gelsinger had already worked
with a mix of strong compatibility and great performance. Its together on the 386 and shared a strong belief in the The 1st-generation
design team, led by chief architect John Crawford, had potential of the x86 and CISC architecture. Both felt that, 486 was twice as
dragged the 16-bit x86 architecture into the 32-bit era and while RISC had its advantages, a redesigned x86 chip fast as a 386 with
kept Intel ahead of the pack. could keep up. the equivalent clock
speed. Image credit:
But other manufacturers were moving fast. Arch-rival What’s more, it could do it without forcing big software Andrzej W K, own
AMD was already developing its own 386 CPUs and only publishers to redevelop their applications, rebuild operating work, CC BY-SA 3.0
Intel’s litigation was delaying their release. Cyrix was already
producing Intel-compatible maths co-processors and was
threatening to move into CPUs. Intel needed an awesome
new product.

It combined a tighter, more


streamlined pipeline with
an integrated L1 cache
On the other hand, there was a lot of conflict within Intel
– and within the computing community at large – over the
future direction of processor architecture. Many felt that
CISC (Complex Instruction Set Computer) architecture, as
used in the x86 line, was a technological cul-de-sac; that
performance would flatten out within a few years as Moore’s
law met fundamental barriers of computing.
They saw RISC (Reduced Instruction Set Computer)
architecture as the future, using a smaller number of more
versatile instructions and optimising the hell out of the
architecture to drive performance. While one team at Intel
worked on a successor to the 386, another was working on
a new RISC processor that eventually became the i860. You
probably haven’t heard of the i860, which tells you a lot about
how this situation played out.

14
systems and optimise compilers. When you threw more The big innovation was to combine a tighter, more
transistors at the problem and increased their frequency, streamlined pipeline with an integrated L1 cache – a first in a
there was no reason why a CISC chip couldn’t compete with mainstream CPU. With 8KB of high-speed SRAM as a store
a RISC CPU. Apply Moore’s Law and keep increasing speeds, for recently used instructions and data on the same silicon,
and a CISC chip might even crush it. the instruction pipeline could be fed with a consistent flow,
enabling it to execute the simplest and most commonly
OPTIMISE THE PIPELINES! used instructions at a sustained rate of one per clock cycle –
Gelsinger and Crawford focused on delivering a processor an achievement that RISC devotees believed was beyond a
that was fully 386-compatible and would build on the existing CISC processor.
32-bit architecture but would give you a massive increase The new pipeline had five stages, although the first – the
in performance – at least double, clock for clock. They took Fetch stage – wasn’t strictly necessary for each instruction,
inspiration from what was going on with the new RISC CPUs, as the CPU could fetch about five instructions with every
paying particular attention to how instructions were loaded, 16-byte access to the cache. Once fetched, instructions
organised, decoded and executed on the CPU. went through two decoding stages, where they were
organised and fed into the execution units. Here they
were executed, and the results written back to registers
or memory in a final write back stage.
The cache minimised any delay in loading data and
instructions, and did such an effective job of caching data
and instructions that the processor only had to go to system
memory on roughly 5 to 10 per cent of memory reads.
What’s more, many 486 motherboards incorporated a
secondary cache with 16KB or more of high-speed RAM,
reducing latency even further. Meanwhile the two decoder
stages enabled those instructions to be pipelined and
processed more efficiently – with five instructions running
through the pipeline, one would normally be processed
The 2nd-generation DX2 chips doubled their predecessors’ clock with every clock cycle.
speed, a feat never replicated by any subsequent Intel CPU. The result was a spectacular improvement in
Image credit: Henry Mühlpfordt, own work, CC BY-SA 3.0 performance. On integer instructions – very much the
meat and potatoes of computing at the time – the 486 was
THE RIVALS at least twice as fast as a 386 running at the same clock
If Intel’s processor design teams put the 486 far ahead of the speed, and sometimes 2.5 times as fast. This meant the With enough
pack in terms of performance, its legal teams did a cracking job CISC-based 486 could hit similar levels of performance to processing power
of suppressing any competition. However, eventually Cyrix and to run it full-screen
the RISC-based i860, while still being compatible with all
AMD won their legal fights, and 486 competitors began to appear. at a full VGA
the existing x86 software. There was no need to rebuild
Cyrix’s 486SLC and DLC processors, released in 1992, were resolution, Doom
particularly interesting. or recompile – code developed for the 286 and 386 became the 486-
Effectively a 386DX with a 486 instruction set and just 1KB of L1 just worked. DX2’s killer app
cache, they still used a 32-bit bus and gave users a cheap halfway
house – a 486DLC33 could run software at roughly the same
speed as a 25MHz 486-SX. Not only were the processors more
affordable, but they plugged into existing 386 motherboards,
meaning the platform as a whole was cheaper.
I had one of these beauties in my first PC, and while it was
noticeably less capable than my friend Brian’s mighty 33MHz 486-
DX, it could still run X-Wing, Ultimate Underworld II, Alone in the
Dark and – eventually – Doom. Ultima VIII: Pagan? A bit more of a
slideshow, but then it wasn’t a great Ultima, so who cares?
AMD released its own 486 chips in 1993, and while they were
late to the party, AMD made up for it with a repeat of a classic 386
performance trick. AMD’s CPUs ran on a 40MHz bus, meaning
that the, SX-40 and Am486DX/2-80 were slightly faster than the
equivalent Intel CPUs.
Meanwhile, AMD’s straight Am486 DX-25 and 33 and SX-33
gave you the same performance as Intel’s equivalents at lower
prices. AMD even released what it called the AM5x86-133 in 1995,
which competed with the low-end Pentium 75 but was actually a
486 running on a 4x multiplier with a 33MHz clock.

15
PROCESSORS

With over 1.2


million transistors,
the 0.8micron
486-DX2 relied
on Moore’s Law
and higher clock
speeds to trash
the theoretically
superior RISC
competition. Image
credit: by Matt
Britt, own work,
CC BY-SA 3.0

At this point, floating point instructions weren’t so This meant there was less overhead in shifting data
commonly used, but here the news was just as good. between CPU and FPU; this, combined with other
Previous Intel processors had worked with optional, optimisations, resulted in a significant improvement in floating
discrete maths co-processors, which handled all the point performance. Fast forward a few years, and Quake
floating point logic. These were expensive and not would require a CPU with a floating point unit, with the system
popular outside of business, as only a few major business requirements citing a 486-DX4 as the minimum. Today,
applications, such as dBase, Borland Quattro Pro and Lotus it’s impossible to imagine a CPU without an FPU, and that’s
1-2-3, actually used a Floating-Point Unit (FPU). The 486- thanks to the mighty 486.
DX, however, integrated one directly onto the processor die, Beyond this, differences from the 386 were relatively
connected to the CPU by its own dedicated local bus. small. The 486 had a few extra ‘atomic’ instructions that
sped up some basic operations, but nothing compared with
OVERDRIVEN the instructions added with the 80286 or 386. The 486
The 486 marked another shift in Intel’s tech and marketing strategy by embracing the also didn’t mess with the 386’s memory model; it could still
whole idea of PC upgrades. As Intel released its clock-doubling DX2 and DX4 processors, address 4GB of RAM across a 32-bit bus, with a protected
it also released Overdrive versions designed to boost existing PCs. Some 486 Overdrive mode that presented both real and virtual memory as one
processors were simply replacement CPUs, plugging into the existing 168-pin socket and
big pool. However, its improved Memory Management Unit
replacing, say, your 25MHz 486-SX with what was effectively a 50MHz 486-DX – albeit at
an eye-watering cost of $549 to $699 US.
performance meant it was much more efficient at shifting
At this point, not every CPU could be removed from its socket, but luckily many 486 data between the system RAM, the CPU and the cache.
motherboards shipped with their own 169-pin upgrade socket, originally designed to fit a
487-SX maths coprocessor for 486-SX machines. Sneakily, the 487-SX was actually a fully DOUBLE THE CLOCKS!
functional 486-DX with an extra pin that told the motherboard to ignore the existing CPU, There was one final architectural change that was to have
and the OverDrive chips just repeated the trick with some extra control circuitry with 50MHz a major impact, even on today’s PCs. Intel CPUs from the
and 66MHz 486-DX2 CPUs.
8086 to the 1st-generation 486 ran at the same frequency
Doubling your speed was definitely tempting, and SX owners got a maths co-processor
in the mix as well. And while Intel pushed the benefits with AutoCAD, WordPerfect and as the external bus that connected all the core components
Corel Draw, the biggest sellers for OverDrive chips were undoubtedly games such as Strike together. This meant that the initial 486-DX processors,
Commander, Falcon 3.0 and Doom. introduced in 1989-1990, ran at the same 20, 25 and 33MHz
speeds as the I/O bus. Intel pushed speeds higher, releasing

16
Games such as Strike Commander pushed the 486 architecture to
its limits with advanced 3D texture mapping and Gouraud shading

When Intel tried the same trick with the 386, it released
a hobbled version with a 16-bit data bus and slower clock
speeds, but the 486-SX was basically a 486-DX with the FPU
disabled. At the time, with so little software that supported the
FPU, this wasn’t much of an issue, and by the time the 486-SX
was released, it only cost around $250 to $300 US.

THE 486 EFFECT


The power of the 486 was transformative at a time
when the CPU was the biggest star of the PC show. Sure,
Cyrix’s low-cost a 50MHz 486-DX, but the 50MHz bus speed began to it was supported by a platform where VGA and SVGA
486 alternatives cause problems for components elsewhere on the bus. graphics cards were growing more powerful, and where
would work inside
a 386 motherboard, Luckily, the 486 design team had an ace to play: it standardisation around the VESA local bus and, later, PCI
making them decoupled the CPU clock speed from the motherboard clock standards was opening up the PC for more powerful add-on
the bargain speed and enabled the CPU to run at double the system cards. However, the 486’s advances in integer and floating
Intel alternative
of the day
clock. This fired up the 486-DX2, launched in 1992, to run at point performance arrived just at a point where advances in
internal speeds of 40MHz, 50MHz and even a staggering gaming graphics needed them most.
66MHz, making the 66MHz 486-DX2 the RTX 3080 of its In the early 1990s, as prices dropped to more affordable
day in terms of its impact on gaming performance. levels, the 486 hit its peak. Just check out the games
The 486-DX4, introduced two years later, went even that emerged. Ultima Underworld and its sequel, Strike
further, tripling the bus speed to hit 75MHz and 100MHz; a Commander, Wing Commander III, X-Wing, Ultima VIII:
staggering level of performance that trashed the available Pagan, IndyCar Racing and Alone in the Dark all launched
RISC competition. The team’s confidence in the x86 between March 1993 and December 1994, and with their
architecture no longer looked misplaced. texture-mapped, Gouraud-shaded 3D graphics, these
PC showcases needed all the processing grunt that they
WALLET-WHACKING POWER could get.
So, the 486 launched with an undeniable advantage in A few simulations, such as Spectrum Holobyte’s Falcon
performance in a market where – thanks to Intel’s ace legal 3 and Digital Image Design’s TFX, even used the FPU. And
department –other x86 chip vendors had practically nothing. then, of course, came Doom; a game that you could just
There was just one problem. While Intel had moved production about run on a 386 in a stamp-sized patch in the middle of
down to a 1-micron process, it still had over 1.2 million transistors the screen, but looked amazing running full-screen at the
– a big step from the 275,00 in the original 1.5-micron 386. This full VGA resolution on a 66MHz 486-DX2.
made it a comparatively big chip and, partly thanks to its $250 If all those other games had pushed the PC as the high-
million US R&D costs, also an expensive one. end gaming platform of the early 1990s, Doom confirmed it.
At launch, the 33MHz 486DX alone cost around $950 US Even when the PlayStation and Saturn consoles launched
(nearly $1,900 in today’s money), which was roughly three a few years later with their fancy-pants, hardware-
times the cost of the equivalent (and still pretty speedy) 386. accelerated 3D tricks, they still struggled to run Doom classic
A 486 PC cost users somewhere north of £2,000 (roughly smoothly in full screen. The 66MHz 486-DX2 could do it
£4,500 today). Intel’s response – you guessed it – was to put on its own, simply using sheer number-crunching power.
out a cut-down, cost-conscious alternative, and 1991’s 486-SX People saw it, liked it and pulled out their wallets. The idea
wasn’t actually such a bad deal. of the PC as the real gaming powerhouse was born.

17
PROCESSORS

SOCKET 7
Ben Hardwidge recalls the strange pocket of time
in the 1990s when one motherboard could support
CPUs from multiple manufacturers

J
ust imagine if you could pick up any one of the parties such as AMD and Cyrix to make clone chips to fill out Socket 7 was found
motherboards in this month’s Z490 Labs test the supply and meet demand. That all changed when Intel on both AT and ATX
motherboards,
(see p44), stick a Ryzen chip in it and know it introduced the first Pentium-branded CPUs. with chipsets
would not only fit, but also work fine. In fact, imagine you’re From this point, third-party companies weren’t from multiple
not just limited to Intel and AMD CPUs, but you could put a allowed to reproduce Intel’s flagship desktop CPU chip makers.
Photo credit:
CPU from all sorts of other chip manufacturers in your microarchitecture, or use the Pentium brand. Instead, the
Konstantin Lanzet
brand new motherboard. Not only that, but there’s a choice old clone chip suppliers, which still had an x86 licence from
of chipsets all designed to work with all these CPUs as well. the cloning days, had to design their own CPUs.
We’re so used to exclusive socket and chipset designs The first Pentium CPUs were launched on the 5V Socket
now that the idea seems like commercial suicide, but this 4. In this era, Cyrix and AMD instead focused on launching
was the situation in the Socket 7 era of the 1990s. This ‘5x86’ CPUs designed as upgrades for existing Socket
period is a strange little oasis in the time between times, 3 486 motherboards, as did Intel’s Pentium Overdrive
where CPU and chipset manufacturers just assumed their CPUs. However, it was the later 3.3V Socket 5 and Socket 7
parts needed to be compatible with each other. platforms that saw Intel, Cyrix and AMD targeting the same
Until this time, Intel had completely governed the design CPU socket.
of x86 CPUs, bringing us the 8088, 8086, 286, 386 and The only difference between Socket 5 and 7 was that
486 (and others) in various guises, and drafted in third the latter upped the total pin count from 320 to 321, and

18
Socket 7 supported Socket 7 could provide dual voltage to the CPU via a split If you used an AT power supply, you also had to physically
CPUs from multiple rail. The sockets are otherwise basically the same, to the switch off the PC after use, as it couldn’t be shut down
manufacturers,
including Intel,
point where you could put a Socket 5 CPU in a Socket 7 with software. Again, though, this was a strange crossover
AMD, Cyrix and motherboard and it would run fine. period, and there were motherboards that conformed to
IDT. Photo credit: the AT form factor, but which also had both AT and ATX
Konstantin Lanzet CHIPSET CHOICES power sockets.
This was before AMD made its own chipsets, but there While Intel was busying itself with ATX and Slot 1, though,
were still plenty of options. If you wanted the best AMD and its chipset partners went all out on Socket 7. The
compatibility, your best Socket 7 option was Intel’s Triton result was Super Socket 7, which maintained compatibility
series, which peaked with the Triton 430TX in 1997. The with older Socket 7 CPUs, but also supported AGP graphics
430TX supported either a 60MHz or 66MHz front side bus, cards and could clock the front side bus at up to 100MHz.
and also gave you the option of three types of memory – There was also a range of Super Socket 7 motherboards in
fast page non-parity, EDO and SDRAM, with the latter two both ATX and AT form factors.
options coming in the brand new DIMM form factor. This Super Socket 7 was great for cash-strapped enthusiasts,
led to many motherboards as it meant you could keep most of your old PC – the PSU,
The K6-III even pushed coming with both DIMM case, hard drive and even the memory in many cases; you
slots and the older 72-pin just needed a new motherboard and CPU if you wanted a
the clock speed up to SIMM slots. decent upgrade. It was massively cheaper than upgrading
However, your choice to Pentium II.
550MHz wasn’t limited to Intel You could run a Pentium CPU in a Super Socket 7
chipsets. Plenty of third- motherboard too, or a Cyrix M-II or IDT WinChip 2, but what
party chip makers, including VIA, ALi, SiS and Opti had their you really wanted was an AMD K6-II or K6-III. AMD’s last
own Socket 7 chipset options. For the most part, they held Super Socket 7 CPUs really pushed the limits of this old
up pretty well, and they were usually cheaper than genuine socket and the AT era, with the 100MHz front side bus
Intel boards, but there were also sometimes compatibility often making these systems faster than the 1st-generation
problems. As an example, when I worked in a computer 66MHz Pentium II CPUs, while costing much less money.
shop in the late 1990s, we often had problems with ALi- The K6-III even pushed the clock speed up to 550MHz, and
based motherboards not working with the 32x Samsung integrated 256KB of L2 cache onto the die.
CD-ROM drive we stocked.
END OF AN ERA
IS IT A BIRD, IS IT A PLANE? NO, IT’S SUPER SOCKET 7! AMD finally moved to its own Slot A platform with the first
Intel pulled the plug on Socket 7 after the Pentium MMX, Athlon CPUs, as well as introducing the Ironbridge chipset
and instead moved its Pentium II CPUs to the new Slot 1 under its own brand, before it discontinued the K6-III at the
format (see Issue 200, p107). In the meantime, it settled on end of 2003, eight years after Intel first launched Socket 7.
ATX as the motherboard and PSU standard for Pentium II. Meanwhile, Cyrix was bought by VIA, which later produced
There were some ATX Socket 7 motherboards, but a few CPUs for Intel’s Socket 370 platform, as well as its
most of them used the older AT form factor, which split the own embedded EPIA platform. But the days of multiple
main power socket into two parts and only had a (large DIN CPUs being supported by one socket are now over – the
socket) keyboard output fixed to the board as standard – mainstream desktop PC market has since been mainly
the rest of the ports all connected to the motherboard with dominated by just Intel and AMD using their own CPU
ribbon cables. dedicated sockets.

19
PROCESSORS

AMD
ATHLON
Stuart Andrews recalls the first AMD x86
CPU that properly put the wind up Intel

T
he summer of 1999 wasn’t a great time for Intel, and territory, including 3D games. Athlon was kicking Intel right
it really should have been. In February it had launched where it hurt, and that eye-watering discomfort wasn’t going to
the Pentium III, a supercharged upgrade of the P6 let up any time soon.
microarchitecture. Cyrix, whose 6x86 processors had
embarrassed some 1st-generation Pentiums, was effectively K7 COMES TOGETHER
finished, its tech now in the hands of VIA Technologies. That just How exactly did AMD manage this feat? Well, as with so
left AMD, whose K6 line of processors had captured some of many standout products in the hardware space, the answer
the budget PC market, but didn’t have the optimised pipelines, involves several developments all coming together at the
cache or floating point performance to give Intel any same time. On the one hand, the success of the K6 II and III
serious competition. had left AMD in a surprisingly strong position.
But when AMD released its first K7 Athlon processors to The K6 architecture had made the most of technology
reviewers in June, something unexpected happened. Sure, bought in with the company’s 1996 acquisition of NexGen
there was already some buzz about the new ‘K7’ CPU, thanks to and had pumped money into AMD’s war chest. It had also
Shipping in 550, intriguing early demos and briefings, but a Pentium III killer? Not cemented AMD’s position as Intel’s most credible rival.
600 and 650MHz
versions, the original likely. Yet when the final production samples hit magazine labs What’s more, AMD also had new CPU and bus technology
K7 Athlon took and website testbenches, it became clear that the new Athlon developed by the Digital Equipment Corporation (DEC) for its
the benchmark was pretty special Alpha RISC processors. It had even taken on most of DEC’s
battle to Intel – and
AMD’s chip wasn’t just matching Pentium III, clock speed for RISC CPU design team, including key architects, Dirk Meyer
won. Image credit:
Maddmaxstar CC clock speed, but beating it. Worse, it was beating it in the kind and Jim Keller.
BY-SA 3.0 of floating point intensive apps that Intel considered home Thanks to a patent cross-licensing deal with Motorola,
AMD also had a head start on new copper-based die
manufacturing technologies, not to mention a new chip
fab in Dresden on its way to use them. This would become
important later on.
All this helped lead to a revolutionary design – the first
7th=generation x86 processor.
The original 0.25-micron (250nm) Athlon had a die with
over 22 million transistors – the highest transistor count
of any x86 processor to date. It also had an ingenious split
cache system, with 128KB of on-chip L1 cache operating at
clock speed, plus another 512KB of L2 cache included in the
processor module.
This L2 cache operated at a fraction of the clock speed –
half-speed on the initial models – but with breathing room
to scale to cover higher and slower speeds later on. This
arrangement gave Athlon a performance advantage over the

20
launch. The final kicker was that AMD was no longer second
rate on floating point operations.
Not only were the Athlon’s floating point units (FPUs)
much faster than the weedy FPUs of the K6 line, but AMD
built on the SIMD instructions of its 3DNow! Technology, with
24 new instructions on top of the original 21. Most mimicked
the cache and streaming controls seen in Intel’s mighty SSE
tech, but AMD also bundled in new DSP and complex maths
extensions, plus MP3 and Dolby Digital decoding tools. This
chip was built to game and entertain.
There was one final way that AMD now matched Intel – the
Athlon was AMD’s first chip to abandon sockets and embrace
the slot. AMD’s Slot-A connector harnessed DEC’s EV6 bus
and bus protocol, which allowed for burst data transfers
at double the rate of Intel’s equivalent GTL+, giving you a
whopping 1.6GB/sec of bandwidth between the CPU and
the motherboard chipset.
The Athlon’s front side bus operated at double the 100MHz
speed of the memory bus, and as faster RAM became
While it was codenamed the K7 right up until launch, AMD named its
7th-gen processor to make it clear it was a break from the K5/K6 past available, this gave AMD scope to up the FSB speed even
further, to 266MHz or even 400MHz. What’s more, with a
earlier K6 processors, even before you factored any other slot design, AMD could combine its CPU die and L2 cache in
architectural improvements into the equation. the one package, and that package was a whole lot easier to
But these improvements were just as significant. Meyer, fit. And to make sure dozy upgraders didn’t try to stuff AMD
Keller and their team designed an architecture that was CPUs into Intel slots or vice versa, it cleverly reversed the
capable of decoding three x86 instructions simultaneously physical design.
and – crucially – symmetrically, unlike the Pentium III.
True, the Pentium III’s instruction pipeline could handle AWESOME ATHLON
three simple instructions at once, but feed it more than Talk about architectures and specs was all very well, of
one lengthy, complex instruction and it choked, as only one course, but nothing really prepared those of us benchmarking
pipeline could manage the workload. The Athlon, by contrast, PCs in the late 1990s for the sheer undeniable awesomeness
could chew through three complex instructions without any of Athlon. The results of benchmarks wouldn’t have made
trouble. You got three instructions at a time, every time. comfortable reading for Intel, especially once the Athlon
650 rolled out in August. Both the Athlon 600 and Athlon
The final kicker was that AMD 650 were faster than the Pentium III 600 in Quake III: Arena,
whether paired with the hero graphics chip of the day –
was no longer second rate on Nvidia’s Riva TNT2 – or with 3dfx’s still speedy Voodoo 3.
The Athlon was around 10 per cent faster in standard
floating point operations Windows applications, and up to 20 per cent faster in
Inside the Athlon
cartridge. Check
gaming benchmarks. The Athlon 600 was 10fps faster out the Slot-A
What’s more, the design featured a new level of optimised than the Pentium III 600 in the fiendishly demanding connector, the CPU
core and the two
branch prediction, which was not only more accurate in Quake II Crusher benchmark. As further tests from the likes
modules of L2 cache.
guessing what the next operation would be, but faster to of AnandTech proved, even a Pentium III overclocked to Image credit: Tullius
recover when it got that guess wrong. 650MHz couldn’t keep up. CC BY-SA 3.0
Like the team brought in from NexGen, the team brought
in from DEC had serious skills and experience in RISC chip
design, and AMD put this to good use. The Athlon architecture
converted x86 instructions into more efficient ‘macro ops’
and then those ‘MOPS’ into RISC operations, which the CPU’s
execution units could work on, nine to a clock.
This design was incredibly efficient by the standards of
the day, but it was also conducive to scaling upwards. Where
the K6-III had been stuck at 500MHz, the Athlon launched
at 500, 550 and 600MHz speeds, matching the 600MHz
of Intel’s top-end Pentium III. As if that wasn’t enough, AMD
added a 650MHz version in fewer than six weeks after

21
PROCESSORS

ISSUES AND OVERCLOCKS


Of course, no new CPU comes without
teething troubles. Early buyers found
a range of compatibility issues with
specific hardware, partly because
Athlon was a complex, power-hungry
CPU, and partly because of AGP
slot power issues affecting many
motherboards and driver issues with
the latest Nvidia cards. With certain
VIA chipsets and less consistent
power supplies, you could find
yourself in a world of instability. Nvidia
even released a driver update for its
graphics chips that disabled the high-
performance 2x mode on the AGP slot
when Athlon was detected.
Some enthusiasts were also
disappointed with the Athlon’s limited
overclocking potential. The K6 line had
been a treat for overclockers – gamers
upped 450MHz CPUs to 600MHz
The Athlon’s And this was just the beginning. In September, Intel routinely, and there was much debate in PC magazines about
microarchitecture launched the Pentium III 600B – a variant of the ‘Katmai’ whether we should allow manufacturers to send in pre-
was a revolutionary
Pentium III with a 133MHz front side bus. It couldn’t match overclocked systems.
leap from the
relatively simple K6, the Athlon 550 in many benchmarks, let alone the 600 and The Athlon wasn’t having any of that. The only ways to
with more advanced 650MHz versions, and still lagged behind the Athlon in when overclock the original CPUs were to crack open the modules
pipelines, faster it came to gaming performance and interfere manually with the resistors, or to purchase
FPUs, more cache
and a three-wide In October, AMD responded with a 700MHz Athlon that a third-party ‘Goldfingers’ device which did it all for you.
instruction decoder pulled even further ahead. AnandTech’s benchmarks of the Through either method you could increase your multiplier
time put it 20 per cent faster than the Pentium III 600B in the and give your Athlon a healthy speed boost, although it meant
Quake II Crusher benchmark. It was nearly 27 per cent ahead invalidating your warranty along the way.
in Quake III.
It was only with the launch of its Coppermine Pentium III THE AGE OF ATHLON
processors in October 1999 that Intel could claw back the The Athlon set the stage for a golden age of PC CPUs. Intel
lead. Yet while the Pentium III 733EB was now king of the hill, struck back with Coppermine, then AMD replaced the K7’s
an Athlon 700 could still benchmark faster in many tests than old aluminium interconnects with copper, and ran the L2
Intel’s 700MHz Coppermine Pentium III. cache at the full speed of the CPU. The 2nd-generation
As the clock speeds rose, the competition just grew hotter. Athlon ‘Thunderbird’ processors could match and even beat
In November 1999, AMD launched a new series of Athlons the Coppermine Intel Pentium IIIs, causing Intel to push even
with a 0.18-micron (180nm) K75 core, taking the top speed further with its Coppermine T CPUs.
up to 750MHz. In January and February, these were followed Before we knew it, 1GHz was starting to look like old hat.
with 800 and 850MHz CPUs. Then just as Intel geared up to 1333MHz and 1400MHz were the new targets. Meanwhile,
launch a (gasp!) 1GHz Coppermine Pentium III in March 2000, the K7 architecture was making waves at the budget end.
AMD stole its thunder by launching the Athlon 1000. To really Where Intel’s cheap, cache-less Celeron processors couldn’t
take the proverbial, it did it two days earlier, giving AMD the handle Deus Ex, Half-Life or Unreal Tournament, AMD’s K7
first 1000MHz x86 CPU. Duron CPUs were storming through them.
Athlon was first, but it wasn’t fastest. The Pentium III There’s still a lot of affection for the Athlon in the PC
1000EB was actually ahead of the Athlon 1000 in many enthusiast community. At a time when Intel seemed
tests, partly due to superior SSE support in many popular unassailable, it was the first chip to really knock it off its feet.
benchmark games. This made Intel try a little harder, and the result was that
Yet there was only a few frames per second in it, and everybody won. In fact, it’s a mark of that affection that when
Athlon systems often had the edge on price. What’s more, AMD created a new budget Zen-based processor line-up
the Pentium III 1000 was only available to system builders at in 2018, it still used the Athlon brand, with the Athlon 3000G
the time of launch. Anyone could get their hands on the 1GHz coming out in November 2019. It’s a sign of a classic brand
Athlon at the time. when it’s still being used 20 years later.

22
INTEL SLOT 1
Holograms, black boxes and mountains of cache. Ben Hardwidge recalls
the weird moment in time when Intel’s CPUs came in Slot format

CACHE FOR QUESTIONS


P
ackaging your CPU inside a big box before slotting it
into your motherboard seems like a recipe for a To understand the need for slot processors, we need to start
thermal catastrophe now, but for a brief period by going back a bit further in time. Before the Pentium II, Intel
around the turn of the Millennium, Intel (and later AMD) had two major CPU designs. It had the Socket 7 Pentium
mounted their CPUs on circuitboards inside sleek black MMX for consumer PCs, which was the last gasp for the
packages. They looked great too. There was now room for first Pentium design, now running at up to 233MHz. MMX
proper logos and a flashy physical design. Instead of dropping stands for multimedia extensions, and it effectively enabled
a nondescript-looking ceramic square into your motherboard, a lot more functions to be handled in software on the CPU,
you had a fancy black box with a hologram on the front. rather than in hardware. For example, an MMX CPU enabled
With their slick packaging, the first Pentium II CPUs looked you to properly use a software PCI modem, rather than a full
great in the TV adverts and promo shots. I remember wanting hardware one, saving you some money.
one just because they looked so good – a small hologram For servers and workstations, Intel had also introduced the
sticker clearly goes a long way towards manipulating people Pentium Pro, a massive chip that was heavily geared towards
An original SECC like me! To the uninformed, it looked like these attractive pure 32-bit computing. It lacked consumer frills such as MMX
Pentium II – look at
slot-based CPUs were the way of the future, but if you took a instructions, but you could run more than one Pentium Pro in
the size of the cache
chips on either side peek inside the box, they were clearly a result of technological parallel on a multi-socket board.
of the CPU area limitations at the time. The Pentium Pro also had a massive L2 cache that ranged
between 256KB and 1MB, depending on the
model. At this time, there was no way to integrate
this cache directly into the CPU die, but the
Pentium Pro did incorporate its huge L2 cache
in the same Socket 8 package as the CPU die,
and it also ran the cache at the same speed
as the CPU. There was a big problem with this
approach at the time though – making these
Socket 8 packages with full-speed cache was an
expensive process, and there were low yields.
Intel wanted to combine the two ideas,
making a desktop CPU with loads of L2 cache,
as well as consumer features such as MMX.
It also needed to be better at executing 16-bit
code (which was still used by some software
at the time) than the Pentium Pro and, most
importantly, it needed to be affordable to
manufacture on a large scale.
This meant compromising, as Intel knew it
couldn’t practically equip a mainstream desktop
CPU with loads of full-speed cache in a socket

23
PROCESSORS

Inside a Pentium
II die – that’s 7.5 large 256KB cache chips, giving you 512KB in total – more than
million transistors you found on some Pentium Pro CPUs.
produced By the end of it, you had a circuitboard containing a full CPU
on a 350nm
package in the middle, with two large cache chips next to it.
manufacturing
process, and with no This was then encased in a box with a thermally conductive
integrated L2 cache metal back. The whole package was called a single edged
contact cartridge, or SECC, and you would then attach a
heatsink and fan arrangement to the metal back, and slot the
whole setup into your motherboard.
The SECC package looked good on the surface, but if you
took one apart, you could see that it was a bit of a bodge job. I
was working in a computer shop at the time, and we joked that
the Pentium II was a ‘Socket 7 on a circuitboard’ – you could
even see the solder points where the socket pins could have
been located on the CPU package. It was still a normal square
CPU package – it was just mounted on a board instead.
Performance was mixed. If you were running full 32-bit
software in Windows 95, then the Pentium II was generally
faster than the Pentium MMX, but the latter still had the edge
in some 16-bit software, such as MS-DOS games. It also didn’t
help that the first Pentium II CPUs used the same 66MHz
package. The answer was to manufacture the CPU package front side bus as the final Pentium MMX chips, with the first
on a usual square format without the L2 cache, and to then Pentium IIs running at 233MHz, 266MHz and 300MHz, and a
mount that package on a circuitboard that contained the cache, 333MHz variant arriving later, following a die shrink to 250nm.
resulting in the Pentium II in 1997. It had 7.5 million transistors, This meant that, in some cases, the top-end 233MHz Pentium
produced on a 350nm manufacturing process. MMX was faster than the low-end 233MHz Pentium II.
Like the Pentium Pro, the CPU used a separate ‘back-side bus’
to communicate with the cache, but unlike the Pentium Pro, the THINK OUTSIDE THE BOX
Pentium II could only run the L2 cache at half the speed of the The processor’s new clothes came well and truly off in 1998
CPU. Intel attempted to counter the performance of the cache when Intel introduced its budget range of Slot 1 CPUs, with
by first doubling the amount of L1 cache, from the Pentium the still ridiculous name of Celeron. The first generation of
Pro’s 16KB to 32KB on the Pentium II. The Pentium II’s L2 cache Celeron CPUs, codenamed Covington, removed all of the L2
also had a 16-way associativity, compared with 8-way on the cache from the circuitboard, as well as all the fancy, hologram-
Pentium Pro. A higher associativity means the CPU has a greater clad packaging. This left you with a peculiar-looking green
chance of finding the data it needs in that cache, but that it can circuitboard with a square CPU clearly soldered into the middle
take longer to search for it than a cache with lower associativity. of it - Intel called this non-cartridge arrangement a SEPP format.
The other way Intel bumped up the Pentium II’s The lack of cache meant these Celerons performed poorly
performance was by simply equipping it with a lot of this L2 at the time, pushing people looking for a budget CPU towards
cache. All the first models of Pentium II came with a pair of AMD’s K6 line-up, which still used the aging Socket 7 form

Slot 1 Celerons
didn’t come in a
fancy chassis, and
the first models
didn’t come with
any L2 cache either.
Photo by Qurren

24
FINAL SLOTS
While the first Mendocino Celerons were still mounted
on Slot 1 circuitboards in order to maintain motherboard
compatibility, their L2 integrated cache design meant they no
longer technically needed the rest of the circuitboard. A few
months later, the first Socket 370 Celerons started appearing,
with ‘Slotket’ adaptors required in order to plug them into Slot
1 motherboards. It was a bizarre setup that persisted for an
unusual length of time.
Intel wasn’t quite ready to give up Slot 1 yet. Intel started
by tweaking the design of the CPU chassis, removing the
metal plate at the back. The final arrangement, called SECC2,
retained the plastic front cover with the hologram, but left the
circuitboard and CPU die bare at the back, in order to facilitate
better thermal transfer to the cooler.
The Pentium III factor that Intel had deserted. However, the next generation of Next came the Pentium III, codenamed Katmai, which added
maintained the Celerons in 1999, codenamed Mendocino, overturned this part SSE instructions, but was still fundamentally based on the
front with the
hologram, but of CPU market. They were still mounted on circuitboards at first, same P6 core as the Pentium II. It also still had an external half-
used the new but Intel had now nailed a method to produce a small amount of speed L2 cache setup, with both the CPU and cache mounted
SECC2 packaging, L2 cache on the same die as the CPU, running at full speed. on a circuitboard. It wasn’t until the Coppermine (don’t be
which left the
These new Celerons came with 128KB of full-speed on-die fooled by the name – all the interconnects were aluminium,
circuitboard bare
at the back cache, meaning they were quicker than the 1st-gen (and much rather than copper) revision of Pentium III, with a die shrink
more expensive) Pentium II CPUs in some applications. By to 180m, that Intel finally integrated 256KB of full-speed L2
this time, Intel’s next generation of Pentium II CPUs used a cache into a CPU die containing 29 million transistors.
100MHz front side bus, rather than 66MHz, which provided a Later came a 133MHz front side bus and Intel’s 820 chipset,
significant performance boost over their predecessors. accompanied by high-bandwidth but expensive RDRAM.
Accompanied by the new Intel 440BX chipset, the new CPUs However, the Slot 1 design still persisted. Even the first Pentium
ran at 350MHz, 400MHz and 450MHz, and Intel clearly hoped III to break the 1GHz barrier was based on a slot design. Intel
that this FSB tweak would help distinguish the Pentium II line-up needed to maintain compatibility, which was handy for many
from the new Celeron lineup, despite the latter’s faster cache. of us enthusiasts who had worked out that you could still run
the latest CPUs on some old 440BX boards by overclocking
if you took one apart, you the front side bus to 133MHz. There was also no shortage of
Slotket adaptors at this time, enabling you to install Socket 370
could see that it was a bit CPUs into Slot 1 motherboards.
The final Slot 1 CPU I saw was an engineering sample of a
of a bodge job 1.13GHz CPU that Intel sent to PC Pro magazine, but the chip
was recalled due to stability problems. The slot era was now
Unfortunately for Intel, overclockers had started over, and motherboards based on Intel’s later SDRAM-based
discovering that there was plenty of headroom for some 815 chipset only came in Socket 370 format. The Pentium III
of the Mendocino Celerons to go much faster, despite Intel carried in on socket format, as did the later Pentium 4, and the
locking down the multipliers in an attempt to prevent it. If you CPU industry hasn’t looked back since. Slot processors might
put a new Celeron in an Intel 440BX board, or a board with have looked good, and a part of me misses the fancy casing
VIA’s competing 100MHz FSB Apollo Pro chipset, you could with the holograms, but there’s no doubt that integrating cache
try moving the 66MHz FSB jumper to the 100MHz setting. directly onto the die is a much faster and more efficient way of
If you were lucky, and you had a decent heatsink and fan on doing it.
your CPU, your 300MHz Celeron would suddenly be running
at 450MHz, thanks to its 4.5x multiplier. Combine the clock
speed with the full-speed cache and your £60 processor
could potentially outperform a £400 one.
I remember this well, and bought a 333MHz Mendocino
‘Slotket’ adaptors Celeron with a 5x multiplier, in the hope of running it at
enabled you to 500MHz. It booted, but soon fell over once you got into
plug a Socket 370 Windows. Thankfully, my VIA Apollo Pro board also gave me
CPU into a Slot
the option to run the FSB at 75MHz or 83MHz if you tweaked
1 motherboard.
Photo by the jumper switches right, and the latter setting stably ran my
Konstantin Lanzet budget CPU at 415MHz. I had no need to buy a Pentium II now.

25
RETROGRADE

GRAPHICS

26
GRAPHICS

CGA
Ben Hardwidge delves into the workings of
the PC’s very first colour graphics adaptor

Some 16-colour
ASCII art by Ben
Hardwidge, aged 12

P
eople often nostalgically reminisce about archaic adaptor) and VGA (video graphics array) cards were very
technology, laughing about the frustrations and expensive at first, so CGA still had a home in cheap IBM PC
limitations of cassette tapes or floppy disks, before compatible machines, such as Amstrad’s PC1512. CGA first
adding ‘but they were amazing at the time!’ You simply can’t appeared in 1981, but new software was still supporting it well
hide the horror of the PC’s first colour graphics adaptor (CGA) into the early 1990s – you can even run Windows 3.0 on it.
behind such rose-tinted glasses (although orange-tinted
glasses might help – more on that later). Nobody, absolutely TEXT MODE
nobody, thought CGA was amazing at the time. At its basic level, a standard 16KB (yes, KB) CGA card can
I had a CGA PC in the 1980s, and but even then you felt access a palette of 16 colours, or rather eight colours at two
disappointed when you fired up a PC game to be greeted by intensities. It’s basically 4-bit colour, with three bits allocated to
a mess of purple and black on the screen. At the time, we red, green and blue (RGB), and the fourth bit enabling you to
joked that CGA stood for ‘crap graphics adaptor’. Nobody change the ‘intensity’ of the colour (RGBI).
thought of IBM computers as games machines then, of At the first level of intensity, you get black, blue, green, cyan,
course – CGA was the product of IBM trying to make a red, magenta, brown and light grey. The second level of
graphics standard that could display bar charts properly. It intensity basically gives you the same colours but with an extra
wasn’t meant to compete with the Commodore 64. level of intensity, which turns the brown into a yellow, the light
Better graphics came to the PC later, of course, but CGA was grey into a white and the black into a dark grey, while creating
supported for a long time. The later EGA (enhanced graphics light versions of the other colours.
GRAPHICS

Now, you might think 16 colours sounds okay for 1981, but COLOUR GRAPHICS
you can only display all these 16 colours on the screen at once Let’s start with the former, as that was the one that enabled
in text mode – the mode you used to see on BIOS screens you to get actual colour graphics on your PC. Generally, black
before we had fancy EFI systems. On a CGA card, the text was the background colour, and you then had three other
display has an effective resolution of 640 x 200, but it can only colours. As standard, most games used CGA in BIOS mode 4
display text characters on it, with 80 characters on the X axis, (the default BIOS mode for graphics), with the high-intensity
and 25 on the Y axis. version of palette 1, which gave you black, white, light cyan
As a kid, I used to play around with this mode quite a lot, as it light and magenta. It enabled you to make clearly defined
was the only way to get a lot of colours on the screen. If you shapes with black on white, gave you cyan for skies and
knew your ASCII codes, you could display various lines and water and then everything else would have to be filled in with
blocks as text characters and make a picture. You effectively magenta. It generally looked hideous, although it was
have to ‘type’ a picture, rather than drawing it – I used to spend sometimes better for space games – Captain Blood looked
hours doing it. To type ASCII codes, you hold down Alt and type surprisingly good in this mode.
a three-digit number – 176, 177, 178 and 256 give you three You could get other palettes too. Palette 0 was also
blocks of variable shading and a solid block, for example – it available in BIOS mode 4, and
still works in Windows. In this text mode, you could assign each gave you red, green, black and you fired up a PC
character a foreground and a background colour. brown as standard, or light red,
Game developers used this mode too – I had a clone of Ms light green, black and yellow in game to be greeted by
Pac-Man that used to run in text mode rather than graphics high-intensity mode. The latter
mode, as well as a clone of Breakout called Bricks. On a mode generally looked better in a mess of purple and
standard CGA card, it was the only way to get access to lots
of colours. There was a trick to enable you to display all 16
games to me. It meant you
couldn’t get blue for skies, but
black on the screen
colours at an effective graphical resolution of 160 x 100, by you could do pretty sunsets and
changing the number of lines of each text character to dark dungeons well. One of my favourite games to use this
display. However, it was rarely used. If you wanted graphics palette was a fantasy barbarian game called Targhan, which
rather than text, you usually either had four colours on the genuinely did look amazing considering the technology it
screen at 320 x 200, or one colour at 640 x 200. was using.
As a kid, I also discovered a trick while playing with the
night vision filters for my Dad’s binoculars. If you look at the
TRY CGA FOR YOURSELF cyan, magenta, black and white palette through an orange
In the unlikely event that you want to try out the shocking filter, it becomes the light yellow, light red, light green and
disgrace that is CGA graphics for yourself, you can do it in black palette. I bought some orange acetate from the local
DOSBox (dosbox.com). This handy software creates a virtual art shop and stapled it to a cardboard frame with Blu-Tack in
machine designed to recreate a high-spec PC from the 1990s. It
each corner – I could then swap between palettes at will!
loads a sound card and MIDI drivers automatically, and gets you
set up with a mouse too. It’s great if you want to play a round of
The low-intensity version of this palette was also used in
Doom or X-Wing. games occasionally. One example is Pharaoh’s Tomb, an
However, later VGA cards didn’t support CGA palette- early work by George Broussard at Apogee, who later went This breakout clone,
called Bricks, was
switching as standard. They could run CGA software, but usually on to work on the Duke Nukem games.
effectively built in
in the default black, white, magenta and cyan palette, even if Another trick often used by game developers was to text mode so it could
they used a different palette on a CGA machine. DOSBox runs in switch the CGA card to BIOS mode 5, which in high- access all 16 colours
VGA mode by default, which results in the same problem.
To get around it, you’ll need to open Options in your Start
menu’s DOSBox folder, which takes you into the config file.
Scroll down to the ‘[dosbox]’ section, and type ‘cga’ after
‘machine=’. After that, scroll down to the ‘[render]’ section, and
type ‘true’ after ‘aspect=’.
On some monitors you may find that you still don’t get the
correct 4:3 aspect ratio, even after changing the aspect setting
to true. If that happens, we found that setting ‘fullresolution=’ to
‘1366x768’ fixed it on our 4K monitor. We have no idea why, but
it seems to work.
If you want to run a really old game, it may also only be
optimised for early processors, and will run too fast on DOSBox’s
standard settings. If you want to emulate an XT-era 8086 PC,
scroll down to [cpu] and type ‘simple’ after ‘core=’ and change
the number of cycles to 530 (this isn’t exact, but it was near
enough in our tests).

28
1 2 3

4 5 1. C
 aptain Blood 4. Ribit (a Frogger clone)
BIOS mode 4, palette 1 BIOS mode 4, palette
high intensity 0 low intensity
2. Formula 1 5. World Class
Grand Prix Circuit Leaderboard
BIOS mode 5 BIOS mode 4, palette
0 low intensity, black
3. Targhan background replaced
BIOS mode 4, palette with blue
0 high intensity

intensity mode gave you access to a black, white, light red same. The result was double-height, rectangular pixels,
and light cyan palette. It had the same limitations as the rather than square ones. This mode also produced a
default cyan, magenta, black and white palette, but to my hideous moiré effect on lots of CGA monitors, making it
eyes, the red looked less garish than magenta . difficult to look at the screen.
A few games also ventured outside these palettes with
some tricks, which usually involve replacing black as the COMPOSITE MODE
background colour. Sierra’s Leisure Suit Larry in the Land of There was one more trick to getting a standard CGA card
the Lounge Lizards, for example, used palette 0 at low to display more characters, and it involved cleverly using
intensity, but replaced the black background colour with the composite output, rather than the 9-pin RGB monitor
blue (it looks hideous). This palette worked well in golf output. Irritatingly, most PAL TVs in the UK weren’t able
game World Class Leaderboard, though, with green and to handle this mode, as it’s dependent on the NTSC
brown trees, red leaves, green grass and blue skies and chroma decoder mistakenly seeing some luminance
water – colours you should be able to take for granted. signals as colour.
Sierra used the same trick in King’s Quest IV: The Perils of As a result, you could effectively make new colours by
Rosella, but using the BIOS mode 5 palette, again replacing lining up pixels in certain patterns on an NTSC display, and
the black with blue. The result was a blue, cyan, red and again by using different intensities. By placing one colour
white palette, which worked well with blue sea against cyan pixel next to another one, you could make an entirely new
sky, but meant the grass and trees looked very odd. colour, and it looked solid rather than a messy mix of pixels.
The result is astonishing, enabling you to create a much
Xenon II: Megablast
MONO GRAPHICS wider colour palette.
640 x 200
‘high-resolution’ The other main graphical option available to standard CGA The disadvantage, of course, is that the effect can only
monochrome mode cards was the ‘high-resolution’ 640 x 200 monochrome be achieved by placing pixels next to each other, which
mode. It was used in games effectively reduces the horizontal resolution from 320 to
that had a fair amount of detail 160. Some games supported this mode, though, including
in the graphics, such as Sim Sierra’s original King’s Quest game.
City, Death Track and Xenon II:
Megablast, among others. It King’s Quest in composite CGA King’s Quest in RGB CGA
was also used for early GUI
operating systems, such as
Gem and Windows 3.0.
However, only the horizontal
resolution was higher than the
colour graphics resolution – the
vertical resolution was the

29
GRAPHICS

EGA
Stuart Andrews recalls how 16
colours changed the PC world

P
ity the poor PC of 1983-1984. It wasn’t the graphics
powerhouse we know today. IBM’s machines and
their clones might have been the talk of the
business world, but they were stuck with text-only displays
or low-definition bitmap graphics. The maximum colour
graphics resolution was 320 x 200, with colours limited to
four from a hard-wired palette of 16. Worse, three of those
colours were cyan, brown and magenta, and half of them
were just lighter variations of the other half.
By this point, IBM’s Color Graphics Adaptor (CGA) standard
was looking embarrassing. Even home computers such as
the Commodore 64 could display 16-colour graphics, and
Apple was about to launch the Apple IIc, which could hit 560
x 192 with 16 colours. IBM had introduced the Monochrome
Display Adaptor (MDA) standard, but this couldn’t dish out
Using Chips and Technologies’ EGA chipset, early graphics card
more pixels, only higher-resolution mono text. manufacturers such as ATi could produce smaller, cheaper boards.
Meanwhile, add-in-cards, such as the Hercules or Credit: Vlask, CC BY-SA 3.0
Plantronics Colorplus, introduced higher resolutions, but
did nothing for colour depth. The PC needed more, which It was massive, measuring over 13in long and containing
IBM delivered with its updated 286 PC/AT system and the dozens of specialist large scale integration (LSI chips),
Enhanced Graphics Adaptor (EGA). memory controllers, memory chips and crystal timers
The original to keep it all running in sync. It came with 64KB of RAM
IBM EGA card
was a whopper, THE NEW STATE OF THE ART on-board but could be upgraded through a Graphics Memory
even without The original Enhanced Graphics Adaptor was a hefty Expansion Card and an additional Memory Module Kit to
the additional optional add-in-card for the IBM PC/AT, using the standard up to 192KB. Crucially, these first EGA cards were designed
daughtercard and
8-bit ISA bus and with support built into the new model’s to work with IBM’s 5154 Enhanced Color Display Monitor,
memory module
kit. Credit: Vlask, motherboard. Previous IBM PCs required a ROM upgrade in while still being compatible with existing CGA and MDA
CC BY-SA 3.0 order to support it. displays. IBM managed this by using the same 9-pin D-Sub
connector, and by fitting four DIP switches to the back of the
card to select your monitor type.
EGA was a significant upgrade from low-res, four-colour
CGA. With EGA, you could go up to 640 x 200 or even (gasp)
640 x 350. You could have 16 colours on the screen at once
from a palette of 64. Where once even owners of 8-bit
home computers would have laughed at the PC’s graphics
capabilities, EGA and the 286 processor put the PC/AT back
in the game.

30
BIRTH OF AN INDUSTRY PC RPG, including the legendary SSI ‘Gold Box’ series of
However, EGA had one big problem; it was prohibitively Advanced Dungeons and Dragons titles, Wizardry VI: Bane of
expensive, even in an era when PCs were already the Cosmic Forge, Might and Magic II and Ultima II to Ultima V.
astronomically expensive. The basic card cost over $500 US, It also powered a new wave of better-looking graphical
and the Memory Expansion Card a further $199. Go for the adventures, such as Roberta Williams’ Kings Quest II and
full 192KB of RAM and you were looking at a total of nearly III, plus The Colonel’s Bequest. EGA helped LucasArts to
$1,000 (approximately £2,600 inc VAT in today’s money), bring us pioneering point-and-click classics such as Maniac
making the EGA card the RTX 3090 of its day, and only Mansion and Loom in 16 colours. And while most games
slightly more readily available. What’s more, the monitor you stuck to a 320 x 200 resolution, some, such as SimCity,
needed to make the most of it cost a further $850 US. EGA would make the most of the higher 640 x 350 option.
was a rich enthusiast’s toy.
However, while the initial card was big and hideously
complex, the basic design and all the tricky I/O stuff were
With EGA, there was scope
relatively easy to work out. Within a year, a smaller company, to create striking and even
Chips and Technologies of Milpitas, California, had designed
an EGA-compatible graphics chipset. It consolidated and beautiful PC games
shrunk IBM’s extensive line-up of chips into a smaller
number, which could fit on a smaller, cheaper board. The What’s more, EGA made real action games on the PC
first C&T chipset launched in September 1985, and within a a realistic proposition. The likes of the Commander Keen
further two months, half a dozen companies had introduced games proved the PC could run scrolling 2D platformers
EGA-compatible cards. properly. You could port over Apple II games such as Prince
Other chip manufacturers developed their own clone of Persia, and they wouldn’t be a hideous, four-colour mess.
chipsets and add-in-cards too, and by 1986, over two dozen And when the coder behind Commander Keen – a
manufacturers were selling EGA clone cards, claiming over certain John Carmack – started work on a new 3D sequel
40 per cent of the early graphics add-in-card market. One, to the Catacomb series of dungeon crawlers, he created
Array Technology Inc, would become better known as ATI, something genuinely transformative. Catacomb 3-D and
and later swallowed up by AMD. If you’re on the red team in Catacomb: Abyss gave Carmack his first crack at a texture-
the ongoing GPU war, that story starts here. mapped 3D engine, and arguably started the FPS genre.
Sure, EGA had its limitations – looking back, there’s an
CHANGING GAMES awful lot of green and purple – but with care and creativity, an
EGA also had a profound impact on PC gaming. Of course, artist could do a lot with 16 colours and begin creating more
there were PC games before EGA, but many were text- immersive game worlds.
based or built to work around the severe limitations of
Forgive the
blocky pixels and CGA. With EGA, there was scope to create striking and even A SLOW DECLINE
16-colour palette. beautiful PC games. EGA’s time at the top of the graphics tech tree was short.
In Catacombs 3-D This didn’t happen overnight. The cost of 286 PCs, EGA Home computers kept evolving, and in 1985, Commodore
and Catacombs:
cards and monitors meant that it was 1987 before EGA launched the Amiga, supporting 64 colours in games and
Abyss lay the seeds
of Wolfenstein support became common, and 1990 before it hit its stride. up to 4,096 in its special HAM mode. Even as it launched
and Doom Yet EGA helped to spur on the rise and development of the EGA, IBM was talking about a new, high-end board, the
Professional Graphics Controller (PGC), which could run
screens at 640 x 480 with 256 colours from a total of 4,096.
PGC was priced high and aimed at the professional
CAD market, but it helped to pave the way for the later
VGA standard, introduced with the IBM PS/2 in 1987. VGA
supported the same maximum resolution and up to 256
colours at 320 x 200. This turned out to be exactly what
was needed for a new generation of operating systems,
applications and PC games.
What extended EGA’s lifespan was the fact that VGA
remained expensive until the early 1990s, while EGA had
developed a reasonable install base. Even once VGA hit the
mainstream, many games remained playable in slightly
gruesome 16-colour EGA. Much like the 286 processor and
the Ad-Lib sound card, EGA came before the golden age of
PC gaming, but this standard paved the way for the good
stuff that came next.

31
GRAPHICS

VGA
Stuart Andrews looks at the tech that transformed
the PC into a gaming and graphics powerhouse,
256 colours at a time

T
he technology that put PC graphics firmly on the
map arrived in April 1987 as part of IBM’s PS/2 line
of PCs. IBM saw the PS/2 as the answer to its
biggest problems, putting Big Blue (as we all used to call it)
back in control of the PC architecture and one step ahead of
the clone manufacturers.
To do so, it had Intel’s latest processors, cutting-edge
connection options and the fastest floppy disk storage, not
to mention a revolutionary new high-bandwidth system
bus. But what turned out to be the PS/2’s most important

In 1987 the PC wasn’t


exactly considered a
graphics powerhouse
The VGA section of a 1988 IBM PS/55 model 5550-T. You can see the VGA chip,
feature was its new graphics hardware – the Video Graphics the INMOS RAMDAC, the two timing crystals and 256KB of video RAM. Credit:
Array, or VGA. Darklanlan, CC 4.0 custompc.co.uk/CC4
In 1987 the PC wasn’t exactly considered a graphics
powerhouse. Apple’s Mac II, launched in March the same 200 with up to 256. What’s more, those 256 colours could
year, had a graphics card that could support up to 256 be redefined at any time, from an 18-bit palette of 262,144
colours at 512 x 384. The Commodore Amiga could display colours. With VGA, you could put a photo on the screen and
full-screen animated graphics with up to 64 colours at 320 it kind of looked like a photo. Artists could create 2D images
x 240, or 4,096 colours in still images using its legendary, with sophisticated colour and shading effects. PC games
flicker-tastic HAM mode. went from looking shocking to looking seriously awesome.
The best the PC had to offer was the EGA (Enhanced VGA was literally a game changer.
Graphics Adapter) standard, covering resolutions of up to
640 x 350, but with only 16 simultaneous colours from a PC GRAPHICS GET THE WOW FACTOR
fixed palette of 64. If you wanted to create graphics or play Weirdly, VGA didn’t arrive as a new standard, or even as an
games on your PC, you needed to really like basic colours add-in graphics card. On the first PS/2 PCs, it came in the
with a strange preponderance of green and purple. Graphics form of a chip containing the display controller, along with
enthusiasts were rendering ray-traced 3D graphics on their 256KB of dedicated RAM, a pair of timing crystals and an
Amigas, albeit very slowly, but nobody sensible would even external RAMDAC.
think of doing so on a PC. This already made it a much more integrated technology
VGA didn’t put the PC at the graphics cutting edge, but than the original EGA chipsets, which contained dozens of
it did put it back in the race. The new hardware supported processors, and put it more in line with the integrated chips
resolutions of up to 640 x 480 with 16 colours, or 320 x coming from third parties. What’s more, it was the higher-

32
stuck at 320 x 200 in Mode 13h. However, programmers
ATi was one of many graphics chip and card
manufacturers to first clone VGA, then enhance it. found workarounds. A handful of games, such as the
Credit: Samuel Demeulemeester, CC 4.0 legendary horror game Dark Seed, opted to work with a
reduced 16-colour palette in order to use the full 640 x 480
resolution. Meanwhile, Michael Abrash, who would later
work with id Software on Quake, worked out an approach
that enabled programmers to use 256 colours at a slightly
higher resolution of 640 x 240, which he dubbed Mode X.
Meanwhile, Windows 2.0 moved to adopt the 640 x
480 mode with 16 colours, bringing the interface closer to
what we expect from a GUI today. However, many of the
applications and games we think of as belonging to the VGA
era stuck to Mode 13h and its 320 x 200 resolution. What’s
more, with the CPU performing most of what we’d now call
the GPU’s legwork, this was arguably for the best – until
the Intel 486 appeared in 1989, there wasn’t any really CPU
end option of two new graphics standards. The cheaper powerful enough to handle gaming at higher resolutions.
PS/2 models were stuck with the Multi Colour Graphics
Adapter (MCGA) which had the same 256-colour mode THE IMPACT OF VGA
but lacked VGA’s higher resolutions. Luckily, those colours alone had a huge impact. The ZSoft
Like IBM’s new MCA bus architecture, MCGA didn’t last Corporation’s PC Paintbrush and Electronic Arts’ Deluxe
long beyond the PS/2, but VGA developed a life of its own. Paint II revolutionised professional graphics and computer
Beyond hardware-level support for smooth scrolling, and art on the PC, thanks to 256-colour support. VGA also
a barrel shifter designed to shift incoming data from the made CorelDRAW, launched in January 1989, a realistic
CPU to the display at seven bits at a time, it didn’t actually alternative to the digital design packages appearing on
do much in the way of graphics acceleration. Apple’s computers.
However, it did set a new baseline standard for PC Meanwhile, for PC games, VGA was nothing short of
graphics, and for hardware and software support. Crucially, transformative. Sure, the 64,000 pixels on your monitor
through its RAMDAC and 15-pin D-Sub connector, it looked a little chunky; however, with 256 colours, the
established how the PC could convert digital instructions artists working at leading developers, such as LucasArts,
into a 256-colour analogue video signal, setting the stage Sierra Online, Microprose, Electronic Arts and Origin
for the 16-bit and 24-bit colour standards to come. Systems, were able to produce sprites that looked more
Instead of sending six colour signals from the graphics like recognisably human (or inhuman) characters, and
card to the monitor, like the older EGA chipsets, the VGA background scenery that could bring their game worlds to
chipset and its RAMDAC sent only three signals – red, life. Plus, while the PC couldn’t pull off the same smooth
green and blue, with a potential 64 different levels for each. scrolling, sprite-scaling tricks as the Commodore Amiga
For VGA, this resulted in an 18-bit palette of up to 262,144 or 8-bit consoles, its best games were developing a visual
Eye of the Beholder
– VGA’s larger colours, 256 of which could appear simultaneously in Mode richness of their own. As the PC moved into the 386 era, it
colour palette 13h. Once adopted, this same core technology gave scope was beginning to be taken seriously as a gaming machine.
(right) gave artists for 16-bit and 24-bit colour in later graphics chips, with up to Taken on its own, the first VGA chipset wouldn’t have
the chance to use
65,536 colours or 16.7 million colours on the screen at once. made such an impact. After all, you only got it to use it if you
more realistic
shading compared Resolution wasn’t the base level VGA spec’s strength. bought a pricey IBM PS/2 machine. Instead, it really only
with EGA (left) In fact, PC journalists of the time pondered why it was gained momentum once it began to appear in add-in cards.

33
GRAPHICS

The Secret of IBM was first out of the gate with its PS/2 Display Adapter, advanced resolutions, such as 800 x 600 in 16 colours or
Monkey Island a card that gave any reasonably modern IBM-compatible 640 x 480 with 256 colours.
– moving to VGA
(right) enabled
PC with ISA slots a VGA chipset for the princely sum of This in turn put pressure on the system bus. The original
PC developers to $599 US (about £420 inc VAT then and £1,200 inc VAT in VGA controllers were so undemanding that they couldn’t
create more human- today’s money). exhaust the miserable bandwidth of the 8-bit ISA bus,
like characters Yet by this point, the older EGA standard had spawned but as these new chipsets emerged, they required more
compared with
EGA (left) a growing industry of third-party manufacturers, adept at bandwidth and a spot on the wider 16-bit ISA bus.
mimicking or reverse-engineering IBM’s technology and As time went on and Intel’s CPUs grew faster, demands
spawning their own versions. What’s more, these guys would grow accordingly, resulting in the development of the
didn’t stop at simply replicating IBM’s latest standards; they Extended ISA (EISA) bus and VESA Local Bus. However, this
wanted to add a little extra sauce to their cards by actively complicated the situation further, with the fastest enhanced
enhancing them. VGA cards, based on Tseng Labs or Cirrus Logic tech,
As a result, October 1987 saw the launch of the first VGA- performing best in 16-bit versions running on the 16-bit ISA
compatible third-party graphics card, the STB VGA Extra. It bus, although this wasn’t always the case with every chipset.
By 1989, NEC would lead the early graphics chipset
Developers were able to manufacturers in the creation of the Video Electronics
Standards Association and the Super VGA BIOS, opening
produce sprites that looked up support for higher resolutions and colour depths across
the PC industry. Windows acceleration became the new
more recognisably human battleground and video acceleration became the next
cutting-edge technology.
did everything VGA did, albeit with a few foibles here and Yet all these new cards and advanced feature sets
there, with some optimisations that made it slightly faster. still had the VGA standard at their core. VGA became the
By mid-1988 to 1989, the likes of Tseng Labs, Cirrus Logic, base requirement for new PCs running later versions of
Chips and Technologies and ATi were entering the fray, and Windows or IBM’s OS/2. In many respects, IBM had built
Dark Seed sacrificed not only were they driving prices down to $339 US, but they the foundation of PC graphics for the next ten to 15 years.
colour depth for
were also adding new capabilities. These enhanced VGA In fact, you could argue that VGA is still the foundation.
resolution, in order
to do justice to H.R. cards added features to accelerate video, or increased the If so, it probably wasn’t a whole lot of comfort to IBM.
Giger’s artwork RAM to 512KB, and tinkered with the BIOS to cover more While VGA was the last graphics standard IBM managed to
establish, it wasn’t for the want of trying. Even as it launched
VGA, it was preparing its 8514 graphics adaptor, with fixed
functions to accelerate common 2D drawing processes,
such as drawing lines or filling shapes with colour. In
1990, it hoped to supersede VGA with its new 1,024 x 768,
256-colour standard, XGA.
Both these new standards floundered because they
were designed to run on IBM’s MCA bus, while IBM’s
clone-making rivals focused on getting the most out of the
existing 16-bit ISA bus, before working on the proposed
EISA replacement. The result? Super VGA became the new
de facto standard, while IBM lost its domination of the PC
industry. Bad news for Big Blue, but good news for those of
us who enjoyed the more cost-conscious, game-focused
machines in the years that followed.

34
THE VERY BEST OF EARLY VGA
Coming at you in 256 glorious colours at 320 x 200

WOLFENSTEIN 3D EYE OF THE BEHOLDER


This pioneering Nazi-blasting FPS from id Before The Elder Scrolls, Lands of Lore and
WING COMMANDER Software was originally designed to run with Ultima Underworld redefined the RPG genre,
By 1990, VGA was well established, the 386 EGA graphics, but that became unthinkable Eye of the Beholder set a new benchmark for
had become the mainstream PC CPU and the once John Carmack and his crew were the expected standard of graphics.
486 had just appeared. All three technologies unleashed on VGA. The texture-mapped With its graphics built in Deluxe Paint II and
found their perfect showcase in Wing walls made the most of simple bitmapped powered by 256-colour VGA, it updated the
Commander. Chris Roberts’ dazzling space textures, but the sprites for the Nazis, zombies ‘Dungeon Master’ tile-based dungeon crawler
combat game offered stunning scaling and and Hitler-loving hounds looked dazzling in genre, adding customisable characters, an
rotating sprite spaceships, Star Wars-inspired 256 colours, as id pushed the 386 CPUs of engaging story, and the kind of D&D lore
cinematic cutscenes and thrilling mission the era to their limits. Doom would push 3D we’ve come to know and love. People tend
design. It paved the way for LucasArts’s realism further still, but even that relied on the to remember the excellent Amiga port of this
X-Wing games and the Elite revival, while limited capabilities of the humble VGA card. game, but the PC version was the original, and
showing the way forward for a new breed of in many ways the best.
Hollywood-influenced, story-driven games.

THE SECRET OF MONKEY ISLAND


It’s a toss-up which was more influential DELUXE PAINT DARK SEED
– the first of Ron Gilbert’s beloved pirate By 1987, PC users were already getting sick of Combining a point-and-click adventure
series or the awesome Indiana Jones and the Commodore Amiga users rubbing their face with psychological horror and the art of
Fate of Atlantis. Either way, these two titles in the dirt with Deluxe Paint. With advanced Alien maestro, H.R. Giger, Dark Seed used
used VGA’s capabilities to full effect, with drawing tools, fills and scaling capabilities, VGA in an unusual way, dropping down to 16
impressive sprite characters and glorious these smug gits could create images as colours in order to hit the maximum 640 x
backdrops that made the most of the larger cool-looking as the legendary King Tut mask 480 resolution (apparently, Giger made this
colour palette. You no longer had to use your or Birth of Venus. When Deluxe Paint II was a condition of the team using his art). In any
imagination to visualise locations, because ported to the PC in 1988, PC users were case, it worked, mixing sequences set in an
the artists had done the hard work for you. invited to the party, helping to establish the PC American town setting with scenes straight
The Secret of Monkey Island II: Le Chuck’s as the graphics powerhouse it would become from one of Giger’s dark sci-fi body horror
Revenge went even further, with graphics with the arrival of Adobe Photoshop, Paint netherworlds. Even now, it’s one weird-
that embraced a stunning, hand-painted look. Shop Pro and CorelDRAW. looking game.

35
GRAPHICS

3DFX
VOODOO 3D
Reflective surfaces, smooth frame rates and the
pure awesomeness of GLQuake. Stuart Andrews
recalls the truly transformative effect of 3Dfx’s
Voodoo chipset on PC gaming

I
t’s a classic case of being the right company with In 1993, Pellucid was bought by Media Vision, a company
the right tech at the right time. 3Dfx launched its that had grown rich from selling multimedia kits for PCs
revolutionary Voodoo Graphics chipset just as fully during the CD-ROM revolution. Pellucid had proposed the
polygonal 3D graphics hit the mainstream and PC gamers design and manufacture of a PC 3D gaming chip, and Media
wanted an easy and accessible way to get them. Vision wanted some of that action.
In late 1996, Quake and Tomb Raider had just been Unfortunately, Media Vision had its own (mostly legal)
released, the Nintendo 64 was out in Japan and North issues, and went out of business. However, just when the
America, and the Sony PlayStation and Sega Saturn were situation looked bleak, Scott Sellers met Gordon Campbell,
still in their first year. Reliant purely on CPU horsepower, and founder of the pioneering graphics chip manufacturer,
with no dedicated 3D hardware to back it up, the PC was Chips & Technologies. Campbell asked the trio what they
beginning to lose its place as the king of gaming platforms. wanted to do, and helped them to find the venture capital
If 3Dfx needed a
Sure, it had a bunch of 2D/3D accelerator cards, but to do it. killer app, GLQuake
they were too damn slow to make any difference. With the With Smith working as vice president of sales and delivered. You
Voodoo Graphics chipset, 3Dfx played a bigger role than any marketing, Sellers and Tarolli used all the know-how could play id’s
cutting-edge 3D
other graphics hardware manufacturer in turning around that they’d built up at SGI and Pellucid to design a cost-efficient
title at 640 x 480
situation. In doing so, it made 3D acceleration an absolute, 3D architecture built specifically to handle the polygonal in 16-bit colour at
cast-iron must-have feature. rendering pipeline used in 3D games. a smooth 30fps

THE BIRTH OF VOODOO


3Dfx was founded in San Jose, California in 1994, by a trio
of ex-Silicon Graphics (SGI) employees, Ross Smith, Scott
Sellers and Gary Tarolli. At the time, SGI was by far the
biggest name in 3D graphics, with its enormously expensive
workstations used to create the pioneering CGI effects in
Terminator 2 and Jurassic Park.
What’s more, SGI was already involved in 3D gaming
hardware, developing the core components for what would
eventually become the Nintendo 64. At this time, however,
some of SGI’s engineers were thinking that there were
serious opportunities being overlooked in developing 3D
hardware for PCs.
One group would eventually leave to found a company
called ArtX, which would later get bought by ATI. Meanwhile,
Smith, Sellers and Tarolli founded a new startup, Pellucid, in
1992, with the intention of bringing affordable 3D hardware
to the PC.

36
– a bank of 2MB of high-bandwidth (for the time) EDO RAM,
and the resulting scanlines were fed out to a DAC, which
output to a good, old-fashioned analogue VGA output.

THE FIRST CARDS


The fact that the Voodoo Graphics chipset was 3D-only
helped to keep down the price, but it did make using the card
a little strange. While the card itself could talk to the CPU
and system RAM through the PCI bus, it worked in tandem
with an existing 2D graphics card for 2D DOS and Windows
acceleration, only taking over when there were 3D graphics
to be rendered.
This happened through a D-Sub pass-through cable
running from the output of the 2D card to an input on the
Voodoo Graphics card. While some 3Dfx cards handled the
switching electronically, others actually had a mechanical
switch. On these, you could literally hear when the Voodoo
Graphics card kicked into action.
3Dfx never manufactured its own 1st-generation cards.
Tomb Raider was With Sellers working on the hardware and Tarolli on the Instead, the designs and chips were sold and licensed to
a 3Dfx showcase, core algorithms, the 3Dfx team came up with the idea of an third-party manufacturers, with Diamond and Orchid first
smoothing out the
add-in card that only accelerated 3D, and left 2D graphics out of the gate with the Monster 3D and Righteous 3D in late
blocky textures,
improving frame and Windows acceleration to a separate graphics card. At 1996. These first cards sold for approximately £300, which
rates and adding first, all they had working was a software simulation built in C was a lot but not exorbitant for a PC graphics card at the time.
transparent water and running on a Pentium 90 processor, but this evolved into What’s more, these beauties could perform amazing
to the mix
a card based on two heavily optimised processors. feats with even fairly modest PC configurations. At a time
The first, the Frame Buffer Interface, took polygon when even Intel’s Pentium 133 processors were struggling
scene data from the CPU and applied Z-buffering and to deliver consistently good frame rates with the standard
Gouraud shading, tracking which polygons were visible, software renderer in some demanding games, you could
and ensuring that only those were drawn and filled, then slot a Monster 3D into your Pentium 90 system and see
applying shading to provide an impression of simulated light great-looking, silky-smooth visuals.
and colour.
Each frame of the image would then be converted into
scan lines from top to bottom, then sent on to the second
You could now run Quake at
chip. The Texture Mapping Unit, or T-Rex as it was known, 640 x 480 in glorious 16-bit
applied perspective-correct textures, complete with
mipmapping (the process of using smaller, less-detailed colour and still hit 30fps
textures as an object gets further away) and bilinear or
trilinear filtering (smoothing out blocky textures when Yet 3Dfx’s work went beyond designing the architecture to
displayed at their largest size close to the viewpoint). creating an API that enables game developers to support the
What’s more, the T-Rex supported alpha blending, for card. At the time, there were no 3D engines that supported
convincing transparency effects. No other consumer-grade 3D hardware and no standard APIs for developing 3D games.
graphics hardware was able to handle this at the time. Each OpenGL was focused mainly on CAD and workstation
chip worked with its own frame buffer or texture memory graphics, while Intel was unwilling to release its new 3DR
rendering library for use on hardware that would run DOS
games. Microsoft had yet to develop what became Direct3D.
As a result, 3Dfx developed its own API, GLide. This was
based on OpenGL, so it wasn’t unfamiliar to experienced
3D developers, but it pared back the calls and instructions to
focus on those used in real-time 3D games.
By the time Unreal To show off Glide’s capabilities, 3Dfx didn’t just have its
hit the market, 3Dfx own internal demos, but a range of Atari and Midway arcade
was established as games, including the racer, San Francisco Rush, and the beat-
the best tech to run
’em-up, Mace: The Dark Age. These ably demonstrated what
it. Check out those
shiny surfaces and the new hardware could do. All that was needed were some
lavish textures suitably awesome PC games.

37
GRAPHICS

you to play the game at a 640 x 480 resolution at close to


30fps. You saw it and you wanted Voodoo in your life.
An even more impressive transformation awaited us
with id Software’s Quake. I first played Quake on a Pentium
133 laptop with 16MB of RAM, and the game was only just
playable at a 360 x 240 (or half SVGA). And when I say
KILLER APPS playable, I mean the right side of 20fps.
This was 3Dfx’s one problem at launch. The technology Then 3Dfx released MiniGL, a cut-down version of
itself was impressive, and the cards came with some decent OpenGL designed to handle just the functions used in Quake
The Orchid
Righteous 3D demos, including a slick 3D combat demo, Valley of Ra, which – id responded with a port of the game, GLQuake, which
was one of the featured amazing reflective surfaces and gouraud shaded could take advantage of the MiniGL wrapper. The port had its
first Voodoo characters, and a stunning dolphin sim, Grand Bleu. Orchid and problems, including gloomy brightness levels, but the bilinear
boards to hit the
Diamond took them around to show to eager PC journalists, filtered textures went from looking slightly rough to looking
market, along with
Diamond’s mighty and jaws consistently hit the floor, but there still wasn’t a awesome, and you could now run the game at 640 x 480 in
Monster 3D killer app. glorious 16-bit colour and still hit 30fps.
At this point, the early 2D/3D graphics cards all tended Serious PC gamers saw Quake running unaccelerated and
to support the same games, and we’d got used to seeing then accelerated, then voted with their wallets. Sure, the new
the likes of Descent 2, Actua Soccer, Terminal Velocity and Pentium MMX CPUs released in 1997 could run the game at a
MechWarrior 2 with only mildly improved, filtered 3D textures decent lick, but did it look as good as Voodoo? Not even close.
running at frame rates that barely climbed above what you GLQuake sold 3Dfx cards, and a growing user base
could get with a software renderer. The Voodoo 3D ran these boosted game support. True, 3Dfx had rivals. Videologic’s
games faster at higher resolutions, but nobody was going to PowerVR tech was affordable and efficient, but it also used
pay £300 for that. an unconventional tile-based rendering pipeline and needed
Luckily, 3Dfx soon had two absolute bangers. The first was a faster CPU to get the best out of it. Rendition’s Verite
Tomb Raider. Lara Croft’s debut was already one of the most chipsets looked promising, but were too pricey and struggled
stunning-looking games around on the Sega Saturn, Sony with their 2D performance.
PlayStation and PC, but the pixelated, low-resolution graphics 3Dfx grew to become a kind of de facto standard just as
meant that you weren’t seeing it at its best. the next wave of 3D games started taking off. From Need for
However, just a few months after launch, the publisher, Speed II SE to Myth: The Fallen Lords, Shogo: Armor Division
Eidos, released a patch that allowed you to run Tomb Raider and Unreal, Voodoo Graphics made the best-looking games
under GLide. The effect was amazing, not only smoothing out of the era look even better and run at what seemed incredible
the blocky textures and adding transparent water, but allowing speeds. The PC was back on top as the most technologically
advanced gaming platform of the era.
3Dfx continued through a glorious period. Its 1997 Voodoo
Rush 2D/3D graphics chipset was admittedly a dud, suffering
from a lack of memory bandwidth and sync issues with the
With its high- on-board 2D graphics chip. However, 1998’s Voodoo 2 was a
res models and worthy successor, arriving just a few months after another iD
reflective surfaces, showcase, Quake II.
3Dfx’s lead tech
demo was a This purple period wasn’t to last, as GLide fell out of favour
jaw-dropper. and ATi and Nvidia delivered high-performance all-in-one
Nobody had seen graphics chips, but we owe 3Dfx a huge amount for bringing
anything like this
3D power to the PC when it needed it most – and helping to
outside of Sega’s
Virtua Fighter show the world the full potential of hardware-accelerated
arcade games 3D graphics.

38
POWER VR
Ben Hardwidge catches up with the PowerVR folks
from Imagination Technologies (formerly VideoLogic),
to discuss early PC 3D accelerators

B
ack when PCs were still in horrible beige boxes, 1993, this first card first came out – it would have been in a
John Major was nasally shouting over the despatch 486 PC, so not very good floating point performance. We
box and Nvidia was just a glint in Jensen Huang’s had a Texas Instruments DSP on there to do all the
eye, VideoLogic (now Imagination Technologies, the firm also transform and lighting. This board would later do tile-based
behind PURE radios) started work on the PowerVR project. It deferred rendering, with real-time shadows, and proper 3D
resulted in some of the first PC 3D accelerators and, since volume shadows, but it didn’t have texturing, because it
then, PowerVR has become a mobile GPU system of choice, was hard enough to fit that all onto one chip.’
found in the iPhone 7 and numerous Android phones. Tile-based deferred rendering is the key to PowerVR.
I headed up to Imagination Technology’s HQ in Kings ‘Tile-based rendering and deferred rendering are two
Langley to chat with some of the folks who worked on the separate things,’ explains Kristof Beets, Senior Director,
original PC PowerVR cards. I’m taken to a meeting room, Product Management & Technology Marketing,
where a spread of PC relics from the early 1990s to the PowerVR. ‘Most of our competitors today have some
2000s is laid out. They include never-released products, form of tile-based rendering. Fundamentally, that means
including the Kyro 3 and various pre-release boards, as well you bucket your geometry, so instead of rendering triangle
as some classics. by triangle, you first sort your triangles and then render
each tile.
WHERE DID IT START? ‘The key benefit is local processing. The further your data
Simon Fenny, PowerVR Research Fellow, picks up the goes, the more power it uses. If you keep it very tight, it’s
first one – an enormous PCB with a 16-bit ISA much more efficient. Memory loves big transactions, so
interface. ‘The whole PowerVR project blasting a tile and loading the texture data for a tile is
VideoLogic’s started in July 1992,’ he says, really effective.’
Apocalypse 3DX ‘and in about early ‘The reason why we’re still so good at tiling is because
was the first of all the clever algorithms and data structures that go
mainstream
behind it, which Simon and those guys came up with
PowerVR PC 3D
accelerator in the 1990s – it’s how you sort triangles effectively
into those buckets.’ Basically, the work done on the
early PC 3D accelerators is still useful in
smartphones today.
The next part is deferred rendering, a benefit
of which is that you can identify objects that
are hidden behind other objects before
shading them, so you only shade the objects
you can see. ‘It’s like painting by numbers,’
says Fenny. ‘Imagine you’re drawing your
triangles, and instead of filling in colours you say,
“This is triangle 1, that’s triangle 5 and that’s triangle 6.”
You then say, “Okay, send those off and fill in all the 1s. Oh
what’s the next one? 3 is the next one – do those”, within
each tile.

39
GRAPHICS

‘If something else is behind you, don’t bother even the PCI bus, you could not only write things in, but you could
shading that. If you just have a normal tile renderer, it might burst things right out,’ says Fenny. ‘Because it was tile-
be local, but you still end up drawing a car behind based rendering, if you finished your tile completely you
something else, and then a wall over the top. Why would could do that and be really efficient on the bus.’ An early ISA Rapier
you bother spending all that effort? Some other people will 3dfx wasn’t using tile-based rendering, and its Voodoo 24 card – the
sort things so that it works properly, but it’s expensive to cards used a Z-buffer to solve the visibility problem. It’s a big gold Texas
Instruments chip
sort things.’ situation that not only meant Voodoo cards had to use loop- handles transform
back cables, but they also had to allocate some of their and lighting
THE FIRST PC CARDS
VideoLogic was initially
targeting the arcade
market with PowerVR,
but as PC tech
progressed, the team
soon turned to looking at
the PC. ‘Thankfully, the
Pentium had come along
with the PCI bus,’ says
Fenny, ‘so we were able
to do the transform and
lighting on the Pentium.
We’d send the models
over the PCI bus into the
chip, which would then
render it. These cards would basically mix the signal frame buffer memory to the Z-buffer. That’s why 1st-gen
coming in from the VGA card.’ Voodoo cards are limited to 16-bit colour at 640 x 480,
The first mainstream product based on this tech was while PowerVR cards could go higher.
the VideoLogic Apocalypse 3DX. This mixing of the signal ‘If you turn off Z-buffering, which means a lot of messing
was a key part of the PowerVR formula at the time. The first around in software, 3dfx could get at 800 x 600 in 16-bit,’
3dfx Voodoo and PowerVR cards were dedicated 3D says Fenny, ‘but we were streaming at 24 bits per pixel.’
accelerators, meaning you needed a second ‘2D’ graphics One area where 3dfx had the upper hand was system
card to output a display to your monitor. requirements. You could get decent performance from a
Voodoo cards needed a VGA analogue loopback cable Voodoo card with a
between your 3D card and 2D card. PowerVR cards did it
much more cleanly (at least from a hardware perspective),
Pentium 90, but a
PowerVR card needed a
the work done
mixing the signal over the PCI bus. ‘We realised that, with beefier CPU to get the on early PC 3D
most out of it.
accelerators
THE DREAMCAST
PowerVR was on a roll, and is still useful in
it had caught the eye of
Sega while it was smartphones todaY
developing the Dreamcast.
‘I remember being in a couple of meetings, saying it does
this and this, and they just looked at us thinking, “That’s not
possible,”’ says Fenny. ‘There was a great deal of
excitement. We were adding texture compression. We had
hardware ordering-dependent translucency, which is still
difficult to do now.’
What’s hardware ordering-dependent translucency? ‘If
you ever have to write a game where you have lots of
layers of translucent objects, which are in random order on
the screen, you have to make sure you do them in back-to-
front order,’ says Fenny.
A VideoLogic Apocalypse
3DX – still in its shrink-wrap Beets informs Fenny that these days developers write a
at Imagination Technologies quick-sort in a shader program to deal with it. ‘No! Yuck!’ he

40
on it, which was the cheapest fan we could find
in China, because it’s essentially cosmetic.’

KYRO
The final push for PowerVR on the desktop
PC was the Kyro series. Fenny laments
that the Kyro series saw hardware
ordering-dependent translucency removed from
hardware. The industry was moving towards standardised
A Guillemot APIs, rather than proprietary ones, and that meant
Hercules 3D compromising on some hardware features. ‘We’d say,
Prophet 4000XT “We’re doing translucency sorting” to DirectX developers
card, based
on Kyro 2
and some would say, “What? No, that’s not possible.” Others
said, “Yeah, it would be great to use it, but there are cards that
can’t possibly use it, so we’re not going to develop for it.”’
Kyro also saw the introduction of PowerVR’s ‘perfect tiling’
technique. ‘We figure out exactly which tiles that an object is
in,’ explains Beets. ‘What our competitors do is bounding
boxes, but a box covers a lot more area than a triangle.’ Next
came the Kyro 2, with a die shrink and an increase in clock
speed. I was working for PC Pro magazine at the time, and
reviewed the Kyro II. It wasn’t as quick as Nvidia’s top-end
GeForce2 chips, but it happily beat the GeForce2 MX’s
performance for a similar price.
Nvidia wasn’t happy, and briefed industry partners against
Kyro 2. A leaked PowerPoint presentation showed Nvidia
lambasting Kyro 2’s driver support, rendering quality,
The back of a Kyro responds. ‘It was funny watching some people trying to port Z-buffer issues and lack of hardware transform and lighting.
3 card, which never the Dreamcast games onto, say, the PlayStation. You’d see The presentation’s conclusion was damning: ‘Buying Kyro 2
made it to market
the early examples, and all the translucency would be is a risk – and when cards and PCs get returned, it damages
wrong, because the games were designed with the your finances and your reputation.’ Understandably, there’s
hardware doing it all for you. It did help that we had control not a great deal of love for Nvidia among the PowerVR folks.
over the API, because DirectX was kind of limited to
Z-buffer rendering.’ LEAVING THE DESKTOP
The next PowerVR PC product was the Neon 250, based on Fenny shows me the card that would have been Kyro 3, but it
some of the tech in the Dreamcast, and an all-in-one 2D/3D never made it to market – the reasons are kept off the record.
AGP card. ‘The original version of the Neon product had no fan I ask why we’ve never seen a PowerVR desktop product
on it, and we found it really hard to sell in 1999,’ muses David since. ‘We were very nervous,’ says Harold. ‘We looked at the
VideoLogic’s Vivid! Harold, VP Marketing Communications. ‘People basically market, and thought, “There are five console makers, and
Card – based on the thought, well it has no fan so it must be underpowered Panasonic is going to be out of this business in five minutes,
1st-gen Kyro chip compared to Nvidia. So the next version of the board had a fan then there’s going to be four and some day there will be
three. And in every generation you have to win a slot.”
‘After Dreamcast, we talked very seriously about doing a
console with somebody else, and realised that every single
engineering resource we had would have to go on that
project. Then, if we lost that slot to whoever in the next
generation, we would have no customer.
‘It’s the same with the PC market. When we started, there
were 50+ companies making devices for PC boards, and that
figure was shrinking – not yearly; it was practically shrinking
weekly. We looked at the market and just thought, If we keep
going after PC and console, we’re never going to have
enough customers to make our business resilient.
‘At the time, we said that one day we’d come back to those
markets, but ultimately, you’re driven by what your customer
wants to make.’

41
GRAPHICS

NVIDIA
GEFORCE
Lights, transform, action! Ben Hardwidge
recalls the very first ‘GPU’

I
t’s testament to Nvidia’s marketing
team that one of its buzzwords has now
slipped into common parlance. Not only did
Nvidia’s 1st-gen GeForce 256 introduce us to its now
famous ‘GeForce’ gaming graphics brand, but it also
brought the term ‘GPU’ to the PC with it. An initialism
that we now use as shorthand for any PC graphics
chip, or even a whole graphics card, started life as an
Nvidia marketing slogan.
To give you an idea of how long ago this was, I was
introduced to the term ‘GPU’ by a paper press release
the same week I started my first tech journalism job
in September 1999. We didn’t get press releases via
email then – they were physically posted to us, and
the editorial assistant sorted them all into a box for the
team to peruse.
‘In an event that ushers in a new era of interactivity A VisionTek GeForce 256 card with SDR memory
for the PC, Nvidia unveiled today the GeForce 256, the
world’s first graphics processing unit (GPU)’, it said. At the by the CPU. The first stage is the geometry, where the CPU
time, I thought it seemed pompous – how could this relative works out the positioning (where polygons and vertices sit
newcomer to the 3D graphics scene have the nerve to think in relation to the camera) and lighting (how polygons will
it could change the language of PC graphics? But I now see look under the lighting in the scene). The former involves
that it was a piece of marketing genius. Not only did ‘GPU’ mathematical transformations, and is usually referred
stick for decades to come, but it also meant Nvidia was the to as ‘transform’, with the two processes together called
only company with a PC ‘GPU’ at this point. ‘transform and lighting’ or T&L for short.
Once the geometry is nailed, the next step is to fill in the
TRANSFORM AND LIGHTING areas between the vertices, which is called rasterisation,
Nvidia’s first ‘GPU’ did indeed handle 3D graphics quite and pixel processing operations, such as depth compare
differently from its peers at the time, so it’s time for a little and texture look-up. This is, of course, a massive
history lesson. If we want to understand what made the first oversimplification of the 3D graphics pipeline of the time, but
GeForce GPU so special, we first have to take a look at 3D it gives you an idea. We started with the CPU handling the
pipelines of the time. whole graphics pipeline from start to finish, which resulted in
It was October 1999, and the first 3D accelerators had low-resolution, chunky graphics and poor performance.
only been doing the rounds for a few years. Up until the We then had the first 3D accelerators, such as the 3dfx
mid-1990s, 3D games such as Doom and later Quake were Voodoo and VideoLogic PowerVR cards, which handled
rendered entirely in software by the CPU, with the latter the last stages of the pipeline (rasterisation and pixel
being one of the first games to require a floating point unit. processing), and massively improved the way games
If you want to display a 3D model, it has to go through looked and performed, while also ushering in the wide use
the graphics pipeline, which at this stage was all handled of triangles rather than polygons for 3D rendering. With the

42
just as well with software T&L. DirectX 7 also didn’t require
hardware-accelerated T&L to run – you could still run
DirectX 7 games using software T&L calculated by the CPU,
it just wasn’t as quick.
The GeForce was still a formidable graphics chip whether
you were using hardware T&L or not though. Unlike the 3dfx
Voodoo 3, it could render in 32-bit colour as well as 16-bit (as
could Nvidia’s Riva TNT2 before it), it had 32MB of memory
compared to the more usual 16MB, and it also outperformed
its competitors in most game tests by a substantial margin.
ATi’s response at the time was a brute-force approach,
putting two of its Rage 128 Pro chips onto one PCB to make
the Rage Fury Maxx, using alternate frame rendering
(each graphics chip handled alternate frames in sequence
– note how I’m not using the term ‘GPU’ here!) to speed
up performance. I tested it shortly after the release of the
GeForce 256 and it could indeed keep up.

Nvidia’s GeForce CPU no longer having to handle all these operations, and THE GPU WINS
256 was the first dedicated hardware doing the job, you could render 3D The Rage Fury Maxx’s limelight was cut shortly afterwards,
consumer graphics
games at higher resolutions with more detail and faster though, when Nvidia released the DDR version of the
chip to handle the
whole 3D graphics frame rates. At this point, the CPU was still doing a fair GeForce in December 1999, which swapped the SDRAM
pipeline, including amount of work though. If you wanted to play 3D games, used on the original GeForce 256 with high-speed DDR
the transform and you still needed a decent CPU. memory. At that point, Nvidia had won the performance
lighting stages
Nvidia aimed to change this situation with its first ‘GPU’, battle – nothing else could compete.
which could process the entire 3D graphics pipeline, It also took a while for everyone else to catch up, and at
including the initial geometry stages for transform and this point, various people in the industry were still swearing
lighting, in hardware. The CPU’s only job then was to work that the ever-increasing speed of CPUs (we’d just passed the
out what should be rendered and where it goes. 1GHz barrier) meant that software T&L would be fine – we
could just carry on with a partially accelerated 3D pipeline.
At that point Nvidia had won When 3dfx was building up to the launch of the Voodoo 5
in 2000, I remember it having an FAQ on the website. Asked
the performance battle – whether the Voodoo 5 would have software T&L support,
3dfx said, ‘Voodoo4 and Voodoo5 have software T&L
nothing else could compete support.’ It’s not deliberately dishonest, but every 3D graphics
card could support software T&L at this time – it was done by
BATTLE OF THE PLANETS the CPU – it looked as though the answer was there to sneakily
As with any new graphics tech, of course, the industry didn’t suggest feature parity with the GeForce 256.
With no T&L
hardware, 3dfx instantly move towards Nvidia’s hardware T&L model. At In fact, the only other graphics firm to come up with
fought back with this point, the only real way to see it in action in DirectX 7 a decent competitor in reasonable time was ATi, which
a brute-force, was to run the helicopter test at the start of 3DMark2000, released the first Radeon half a year later, complete with
multi-chip approach
although some games using OpenGL 1.2 also supported it. hardware T&L support. Meanwhile, the 3dfx Voodoo and
on the Voodoo 5
5500. Photo credit: The latter included Quake III Arena, but the VideoLogic PowerVR lines never managed to get hardware
Konstantin Lanzet undemanding nature of this game meant it practically ran T&L support on the PC desktop, with the Voodoo 5 and Kyro
II chips still running T&L in software.
But 3dfx was still taking a brute-force approach –
chaining VSA-100 chips together in SLI configuration on its
forthcoming Voodoo 5 range. The Voodoo 5 5500 finally
came out in the summer of 2000, with two chips, slow
SDRAM memory and no T&L hardware. It could keep up
with the original GeForce in some tests, but by that time
Nvidia had already refined its DirectX 7 hardware further
and released the GeForce 2 GTS.
By the end of the year, and following a series of legal
battles, 3dfx went bust and its assets were bought up by
Nvidia. GeForce, and the concept of the GPU, had won.

43
S U B S C R I P T I O N / OFFER

ALL PRINT SUBSCRIPTIONS NOW COME WITH A

FREE DIGITAL SUB


PRINT + DIGITAL +
Free delivery of the print magazine to your door
Exclusive subscriber-only covers
Save up to 37% on the shop price of print issues
Access to the digital edition on your iOS or Android device

CHOOSE YOUR SUBSCRIPTION OFFER


£5 for 3 issues £45 for 12 issues
Renewing at £25 every 6 issues UK only
UK only
£80 for 12 issues
£5 Rolling subscription EU
UK only
£90 for 12 issues
£25 for 6 issues Rest of the world
UK only

SUBSCRIBE TODAY!
Mouse-pointer custompc.co.uk/subscribe
Phone 01293 312182 ENVELOPE [email protected]
Subscriptions, Unit 6 The Enterprise Centre, Kelvin Lane, Manor Royal, Crawley, West Sussex, RH10 9PE

Please allow 28 days for delivery.

44
SUBSCRIBE TO

GET 3
ISSUES
FOR £5

Mouse-pointer custompc.co.uk/subscribe
45
RETROGRADE

SOUND

46
SOUND

THE SOUND
BLASTER STORY
Ben Hardwidge talks to Creative Technology founder and CEO, Sim Wong Hoo,
about the development of the iconic Sound Blaster brand

N
ow celebrating its 30th
birthday, the Sound
Blaster made a
massive impact when it was
launched back in 1989. It seems
bizarre now, but at that time,
gaming was still considered to be
a frivolous novelty for the PC,
which was primarily a business
machine. While the Atari ST and
Commodore Amiga had half-
decent sound capabilities, most
PCs came equipped with only a
mono PC speaker, which simply blurted our chirps and beeps
The first Sound Blaster, codenamed ‘Killer Card’ was
like an excitable 1970s telephone. PC audio was terrible. launched in 1989, combining MIDI synthesis with 23KHz
If you wanted proper music in your games then you audio playback
needed a MIDI card. Rather than playing back a music
recording like current games, MIDI music is a bit like a Word needing masses of processing power, or a massive hard drive
document. In a Word document, the fonts are stored to store a recording. AdLib was one of the first companies to
somewhere else, and the Word file just stores the formatting, market a MIDI music expansion card for the PC, making a
meaning you can store a huge number of words and pages in massive difference to games, but the Sound Blaster went
a very small file size. In the same way, with MIDI, you have the one step further by combining MIDI music with basic
sounds stored on a synthesiser card, and a game’s music file sampling capabilities.
just tells it which sounds to play and when. The result was an audio system that could give you decent
This started with basic FM synthesisers such as music in games, as well as sampled speech and sound
Yamaha’s OPL2, which modulated frequencies to simulate effects. It changed the PC’s sound forever and sold by the
instruments, and then later went up to ‘wavetables’ of bucketload. It was the final part of the equation needed to
sampled instruments to create much more realistic- transform the PC into a proper gaming machine. Thirty years
sounding music. after the original Sound Blaster card was launched, we
In the days before we had very powerful CPUs and masses caught up with founder and CEO of Creative Technology, Sim
of storage space, this meant complicated musical scores Wong Hoo, to talk about the history of the iconic Sound
could be performed in games using tiny files, without Blaster brand.

47
SOUND

featured a stereo 12-voice music synthesiser. This product


became quite a hit in Asia. It was our first highly lucrative
product, and back then, that meant a lot for a small company
such as Creative.
In 1988, I felt the time was right for me to go to the USA,
which was the world’s largest PC market at the time, and my
mission was to create a PC sound standard for the whole
world. A lot of people felt this was an impossible mission,
considering our small size and limited resources at the time.
While I was in the USA, I learned that the market for music
cards had started to gain traction in the gaming industry.
The later Sound Blaster 2 increased the playback We quickly approached key game developers to support our
sampling rate to 44KHz, still on an 8-bit ISA card music cards. As an unknown company from Asia, it was very
challenging initially. However, we soon gained the respect of
CPC: Let’s start right at the beginning. What made you think several key developers, because of our prowess in technology
there was a definite market for a discrete sound card in and commitment to
the 1980s?
Sim Wong Hoo: Let me go a little bit further back. I started
supporting these developers.
To target this gaming market,
My mission was to
playing with microcomputers in 1979, when there was only
a handful of them around. They were either dumb or only
we changed the name of our
music card to Game Blaster
create a PC sound
managed some beeps. At that time, I was designing some and dropped the price by half. standard for the
computerised seismic data logging equipment, which my In the process of talking to
former French boss claimed to be the most advanced in the these developers, they whole world
world. When the equipment was brought to operate in oil rigs, strongly requested a sound
nobody believed that it was possible that a Singaporean had card that could support voice. I told them that we had already
designed this equipment in Singapore, which had no high-tech done it in 1986, but removed the feature due to lack of support
industry at all. for it. I told them that if they were willing to support it, we could
With a strong background in digital and analogue do a joint development. The first company with which we
technologies, plus acoustic knowledge, coupled with a deep worked closely was Broderbund, with its Carmen Sandiego
interest in the science of music, I had a burning desire to bring series of educational games. The project name of this sound
sound into the computer world. In fact, my first secret card was ‘Killer Card’. Broderbund developed its new voice-
microcomputer project in my French boss’ company was capable games with a crude prototype version of our ‘Killer
writing an electronic organ program in machine language that Card’. This card actually consists of two prototype boards,
could be played on the computer keyboard, much to the interlinked together with a whole bunch of wires.
chagrin of my boss. I left his company and started Creative in The ‘Killer Card’ became the Sound Blaster, and it was
1981 with a mission – to bring sound into the computer world. launched in November 1989 at Comdex in Las Vegas. And with The Sound Blaster
It took Creative another five years, until 1986, the voice-capable games from Broderbund ready to ship, Pro added an IDE
before we developed a PC – the Cubic CT – that had Sound Blaster was ready for prime time. Michael Jackson interface, enabling
Creative to sell
sound and music capabilities. This was five years passed by and was attracted to the only booth that generated
multimedia packs
before the term ‘multimedia’ for PC was even coined. But we computer audio throughout the entire Comdex show – the with a sound card
were too early, as there wasn’t any third-party content to Creative booth. I showed him the demo and presented our and CD-ROM drive
support it, especially voice-capable
software. Creative faced a Herculean task
in marketing the Cubic CT, especially in a
tiny market such as Singapore.
In 1987, after some soul-searching,
we decided that the Cubic CT was too
complex an animal for a tiny startup in
Singapore to handle. We decided to focus
our energies on just the music portion of
the Cubic CT, for which we had developed
some cool software, such as the Intelligent
Organ in 1986, which enabled you to play
orchestra-like music with just one finger
tapping on the keyboard. This became our
Creative Music System music card, which

48
synthesiser, so by default, it automatically supported a wider
range of software from the two standards, giving users the best
of both worlds.

CPC: The first Sound Blaster made a killer product by


combining PCM audio with FM synthesis, but its sampling rate
was limited to 23KHz. Why was the sample rate so low?
Sim Wong Hoo: The sampling rate was low simply due to
component cost, and the performance of PCs at the time. A
mass-market 8-bit analogue-to-digital converter wasn’t
available at that time, so analogue-to-digital sampling was
performed in software using the digital-to-analogue converter.
The Sound Blaster 16 made full use of the 16-bit ISA interface,
Because it was software, it was limited by the speed of the PCs
enabling CD-quality 16-bit/44KHz sampling at that time. Anyway, a 23KHz sampling rate is good enough for
8-bit, as the benefit of increasing the sampling rate is drowned
technology, and he stayed for 30 minutes, even though his out by the coarse 8-bit output anyway.
minders wanted to usher him away after five minutes.
Obviously he was awed. CPC: PC games had very limited audio features at this time –
At Comdex, people lined up in 20-person queues in front of how did you go about getting game developers to implement
three cashiers in our tiny 300-square-foot booth – we sold Sound Blaster support?
one Sound Blaster every four minutes. This was a Sim Wong Hoo: After the initial success of Sound Blaster, we
phenomenal success and Sound Blaster took off like a rocket started to engage the entire gaming industry, and supported
after that. To date, over 400 million units of Sound Blaster developers to put audio into their games. We provided them
have been sold. In fact, all PCs today still retain the original with a free Sound Blaster Developer Kit, which was the first of its
Sound Blaster compatibility in the OS. kind in the industry, as well as free consultancy. We even helped
game developers to certify their games as ‘fully Sound Blaster
CPC: What were the limitations of these old cards in compatible’ at no cost.
comparison with later Sound Blasters?
Sim Wong Hoo: The Sound Blaster was an 8-bit sound card CPC: What was the thinking behind adding gameports to the
with a low sampling rate. The audio quality was coarse and backplates of Sound Blaster cards?
very bad by today’s standards. But going from no sound to Sim Wong Hoo: It was very simple to do, and we had the space
‘got sound’ was a giant step for the PC at that time. Users on the backplate to include a gameport. This also saved a
were thrilled by this new capability, and its highly affordable precious slot for users who wanted to play games with joysticks.
mass-market price.
The OPL2 synthesiser CPC: It took a while for Creative to make the MIDI output of
was a two-operator FM Sound Blasters MPU-401-compatible. Why was this?
The audio quality synthesis chip and could Sim Wong Hoo: MIDI wasn’t our focus at the time – it was a

was coarse and only generate nine-voice


mono music. ‘Two-operator’
small, niche and hard-to-service market. The original Sound
Blaster did have a MIDI interface hidden in the gameport. It was
bad by today’s means it uses only two sine
waves to modulate each
put there to give a positive answer to curious people who asked
about MIDI but didn’t need it. This limited MIDI features didn’t
standards other and generate
different kinds of musical
cause any loss Sound Blaster sales. We eventually did make our
MIDI interface MPU-401-compatible and, as we had expected, it
instrumental sounds. While made no difference to our sales. The fact is that almost all our
it sounded slightly better than our own 12-voice stereo users didn’t care much about this compatibility.
synthesiser, it was still rudimentary in the realm of electronic
music instruments. CPC: The Sound Blaster Pro came on a 16-bit ISA card, but
was still only an 8-bit card really. Why did it need a 16-bit
CPC: The first Sound Blaster used a Yamaha OPL2 FM ISA interface?
synthesiser, making it AdLib-compatible. If AdLib hadn’t Sim Wong Hoo: The Sound Blaster Pro was a stereo version of
done this first, do you think the first Sound Blaster cards the Sound Blaster, which was a requirement of the Microsoft
would have had different MIDI synthesis? Multimedia PC standard. It supported additional interrupt and
Sim Wong Hoo: The first Sound Blaster did have a different DMAs, which were only found on the 16-bit bus.
music synthesis system – that was already in our Cubic CT PC
in 1986. In fact, the first generation of Sound Blaster supported CPC: The Sound Blaster Pro also came with an IDE interface to
both the Yamaha OPL2 FM synthesiser and our own 12-voice control a CD-ROM drive. What was the thinking behind this?

49
SOUND

Sim Wong Hoo: The CD-ROM drive that met the performance on the PC. Many of these cards suffered high returns as
requirement specifications of the Multimedia PC initiative was users found them not to be that Sound Blaster-compatible.
originally a very expensive, Japan-made CD-ROM drive with a After the returns, the users would usually then buy original
complicated and expensive SCSI interface, which cost over Sound Blasters.
$2,000 US. This expensive drive would have immediately
derailed the multimedia PC initiative. CPC: Take us through the development of the EMU chips for
So Creative solved this nightmarish scenario by the later 16-bit Sound Blasters – what were you looking to
codeveloping a new and inexpensive CD-ROM drive achieve with this level of advanced synthesis?
with MKE (Japan). Creative significantly improved the Sim Wong Hoo: The EMU was the grandfather of wavetable
performance of this low-cost drive by developing a synthesis, earlier than Yamaha and Roland, pioneering
proprietary CD-ROM drive interface on the Sound Blaster, wavetable synthesis way back in the early 1970s. EMU joined
as well as new driver software. This innovative driver went the Creative family in 1993, and we started using its
against conventional wisdom of needing an Interrupt wavetable chips in Sound Blasters to provide much better
and DMA for high-speed data transfer. Instead, it used music synthesis and FM synthesis. It was a major
the CPU to access the CD-ROM drive directly and create breakthrough for PC sound cards at that time.
a huge buffer of data in advance, thereby increasing The subsequent EMU chips – for example, EMU10K1 –
performance tremendously. besides doing wavetable synthesis, were also fully
Putting the CD-ROM interface on the Sound Blaster was programmable acoustic digital signal processing engines
an obvious advantage in that you also didn’t require an that powered our game-changing Environmental Audio
additional expansion slot for a CD-ROM drive controller. It eXtension (EAX) system. This enabled multiple
also simplified the sales of our Multimedia PC Upgrade Kits, simultaneous voices to be processable in hardware.
which comprised a sound card, CD-ROM drive and some
CD-ROM titles. CPC: Even though so many decent MIDI sounds were
available, via the AWE 32, AWE 64 and various wavetable
CPC: The Sound Blaster Pro 2 introduced OPL3 synthesis – cards, OPL2/OPL3 is still considered the ‘sound’ of the era –
what could this do that you couldn’t do on OPL2? it’s the default in DOSBox, for example. Why do you think
Sim Wong Hoo: OPL2 had two operators and nine voices, wavetable synthesis didn’t quite catch on in the same way
while OPL3 had four operators, 18 voices and stereo output. as FM synthesis?
FM synthesis with four operators used four sine waves to Sim Wong Hoo: FM synthesis supported many old games,
synthesise music, which provided a richer timbre and thus which is why it’s still found to be the default in DOSBox. As
created better-sounding musical instruments. PCs got a lot faster, and supported larger memory, I guess it
was easier for developers to stream music directly in games.
CPC: Several competitors started producing cheaper Some of them used their own software audio engines.
‘Sound Blaster Pro-compatible’ cards in the early 1990s –
how did these affect your sales, and was there any licensing CPC: The AWE32 was expandable via standard 30-pin
involved in claiming compatibility with your cards? SIMMs, but the AWE64 wasn’t. What was the reason for In the heyday of
MIDI gaming audio,
Sim Wong Hoo: These so-called compatible sound cards had this decision?
the massive AWE32
negligible effects on our sales, despite selling at lower prices. Sim Wong Hoo: The AWE64 was targeting a much bigger could be expanded
In fact, they helped to create a larger awareness for sound market and, to be cost-effective, we had to remove the using 30-pin SIMMs

50
memory upgrade functions. The
built-in memory was sufficient
for most applications. The
AWE64 subsequently became
a runaway success.

CPC: Some hobbyists have


found ways to clone old ISA
Sound Blaster cards, ordering
a pre-made PCB and soldering
in the components (such as
the Snark Barker). Given that
Creative hasn’t made these cards
for 25-odd years, do they have
Creative’s blessing?
Sim Wong Hoo: We have no issues with individual hobbyists The Last big ISA Sound Blaster launch was the AWE64,
who are nostalgic about our very old Sound Blaster cards. with the Gold version coming with 4MB of memory

CPC: We recently did a social media survey on how people For example, Sound BlasterX AE-5 offers dedicated
use their spare PCI-E slots, and 19 per cent of our high-quality components, and proprietary technologies
respondents used a dedicated sound card. What do people such as Xamp, which drives individual headphone channels,
get from a dedicated sound card that they can’t get from providing much better headphone audio transience. It can also
integrated audio? drive two extreme ends of the headphone spectrum, from
Sim Wong Hoo: In the first place, I think motherboard audio is 600 Ohm studio monitor headphones to 16 Ohm sensitive
horrible. Many engineers, especially digital engineers, think in-ear monitors.
that PC audio is achieved by simply putting a decent DAC on a Then there’s the Sound Blaster audio processing technology,
motherboard. That couldn’t be further from the truth. A good which can be personalised to suit individual entertainment
audio design requires a good analogue section. needs such as specific game profiles. It has features such as
There are many contributors of noise on any motherboard, Creative Multi Speaker Surround 3D technology (CMSS 3D),
so designing a good analogue section on a noisy which is able to provide 3D surround audio on just two front
motherboard is almost a speakers. There’s also the Crystalizer, which helps to restore
defeating cause. On a details that are otherwise lost in compressed audio, and
In the first powerful gaming computer, DialogPlus, which enhances speech clarity in movies.

place, I think the noise from a powerful


CPU is even worse. This
In fact, we’ve also moved beyond the internal sound card. To
serve users who don’t have a slot for internal sound cards, we
motherboard problem is magnified by
on-board Wi-Fi, Bluetooth
have a family of external Sound Blasters, which provide the
same high-end audio performance and features.
audio is horrible and so on. The final nail in
the coffin is the multiple CPC: How much of Creative’s revenue comes from Sound
GPU cards found on the Blaster cards these days, and how does this compare with
most powerful computers, which to me makes motherboard the past?
audio unredeemable. Sim Wong Hoo: The sound card revenue of today obviously
Creative has many, many years of experience and can’t be compared with the heyday of Sound Blaster in the past,
expertise in pristine audio design. This expertise spans when we used to ship millions of sound cards a month. That
digital, analogue and acoustic audio domains, all of which are said, Sound Blaster is still an important contributor to our
necessary for superior audio performance. Sound Blaster revenue. Plus, with the recent launch of our Sound Blaster AE
was well established decades before motherboard audio series, we are seeing a renewed interest in sound cards.
became pervasive. Over the years, as motherboards became
more powerful and noisier, our Sound Blaster cards, despite CPC: What’s next for Sound Blaster?
being plugged into the motherboard, were always a few Sim Wong Hoo: Super X-Fi is our latest revolution in
steps ahead in being able to preserve this pristine quality. headphone audio. It provides holographic-like audio
If users are willing to spend money on an expensive high- experience in headphones that’s as good as the real thing. To
end graphics card, it would make total sense for them to the headphone industry, it will be like the transition from black
invest in a worthy sound card to complete the entertainment and white TV to colour TV. Super X-Fi will be seen in upcoming
experience. Even our lower-end sound cards provide good Sound Blasters, and this could well reignite the audio
audio, and retain a big following to this day. revolution for the world.

51
SOUND

THE PC
SPEAKER
K.G. Orphanides delves into the bleeps and bloops
of the PC’s original primitive sound system

B
efore sound cards brought us polyphonic music and As PC sound card adoption grew through the 1990s, fewer
CD-quality PCM (pulse-code modulation) audio games used the integrated beeper and smaller piezoelectric
recordings, PCs could make exactly one noise: a speakers would become more commonplace. These were
square wave, output through a dynamic speaker driven by the quieter, and lacked the versatility and subtlety of a larger
computer’s timer chip. Launched in 1981, IBM’s first model dynamic speaker, making some fancier audio effects far less
5150 Personal Computer had an internal 2.25in (5.7cm) distinct and often too quiet.
speaker, designed to produce BIOS error codes to help Many modern PCs no longer come with any kind of
diagnose problems at boot. speaker. But motherboards still have the header connector,
The Intel 8253 chip
It was driven by the Intel 8253 Programmable Interrupt so you can still install one and listen to audio designed for an
drove the original
Timer, the same piece of hardware that handled system internal beeper as it was meant to be heard. PC speaker. Credit:
timing. While Timer Channel 0 was used for system Wikimedia Commons
synchronisation, Timer Channel 2 was used to send square QUEST FOR POLYPHONY
waves to the internal speaker, making it beep. Whichever way your PC beeper
By the 1990s, the 8252 had been superseded by the Intel sound is implemented, it’s
8259 Programmable Interrupt Controller (PIC), and these monophonic, which means it can
days, you’ll find a modern hardware equivalent on your only produce one tone at a time.
motherboard’s southbridge in the form of an Intel Advanced But, as with other very limited
Programmable Interrupt Controller (APIC) variant. All of them early computer audio standards,
retain PC internal speaker functions. that wasn’t going to prevent

52
composers from doing modulation (PWM) as a
remarkable things with it. method of producing
FURTHER LISTENING
Beyond simple system more sophisticated The Secret of Monkey Island
beeps, the easiest music to sound, with a variable custompc.co.uk/MonkeyIsland
persuade a PC speaker to volume and harmonies.
reproduce is single-tone Also used in PCM audio through the PC speaker
melodies. A series of numerous ZX custompc.co.uk/PCM
instructions is sent to the Spectrum games, PWM
timer via the CPU, using the uses careful timing of Album: System Beeps
programming language of the signals sent to the custompc.co.uk/SystemBeeps
your choice, telling to it PC speaker to modulate
produce a series of tones its usually binary
at a specified frequency. voltage levels, forcing the speaker into a range of partially
Sound effects in games on positions to produce sine waves. This can effectively turn
also started out as simple the speaker into a 1-bit DAC (digital-to-analogue converter).
beeps, but programmers soon Also heard in titles including Hard Drivin’ and Fantasy
An arpeggiated
pseudo-polyphonic started getting clever, rapidly changing the tones being World Dizzy, this approach can be used to play a pre-
‘chord’ from sent to the speaker to produce complex audio effects. generated soundtrack, rather than using the timer chip to
The Secret of Apogee Games mastered the art of creating convincing – directly generate square wave tones. However, even at 1-bit,
Monkey Island
or at least distinctive – PC speaker effects in titles including this sound reproduction was often CPU-intensive and the
Commander Keen and Hocus Pocus. resulting audio’s low quality grates on many listeners.
You technically can’t play polyphonic music on hardware Later, Access Software’s RealSound technology used a
that can only produce one voice at a time but, as it transpires, near-inaudible carrier wave and fine-grained control of the PC
there are ways around this problem. Probably the most speaker’s displacement amplitude to produce 6-bit digitised
widely used approach to this is arpeggiation, where a audio, giving us surprisingly high-quality speech and music in
pseudo-polyphonic effect is achieved by rapidly switching games including Mean Streets, World Class Leaderboard Golf
from one tone to another – anywhere up to 120 times a and Legend Entertainment’s Spellcasting series.
second – to give the impression of chords to the listener. By 1992, even Microsoft was in on the game, releasing a
A number of games, including the 1990 PC version of driver for Windows 3.1 that allowed any PCM WAV file to be
The Bitmap Brothers’ Xenon 2 Megablast in 1989, the PC output via the internal speaker. As sound cards, CD-ROM
port of Sega’s Golden Axe in the same year and Magnetic games, and then integrated motherboard audio became
Fields’ Lotus III in 1992, create two or three virtual audio ubiquitous, the need to write dedicated timer chip music or
channels and alternate which of them is directed to the kludge samples through the internal beeper evaporated, and
timer chip, allowing basslines to be rapidly switched into PC speaker audio vanished from audio selection screens.
the music. The results often sound harsh and busy, but
produce a rather effective BACK TO THE PRESENT
impression of polyphony. Today, PC speaker music isn’t as dead as you might expect.
A combination of these Although less iconic than the C64 or NES audio systems, you
techniques was used to even can hear its influence in the modern chiptunes music scene.
better effect in LucasArts’ In February 2019, Russian composer Shiru released
PC speaker music, such System Beeps, an entire album written for the PC speaker
as the remarkable beeper and using some of the most sophisticated arrangement,
rendition of the main theme arpeggiation and hearing perception tricks we’ve heard to
from The Secret of Monkey create an illusion of polyphony. There is, of course, a DOS
Island (1990), where the version of the album, but if you don’t happen to have any
sophisticated use of fast classic PC hardware (or a copy of DOSBox), it’s also available
trills and an alternating to buy in conventional digital formats.
percussive channel created Shiru used modern Digital Audio Workstation software
the impression of steel to compose System Beeps and has made relevant plug-ins,
drum chords backing projects and source code available for anyone else who
the main melody. wants to play with them.
Other techniques made Shiru isn’t alone in working on music creation tools for
A classic PC speaker more direct changes to the way the PC speaker’s sound your internal beeper. BaWaMI, created by Robbi-985, is a
square wave from
output worked. Windmill Games’ 1983 booter game Digger Windows MIDI synthesiser that will output via PC speaker. If
the theme to Space
Quest III: The Pirates and its iconic use of Hot Butter’s Popcorn as its in-game you’re so inclined, you can still hear and make new music for
of Pestulon theme is thought to be the earliest title to use pulse width the PC’s oldest audio device.

53
SOUND

ROLAND MT-32
K.G. Orphanides looks back at Roland’s external MIDI synth
that revolutionised early PC gaming music

Released in 1987 as a
musician’s tool, the MT-32
would revolutionise PC
gaming audio

A
glossy black box with a green LCD invites you to ‘Insert 1980s, with a very different feel to analogue synthesisers’ use
Buckazoid’ on its screen. A stirring 1980s sci-fi theme of control voltages to determine pitch, gate and trigger signals.
blasts glossy-textured synth tones through the The MT-32 used Roland’s new Linear Arithmetic (LA)
speakers connected to it, as you’re brought up to speed on the synthesis (see custompc.co.uk/LASynth) technique, first
continuing exploits of space janitor Roger Wilco. In 1989, Space seen a few months earlier in Roland’s 61-key D-50 keyboard
Quest III leaned into the highest-quality music available on home synthesiser. LA synthesis relies on Partials: fundamental
computer platforms, an external MIDI audio device that was as sounds to which it then adds effects in order to produce voices.
prohibitively expensive as it was revolutionary. These Partials are either stored as pulse code modulation
When it was released in 1987, the original Roland MT-32 MIDI (PCM) sound samples (as used by audio CDs, WAV files and
synthesiser cost £450 in the UK – equivalent to over £1,200 in so on) or fully simulated combinations of oscillators, creating
today’s money, and it didn’t even come with the MIDI interface the tone. Filters then determine the brightness of the sound by
card you’d need to connect it to your PC. fixing its cutoff frequency, and an amplifier then determines
Roland primarily marketed its MIDI expander module at its loudness. The LA chip’s pitch and amplitude envelopes
amateur electronic musicians: a multi-timbral synth-in-a-box act on the PCM sounds, determining the note produced and
that could be controlled by any MIDI keyboard. It proved popular
by being significantly cheaper than most rivals, and by supporting
32-note polyphony across up to eight simultaneous voices.
But the MT-32 would become best known as the pinnacle
of IBM PC-compatible gaming audio from the late 1980s to the
mid-1990s, and it helped to popularise the fully orchestrated
game soundtracks we take for granted today.

WHAT’S IN THE BOX?


In 1987, digital synthesis was still a relatively new technology,
developed in the 1970s and popularised in 1983 by Yamaha’s
DX7 synthesiser. It would become the archetypal sound of the The innards of an early MT-32

54
Uniquely, the MT-32
its attack, decay, sustain and release. This technique enabled could be sent SysEx
the synth to produce a realistic (for the time) reproduction of messages to display
genuine instruments. short text strings –
a feature that many
Alongside the LA chip, you’ll find a dedicated gate array, games used
a reverb chip, a Burr-Brown PCM54 DAC, a clutch of op-amps,
and EEPROMs that hold the MT-32’s firmware and PCM
sample banks. You can even send custom patches to the for new General Midi audio devices such as the Roland SCC-1.
MT-32 – specific configurations of effects for the LA synthesis Other studios supported the MT-32 as late as 1997, with the
chip to render on a voice from the PCM bank, so you can cover disk demo of Bethesda’s The Elder Scrolls: Daggerfall
effectively make new instruments. (custompc.co.uk/Daggerfall) being among the last.

Sierra carried the flag for the VERSIONS AND RELATIONS


The MT-32 spawned a host of versions and successors,
MT-32, recruiting Supertramp and became a de facto MIDI standard for other sound card
producers before General MIDI was established.
drummer Bob Siebenberg This first ‘old’ version of the MT-32 is easy to spot, based
on its port configuration – it had just a stereo pair of 1/4in TRS
GETTING INTO PC GAMING outputs. If you connect it to a MIDI interface card on any PC
The first IBM PC-compatible game with an MT-32 soundtrack faster than a typical 286, it can produce buffer overflow errors
was Sierra’s King’s Quest IV: The Perils of Rosella. Scored for due to an insufficient delay between SysEx messages sent
the MT-32 by film and TV composer William Goldstein, the to the device. This could be resolved using the turbo button
game also supported other audio hardware on release, notably on 386 and 486 PCs or slowdown utilities on later PCs. This
the Yamaha OPL2-based AdLib. doesn’t affect modern PCs using good-quality USB-to-
Sierra would carry the flag for the MT-32, recruiting MIDI connectors though – delaySysEx switches are also
Supertramp drummer Bob Siebenberg to create the implemented in a number of popular emulators.
soundtrack for Space Quest III and even selling the MT-32 The second ‘new’ version of the MT-32 introduced a
and required MPU-401 ISA MIDI interface card for $550 US functionally undetectable control CPU switch, along with
(equivalent to around £950 today), with MIDI composition an additional rear TRS stereo headphone port and reduced
software and two Sierra games of your choice included. noise levels. It also added a ROM playback demo mode and
The MT-32’s original US retail price was $695 US (around introduced some changes to the gate array and ROM chips.
£1,200 today). It wasn’t cheap, particularly compared with It fixed the buffer overflow error affecting faster computers,
the AdLib and CMS Game Blaster cards Sierra also sold, but it but it also rectified some firmware bugs on which some game
was the best way to get what the company’s 1989 catalogue composers had relied, breaking some soundtracks.
describes as ‘a symphony orchestra playing in your living room’. The MT-32’s appeal to computer music composers didn’t
Other companies took up the challenge, some more go unnoticed by Roland, and the company followed it with the
enthusiastically than others. Origin Systems supported the screenless Computer Music (CM) range of MIDI devices, based
MT-32 with some excellent soundtracks from 1990’s Ultima on the LA chip. This included, in 1990, Roland’s first internal
VI and Bad Blood, through to Pacific Strike in 1994. LucasArts/ ISA sound card, the LAPC-I, which integrated an MPU-401
Lucasfilm Games put most of its MT-32 support into its Star interface and MT-32-compatible CM-32L synth.
Wars titles, such as X-Wing, although some adventure games, By 1991, General MIDI was standardised and Roland
including Sam & Max and the Monkey Island titles, received launched its Sound Canvas range with the SC-55, which used
MT-32 MIDI soundtracks. Legend Entertainment, New World Roland’s own GS (General Standard) extension to provide
Computing and Microprose were also enthusiastic adopters. even more voices. A year later an internal version, the Roland
UK game development support for the MT-32 included the SCC-1, was released. Both provided reasonable MT-32
Bitmap Brothers’ Gods, Adventuresoft/Horrorsoft’s Elvira and backwards compatibility, but lacked support for custom MT-32
Simon the Sorcerer games, Team 17’s Alien Breed, Gremlin instrument patches.
Graphics’ Litil Divil and Plan 9 From Outer Space, as well as These MIDI devices, and many to follow, would be popular
Ocean Software’s Elf. with musicians for years,
Sierra aggressively promoted and supported the MT-32 but MIDI music in games
until the General MIDI standard was published in 1991, which was on the wane. Full CD LISTEN TO THE ROLAND MT-32
standardised the voice types and program numbers, ensuring audio was clumsy at first, but Space Quest III custompc.co.uk/SQ3
that the right instrument sounds were playing the right parts as disk capacity and audio Frederic Pohl’s Gateway custompc.co.uk/FPG
on all compatible devices, although the quality of the voices compression improved, it
Ultima VI custompc.co.uk/UltimaVI
still depended on your synth. would be digitally recorded
The music for Laura Bow II: The Dagger of Amon Ra (1992) audio that led game music into The Bard’s Tale III custompc.co.uk/BT3
was composed on the MT-32, but released with full support the new millennium. Dune custompc.co.uk/dune

55
RETROGRADE

STORAGE

56
STO R AG E

FLOPPY DISKS
Ben Hardwidge takes you through the workings of various types
of floppy disk, which were once the PC’s main storage medium

T
he classic 3.5in HD floppy disk has become a bit of were laughably awful in terms of reliability, convenience,
an icon now, and in more ways than one. Plenty of space and the length of time taken to load and save data. In
people will tell you that their kids think of a floppy 1971, IBM’s first read-only ‘Type 1 diskette’ was an attempt to
disk as the Save icon in Word. The 3.5in floppy is also what solve these problems in a neat 8in package, capable of
most people imagine when you say ‘floppy disk’ – a plastic storing 81KB. In 1973 it became commercially available with
case with a spring-loaded metal protector and a 1.44MB read/write abilities, and a larger capacity of 248KB.
storage capacity. The history of floppy disks goes back well
beyond these neat little storage packs though. MAGNETS, HOW DO THEY WORK?
My first experience of multiple types of floppy disk came Floppy disks work on the basic principle of magnetic binary
when my dad bought me a game (Targhan, in case you’re storage. As you probably know, all computer data can be
interested) for our PC XT clone in 1988. I opened the box, and broken down to simple on-off switches called bits at its
inside it were two 5.25in disks. We had to send it back to get most basic level – if the switch is off, it’s a zero, if it’s on, it’s
the version with 3.5in disks, which took ages because, at the a 1. There are eight bits in a byte, 1,024 bytes in a kilobyte,
time, hardly anybody used 3.5in disks for PCs. All the major 1,024 kilobytes in a megabyte and so on.
models, from the IBM PC to the Amstrad PC1512 and PC1640, Inside the package of a floppy disk is a circular piece of
A single-sided, had one or two 5.25in floppy drives instead. magnetically coated material with a hole in the middle, and
single-density Floppy disks were the main form of storage for the first a piece of protective fabric on either side of this material to
8in floppy disk, decade of the PC’s history, in many cases the only form of protect it. In the case of 8in and 5.25in disks, the hole is left
with a 50p coin for
storage. But the history of the floppy disk goes back even blank for the drive’s spindle to go through it. In the case of
scale. It has a total
formatted capacity further than the first PCs. For the purpose of this feature, I got 3.5in disks, there’s a metal plate in the middle with holes in
of 248KB hold of one of the very first types of floppy disk, an 8in single- it, onto which the floppy drive can lock.
sided single-density The drive then spins the disk and a stepper motor brings
disk. It’s huge. You the magnetic read/write heads into contact with the disk.
can put a 50p piece With 8in and 5.25in disks, where the disk is exposed in a
in the central spindle hole at the front, the heads make contact with the disk once
hole with space you insert the disk and flip down the physical lever at the
to spare. It holds a front of the drive to lock it in place. With 3.5in disks, the
formatted capacity heads make contact with the desk once the disk has been
of just 248KB. fully inserted in the drive, meaning the protective metal
The floppy disk plate at the top has been fully moved to expose the disk,
was one of the ‘first and it’s all locked in place.
solutions to the Once the heads make contact with the disk, and the
problem of disk is spinning, the drive can then read or write data –
transferring data from a magnetic transition denotes an on (1) switch, while no
one place to another. magnetic transition means an off (0) switch. All the ones
We’d used punch and zeroes are encoded/decoded in a bitstream, and in the
cards, punched tape case of 5.25in and 3.5in floppy disks, this is generally MFM
and magnetic tape, (modified frequency modulation), although there were
which worked, but other encoding methods in the early days of floppy disks.

57
STO R AG E

Going into the complete workings of MFM would take a into sectors containing a certain
feature in itself, but the basic gist is that it introduces a clock number of bytes, with unused bytes
to separate the bits in the bitstream. After all, a computer on either side of the sector and a
would have a tough time reading a long line of zeroes in a header to mark the start of the sector.
row, with no magnetic transitions between them to tell it This header also contains a cyclic
whether this was one ‘off’ bit or several of them. The idea redundancy check (CRC), which was
behind using FM and MFM data encoding was to enable a also placed at the end of the data used
non-return-to-zero (NRZ) system, so there was never a in each sector, for error checking.
state where there was neither an on nor off signal. There are many blank spaces, and
An FM bitstream can encode a 0 as 10 and a 1 as 11, for bytes to denote start and end points
example. MFM is more complicated than just using 10s for tracks and sectors, which is part of
Inside a 3.5in disk’s plastic shell is
and 11s, but the principle of using a clock to separate the the reason why the formatted capacity is a small floppy disk with a piece of
bits is basically the same. Incidentally, MFM was also the always lower than theoretical maximum protective fabric on either side
standard used in early hard drives, including the Amstrad capacity of a disk.
PCs in the late 1980s, before IDE was introduced.
SIDES AND DENSITIES
MAKING TRACKS In the case of the classic 3.5in floppy
Towards the latter days of the floppy disk’s reign, you could disk, you had 512 bytes per sector on a
buy disks pre-formatted for your type of computer, but you PC. Depending on your disk (and your
originally bought them unformatted. In this unformatted hardware), you could also use both
state, there’s nothing on the disk. It’s a blank, circular piece sides of the disk, doubling the storage
of magnetically coated material. You would then have to capacity. The other way to increase
tell your computer to format it for your system. In my case, the capacity was with the ‘density’, the
that meant typing ‘format a: /w’ at the DOS prompt. number of tracks and sectors per side.
The ‘/w’ means ‘wait’ – like many people at the time, I As an example, a ‘double-density’
couldn’t afford a hard drive, so I had to boot DOS from a (DD) 3.5in floppy disk has 80 tracks per
floppy disk called a system disk each time I started my PC – side, each containing 9 sectors with 512
this system disk also contained all the DOS commands, so bytes each. Each track therefore has
you would have to type the format command with the 4,608 bytes – multiply that figure by 80
system disk in the drive, then swap over to the unformatted and you get 368,640 bytes, or 360KB
disk when prompted – you really didn’t want to accidentally per side. So, a single-sided, double-
format your system disk! density (SS/DD) disk has 360KB – add
The formatting process would then prepare your disk for another side and you get a double-sided
reading and writing. Unlike the spiral of data used on most double-density (DS/DD) 720KB disk.
CDs and DVDs, floppy disks organise data in ‘tracks’ – The next step, once you've got faster
concentric circles that are separated by controllers and better physical media, is to double the
small areas containing no data. number of bits (and hence sectors) that fit on one track,
These tracks are then, in taking the number of sectors per track from nine to 18 and
turn, separated doubling the capacity from 720KB to the classic
1.4MB high-density (HD) floppy disk.
I’m referring to PC standards here, of course,
but other computers, such as the Amiga and
Mac, had more efficient ways of formatting disks
that resulted in higher capacities. With a
continuous motor speed, every sector held the
same amount of data, regardless of whether it was
on the inner or outer part of the disk area. As you move
from the centre of the disk outwards, however, the sectors
become physically bigger, which means space is wasted on
the outside area of the disk, where there should be more
room for data storage.
Apple got around this issue by varying the speed of the
motor when the head was on the outside of the disk vs the
From left to right, 8in
inside of the disk, enabling it to add more storage capacity
SS / SD, 5.25in DS / on the outside of the disk and get 800KB from a double-
DD, 3.25in HD sided double-density disk, rather than the 720KB on a PC.

58
You also got wildly different amounts of formatted KNOW YOUR FLOPPIES
storage space from the same physical size of disk on
different systems in the early years of the floppy, as there SIDES / FORMATTED
were so many different software standards, all with DATE SIZE
DENSITY CAPACITY
different sector sizes and
formatting systems.
A double-density It all resulted in a bit of
1973 8in SS / SD 248KB

a confusing mess of different


3.5in floppy disk has standards, across all sizes of
1976 8in DS / SD 563KB

80 tracks per side, floppy disk. The 8in floppy started


off being single-sided, single-
1977 8in DS / DD 985KB

each containing density (SS/SD) in 1973, but


ended up with a 1.2MB
1978 5.25in DS / DD 360KB

nine sectors (unformatted) capacity in 1977 1982 5.25in DS / HD 1.2MB


thanks to doubling the
density and the number of sides
1983 3.5in SS / DD 360KB
used. Likewise, 5.25in floppy disks also had an 80-track
high density flavour that gave you up to 1.2MB of data on
one disk. 1984 3.5in DS / DD 720KB

THE COST OF STORAGE 1986 3.5in DS / HD 1.4MB


In these times when solid state drives are racking up huge
speeds and capacities, it’s difficult to imagine a hard drive In my case, I only had one floppy drive in my first PC, but
being a luxury item, but hard drives were extremely at least it was a 3.5in double-sided double-density (DS/
expensive for a long time – just 10MB could cost you DD) one, which meant the disks had a reasonable amount
thousands of pounds. of storage space compared to 360KB 5.25in floppy disks
It’s for this reason that many PCs came with two (the other main PC standard at the time). The big adventure
floppy drives in the early days – you’d generally use the A games would arrive in large boxes containing multiple
drive for your OS and programs, and the B drive for data. floppy disks, and with no hard drive, you would have to
There’s still a bit of this residual DNA in PCs today – we swap over disks regularly. But floppy disks were cheap. I
may not have A and B drives any more, but our primary could buy a couple of with my pocket money and you got
storage devices are still C drives, with the assumption that disks of demos free with PC magazines.
Windows reserves A and B for floppy drives.
DECLINE
The 5.25in It took a long time for the floppy disk to completely die out.
floppy disk Even when DVD writers and USB thumb drives were
was once the
standard for
mainstream, floppy drives could still be useful for low-level
PC storage computing tasks, such as flashing your motherboard BIOS
or installing RAID drivers before a Windows installation.
There were also attempts to keep the format alive with
the ‘SuperDisk’ drive in the late 1990s, with capacities of
120MB and later 240MB, along with backwards
compatibility with older 3.5in drives. But eventually,
capacious and speedy flash drives, as well as cloud storage,
killed them off. Floppy disks were revolutionary compared
with the storage methods that preceded them, but they
were also notoriously unreliable, slow and noisy.
You risked losing all your data if you got them near a
magnet, or accidentally creased a 5.25in disk’s packaging.
The last motherboard I remember coming with an MFM
floppy drive controller was ASRock’s Z77 Extreme6 in 2012,
although you can still buy USB 3.5in floppy drives today, as
well as adaptors to control old 5.25in floppy drives via a
USB port. I still have a box of my old DS/DD floppy disks, a
reminder of where so much of my pocket money went
when that was my only storage option.

59
RETROGRADE

SOFTWARE

61
S O F T WA R E

WINDOWS 1.0
35 years ago Microsoft finally launched the first version of
Windows. Stuart Andrews looks back to where Windows
started, and tries using Windows 1 again for himself

I
t’s now more than 35 years since Windows either too expensive – an Apple Lisa cost around $10,000
launched in November 1985, 18 months behind US, while you could buy a PC for under $3,000 – or too
schedule and almost three years after Apple’s Lisa demanding in their system requirements. If it wasn’t bad
had introduced the first commercial GUI. It wasn’t exactly a enough that Visi On needed a staggering 512KB of RAM
hit; it flopped commercially, while reviewers criticised its and a hard disk, its applications needed to be coded in a
performance and wondered whether some of its most specific version of C using Unix tools. This left space for
powerful features were really that useful. Yet less than five an alternative.
years later Windows dominated the operating system Gates hired Scott McGregor, one of the key developers
market, running on over 70 per cent of all personal at Xerox PARC, and set a team to work on a project
computers sold. You can see Windows 1 as the ugly duckling codenamed Interface Manager. Crucially, it wasn’t seen
that was to transform into the, well, still gruesome but as a complete OS, but as a graphical environment that ran
enormously successful swan. on top of MS-DOS. In November 1983, Gates announced
Windows and set its release date for April 1984.
It wouldn’t require a hard The hype said Windows would bring a new way to
use PCs. It wouldn’t require a hard drive – just two floppy
drive and it would run with disk drives – and it would run with just 192KB of RAM. By
Even in the first
release, there
December 1983, an early version was previewed for an were options
just 192KB of RAM article in Byte magazine, with its writer, Phil Lemmon, to personalise
Windows, although
arguing that ‘Microsoft Windows seems to offer
good luck finding a
MAKING WINDOWS remarkable openness, reconfigurability and transportability, colour scheme that
Windows began its journey in the autumn of 1982. as well as modest requirements and pricing’. The result, didn’t look horrific
Microsoft’s CEO Bill Gates was already aware of research
into mouse-driven, graphical user interfaces at the
legendary Xerox PARC, and of Apple’s continuing work on
the same principles. However, the story goes that Gates
attended the autumn 1982 Comdex trade show in Las Vegas,
where he saw VisiCorp demonstrate Visi On: a GUI for the
IBM PC. Gates is said to have watched the demo several
times, back-to-back, before suggesting that other Microsoft
personnel needed to come out to Comdex and take a look. If
GUIs were the future, Microsoft wanted a piece of the action.
At this point Microsoft wasn’t the huge tech monolith we
know today. It was still a small company that had grown
successful on the back of Microsoft BASIC and MS-DOS.
Gates saw an appetite for a new and easier way to work
with the personal computer, and that rival systems were

62
Cutting, copying
and pasting were
revolutionary new
ideas, enabling
you to move
information from
one app to another

Lemmon thought, could bring computing to a new, non- embarrassment. Trower even saw being put in charge of
technical audience. the project as a step towards getting fired. By this point Scott
Why, then, did it take another two years to get finished? For McGregor had resigned, and while the core components
a start, there were some major technical challenges. When were in place, elements of the design and the look weren’t
development started, standard CGA screen resolutions were working. More seriously, there weren’t any applications.
limited to 640 x 200 in monochrome, and it was only with ‘Even at Microsoft, getting developers to write Windows
the development of EGA graphics boards in late 1984 that software was a challenge,’ said Trower in a 2010 interview.
you had enough pixels to make Windows effective. The slow ‘I couldn’t even get my former team to build a version of
speeds and limited capacity of floppy disks had an impact, BASIC.’ However, there was a prototype of a simple bitmap
while the Intel 8088 CPUs used in most PCs weren’t exactly drawing program, while Trower persuaded Gates and
bursting with firepower. Ballmer that Windows needed a set of simple applets,
Perhaps worst of all, there was a challenge in building including a word processor, calendar and business
industry support. As Gates said in 1983, ‘the primary focus card database.
of the company and the speeches I gave, the promotion I What’s more, Trower made it a requirement that You could run three
did, was to get people to believe in the graphics interface Windows could run existing DOS applications. This in itself or four applications
whether it was Macintosh or Windows, and that was a tough proved awkward – many DOS apps exploited tricks or at once, provided
you could tolerate
thing because people like WordPerfect and Lotus refused to workarounds that caused problems for Windows memory
painful slowdowns
put the resources into doing applications’. management – but it was a major boost to Windows in and a lack of screen
Some believe that other factors were in play. By 1984 the future. real estate
Microsoft was working with Apple on Macintosh software,
and had signed licensing agreements for specific UI
elements, but not others, including overlapping windows
and the Recycle Bin. It’s possible that Microsoft reworked
Windows to avoid including these elements and triggering
future litigation. If so, Microsoft wouldn’t admit it. A
November 1983 article in the US computing mag, Infoworld,
suggested that Microsoft’s Steve Ballmer saw tiled windows
as delivering a neater desktop.

A DEVELOPMENT DISASTER
Whatever the case, the development of Windows was
definitely troubled. Tandy Trower came in as the product
manager in autumn 1984, at a point where Windows
was seen externally as vapourware and internally as an

63
S O F T WA R E

Lack of app support By the early summer of 1985 Windows still wasn’t the mouse or the GUI. Sandberg-Diment had his doubts about
was such a problem finished, but Ballmer decided to release a ‘Premiere Edition’ dialogue boxes, suspecting that most people would prefer ‘a
that the Windows
to application developers and members of the press. more direct means of executing commands.’ He also felt that
team developed its
own paint programs, The team went into crunch, to the extent that one young multi-tasking was a waste of effort. ‘Most people use but one
utilities and games program manager, Gabe Newell (yes, that one) started program most of the time, if not all the time,’ he suggested.
sleeping in the office. Even at the last stages, new defects That’s aged well.
were found in the memory management code, delaying
the release even further. It was only in November that USING WINDOWS
testing Windows was finished, to be released at Comdex So how successfully did Windows 1 lay down the foundations
1985 with a comedy roast where Microsoft poked fun at its for the Windows we know and sort of love today? Well, it has to
own product’s lateness. be said that it’s a very different experience. There’s no desktop
and the management of windows is incredibly primitive. While
Even selecting from a pull-down menu it is mouse-driven, icons don’t play a starring role. Instead, you
launch applications by double clicking on a list in the MS-DOS
is different, involving a click, button- Executive – a simple file manager that lists not just the
programs, but all your MS-DOS files.
hold, select and release process The first application you launch occupies the whole screen,
and subsequent applications split the screen into two, three
MALIGNED AND MISUNDERSTOOD or four. Once windows are in place you can close, maximise
You might have expected the response to be rapturous, or resize them, or move them from one half or corner of the
but – as with so many Microsoft products – there was
disappointment and bemusement. InfoWorld ran its review
with the headline ‘Windows Requires Too Much Power’ and
gave it 4.5 out of 10. A piece by Erik Sandberg-Diment for
The New York Times called Windows extremely memory-
hungry. ‘Running Windows on a PC with 512K of memory’,
he noted ‘is akin to pouring molasses in the Arctic. Also,
the more windows you activate, the more sluggishly the
program makes its moves’.
Most of all, pundits weren’t convinced that Windows
solved any genuine problems. Some didn’t see the point of

64
widescreen aspect ratio, or your
favourite drawing subjects are
sausage dogs and snakes.
Windows Write is recognisably a
word processor, but there’s no spell
check or anything beyond basic
formatting features, much like the
Windows Write we all carried on not
using before Windows 95. And as
for Reversi, well it’s a variant of the
classic black and white disc strategy
game Othello, but – let’s face it – it’s
no Minesweeper or Solitaire.

LOOKING TO THE FUTURE


What’s most striking about using
Windows 1 now is that it feels less
like an operating system than a
fancy front-end for MS-DOS. It still
runs from an MS-DOS command
If you didn’t have screen to another. But with no overlapping, space gets tight prompt, it still works with the MS-DOS file and directory
the cash for a new- pretty quickly, and the size of the fonts and the blocky graphics structures and it was still partly designed to run MS-DOS
fangled EGA card,
you were stuck with
mean you don’t always get enough room per application to applications, principally because Microsoft had little faith
the even lower-res, make head or tail of what’s going on. in anyone developing native Windows ones. They were
black and white There’s no taskbar, but icons for the MS-DOS Executive and right, as until Windows 3.0 took off, barely any of the major
CGA version any minimised applications appear in a space at the bottom of software vendors made Windows software.
the screen, where a double click will bring them back into view. Yet there are aspects of Windows that show its potential.
There’s also a Control Panel where you can set the time and We’ll be kind and say that Microsoft ‘borrowed’ Apple’s
date, adjust your cursor preferences, add fonts and set up your concept of a clipboard, allowing you to cut and paste text or
colour scheme. Of course, the EGA standard only supported a pictures from one application to another. Microsoft’s early
maximum of 16 colours from a gamut of 64, while only seven Windows adverts go big on copying contact details from a
fonts were available on release. database and pasting them into a letter in Windows Write,
Even some Windows fundamentals don’t work like we then adding a graph from Microsoft Chart or Lotus 1-2-3
expect these days. Drag and drop is as non-existent as the old which, at that point, was the T-Rex of business applications.
Recycle Bin. Today, we also forget how Windows was so keen Microsoft designed Windows to be compatible with a
to demand double clicks when a single click would do. The range of applications – not just its own – and to promote
menu bar is in place, with a button in the top-left corner where interoperability, so that you didn’t have to work with just one
you can resize, move, close, maximise and minimise the application, or even one specific suite. Windows wanted
window, but the latter two options are called Icon and Zoom. you to mix and match. Microsoft also designed Windows as
Even selecting from a pull-down menu is different, involving a GUI that could work across PCs with different hardware,
a click, button-hold, select and release process that feels and this in turn helped to make the PC market more
utterly alien now. competitive. Even at that point, Bill Gates’ ambition was ‘to
The next shock is the primitive built-in applications. create the software that puts a computer on every desk and
Calculator is a simple calculator with only the most basic in every home’.
functions. Calendar has a single field where you can add Sure, Microsoft wanted to build the system software and
appointments on the hour or add alarms, but nothing else. the most important applications, but it also understood the
The Notepad is your classic no-frills text editor – and we mean necessity of bringing other software-makers on board. As
no-frills – while Terminal is the kind of baffling, text-driven Gates said in 1993, ‘Our vision, we shared; we didn’t view
comms program that only ever looked good in WarGames. that as some competitive edge. We just wanted to talk
Seriously, people used this stuff? about it and get other people to share the same ideas so
The highlights are Paint, Windows Write and Reversi, not that they would help make it all come true.’
because they’re any good but because they bear some vague Of course, Microsoft has never been shy about
resemblance to modern applications. Paint has a palette of monopolising, but Windows has always been stronger
tools, plus drop-down menus to handle fonts and options. when Microsoft opened up and led the way. You can see
It also has virtually no room to actually do anything with the Windows 1 as the start of that process, even if it’s not an OS
tools, unless you’re keen on drawing in a low-resolution, that you’d want to use today.

65
COLLECT AND PLAY.
IT’S THE EVERCADE WAY!

IGN T3
“... A GREAT WAY “... MAKE PLAYING
TO PLAY ARCADE AND RETRO ARCADE GAMES
RETRO CLASSICS IN THE T3 FUN, AFFORDABLE AND
COMFORT OF YOUR HOME.” ACCESSIBLE TO ANY GAMER.”
GAMING BIBLE
“THE VS MIGHT JUST BE MY 8/10 NINTENDO LIFE
“WE LOVE THE FACT THAT IT’S
FAVOURITE NEW GAMING GIVEN PLAYERS LEGITIMATE
HARDWARE OF 2021.” IGN ACCESS TO MANY GAMES”

AVA I L A B L E N O W, V I S I T W W W. E V E R C A D E . C O . U K / R E TA I L E R S
/EVERCADERETRO EVERCADERETRO @EVERCADERETRO
S O F T WA R E

WINDOWS 3.1
Screensavers, colourful icons and proper fonts. 30
years after its release, Stuart Andrews looks back at
the version of Windows that finally put it on the map

W
indows 3.1 is arguably the most crucial Windows horrible, text-based MS-DOS Executive of Windows 1.x
ever – the Windows that defined how PC and 2.x with the new Program Manager and File Manager
computing looked just as it was beginning to take components. Instead of clicking on a program or a file in a list,
off. Before version 3.1, Windows was a successful operating you could double click on an icon to launch it. Yet Windows
system, but one that looked and felt like a GUI shell perched 3.1 went further, taking advantage of the VGA and SVGA
precariously on DOS. graphics standards to introduce a revamped UI with more
With the launch of Windows 3.1 in April 1992, Windows colourful icons.
finally looked and felt like the real deal. What’s more, it was a What’s more, those icons could now do more than just
sales phenomenon, shipping over 3 million copies in its first get clicked on, as Windows 3.1 introduced drag and drop. You
six weeks on the market and 25 million within the first year. could explore your PC’s file system visually, copying files from
Windows was already big, but 3.1 put Windows in the lead. one folder to another by clicking on the file, dragging it over
How did Windows 3.1 do this? That’s not something you and releasing the mouse button. You could drag a file onto the
can nail down to any one factor. It was partly a question of Print Manager icon to print it out, or onto the application’s icon
stability, partly features and partly look and feel. Believe us – in Program Manager to open it and start work.
Windows 3.1 looks rough by today’s slick standards, but not Yet perhaps the most vital enhancement over Windows
half as rough as what came before. 3.0 was the introduction of TrueType fonts. At this point,
Cue a sigh of relief Windows still involved a lot of text and, up until Windows 3.1,
when this splash LOOK AND FEEL this text didn’t look good. It was pixelated, primitive and ugly,
screen showed up.
Look and feel certainly played an important part in Windows with no real provision to vary horizontal or vertical spacing.
Launching Windows
from MS-DOS could 3.1’s success. Windows 3.0 has already done some of While developing Windows 3.1, Microsoft put a team
be s…l…o…w the hard work of introducing a proper GUI, replacing the together to fix this problem, and that team worked with

The Program Manager was the heart of Windows 3.1. Double clicking
icons launched the applications, or you could drag and drop files onto
the icons or open windows

67
S O F T WA R E

monitors, and After Dark’s fish and flying toasters had already
appeared on Windows 3.0 and macOS. However, Windows
3.1 made screensavers a standard component, introducing
long-time favourites, such as the classic flying Windows logo,
the Star Trek-style Starfield, and the psychedelic Mystify and
Swirl. Seriously. After few too many shandies, they blew our
primitive, PC-loving minds.

ARCHITECTURAL IMPROVEMENTS
Yet the most important features that Windows 3.1 introduced
were those you couldn’t see. Windows 3.0 had introduced
protected memory – a way of using the protect mode of the
80286 CPU to allow Windows and Windows apps to use up
to 16MB of RAM rather than just the first 640KB.
Coded by ex-physicists David Weise and Murray Sargent,
TrueType fonts were a revelation to Windows users, making the
OS look significantly better and opening up more sophisticated this feature had been crucial, making Windows a viable
WYSIWYG DTP and design applications alternative for Microsoft to working with IBM on what would
become OS/2. Running in protected mode gave Windows
one of the two leading typesetting companies of the era, programs more stability, and enabled MS-DOS applications
Monotype, to design a new set of core fonts. Meanwhile, to run under Windows and still access all the available RAM.
Microsoft worked on the technology to render those fonts This in turn meant that Windows spent less time crashing,
on-screen, so they could be scaled upwards and downwards, which made it a lot more attractive to people trying to get
rotated and respaced, and still look pretty good. some actual work done.
Monotype came up with the Times New Roman, Arial and Windows 3.1 built on this foundation by taking the new
Courier New fonts that Windows still incorporates today, memory management features built into the newer 386
while Microsoft licensed and adapted Apple’s TrueType processors and using them in a 386 Enhanced mode. Where
technology, adapting the font hinting tech that made these Windows 3.0 was limited to a maximum of 16MB, Windows
fonts clear and legible even on a VGA resolution (640 x 480) 3.1 upped that limit to 256MB (or, in theory, up to 4GB) and
screen. This not only made Windows look a whole lot better, enabled programs to use virtual memory above and beyond
but made it a viable platform for desktop publishing and the physical memory installed.
design. Suddenly, the Mac had competition. It also enabled most DOS programs to be run inside a
Window with mouse support, and multiple DOS programs to Windows 3.1 gave
This was also the first version be run simultaneously. What’s more, all these enhancements us new ways to
customise our
meant Windows 3.1 only worked on an Intel 80286 CPU or
of windows to include a built- later. Rocking an old-school 8086? Tough. desktops, although
not much of any
These changes improved not just Windows’ overall
in screensaver stability, but its multi-tasking capabilities as well. Applications
value with which to
customise them

FUN AND GAMES


Other aspects of Windows 3.1 revealed a more playful side
to Microsoft. Windows 1.0 includes one game – Reversi,
while Windows 3.0 introduced Solitaire, a patience card
game originally developed by a Microsoft intern, Wes Cherry,
and responsible for so much lost productivity that Microsoft
banned its ‘boss key’ feature, which switched from the game
to a mocked-up Excel spreadsheet, before release.
Windows 3.1 added Minesweeper, the classic game of
grid-based bomb discovery so addictive that, legend has it,
Bill Gates had it uninstalled from his PC. Not that this stopped
him or anyone else playing it. While Gates was known to
sneak onto a colleague’s computer after hours to play it, the
rest of the company joked that Minesweeper was the most
carefully tested of all Windows 3.1’s new features.
This was also the first version of windows to include a built-
in screensaver. As with so many new features, this wasn’t all
that new – screen burn-in was an issue for CRT-based VGA

68
CONFOUNDING ISSUES
Let’s not heap too much praise on Windows 3.1; it still had its
fair share of issues. One was that Windows still didn’t support
long filenames, so both files and directories were limited to
names eight characters long, followed by a three-character
suffix that told the OS what kind of file it was. This meant
users became ingenious at truncating filenames, which in
turn made looking through a folder full of documents or save
games feel like decoding some esoteric text.
What’s more, while Windows 3.1 had support for
multimedia hardware, which was just about becoming
affordable and available, ease of installation wasn’t on
Microsoft’s list of priorities. Restrictive hardware didn’t help
– these were the days when solving hardware conflicts
involved moving jumpers from pin to pin to swap Direct
Memory Access channels. However, Windows 3.1 made the
whole process of installing drivers for a CD-ROM drive and
sound card as challenging as possible – it might take hours
to get the whole setup running.
Networking wasn’t any better either, because Windows 3.1
This was as exciting mostly got the resources they needed, and a central didn’t have any built-in networking support. Instead, it piggy-
and intuitive as file messaging system alerted them to hand over resources backed on networking clients for the underlying MS-DOS
management got in
Windows 3.1. Notice
as and when they were needed, although not all Windows operating system. If you hadn’t already mastered Novell
those old-school 8.3 programs behaved as well as others. A Task List enabled Netware or Microsoft LAN Manager, you were still going to
character filenames you to see all the currently running programs and halt any have to get to grips with them here.
that were gumming up the system, although the more likely Nor was the Windows shell ideal. Simply finding a program
outcome was that they would crash Windows first. in Program Manager could be daunting, especially if you
What’s more, all of this went hand in hand with another weren’t sure which folder or group held it. With screen space
major Windows feature. Windows already had the Dynamic at a premium, you would have to constantly minimise and
Data Exchange (DDE) protocol, which allowed you to take restore Windows while you looked. Don’t even ask about
messages and/or data from one Windows program to finding files in File Manager.
Most of all, Windows wasn’t a great platform for games.
WinG worked on a technical Dodgy drivers and the massive overheads involved in just
running Windows itself made it much, much easier to run
level, as proven by a WinG Wolfenstein 3D or The Secret of Monkey Island 2 in DOS,
which Windows needed to run anyway and for which all
port of id Software’s Doom Windows users had to pay. This also meant that getting
games running still required tinkering at text editor level
another. Windows 3.1 went one better with Object Linking with a range of crucial system files, to the point that most PC
and Embedding (OLE), which enabled you to embed an gamers were on intimate terms with config.sys, himem.sys
object created by one application into a document created by and autoexec.bat. Windows 3.1 didn’t change this one bit.
another, with both apps updating seamlessly when you made With time, there was some movement. In 1994, Microsoft
any changes. released a new API, WinG, which was designed to deliver
Suddenly, you could create a chart in Microsoft Excel and faster graphics performance under Windows and encourage
stick it in your Microsoft Word report, then update the data more developers to port their DOS games. WinG worked on
in Excel and see the changes rolled out in Word. I know. It a technical level, as proven by a WinG port of id Software’s
doesn’t sound that thrilling, but at the time, this rocked the Doom. Yet it didn’t work so well on the commercial level, with
computing world. developers looking at the work involved and the existing DOS
Last, but not least, Windows 3.1 gave the world the user base, then shrugging their shoulders until Windows 95
Windows registry. At the time, this central database of and DirectX came along.
settings wasn’t all that well known or understood, and Still, for all these faults, Windows 3.1 was a major leap in the
we never felt the need to edit it directly as we would in the right direction, paving the way not just for Windows 95, but for
Windows 95 years. Still, it showed a willingness to gather vital the switch from IBM and OS/2 towards Windows NT. Without
system information and preferences in one place, rather than that we might never have had the PC boom of the mid-1990s,
in a horde of SYS, INF and INI files, as had been the Windows Windows XP and everything beyond. And where would we all
way until this point. be without that?

69
Join us as we lift the lid
on video games
ALL FORMATS
116 PAGES MONTHLY

LIFTING THE LID ON VIDEO GAMES


ALL FORMATS
116 PAGES MONTHLY P O R TA B L E G A M I N G S P E C I A L ALL FORMATS
116 PAGES MONTHLY

LIFTING THE LID ON VIDEO GAMES


LIFTING THE LID ON VIDEO GAMES

CASTLEVANIA
The designer who created a
Konami classic – and vanished

PLAYDATE’S
INDIE DEV
GAME BOY SCENE
Make an infinite runner on
genuine retro hardware THE JOY OF
MODDING
HANDHELDS

EXPLORING
NINTENDO’S
UNUSUAL ORIGINS

Issue 62 £6
wfmag.cc Issue 63 £6
wfmag.cc

AN EXCLUSIVE LOOK AT THE THRILLING SEQUEL TO DELIVER US THE MOON


SONGS AND SURVIVAL FROM
HUMANS VS MACHINES IN A BLACKLY COMIC FIRST-PERSON SATIRE THE MAKERS OF HEAVEN’S VAULT
62

63
xx

xx

001_WF#62_COVER v2_RL_VI_DH_RL2.indd 1 20/04/2022 15:04


001_WF#63_COVER v2_RL_VI_DH-RL2.indd 1 18/05/2022 16:37
64
xx

001_WF#64_COVER v2_RL_VI_DH.indd 1 16/06/2022 12:43

Visit wfmag.cc to learn more


RETROGRADE

THE FIRST PC

71
T H E F I R ST P C

IBM PC 5150
Ben Hardwidge travels back to August 1981, when IBM
released its Personal Computer 5150 and the PC was born

A
big ape had only just started lobbing barrels at a personal computers. We ask him if it felt like the beginning
pixelated Mario in Donkey Kong arcade machines, of a new era when the PC was first launched 40 years ago.
Duran Duran’s very first album had just rolled off ‘Yes,’ he says, ‘but probably not the beginning of something
the vinyl presses and Roger Federer was just four days old. In so huge that its legacy lives on today.’
this time, the UK was even capable of winning Eurovision At this time, the home computer market was really
with Bucks Fizz. It’s August 1981, and IBM has just released starting to take off, with primitive 8-bit computers, such as
the foundation for the PCs we know and love today, the the Sinclair ZX80 and Commodore VIC-20, enabling people
PC 5150. at home to get a basic computer that plugged into their TV.
‘By the late 1970s the personal computer market was At the other end of the scale, large businesses had huge
maturing rapidly from the many build-it-yourself hobbyist mainframe machines that took up entire rooms, connected
kits to more serious players like Apple, Commodore and to dumb terminals.
Tandy,’ retired IBM veteran Peter Short tells us. ‘As people There was clearly room for a middle ground. IBM was
realised the greater potential for personal computers in going to continue producing mainframes and terminals
business as well as at home, pressure grew on IBM to enter for many years yet, but it also wanted to create a powerful,
the market with their own PC.’ independent machine that didn’t need a mainframe behind it,
Short is now a volunteer at IBM’s computer museum in and that didn’t cost an exorbitant amount of money.
Hursley (slx-online.biz/hursley), which holds a huge archive The PC 5150’s launch price of $1,565 US (around £885
of the company’s computing machines and documentation, ex VAT) for the base spec in 1981 equates to around £3,469
from Victorian punch card machines to the company’s ex VAT in today’s money. That’s still very far from what we’d

72
IBM’s System 23 Datamaster,
pictured here at the IBM Hursley
Museum, cost $9,000 US

An AMD 4.77MHz 8088 DIP CPU sits


in the bottom socket, with an optional
IBM 8087 coprocessor sitting above
it for floating point operations

call cheap, but it was a colossal price drop compared with see now, but there are
IBM’s System/23 Datamaster, an all-in-one computer still some similarities. For
(including screen) that had launched earlier the same starters, the floppy drive
year for $9,000 US – six times the price. And even that connects to the PSU with
was massively cheaper than some of IBM’s previous a 4-pin Molex connector,
microcomputer designs, such as the 5100, which cost up still seen on PC PSU cables
to $20,000 US in 1975. today. The PC was also
clearly geared towards
The ticking heart of the box expansion from the start.
The ticking heart of the box is a 4.77MHz 8088 CPU
is a 4.77MHz 8088 processor made by AMD – Intel had given the company a licence to
produce clones of its chips so that supply could keep up with
made by AMD demand. It’s for this reason that AMD still has its x86 licence
and can produce CPUs for PCs today, but at this point, the
IBM needed to act quickly. Commodore had already got two companies weren’t really competitors in the way they
a foothold in this market several years earlier with the PET, are now. To all intents and purposes, an AMD 8088 was
for example, and IBM realised that it couldn’t spend its usual exactly the same as an Intel one, and PCs generally came
long development time on the project. The race was on, with with whichever one was in best supply at the time of the
the project given a one-year time frame for completion. machine’s manufacture.
‘At the time, IBM was more geared up to its traditional, The CPU itself is an interesting choice. It’s a cut-down
longer-term development processes,’ explains Short. version of Intel’s 8086 CPU that it had launched in 1978. The
‘But it eventually realised that, with a solid reputation in the 8088 has the same execution unit design as the 8086, but
marketplace, it was time to look for a way to do fast-track has an 8-bit external data bus, compared with the 8086’s
development that would not produce a machine three, four 16-bit one. As with today’s PCs, the CPU is also removable
or five years behind its competitors.’ and replaceable, but in the case of the PC 5150, it’s in a long
dual in-line package (DIP) with silver pins, rather than a
PROCESSORS AND COPROCESSORS square socket.
We opened up a PC 5150 for this feature, so we could have Immediately above the CPU sits another DIP socket for
a good look at the insides and see how it compares with an optional coprocessor. At this point in time, the CPU was
PCs today. It’s hugely different from the gaming rigs we only an integer unit with no floating point processor. This

73
T H E F I R ST P C

The floppy drive connects to a 4-pin Molex connector on the


PSU – a plug that’s still sometimes used in today’s PCs

was generally fine in an era when most software didn’t


overly deal with decimal points, but you had the option to
add an 8087 coprocessor underneath it. This worked as
an extension of the 8088 CPU. ‘Adding the 8087 allowed
numeric calculations to run faster for those users who
needed this feature,’ explains Short.
The decision to use a CPU based on Intel’s x86 instruction
set laid the machine code foundation for future PCs, and
hasn’t changed since. Comparatively, Apple’s Mac line-up
has had a variety of instruction sets, including PowerPC, x86 these standards later) – each standard required a separate The IBM PC 5150 had
and now Arm. Nvidia might be making big noises about the PCB on this card – there’s a composite TV output in addition five 8-bit ISA slots for
expansion cards
future of Arm in the PC, but the x86 instruction set has stood to the pair of 9-pin monitor outputs as well. Bizarrely, this
its ground on the PC for 40 years now. card also doubles as a parallel port controller, with a ribbon
IBM itself has also dabbled with different instruction cable providing a 25-pin port. It’s typical of the Wacky
sets, including its own 801 RISC processor. Why did it go Races vibe seen on cards at the time, with multiple features
with Intel’s CISC 8088 CPU for the first PC? The answer, shoehorned into one expansion slot.
according to Short, is mainly down to time and a need to Similarly, there’s also a 384KB memory expansion card,
maintain compatibility with industry standards at the time. which also doubles as a serial I/O card, with a 25-pin port on A raw ISA card
‘The first prototype IBM computer using RISC architecture the backplate. The final card is an MFM storage controller for at the Hursley
museum, designed
only arrived in 1980 and required a compatible processor,’ the 5.25in floppy drive at the front of the machine.
for hobbyists to
Although the PC was clearly built with expansion in mind, make their own
It’s typical of the Wacky Races Short points out that ‘IBM was not the first to introduce
expansion slots. As far back
expansion cards

vibe, with multiple features as 1976, Altair produced


the 8800b with an 18-slot
shoehorned into one slot backplane, the Apple II also
featured slots from 1977 and
he explains. ‘In order to complete the 5150 development in there was also an expansion
the assigned one-year time frame, IBM had already decided bus on the BBC Micro from 1981.
to go with industry-standard components, and there was No doubt market research and
existing experience with the 8088 from development by competitive analysis showed
GSD (General Systems Division) of the System/23. RISC that this approach would provide
required the IBM 801 processor, but the decision was made additional flexibility and options
to go with industry standard components.’ without having to redesign
the motherboard’.
EXPANSION SLOTS Interestingly, though, Short
In addition to the ability to add a coprocessor, the IBM PC also says IBM was keeping an
5150’s motherboard also contains five expansion slots, with ‘eye on the hobby market. A
backplate mounts at the back of the case, just like today’s standard bus with expansion
PCs. Three of the slots in our sample were also filled. slots would allow users to create
One card is actually two PCBs sandwiched together – it’s their own peripherals. IBM even
a dual-monitor video card with the ability to output to both announced a Prototyping Card,
an MDA screen and a CGA screen simultaneously (more on with an area for standard bus

74
A 5.25in floppy interface components and a larger area for building your own
drive was the design’. It’s a far cry from the heavily populated PCI-E cards
standard storage
with complex machine soldering that we see today.
system for the
5150, with no The memory is organised in four banks in the bottom right corner of
hard drive option MEMORY the motherboard – in this case there are four 64KB banks, adding
at launch That 384KB memory card shows a very different approach up to a total of 256KB
to memory expansion than the tidy modules we have today.
Believe it or not, at launch, the PC 5150 base spec came with bank, while the 64KB configuration filled all four banks with
just 16KB of memory (a millionth of the amount of memory 16KB of memory each.
in today’s 16GB machines), which was supplied in the form of A later revision of the motherboard expanded this to 64KB
DRAM chips on the bottom right corner of the motherboard. as the base spec with one bank filled, and 256KB with all
The top spec at launch increased that amount to 64KB, four banks filled (this is the spec in our sample). If you then
although you could theoretically also install the DRAM chips added a 384KB memory card, such as the one in our sample,
yourself if you could get hold of exactly the right spec of chips you ended up with 640KB of memory – the maximum base
and set it up properly. The chips on the motherboard are split memory addressable by PCs at this time.
into four banks, each with nine chips (eight bits and one parity
bit). In the original spec, the 16KB configuration filled one GRAPHICS AND DISPLAYS
As we previously mentioned, our PC 5150 sample has
a dual-monitor card, which supports both the display
IBM’s colour 5153
standards available to the IBM PC at launch. A Mono Display
monitor didn’t come
out until 1983, Adaptor (MDA) card could only output text with no graphics,
shown here with while a Color Graphics Adaptor (CGA) card could output up
an IBM PC XT at to four colours (from a palette of 16) at 320 x 200, or output
Hursley, with Alley
Cat in full CGA glory
monochrome graphics at 640 x 200.
However, as Short notes, ‘the PC was announced with the
mono 5151 display in 1981. The CGA 5153 was not released
until 1983’. Even if you had a CGA card in your PC 5150, if you
used the original monitor, you wouldn’t be able to see your
graphics in colour. Seeing colour graphics either required you
to use the composite output or a third-party monitor.
‘Once the colour monitor became available,’ says Short,
‘it could either be attached as the sole display with its own
adaptor card, or equipped with both a mono and colour
adaptor card, and could be attached together with a mono
screen. Now you could run your spreadsheet on the mono
monitor and display output graphics in colour.’
There’s an interesting connection with the first PC
monitors and the legacy of IBM’s computing history too.
When we interviewed the Hursley Museum’s curator Terry
Muldoon (who has now sadly passed away) in 2011, he told
us the reason why the first PC monitors had 80 columns. ‘It’s
because it’s the same as punch cards,’ he said. ‘All green-
screen terminals had 80 columns, because they were
basically emulating a punch card.’

75
T H E F I R ST P C

DOS running on an IBM PC


5150 with a monochrome
green screen at Hursley

STORAGE is labelled for the keyboard – the other is labelled ‘cassette’.


Storage is another area where the PC is at a crossroads ‘It was common at the time to provide software on cassette
between new tech. As standard, the PC 5150 came with a tapes, which could also be used to store user written
single 5.25in double-density floppy drive, with 360KB of programmes,’ says Short. ‘My own Radio Shack TRS80 in
storage space on each disk. There was the option to add a 1979 used this method. A standard cassette tape machine
second floppy drive in the empty drive bay, but there was no such as the Philips could be connected through this socket.’
hard drive at launch.
‘The first hard drive for microcomputers did not arrive until SOFTWARE SUPPORT
1980 – the Seagate ST506 with a capacity of 5MB,’ explains This brings us neatly to the subject of software support.
Short. ‘By that time, the PC specifications had already been We’re now used to graphical user interfaces such as
agreed and the hardware development team in Boca Raton Windows as standard, but in 1981 Microsoft was a small
was in full swing. The requirement was for a single machine company, which had developed a popular version of the
developed within a one-year time frame. BASIC programming language.
‘A small company called Microsoft was also developing ‘Microsoft Basic was already very much an industry You flip the big
red switch (BRS)
the first version of DOS under sub-contract. The 5150 BIOS standard by 1980,’ says Short. ‘It was Microsoft’s first
on the side to
therefore had no hard disk support – DOS 1.0 and 1.1 are the product. This fitted with the concept of using industry power the PC
same. The power supply selected for the 5150 wasn’t beefy standard components. IBM chose to sub-contract its 5150 up or down
enough at 63W to power the 5150 and a hard drive.’ operating system development to
Later versions of the 5150, such as our sample, came with Microsoft, perhaps for this reason.
a 165W PSU, and future DOS versions enabled you to run a Again, the compressed development
hard drive, but it wasn’t until the IBM PC 5160 XT in 1983 that schedule influenced these decisions.’
there was a hard drive option with an IBM PC as standard. Terry Muldoon gave us some more
The PSU also connects to a massive red switch power insight into the development of the
switch on the side, which is very different from the delicate PC’s first operating system, IBM PC
touch-buttons we have today. You had to literally flip a switch DOS 1.0, when we spoke to him in
to power on the first PCs. This was another legacy of IBM’s 2011. ‘The story I heard is that basically
past – a time when, if a machine needed to be shut down IBM needed an operating system,’ he
drastically, you would ‘BRS it’ – BRS stands for big red switch. said, ‘and IBM didn’t have time to write
The back of the PC 5150 also alludes to another form of one – that’s the story. So they went
storage. There are two DIN sockets on the back, one of which out to various people, including Digital

76
Research for CPM, but Digital Research didn’t return the call.
Bill Gates did, but he didn’t have an operating system, so he
went down the street and bought QDOS.
‘The original DOS was a tarted-up QDOS, supplied to IBM
as IBM Personal Computer DOS, and Gates was allowed to
sell Microsoft DOS (MS-DOS). And they carried on for many
years with exactly the same numbers, so 1.1 was DOS 1 but
with support for us foreigners, then we went to DOS 2 with
support for hard disks, DOS 2.1 for the Junior, DOS 3 for the
PC80 and so on.’
You can have a play with DOS 1.0 on an emulated PC 5150
The IBM Personal Computer
at custompc.co.uk/5150, and it’s a very basic affair. Even if laid the foundation for the
you’ve used later versions of DOS, there are some notable PCs we know and love today
absences, such as the inability to add ‘/w’ to ‘dir’ to spread
out the directory of your A drive across the screen, rather
than list all the files in a single column. of software interrupts in that BIOS that people used, such
What’s also striking is the number of BASIC files supplied as the timer tick, which were really useful. You get that timer
as standard, which can be run on the supplied Microsoft tick and you can get things to happen, so you have to be able
BASIC. One example is DONKEY.BAS, a primitive top-down to produce something that hits the timer tick, because the
game programmed by Bill Gates and Neil Konzen, where software needs it.’
you move a car from left to right to avoid donkeys in the road Rival computer makers could circumvent the copyright of
(really). What’s more, this game specifically requires your the BIOS by examining what it did and attempting to reverse-
PC to have a CGA card and to run BASIC in advanced mode – engineer it. Muldoon explained the process to us.
you couldn’t run it on the base spec. ‘The way people did it is: with one group of people, say:
“this is what it does”, and another group of people take that
A FUTURE STANDARD specification, don’t talk to them, and then write some code
With its keen pricing compared with previous business to make it do that – that’s called “clean room”. So one person
computers, the IBM PC 5150 was well received in the USA, documents what it does, and another person now writes
paving the way for a launch in the UK in 1983, along with DOS code to do it – in other words, nobody has copied IBM code,
1.1 and the option for a colour CGA monitor. Clone machines and there’s a Chinese wall between these two people.
‘What some of the clone manufacturers did is, because
The power supply wasn’t we published the BIOS, they just copied it. Now, the BIOS
had bugs in it, and we knew they’d copied our BIOS because
beefy enough to power they’d copied the bugs as well. This was only the small
companies that came and went. Phoenix produced a clean
the 5150 and a hard drive room BIOS, so if you used a Phoenix chip in your clones, you
were clean.’
from companies such as Compaq soon followed, claiming Of course, any self-contained personal computer can
(usually, but not always, rightly) to be ‘IBM PC compatible’, technically be called a PC. Peter Short describes a PC as a
and the PC started to become the widespread open standard machine that ‘can be operated directly by an end user, from
that it is today. Was this intentional on IBM’s part? beginning to end, and is general enough in its capabilities’. It
‘Industry standard components, an expansion bus and a doesn’t require an x86 CPU or a Microsoft OS. In fact, there
prototyping card would naturally lead to an open standard,’ was and still is a variety of operating systems available to
says Short. ‘Not publishing the hardware circuitry would x86 PCs, from Gem and OS/2 in the early days, through to
make it difficult to capture the imagination of “home” the many Linux distributions available now.
developers. Open architecture was part of the original plan.’ However, the PC as we generally know it, with its x86
Muldoon wasn’t so sure when we asked him back in instruction set and Microsoft OS, started with the PC 5150 in
2011. ‘Now where did IBM make the mistake with DOS?’ He 1981. Storage and memory capacities have hugely increased,
asked. ‘This is personal opinion, but IBM allowed Bill Gates as have CPU clock frequencies, but the basic idea of a self-
to retain the intellectual property. So we’ve now got an Intel contained box with a proper
processor – the bus was tied to Intel – and another guy owns CPU, enough memory for
the operating system, so you’ve already lost control of all of software to run, its own THANKS
your machine in about 1981. The rest is history. storage and a display output, Custom PC would like to thank Tim Beattie
‘The only bit that IBM owned in the IBM PC was the BIOS, as well as room to expand for the loan of his PC 5150 for this feature, and
which was copyright. So, to make a computer 100 per cent with extra cards, started the team at IBM’s Hurlsey Museum. RIP Terry
IBM compatible, you had to have a BIOS. There were loads here. Thank you, IBM. Muldoon – you’re very much missed.

77
RETROGRADE

HOW TO

78
H OW TO

BUILD A DOS PC WITH


A MODERN TWIST
Ben Hardwidge shows you how to build a pure retro PC
gaming rig, without scraping your fingers on horrible
off-white cases, or faffing with ancient hard drives

WHY? JUST, WHY?


T
ech nostalgia can be a perilous and well-
lubricated slope if you don’t tread carefully, a bit There’s no doubt that DOSBox is a seriously powerful
like a staircase in an old Sierra adventure game virtual emulator that works brilliantly. However, it doesn’t
(we’re looking at you, King’s Quest III). It starts innocently, quite tick all the boxes. If you want to put an actual
perhaps by browsing screenshots of LucasArts hardware synthesiser card in your PC, and hear it play
adventures on mobygames.com, or wistfully searching music in games in real time, with the option to expand
for old hardware on Google Images. You might even it with a wavetable daughterboard, then you need the
download Doom and The Secret of Monkey Island on original hardware.
Steam or GOG, or have a play around with DOSBox. If you want to boot to a DOS prompt and know that
You can stop there, of course. You can, honestly. There’s your computer is running the software natively, without
no reason to scratch this reminiscing itch any further, emulation, then you’ll need a real 1990s PC. In all honesty,
but there’s always another step to a hobby that starts to it might make little difference to the end result compared
venture into the realms of silliness, and we’ve decided with an emulator, but it’s a fun project, and it also gives you
to take it. We’re going to build a PC based on original an understanding of how old PC hardware works, which
hardware from the 1990s, and show you how to do it. you don’t always get from an on-screen DOS emulator.

79
H OW TO

If you’ve ever wondered what it was like to use a PC We also wanted to avoid using an old hard drive.
from the old days, and fancy having a dabble with old Mechanical hard drives can become unreliable after
hardware, but don’t know your AT from your AT-AT, then five years, let alone 25, plus they’re slow and noisy, so
this Retro tech special is for you. In fact, even if you don’t we wanted to use solid state storage. However, we
want to buy a load of overpriced ancient hardware in order also wanted the flexibility to run any old software from
to construct an obsolete gaming machine (and we won’t charity shops and eBay, not to mention disks from the
judge you for that, much), we’ll give you a grounding in loft, and that means our system needs a CD-ROM drive
how the PC has changed in some ways, but not in others, and a floppy drive too.
and help you understand the foundation on which today’s Finally, PSUs have come an enormously long way
PCs are built. since the 1990s. We have modular and semi-modular
designs, wrapped/sleeved cables as standard, and
MISSION BRIEFING the 80 Plus initiative has weeded out the flaky and
The idea behind our retro rig is to combine the best of the inefficient PSU designs that were commonplace 25
old world with the perks of the new world. While there years ago. So we’re using solid state storage, a modern
are parts of the legacy PC hardware era we miss, there case and a new PSU. The rest of the core spec, however,
are others that we’re is contemporary 1990s DOS hardware.
The idea is to combine the very glad have been
consigned to the great SLOTS OF FUN
best of the old world with silicon scrapheap in Your first priority when building a DOS gaming system
the sky. There was no is the motherboard. You want one with 16-bit ISA slots
the perks of the new world way we were going to (long and usually black), so you can get the sound
use a 1990s case, for working properly. PCI sound cards were largely designed
example. As well as having that horrible off-white colour, for Windows, rather than DOS, and while some of them
which yellowed over time, early PC cases were often have DOS drivers, it’s a faff trying to get them to work in
badly designed and built. There was nearly always sharp all your games.
metal on which you could easily scrape your knuckles, PCI sound cards also tend to make heavy use of the
very little consideration given to cable routing, drive bays CPU, and don’t have all the required audio hardware on
everywhere and the PSU would often be sat at the top. them, relying on the CPU to do some of the work. That’s
fine if you have a Pentium III and Windows 98, but it’s
rubbish for DOS gaming – an ISA card will have all the
synthesiser hardware you need on it. However, it’s worth
having a PCI slot for your graphics card.
Some motherboards from the 1990s will also have
VESA local bus slots, which look a bit like an ISA slot
with a brown PCI slot on the end. You can install an ISA
graphics card in these slots, but actual VESA local bus
cards are generally expensive and hard to find these
days. Ideally, look for a motherboard with a mix of both
ISA and PCI slots – the latter are short, with a thin socket
in the middle, and they’re usually white.
You also want a replaceable CMOS battery.
These silver discs are a standard feature of today’s

The MSI MS-5158 Choosing a


motherboard used motherboard with
in our DOS PC. The a replaceable
ISA slots are the CMOS battery will
black ones at the save you having
bottom, and the PCI to replace a dead
slots are the white battery with a
ones above them soldering iron

80
Molex-to-floppy adaptors are
easy to find on eBay. These
black ones are made by Corsair

motherboards, but in the 1980s and 1990s, the CMOS


battery was often soldered to the motherboard. This replaceable CMOS battery, plus it supports a wide range
wasn’t a major issue at the time, as the batteries lasted of CPUs, from early Intel Pentiums to the later MMX chips,
for years, but decades later, these batteries have run dry, as well as AMD’s K6 processors.
and you’ll now need a careful touch with a soldering iron
to replace them. PLUGGING AWAY
The other crucial considerations for motherboards are Your power supply and its connectors are also a
their form factor and power socket. For starters, avoid consideration. You should be able to use a new one,
any proprietary motherboard designs. These were often rather than scouting around for an old 1990s one – just
found in the PCs from big brands at the time, such as make sure that the main ATX connector (24-pin on the
Compaq, Hewlett-Packard and so on. These PCs were latest designs) can be separated into two parts – the large
fine in themselves, but they severely lacked flexibility, as 20-pin part will connect straight into the socket on an old
they often had proprietary power connectors and their I/O ATX motherboard.
ports would be positioned in non-standard locations, to fit Secondly, you’ll need plenty of Molex connectors, plus
with the custom case designs. a couple of floppy power connectors. Most PSUs have
Meanwhile, in the world of DIY PCs and independent the former in some form, but the latter can be hard to
system builders, there were two main standards of find sometimes. Thankfully, you can get adaptor cables
motherboard – AT and ATX, with the latter coming later. AT on eBay and Amazon, which go straight from a Molex
was a firm favourite among enthusiasts, as it maintained connector to a floppy plug, and also ones that go from
compatibility with older cases. If you bought an AT 386 SATA power to Molex power. We bought a pair of black
system, you could replace the motherboard with a Super Corsair Molex-to-floppy adaptors on eBay.
Socket 7 AT motherboard many years later.
The only port soldered to a standard AT motherboard CHEAP AS CHIPS
was a large 5-pin DIN socket for connecting a keyboard. The CPU is your next consideration, and if you read our
Any other ports would be connected via ribbon cables, piece on Socket 7 (see Issue 203, p108), you’ll know
using expansion slot backplates or cut-outs on the back there’s a huge range of options here. We suggest avoiding
of the case. Cyrix CPUs, as their poor floating-point performance will
The power supply socket was also a bit of an oddity, make them struggle in games such as Quake. An AMD K6
with the PSU having two plugs that needed lining up or Intel Pentium will do the job fine. We’re using a 166MHz
together. There was no way to turn off an AT PC with
software either, so there would be a hard push-in/push-
out switch trailing off the PSU.
ATX introduced the motherboard power socket that
we still use today, albeit with 20 pins rather than 24, as
well as the ability to turn off the system with software, so
you could tap a simple power button rather push in a hard
switch. The ATX standard also introduced the standard
motherboard length and I/O backplate design you see
on motherboards now.
You can avoid most of the pitfalls of incompatibility
by simply opting for a motherboard with an ATX form
We’re using
factor and power socket. In our case, we’ve gone for an
a 166MHz
MSI MS-5158, an ATX Socket 7 motherboard based on Intel Pentium
Intel’s Triton TX chipset – it has PCI and ISA slots, and a MMX CPU

81
H OW TO

72-pin SIMMs need


to be installed in pairs
on Pentium systems
– we’re using two
16MB EDO sticks

adaptor – a passive cable won’t work here, as you need to


A 1MB PCI graphics card will cover
convert the analogue signal to digital, so do your research
all your bases. Cards based on the
Cirrus Logic 5446 were common and before purchasing. If you want the full retro experience,
easy to pick up cheaply on eBay you could even pick up an old CRT monitor, but we’re just
plugging our machine into a 4K iiyama monitor in the
Pentium MMX, which is overkill for most of the games lab, which has a 15-pin VGA input, as well as the ability
we’ll be running on our system, but that hardly matters to change the aspect ratio to 4:3 in the OSD menu. You’ll
when you can pick them up for £15 now. want the latter feature if you don’t want your games to
You could also use a Slot 1 Pentium II or Celeron look weirdly stretched.
system, as long as it has ISA slots for your sound card.
These systems were much more geared towards 32-bit MEMORY LANE
computing than their Socket 7 predecessors, and they Next up is memory, and you may have two options here
won’t give you any advantage over Socket 7 CPUs in DOS – 72-pin SIMMs or 168-pin DIMMs. Single inline memory
games, but they will work. modules (SIMMs) need to be installed in pairs on Pentium
Likewise, an AGP graphics card will boot into DOS, systems, although you could install them singularly on
but there’s no advantage to using one over a PCI card, some earlier 486 PCs. You’ll have a choice of fast page
as there will be no DOS 3D drivers. There are a few non-parity or EDO memory – either should work fine, but
considerations when buying a graphics card for DOS EDO will be slightly quicker.
though. Firstly, the minimum you want for DOS games is Meanwhile, 168-pin dual-inline memory modules
a 256KB VGA card, which will enable you to run games at (DIMMs) come with either EDO or SDRAM chips (the
320 x 240 (or 320 x latter is quicker), and it’s fine to install them singularly
For our DOS and Windows 200) with 256 colours,
and you can step up
rather than in pairs, although you won’t be able to mix
DIMMs and SIMMs together. SDRAM DIMMs also come
3.1 system, 32MB of to 640 x 480 with in a variety of clock speeds to match the front side bus.
16 colours. Having a As with today’s memory, the fastest DIMMs can slow
memory is plenty 512KB card will enable down to lower clock speeds, so you may as well buy
you to get 256 colours 100MHz (or even 133MHz) memory if you have the option.
at 640 x 480, and stepping up to 1MB will even enable This will match the bus speed of later AMD K6-II and
16-bit colour (over 64,000 colours) at 640 x 480, or 256 Pentium II motherboards, and slow down to 66MHz on
colours at 800 x 600. older systems
Virtually no DOS games support the latter two For our DOS and Windows 3.1 system, 32MB is plenty,
modes, but they’re handy if you want to have a play with and our CPU uses a 66MHz front side bus, so we’re just
Windows 3.1. If you do want to run Windows 3.1 on your going with a pair of 16MB EDO 72-pin SIMMs.
retro system (which will open you up to some other
games, such as Civilization II), make sure there’s a driver FLASH, AAAAHHHHH!
available for your graphics card. A PCI graphics card will The idea of running your whole PC on solid state storage
also be quicker in Windows than an ISA one, making for was a mere fantasy back in the 1990s, but you can easily
a more responsive experience. We’ve chosen a 1MB PCI use flash memory for your main storage system on an old
Cirrus Logic 5446, which covers all our bases – you can PC now. There are various methods, but we’re going to
still pick them up cheaply on eBay. use a CompactFlash adaptor that plugs into an IDE socket
You’ll also need a supporting monitor with an analogue and requires a floppy power connector – it cost us just
15-pin VGA input, or an HDMI or DVI input with an active £5.99 on eBay. There are also options that plug into ISA

82
ON THE CASE
As we mentioned earlier, you should be able to install
an old ATX motherboard into a new ATX case, but if you
want to run software from the original media, it will also
need front-facing drive bays. You’ll need a 5.25in bay
for a CD-ROM drive, and a bay for a 3.5in floppy drive if
you want that too. A dedicated 3.5in bay can be used for
the latter, or you can get a 3.5-to-5.25in adaptor to put a
floppy drive in a 5.25in drive bay.
We’re using a Fractal Design Define R5, which we had
spare in the lab and has two 5.25in bays (we’re using an
Akasa adaptor to install the floppy drive in a 5.25in bay),
but there are other new cases that will do the job. Fractal’s
latest Define 7 has one 5.25in bay, for example, and the
slots, but the IDE method makes for a system that’s easy larger XL model has two 5.25in bays. The Pure Base 600
to set up for booting in the BIOS. from be quiet! also has two 5.25in front-facing drive bays.
CompactFlash is readily available in a variety of
capacities, and its removable nature means you can easily PERIPHERAL VISION
have a few flash cards to boot your system with different If you have an old ATX motherboard, the rear I/O panel
options. What’s more, you can easily plug a Compact will likely have a pair of 9-pin serial ports (usually used
Flash card into a USB card reader, and transfer files from for mice and external modems), a 25-pin parallel port
another PC to it, which is much easier than mucking about (usually used for printers, but also some scanners and
with slow and unreliable floppy disks. storage devices) and a pair of PS/2 ports (small 5-pin DIN
We’re using a 512MB card, but you can go higher. DOS sockets) – one for a keyboard and one for a mouse. You
runs on a FAT16 (not FAT32 ) file system, which means may even have USB ports, but these are useless for DOS.
you’re limited to using no more than 2GB for a single drive, We recommend using the PS/2 ports for your keyboard
although you can also partition a larger flash card into and mouse. PS/2 is quicker than serial, and there’s a
multiple 2GB drives with different letters. decent range of PS/2 kit available, including optical mice
Our other two storage devices are an IDE CD-ROM (no one wants to return to using analogue ball mice again,
drive and a 3.5in floppy drive, which will connect directly however nostalgic they are!)
to the motherboard’s IDE and floppy controller ports. You can even get some modern USB keyboards and
You’ll need cables to attach both of these devices, which mice working with old PS/2 ports via adaptors, but you’ll
are commonly available in ribbon format, but in the early need to do your research. A USB peripheral will need to
2000s, some manufacturers started bunching the wires internally support the PS/2 protocol in order for it to work
all together to make ‘rounded’ IDE and floppy cables, so over a USB adaptor, and many of them don’t. With a bit of
they take up less space. You can still buy these new, and help from Google, you should be able to find out if you can
we’re using some blue ones here. use your USB keyboard or mouse with a PS/2 adaptor. If
not, you can buy second-hand PS/2 peripherals cheaply
on eBay – it’s not as if you’re going to need a 4,000dpi
sensor to play The Secret of Monkey Island, or even Doom
for that matter.

SOUND BYTES
Finally, we come to the sound card. As we mentioned
earlier, you want a 16-bit ISA sound card for DOS. The
basic standard for DOS games is the Creative Sound
Blaster Pro, which combines FM synthesis for music with
the ability to play back 8-bit sampled sound – it’s great
for a game such as Doom, so you get music and demon
growls, gunshots and explosions. It’s also compatible with
the first Ad-Lib products, which provided FM synthesis
(but no sampled sounds) and are supported in quite a few
An IDE CompactFlash DOS games, particularly early ones.
adaptor makes for easily
There are plenty of non-Creative 16-bit ISA cards that
swappable storage
that’s also comparatively have Sound Blaster Pro-compatibility and can be picked
fast and reliable up quite cheaply – just make sure you can get the drivers

83
H OW TO

BUILDING TIPS
So you’ve got all your bits and pieces, and because we’re
using an ATX case, PSU and motherboard, much of the
build process is similar to making a custom PC now.
However, there are a few key differences.

JUMPER ROUND
The first warning is that your motherboard’s BIOS will be
very different from today’s user-friendly EFI systems, and
even the BIOSes found on boards ten years ago. In fact,
Creative’s Sound
for the moment, forget about the BIOS, and instead look at
Blaster 16 will
happily provide FM the various jumpers and switches on your motherboard.
synthesiser music for them. We’re using a Creative Sound Blaster 16, which Firstly, there may be some DIP switches – plastic
in games, as well has full compatibility with the Sound Blaster Pro, and can blocks featuring several little numbered on/off switches.
as sampled sound
also play and record 16-bit sound. Secondly, look for jumpers. Jumpers are small sets of pins
There are advantages to buying a better sound card, with movable, conducting tops, which can be swapped
though, as you can massively improve the quality of the around to connect pairs of pins, acting as switches.
synthesiser music in some games. Creative’s AWE32 Before you change anything, look for any tables printed
and AWE64 cards have much better synth sounds than on your motherboard that outline the position of the
the Yamaha OPL2/OPL3 FM synthesiser sounds used by switches and what they mean. If you can’t find them, try
most cards at the time. Again, Doom is a great example of to find your motherboard manual online, so you can see
a game that sounds much better with one of these cards. what all the switch settings mean. Getting this wrong
Another alternative is to use the MIDI interface on can genuinely result in you accidentally overvolting or
the Sound Blaster 16. The early Sound Blaster 16 and overclocking your hardware and cooking it. These could
AWE32 cards (but not the later ones) had a wavetable be perilous times for PC building!
daughterboard connector, to which you can attach a Now you need to set the switches and/or jumpers
secondary synthesiser card. If you can find a Yamaha to meet the voltage, bus speed and multiplier for your
DB50XG, the sounds are amazing, and there are CPU. In the case of our 166MHz Pentium MMX, that
plenty of other decent-sounding daughterboards too. means a voltage of 2.8V, a 66MHz front side bus and a
Once it’s plugged into the wavetable connector, your 2.5x multiplier. You’ll also need to check any jumpers or
daughterboard will then just output its synthesiser switches for the memory – some motherboards require
sounds through the line out. This setup will only work on a switch to be set to use SIMMs or DIMMs, or to set the
games that can play music data through the MPU-401 memory speed. Triple-check all your switches and
interface but quite a few do, and they’ll sound better for it. jumpers before you install your CPU.
You can also use the 15-pin joystick port on the back of
the Sound Blaster 16 for MIDI, via a 2 x MIDI (5-pin DIN) CPU INSTALLATION
to 15-pin cable, which you can buy on eBay. A popular Physically fitting the CPU is one area that hasn’t really
external MIDI box at the time was the Roland MT-32, changed over the past 25 years. In the early PC days,
which is supported in some DOS games, including King’s CPUs were sometimes soldered into motherboards,
Quest IV. rather than using a sockets, and it wasn’t until the 486 era

Your motherboard may have tables printed on it, showing


how to set your bus speed and CPU multiplier

DIP switches are


simple on/off
switches that tell
the motherboard
what to do. Our ones
are currently set
up for a 200MHz
CPU, so we’ll need
to flip up switch 1 to
avoid inadvertently
overclocking our
166MHz chip

84
Installing a Socket
Some low-powered Socket 7 CPUs don’t even
7 CPU is very
need a fan, but you’ll need one for a Pentium MMX.
similar to fitting an
Ours screws directly into the heatsink fins
AM4 chip today

water-cool it for a laugh. It was all fun and games until


the force of our custom retention clip for the waterblock
ended up pulling a lug off one side of the socket. We
ended up having to stick our heatsink to the CPU using
thermal adhesive as a result. Don’t be like us – be
The heatsink will sensible here.
grip the socket
with a retention
clip, which exerts RAM IT HOME
enough force on the Next comes the memory. If you’re using a DIMM, the
CPU to maintain
installation method is the same as today. Push back the
thermal contact.
You’ll need to apply clips, put the memory in the slot (it will only go in one way
thermal paste first around) and push it down until the clips flip up to grip the
memory. SIMMs are a little different, but still simple. To
that zero-insertion force (ZIF) sockets really became a install a SIMM, insert it in the slot in an angle, as shown,
standard feature, but we’ve never looked back since. As then push it back to secure it in place.
such, installing our Pentium CPU is very similar to fitting Now is a good time to check your hardware works, so
a modern-day Ryzen chip. You lift up the lever by the plug in your power supply’s 20-pin ATX connector (you
socket, and line up pin 1 on the CPU with pin 1 on the socket may need to uncouple a block of four pins next to it) to the
(denoted by a triangle shape). You can then slot the CPU motherboard’s socket, install a graphics card and hook
into its socket and push down the lever to hold it in place. up a monitor. You’ll then need to find the header for the
Next comes the CPU cooler, and this is an area that’s power switch, which will be detailed in the motherboard
changed hugely over the past couple of decades. With a
thermal design power (TDP) of just 13W, there’s no need
for a massive CPU cooler on top of our Pentium MMX
CPU, and sub-100MHz Pentiums technically don’t even
need a fan.
We’ve picked up a basic Socket 7 heatsink and fan on
eBay. Apply a small blob of thermal paste in the middle of
the CPU, then put the heatsink on top of it. You then simply
clip the heatsink to the hooks on either side of the socket.
You can then fit the fan (if it isn’t fitted already) and plug in
its power cable – the power header is usually next to the
CPU socket, but check your motherboard manual. Install a SIMM in the
As a side note, you may notice that you can’t see the slot at an angle, and
clips in the later pictures of our PC, which is because then push it back to
secure it in place
we got a bit carried away, and wanted to see if we could

85
H OW TO

Unless you want to attach two drives to a single


cable, set your drive’s jumper to Master

Next, locate your case’s front panel connectors.


Many of them will be redundant on this system, but the
power switch, power LED and reset switches can all be
connected – check your motherboard manual to find the
Our pair of 32MB location of the headers, and plug them in now. This area is
EDO 72-pin also where you’ll be able to attach a PC speaker, usually
SIMMs is now
with a 4-pin header (with two wires). Even if you have a
ready for action
sound card, it’s worth having a PC speaker to identify error
beeps, and for sound in some older games – if your case
manual, or may (if you’re lucky) be labelled on the board. doesn’t have a speaker, you can pick up a small one for
Turn on the PSU, and short out the two pins of this header £1.95 from amazon.co.uk, which will do the job fine.
with a screwdriver – you should then get a display on Now offer up the motherboard to the case, plug
the monitor showing your CPU and clock speed, plus in the 20-pin ATX power connector, gently push the
the amount of memory. If not, double-check your motherboard into its I/O backplate at a slight diagonal
components, jumpers and switches. (being careful not to scrape it on the standoffs below),
then gently lower it down and screw it into the standoffs.
STANDOFFISH
Next, check the standoffs on your case’s motherboard DEEP DRIVE
tray. Most of today’s motherboards have a standard Next comes storage, which sadly isn’t as simple as
screw layout, but this wasn’t the case in the 1990s, and just plugging in the cables and screwing the drives into
you don’t want to accidentally short out the traces on the the case. IDE cables usually have two connectors, for
bottom of your motherboard. Check that the layout of the connecting two drives to one IDE channel, in what was
standoffs in your case matches the screw hole layout on then (politically incorrectly) called ‘master and slave’
your motherboard. If it doesn’t, move the standoffs to the configuration – you wanted your faster drive (such as your
right places, and remove any that aren’t going to be under boot hard drive) to be the master, and slower drive (such
a screw hole. as a CD-ROM) to be the slave.
After that, put your motherboard’s I/O shield in the slot In our case, we’re using two separate IDE channels for
at the back of your case. If you don’t have one, you may each of our IDE drives, rather than connecting more than
well be able to get a replacement on eBay, as the standard one drive to the same channel, but we still need to set up
layout ( 2 x PS/2, 2 x USB, 2 x serial, 1 x parallel) was pretty our drives properly. On the back of an IDE CD-ROM or hard
universal at this time.
Next, use your cable-routing holes to pass your Our CompactFlash adaptor, CD-ROM and floppy drive
are all connected to the right sockets, with power and
PSU’s 20-pin ATX cable through to the right area of your rounded cables routed via the cable-routing holes
motherboard. Today’s motherboards nearly always have
the ATX power connector on the right edge, but they could
be practically anywhere on 1990s boards. In our case, it’s
at the top, so we’re passing the cable through one of the
top cable-routing holes.
Now is also a good time to route the power cable for
your case’s exhaust fan through a top cable-routing hole.
You can then connect it to one of your PSU’s Molex power
connectors round the back of motherboard using a 3-pin
fan to Molex adaptor. If you want to keep it quiet, you can
also put a resistor cable between the two adaptors, which
will cut the voltage from 12V to 7V – they cost a couple of
quid on eBay.

86
An Akasa ISA cards have the PCB on the other side of the backplate from PCI cards
3.5-to-5.25in bay
adaptor enables drive (and on our CompactFlash PCB) will be a jumper, CARD GAMES
us to fit a floppy
which can be switched to M, S or CS, with the latter Finally, slot your graphics card and sound card into place
drive in our Fractal
Design Define R5. standing for ‘cable select’, although we recommend using – put the graphics card in the top PCI slot, and the sound
Yay, Lemmings! the ‘master’ or ‘slave’ options for certainty. card in one of the bottom ISA slots. ISA cards have the
Set this to ‘M’, unless you’re running two drives on PCB on the other side of the backplate from PCI cards,
one cable, in which case set the faster drive to ‘M’ and with their chips facing the top of the case, but you fit them
the slower drive to ‘S’. If you don’t do this properly – for in the same way. Slot them in place, and secure their
example, by putting two ‘master’ drives on one cable – backplates with your case’s screws.
the system may not boot. You can then connect your IDE
cables. There will be two plugs with a short length of cable PLUG IT IN, PLUG IT IN!
between them, and All your hardware is now basically installed – the final step
a longer cable going is to plug in your keyboard, mouse, mains cable and VGA
The ATX power socket down to a third plug. cable and start it up. Your BIOS should be set to boot from

could be practically The latter plugs into


your motherboards
the floppy drive (A) by default, but you’ll be able to change
the boot priorities in the BIOS by pressing Del when your
anywhere on 1990s boards IDE socket (a notch
means it can only be
system starts up.
The next step is to insert your DOS boot disk and boot
fitted the right way up your system. We’ll give you some tips and tricks for
around), and the top plug goes into your drive. If you’re getting your software set up over the page.
connecting two drives, the ‘top’ plug goes to the ‘master’
drive and the second plug goes to the ‘slave’ drive. One 1990s DOS machine ready for action and, unlike
PCs from the time, the interior is nice and tidy too
You can route your IDE cables around the back using
your cable-routing holes, and you’ll also need to route
power cables to your drives from your PSU, using
Molex or 4-pin floppy connectors. Next comes the
CompactFlash adaptor, if you’re using one. You just need
to make sure its jumper is set to ‘master’, hook up a floppy
power connector and plug it into your motherboard’s
primary IDE channel socket.
Meanwhile, your floppy drive needs to be connected to
your motherboard’s floppy controller socket, which looks
like a slightly smaller IDE socket. Sadly, most floppy drives
don’t have a notch to make sure you can only install the
cable the right way around, but it’s not disastrous if you
get it wrong. If you turn on your system, and the floppy
drive light is on permanently, and the floppy drive isn’t
detected, then the cable is the wrong way around – you
just need to turn it round the right way.

87
H OW TO

INSTALL FREEDOS
ON VINTAGE
HARDWAREFollowing our vintage PC building guide, K.G. Orphanides
shows you how to get a retro PC up and running with FreeDOS

PARTITION YOUR DISK AND INSTALL DOS


B
ecause MS-DOS 6.22 is increasingly hard to obtain
legitimately – your options are old floppies on FreeDOS’s current stable release is version 1.2, but we
eBay or an annual Microsoft Visual Studio recommend using the near-final live CD release candidate of
subscription that costs over a grand – you’re better off with a FreeDOS 1.3. From the FreeDOS website, follow the release
modern open source DOS. We’re using FreeDOS, an actively candidate link to custompc.co.uk/FreeDOS and download
developed MS-DOS-compatible operating system that’s FD13-LiveCD.zip. Burn it to a CD-ROM, make sure the BIOS
sufficiently close to the original that neither you nor your on your DOS PC is set to boot from CD before hard disk, insert
software are likely to notice the difference. You can grab a the disc and boot the machine. If your DOS system doesn’t
copy from freedos.org have a CD-ROM drive, there’s also an FD13-Floppy image.
It comes with quality-of- It comes with
quality-of-life features,
At the FreeDOS menu, select Install to harddisk – this
works fine on a CompactFlash card, as used in our hardware
life features, such as PS/2 such as PS/2 and USB guide too. If your drive is blank, you’ll be asked if you want to
mouse drivers, Tab partition it. Select Y to automatically partition drive C – the
and USB mouse drivers command completion, maximum available partition size will be used.
file decompression Reboot when prompted and select Install to harddisk
tools and support for FAT32 file systems – it can handle again. Erase and format drive C when prompted. Select your
soft reboot and shutdown commands too. Connect your keyboard layout and then choose ‘Full installation including
FreeDOS PC to the Internet, and there’s an even a package applications and games’. Confirm your choice, wait for
manager, FDNPKG, to help you install and update your installation to complete, eject the CD and reboot. If you need
system utilities. more control of your disk partitioning, instead select Use
FreeDOS 1.3 in Live Environment Mode and type FDISK at the
command prompt.
Unlike MS-DOS, FreeDOS supports FAT32, which means
you can have hard disk partitions bigger than 2,047MB. If
your disk is 2TB or larger, you’ll be asked if you want to enable
FAT32 support. Click Yes here, unless you specifically want
to create multiple smaller partitions that are backwards
compatible with older versions of MS-DOS.

USING DOS
DOS is a command line operating system, and if you’ve ever
used Windows’ cmd, it will feel familiar. It’s case-insensitive:
commands, paths and file names don’t have to be typed
Unlike standalone versions of MS-DOS, FreeDOS supports the FAT-32 file system in UPPER CASE but are often styled that way. To run an

88
is to simply mount your drive on your usual PC with a card
reader and copy in the lines you need using a GUI editor, such
as Notepad in Windows. If you prefer to write or edit config
files under DOS, just use FreeDOS’ EDIT command for a very
capable MS-DOS editor with mouse support. If you want to
comment on a line, put the word ‘rem’ in front of it. This is handy
for troubleshooting and working out exactly what lines you
need in your boot files.

DRIVERS
Although FreeDOS has some integrated drivers, you’ll
still need the manufacturers’ drivers for your sound card,
possibly your graphics card, and any non-standard interfaces
or unusual input devices, such as specialist joysticks and
The FreeDOS 1.3 executable file – which will typically have a .COM, .EXE or .BAT Zip drives.
RC3 live disk makes extension – just type its name without the extension. File and Your first stop for driver sourcing should be Vogons Drivers
testing, formatting
and installation a
directory names are limited to eight characters and extensions (vogonsdrivers.com), a spin-off of the popular and infinitely
convenient menu- to three, with longer names curtailed with a tilde (~). When helpful Vogons retro gaming message board. The drivers
driven affair you’re finished with DOS, you just turn off the computer. Some generally come with full instructions and examples of the lines
older programs don’t even have the option of quitting back to you’ll need to insert in boot-time config files. Another useful
the command line. collection of hardware drivers, this time with a focus on storage
devices, can be found at Hiren & Pankaj’s Homepage (hiren.
EDITING DOS CONFIG FILES info/downloads/dos-files).
As it loads, DOS looks for specific user instructions in files FreeDOS’ default FDAUTO.BAT file includes the most
traditionally known as AUTOEXEC.BAT and CONFIG.SYS, in common SET BLASTER address line for Sound Blaster
the root of your boot drive, whether that’s a floppy or your compatible cards. This will be enough in many cases, but you
C:\ partition. As we’re using FreeDOS, these are actually may still have to add the path to the actual driver yourself, as
called FDCONFIG.SYS and FDAUTO.BAT. FreeDOS includes a well as assigning your own MIDI settings. For example:
selection of useful drivers, such as ones for mice and CD-ROM
drives, and these are already called in its boot files. SET BLASTER=A220 I5 D1 H5 P330
The easiest way to create or modify these files, assuming SET MIDI=SYNTH:1 MAP:E
you’re using a CompactFlash or SD card for your hard disk, SET SOUND=C:\DRIVERS\SB16

The OS comes
with FreeDoom,
but real Doom
works well too

89
H OW TO

can select VGA from your game’s installation options and it


will work. In some cases, such as with the 3dfx Voodoo range
of 3D graphics cards, DOS games that supported them would
come bundled with the relevant driver – GLide in the case
of Voodoo. However, you may need to copy your own more
recent copy of the driver file into the game’s directory – we
copied the glide2x and glide3x DLL and OVL files from our
Voodoo 3 3500 TV’s Windows driver disk and it worked fine.

TRANSFERRING DATA
If you’re using a CompactFlash or SD card-based DOS drive
and have a reader connected to your PC, you can just mount
your entire DOS drive under your normal Windows, Linux or
macOS operating system and copy any files you want to it. This
convenient approach makes it easy to get retro games you’ve
Pre-defined Under DOS, you’ll generally have an easier time of bought on gog.com or Steam onto your DOS drive – we tried
startup menus configuration if you stick with ISA cards, although we got the this with the Steam versions of Quake and Ultimate Doom, and
provide commonly
required memory
PCI Sound Blaster Live! 5.1 from 2000 working with some both games worked fine on our retro machine.
configurations, tweaking of its driver’s CTSYN.INI file. If you run into IRQ or Alternatively, you can burn a load of DOS software to a data
but you can add DMA conflicts, check your motherboard’s bios settings – if CD and transfer it the old-fashioned way. However, if you’re
your own too in doubt, disable on-board components such as unused using standard IDE hard disks, or you don’t want to routinely
parallel and serial ports, and – especially if you’re using PCI open your DOS PC to load up its hard disk, you might want to
components – disable Plug and Play and enable Legacy Mode. add USB mass storage support to FreeDOS.
Graphics drivers were far less important in the DOS era If your motherboard has the common UHCI-compliant
than now: if your card supports the VGA display mode, you host controller, then you’re in luck, as FreeDOS includes Bret
Johnson’s USBDOS drivers (bretjohnson.us). We recommend
K N OW YO U R F R E E D O S CO M M A N D S just invoking them as needed to keep memory consumption
DIR down, rather than loading them in FDAUTO.BAT.
List everything in the current directory. FreeDOS by default applies the /P command If your vintage system only has an OHCI controller, or if
extension to pause when the screen is filled. Press space to see more. you’re using a newer motherboard with an EHCI USB chipset,
DIR /W then you’ll need Panasonic’s multi-chipset USBASPI driver
Show filenames and extensions only, in a columnated list.
(custompc.co.uk/USBASPI) and use the Motto Hairu USB
X: Mass Storage driver (custompc.co.uk/Hairu) to mount
Change to specified drive letter, swapping ‘X’ for the letter of the drive you want to access.
your disks. To add an OHCI controller in FDCONFIG.SYS, add
CD PATH
Change Directory to the specified directory name or path, replacing
the following lines, modifying the driver paths as appropriate:
PATH with the name of the directory you want to access.
CD.. DEVICE=C:\DRIVERS\USBASPI1.SYS /V /O
Move back to previous directory DEVICE=C:\DRIVERS\di1000dd.sys
CD \
Move to top level directory USB drives must be plugged in at boot time to be accessible.
MD NAME
Make a Directory called NAME CPU THROTTLING
COPY X:\PATH\ X:\NEW\PATH\ If you’re using a 500MHz PC from 2000 to run games
Copies files and directories from one place to another
from 1991, your processor will make older clock cycle
MOVE X:\PATH\ X:\NEW\PATH\
fixed software run impossibly fast. FreeDOS includes the
Moves files and directories to a new location
SLOWDOWN tool to counter this problem.
EDIT
The friendliest DOS text editor For Origin’s Martian Dreams, for example, with an
RESET
executable called MARTIAN.EXE, we just typed SLOWDOWN
You don’t have to type reset to reboot your PC, but FreeDOS gives you the option MARTIAN in its directory. You can then reduce speeds by
SHUTDOWN pressing Ctrl and Alt together until you get the speed with
Another optional FreeDOS command for the comfort of modern computer users which you’re happy.
FDISK
DOS partitioning tool MEMORY MANAGEMENT
FORMAT X The classic DOS games came from a time when only 640KB
Formats the specified drive (replace X with the appropriate drive letter). of conventional (or base) memory could be directly used in
This will erase its contents and ready it for use with DOS
MS-DOS ‘real mode’. Even then, that was a tiny amount of

90
As an alternative to using the old-school boot floppies
that most gamers had at the time, we’re going to use
FreeDOS’ integrated startup menu system.
FreeDOS has already done a lot of the work for us here,
creating high memory and JEMM386 expanded memory
startup options.
If you need the maximum amount of conventional
memory available, select option 1, Load FreeDOS with
JEMMEX, no EMS (most UMBs) and max RAM free, which
nets us 643KB of available conventional memory.
If you’re running one of the many 1990s games that
require EMM386 expanded memory (their manuals will
tell you if they do), you want option 2.

You can usually install straight from the CD without any fuss, but you can run MAKE YOUR OWN BOOT MENU
FDISK from the live OS if you need more control over drive partitioning In FDCONFIG.SYS, a MENUDEFAULT section defines
four numbered startup menus. We can add an extra
RAM with which to play, so methods of increasing available option 5 like this:
memory were rapidly introduced. These included a 64KB
high memory area, expanded memory of up to 32MB (EMS) MENU 5 - SB LIVE (JEMM386, HIMEM, NO USB)
and extended memory of up to 4GB (XMS).
In FreeDOS, these memory areas are controlled by In the same file, you can add specific lines to a chosen
HIMEMX, JEMMEX, menu number by putting the number(s) and a question mark
From the late 1980s and JEMM386,
which are invoked in
at the start of the line. For example, putting ‘125?’ before a
line means it will be included in boot options 1, 2 and 5 – we’ve
onwards, most games FDCONFIG.SYS. added ‘5?’ to lines that call HIMEMX, JEMM386 and FDAUTO.
To free up extra BAT to include those features in our new menu option.
included installers memory, DOS users In FDAUTO.BAT, a quick way to load drivers that only apply
traditionally have to to your new menu option is to insert an ‘if not’ block just
juggle extended memory management tools, load drivers before :FINAL at the bottom. For example, the following lines
into the high memory area, and winnow out unnecessary enable a PCI Sound Blaster Live! if we select menu option 5,
drivers until there’s enough memory available to load your but skips straight past it if we select any other menu option:
desired application.
IF NOT ʺ%CONFIG%ʺ==ʺ5ʺ GOTO FINAL
:SBLIVE
SET MIDI=SYNTH:1 MAP:E MODE:0
SET BLASTER=A220 I5 D1 H5 P330 T6
SET CTSYN=C:\DRIVERS\SBLIVE\DOSDRV
C:\DRIVERS\SBLIVE\DOSDRV\SBEINIT.COM

INSTALL A GAME
Software installation is usually blissfully simple under
DOS. From the late 1980s onwards, most games included
installers, so you just need to insert your install CD or floppy,
go to its drive letter (for example, type ‘a:’ at the C prompt to
go to your floppy drive) and run the installer, usually called
INSTALL or SETUP.
You’ll probably be asked to select your graphics mode,
sound card and choose an install location – this should be
drive C. The installer will copy over its files and tell you what
you need to run to play the game. You may need to do some
disk swapping during this process, and games with CD audio
will also require the disc to be in the drive while you’re playing
the game. Some games don’t have installers, but if you copy
If you’re using an IDE CompactFlash reader for your retro machine, you can plug all their files into a directory on your hard disk, you can usually
it into a card reader on a Windows PC to easily copy and edit files for your OS run them from that location.

91
H OW TO

EMULATE DOS ON
RASPBERRY Pi
K.G. Orphanides shows you how to use the powerful DOSBox-X emulator to
boot Raspberry Pi to DOS, and run anything from Windows 3.11 to classic games

G
raphical user interface? Pah, luxury! When us PC com), DOSBox-X has more precise hardware emulation,
gamers were young, we had to type in text at the supports a wider range of software, and can effectively
DOS prompt. Of course, you can already buy run more DOS-related operating systems (up to Windows
some classic DOS games readily from GOG and Steam ME). It also has a sophisticated graphical interface to help
that have been tweaked to run from Windows 10. you manage tasks such as configuration and virtual disk-
However, if you want to get the authentic DOS experience, swapping. In this guide, we’ll show you how to make a
where you have complete control over your system, you Raspberry Pi system that boots straight into DOS.
have to either run an emulator or build a machine based
on old hardware. 1 / CREATE YOUR DOS DIRECTORIES
We’re going to take you through the latter next month, Let’s create the directory structure to house the software
where we’ll show you how to build a machine that natively we’re going to run through DOSBox-X:
runs DOS games.
When us PC gamers were Another alternative,
though, is to use
mkdir -p dos/{floppy,cd,games}

young, we had to type in an emulator, and The floppy and cd directories will house disk images, and
Raspberry Pi makes we’ll be able to switch between them in DOSBox-X. This
text at the DOS prompt a great platform for tutorial and our template config files presume you’ll keep
this if you want to all your DOS files in a /home/pi/dos/ directory, so be sure
make a dedicated machine that boots straight into DOS, to change any paths if you’re using a different username
particularly because of its low cost. or DOS directory names.
The extra oomph of the 4GB or 8GB edition of While our generic config file should handle most DOS
Raspberry Pi 4 provides plenty of power for emulating software well on a Raspberry Pi, you can also create
classics of the past in DOS, and that even goes as far separate .conf files for specific programs, in order to
as installing and running early versions of Windows. better match their requirements and automatically
In this tutorial, we’ll show you how to emulate PC run commands.
software from the DOS era using DOSBox-X. If you
don’t need DOSBox-X’s menus 2 / TWEAK YOUR GRAPHICS
COPYRIGHT or extra features, though, the Assuming you’re using a standard 1,920 x 1,080 display
standard version of DOSBox with your Raspberry Pi, you’ll find some more demanding
DOSBox is an emulator and we use it
0.74-3 available in the package DOS software struggles at full resolution, particularly
with open-source FreeDOS code. Be
repository is a handy alternative. if you have DOSBox-X configured to use OpenGL and
mindful of copyright when downloading
Just type sudo apt install aspect ratio correction.
files for DOS software, and only use
dosbox. You’ll find its config file in On the desktop, open the main menu, go to Preferences
proprietary software that you own and
/home/yourusername/.dosbox and select Screen Configuration. Right click on your
in accordance with the licence terms.
Forked from the original display – most likely marked HDMI-1 – and select 1,280 x
custompc.co.uk/dosboxlegal
DOSBox emulator (dosbox. 720 from the Resolution menu. Running your entire GUI

92
4 / EXPORT A CONFIG FILE
Restart DOSBox-X and tell it to generate a config file
that we can later modify in a text editor, based on the
program’s default settings, and then exit.

CONFIG.COM -all -wcd


exit

The file we’ve just made can be found in /home/pi/.


config/dosbox-x and, at the time of writing, is named
dosbox-x-0.83.3.conf. As well as being human-
readable and conveniently editable in a text editor, you
can modify this long and extensively commented file
from within DOSBox-X using the configuration GUI in the
main menu. This is handy, as DOSBox-X’s configuration
The menu system at a lower resolution will lighten the load of rendering and has more options than that of vanilla DOSBox.
allows you to easily upscaling for the emulation, and have no adverse effect
make changes to
your emulated
on games from an era when 640 x 480 (or often 320 x 5 / CUSTOMISE YOUR CONFIG
system. You can 240) was the norm. For this tutorial, we’ve created some config files that you
radically change can download from custompc.co.uk/github. The code
the appearance 3 / INSTALL DOSBOX-X box will run most DOS software. As well as editing your
and performance
of your software In a Terminal, enter the following: main DOSBox-X config, you can launch DOSBox-X with
by switching a specific config file – useful if you wish to easily switch
scalers on the fly sudo apt install automake libncurses- between different OS setups – using the following
dev nasm libsdl-net1.2-dev libpcap-dev command-line switch:
libfluidsynth-dev ffmpeg libavdevice58
libavformat-* libswscale-* libavcodec-* dosbox-x -conf yourfile.conf
git clone https://github.com/
joncampbell123/dosbox-x.git We’ll take advantage of that later to help install Windows
cd dosbox-x 3.11. Note that your custom config files only need to
./build include lines that vary from the defaults. In the following
sudo make install steps, we’ll create a config file optimised for playing late-
dosbox-x era DOS games on Raspberry Pi 4 with 4GB or 8GB RAM.

DOSBox-X should open at its Z: prompt. You can’t paste 6/ G


 RAPHICS, SCALERS
commands into it from the clipboard, but there are some AND PERFORMANCE
modern convenience features. Pressing Tab will auto- The default config is already well optimised to run DOS
complete, you can scroll through your command history software on most systems, but we need to make a few
using the Up arrow, and you can add startup commands adjustments to improve performance on Raspberry
to a config file. Type exit to quit and ensure that the Pi’s hardware.
config directory, which we’ll need in the next step, is Leave the fullscreen setting as false, as you can
created properly. enable and disable fullscreen mode using DOSBox-X’s
menus, or the F12+F keyboard shortcut; fullresolution
should be left as ‘desktop’.
To get proper aspect ratio correction and reasonable
graphical fidelity at 1,280 x 720, set the output
to ‘opengl’, aspect to ‘true’ and select a scaler for
interpolating low-res graphics. Your scaler choice is
largely a matter of personal taste, so use the Video
One of DOSBox-X’s menu options to try a few. If your sound becomes
key advantages is a
graphical interface choppy, you’re pushing Raspberry Pi’s capabilities
that covers each too far.
element of your
emulated PC’s
7 / AUTOEXEC.BAT
configuration, from
CPU emulation At the end of the config file is autoexec, where we’ll put
to scaler all our MOUNT and IMGMOUNT lines to assign drive letters

93
H OW TO

to directories and floppy or CD images, as well as any


commands to run at boot.
In our sample config, we’ve used MOUNT to set /home/
pi/dos as the C drive in DOS. We’ll copy and install all
our software to this location. If you use the IMGMOUNT
command with multiple file names of CD or floppy
images, you’ll be able to swap between those images
in order to swap between media. To swap floppies,
use F12+LEFT-CTRL+D. To swap CDs or DVDs, use
To improve F12+LEFT-CTRL+C.
performance,
change Raspberry
8 / USING DOSBOX-X
Pi’s desktop
resolution to Like DOSBox, DOSBox-X uses the open-source FreeDOS
1,280 x 720 operating system, rather than Microsoft’s proprietary

DOWNLOAD
pi-dos.conf THE FULL CODE:
> Language: DOSBOX-X CONFIG FILE custompc.co.uk/PiDOS

001. # Basic DOSBox-X config for 90s DOS software o 026. core = dynamic
Raspberry Pi. 027.
002. # See default config file and https://github.com/ 028. # some software benefits from emulating a specific CPU,
joncampbell123/dosbox-x/wiki for further documentation which can be specified here
003. 029. cputype = auto
004. [sdl] 030.
005. # set fullscreen true if you want to boot to an 031. # if you experience lag or juddering audio, set CPU
authentic-feeling DOS environment cycles to max.
006. fullscreen = false 032. cycles = auto
007. 033.
008. # Don't forget to set Raspberry Pi's desktop resolution 034.
to 1280x720 035. [autoexec]
009. fullresolution = desktop 036. # Your DOS autoexec.bat file. These commands will
010. be run at startup, making it easy to mount lots of
011. # opengl allows aspect ratio correction floppies or CDs at once, as well as your working
012. output = opengl directories.
013. 037.
014. [render] 038. mount c /home/pi/dos/
015. # set frameskip to 1 or 2 for resource-hungry titles 039.
016. frameskip = 0 040. # uncomment and customise these lines to mount floppy
017. and CD images. Remember that DOS isn't case sensitive,
018. # aspect ratio correction but Linux is.
019. aspect = true 041.
020. 042. # imgmount a "/home/pi/dos/floppy/disk1.img" "/home/pi/
021. # choose your favourite. Don't use scalers on games dos/floppy/disk2.img" "/home/pi/dos/floppy/disk3.img"
that already have high resolutions. Set scaler to none -t floppy
to improve performance. 043. # imgmount e "/home/pi/dos/cd/a directory with spaces
022. scaler = advmame3x in/sherlock.iso" /home/pi/dos/cd/quake/QUAKE101.cue -t
023. iso -fs iso
024. [cpu] 044.
025. # use normal core for multitasking OSes such as Win95 045. c:

94
You may find that the
interface of Windows 3.x
feels rather alien
MS-DOS, although you can install and run MS-DOS from
a disk image if you own a copy.
Navigation through DOS directories isn’t too different
to using a Bash terminal, particularly as a number of
Bash commands have been included, such as LS as an
alternative to DIR in DOS. To run a .com, .exe or .bat file,
just type its name without the extension. RPG classic Worlds of Ultima: Martian Dreams is legally available
for free from GOG.com, but you’ll have to use innoextract
To capture and release your mouse, use the LEFT- 1.8 (constexpr.org/innoextract) to pull the files out of it
CTRL+F10 shortcut. The autolock entry under SDL config
enables capture-on-clock.
CD WINDOWS
9 / WINDOWS 3.11 WIN
Now we’re going to install Windows for Workgroups
3.11, released in December 1993. The biggest challenge 11 / USING WINDOWS 3.X
is finding a copy of Windows 3.11 to install – that usually If you’ve only ever used Windows 95 or later, the interface
means aging of Windows 3.x may feel rather alien. There’s no Start
floppy disks, or button, and if you want to quit back to the DOS prompt,
disk images if you you have to open Program Manager’s File menu and
had the foresight select Exit Windows.
to make backups. The default Program Manager folders, each of which
We’re working from are full of shortcuts to helpful software and settings, are
a set of disk images. clearly labelled. To explore your mounted DOS drives,
If you don’t open Main and then File Manager. Accessories include MS
already have one, Paint (precursor Paintbrush), a Sound Recorder and even a
and don’t fancy Media Player. A line at the top left of each opened window
the second-hand allows you to move and close it, and you’ll find minimise
market, you can, and maximise buttons on the top right of each window.
surprisingly, find
it included in 12 / BOOT RASPBERRY PI TO DOS
Microsoft Visual Once you’ve configured DOSBox-X – and any relevant
Windows 3.11 will
cheerfully run Studio Subscriptions (formerly MSDN Subscriptions), window managers – to your satisfaction, you can
either on top of currently priced at £33.54 per month, for the benefit of complete your pitch-perfect 1990s PC simulation
DOSBox-X’s default developers working on backwards compatibility, by booting straight to DOS. Open a Terminal window
FreeDOS operating
and type:
system, or installed
with DOS 6.22 10 / INSTALL WINDOWS
on a dedicated Copy the contents of each installation disk or image to a mkdir /home/pi/.config/autostart
hard disk image /win311 subdirectory of the dos directory tree we made mousepad /home/pi/.config/autostart/
earlier; you can do this as you normally would on the dosbox.desktop
desktop or at the command line, or by using DOSBox-
X’s IMGMOUNT to mount them and using the DOS COPY Add the following to the new text file:
command while switching disks. At the command line,
start DOSBox-X with a Windows-suitable config file – [Desktop Entry]\
download ours from custompc.co.uk/PiWin Type=Application
Name=DOSBox
dosbox-x -conf win311.conf\ Exec=/usr/bin/dosbox-x

CD WIN311 This will use DOSBox-X’s default config file. You’ll need
SETUP to enable fullscreen in your DOSBox-X config for this to
launch correctly, and we strongly advise enabling
Windows 3.11 will install itself. Reboot. opengl-dependent aspect ratio correction.

95
T H AT M A D E

OUT
“The Computers that Made Britain
is one of the best things I’ve read NOW
this year. It’s an incredible story of
eccentrics and oddballs, geniuses and
madmen, and one that will have you
pining for a future that could have been.
It’s utterly astonishing!”
- Stuart Turton, bestselling author
and journalist

Avilable on

Buy online: wfmag.cc/ctmb

You might also like