Custom PC Retrograde
Custom PC Retrograde
Custom PC Retrograde
RETROGRADE
THE ULTIMATE GUIDE TO PRE-MILLENNIAL PC HARDWARE
PC HARDWARE
ENTHUSIASTS
GAMEPAD GROUP TEST 9 PC GAME CONTROLLERS REVIEWED
ISSUE 228 / SEPTEMBER 2022 MINI-ITX MOTHERBOARD LABS / MINI-ITX BUILD GUIDE / GAMEPAD GROUP TEST / AMD FSR 2 / GPU BUYERS GUIDE / HOW TO SPRAY-PAINT YOUR PC CASE
THE BEST-SELLING MAG FOR PC HARDWARE, OVERCLOCKING, GAMING & MODDING / ISSUE COMFORTABLE
228 COMPUTING OPTIMISE YOUR DESK ERGONOMICS
BUILD A
MINI PC
ISSUE 227 / AUGUST 2022 CPU MEGATEST / AMD RADEON RX 6750 XT AND 6950 XT / COMFORTABLE COMPUTING / RESPONSE TIMES TESTED / WATER-COOL YOUR GPU / BIG BOX GAMES
THE BEST-SELLING MAG FOR PC HARDWARE, OVERCLOCKING, GAMING & MODDING / ISSUE 227
12
CPUs
TESTED
MINI-ITX
MOTHERBOARDS
PRICES FROM £140 TO £373
INTEL AND AMD OPTIONS
10
BOARDS AUGUST 2022 / £5.99
REVIEWED!
10
REVIEWED!
BOARDS
WELCOME
BEYOND NOSTALGIA
n the face of it, the first PC we had in my family home
3
RETROGRADE
CONTENTS
PROCESSORS 72
6 Intel 286
11 Intel 386
14 Intel 486
18 Socket 7
20 AMD Athlon
23 Intel Slot 1
GRAPHICS
27 CGA
30 EGA
32 VGA
36 3dfx Voodoo
39 PowerVR
42 Nvidia GeForce 11 79
SOUND
47 The Sound Blaster Story
52 The PC speaker
54 Roland MT-32
STORAGE
57 Floppy disks
27
SOFTWARE
62 Windows 1.0
67 Windows 3.1
THE FIRST PC
72 IBM PC 5150
HOW TO
79 Build a DOS PC with a modern twist
88 Install FreeDOS on vintage hardware
92 Emulate DOS on Raspberry Pi
4
RETROGRADE
PROCESSORS
5
PROCESSORS
INTEL 286
K.G. Orphanides takes a technical look
at Intel’s16-bit swansong
W
e’re now so used to tiny transistors that the 7nm Its successor, the 80386, was a true 32-bit processor, with
process used to fabricate AMD’s latest Zen 3 CPUs a 32-bit data bus and memory addressing to match. But even
hardly seems worth mentioning now – it’s hard to as its technology was superseded, the 286 was just hitting
keep track of the numbers of transistors when they get into its stride in the home PC market, which it would dominate
billions. However, you only have to look at early PC CPUs to see until 386 and 486-based PCs started to become vaguely
just how far silicon manufacturing has come. Intel’s 80286 affordable in the early 1990s.
processor was released in 1982, and fabricated on a 1.5µ
(1,500nm) manufacturing process, compared to the 3µ A VISION FOR THE FUTURE
(3,000nm) process used by its predecessor, the 8086. It When development began on the 80286 in 1979, Intel’s
packed in 134,000 transistors: 4.6 times as many as the 8086. product requirements document envisioned that the
By comparison, AMD’s 7nm Zen 2 processors contain up to 9.8 powerful new processor would be primarily used in industrial
billion transistors. applications, from telecoms to manufacturing automation
The 80286 was introduced with an entry-level-model clock and medical instruments. It was explicitly designed to be
speed of just 6MHz. This figure would go as high as 12.5MHz for compatible with the 8086, ensuring that software for the
the popular Intel 80286-12, and up to 25MHz for late-era takes older processor would run without modification on the new
on the CPU by other manufacturers, such as AMD and Harris. It device. But unlike the 80186 (see opposite), PCs weren’t on
would be the last, fastest 16-bit PC processor Intel made. the 286’s original roadmap.
In Intel’s 1984 annual report, which details the 286’s
THE 80287 COPROCESSOR development, release and nascent domination of the industry,
Since the 8086, floating-point coprocessor chips – popularly known as maths coprocessors the company admits that in hundreds of pages of planning
– had been made available as optional additions via a motherboard socket. They allow materials ‘the personal computer – which would eventually
addition, subtraction, multiplication, division and square root calculations on numbers with become its biggest user – wasn’t mentioned once’.
decimal points to be carried out more quickly than on a standard integer unit, improving The 80286 was announced in February 1982, and the
performance in arithmetic-intensive applications. designers had a working prototype to show industry partners
Originally, that was mostly accounting and computer-aided design (CAD) software, but
that spring, promising ‘about three times the performance of
later games were also able to take advantage of the hardware, notably including 1989’s
SimCity and flight sims such as Falcon 3 in 1991. The 486SX series was the last range of Intel
any other 16-bit microprocessor’. However, after initial testing
CPUs to be released without a built-in maths coprocessor – its sibling 486DX integrated a of the first 286 wafers, ‘progress just seemed to drop to a
floating point unit into the CPU. snail’s pace’, according to logic design supervisor Jim Slager,
again quoted in Intel’s 1984 annual report. The processor
wasn’t yet running fast enough, and the testing programme
for CPUs that would come off the manufacturing line was
running late.
But in June 1982, IBM – then the world’s largest maker of
computers – came calling. IBM had been using Intel’s 8088
since 1979 and it was looking to give a power boost to its next
generation of PCs: the IBM model 5170, better known at the
An optional 80287 coprocessor provided the 286 with a floating point unit
IBM PC/AT.
6
THE OBSCURE, WILDLY SUCCESSFUL 80186
Released at around the same time as the 286, the 80186 was
fully software-compatible with the 8086, with an emphasis
on increased performance at the lowest possible cost. It was an
instant success, and Intel produced 30 times as many 80186s as
8086s in the new processor’s first year of release.
Although Intel at one point envisioned the 186 being used
in workstations, word processors and PCs, it was the 286 that
ultimately came to dominate the desktop market. Unlike the 286,
the 186 had its clock generator, timer and interrupt controller –
previously motherboard components – built into the CPU.
However, these integrated components weren’t compatible
with the hardware used in the IBM PC, leading IBM to select the
286 for its PC/AT range of computers.
The 186 was nonetheless massively successful, due to its
speed and ease of integration into other systems, appearing in
coprocessors, communications controllers, flight management
computers and general-purpose microcontrollers.
Intel pulled together a cross-disciplinary task force to It did appear as the main CPU of a few PCs, including the 1986
Sega AI in Japan, the Tandy 2000 in the USA and the frankly
complete the testing tools, address bugs and complete
inexplicable RM Nimbus schools PC in the UK. Intel ended
the parallel development of motherboard components.
production of the 186 in 2007, although fully compatible third-
Marketing focused on a new public presentation of the 80286, party clones are still available.
highlighting its superiority to Motorola’s popular 68000
processor and emphasising that it was far more than a minor
update to the 8086.
Intel emphasised the 286’s multi-user and multi-tasking
capabilities, including variable privilege levels to restrict
access to specific parts of memory, as well as an instruction
set designed to rapidly switch between programs, providing
support for Unix as well as DOS.
The marketing push – and especially IBM’s adoption of
the processor – worked. Chip samples were delivered to
customers later the same year
Unlike the 80186, PCs and, in 1983, volume production
of the 80286 began. The IBM
weren’t on the 286’s PC/AT launched in August
1984, prompting a wave of
original roadmap AT-compatible computers
from companies including The 186 was hugely successful outside of the
Compaq and NEC. By the end of 1988, Intel estimates, there desktop PC world. Image credit: Konstantin Lanzet
were around 15 million 286-based PCs in use worldwide.
7
PROCESSORS
Sierra’s King’s Quest II: Romancing the Throne explicitly supported the 286-based IBM PC/AT
8
The 286’s machine word status is used to indicate the
presence of features such as an 80287 maths coprocessor
(see p106), and whether the CPU is supposed to be running
in protected or real mode. The introduction of instructions
to efficiently end the execution of a task, save its state and
switch to another, loading its last state, significantly improved
multitasking performance.
PERFORMANCE
The 286 provided a marked performance boost over the
8086 and 8088. This was in part down to faster clock speeds,
particularly when 12.5MHz, 16MHz and even faster 286 CPUs
became popular. The CPU also benefited from significant
architectural redesigns, enabling a 10MHz 286 to execute
programs up to six times faster than a 5MHz 8086, according
to Intel’s Introduction to the iAPX 286 document.
A 12MHz 286 can calculate between 1.28 and 2.66 million
instructions per second (MIPS), compared to 0.330 MIPS
for a 5MHz 8086 and 0.750 MIPS for a 10MHz 8088. The
286’s instructions per clock (IPC) count works out at 0.21
MIPS per megahertz. To help achieve this, the 80286 CPU
comprises four independent processing units: address unit,
bus unit, instruction unit and execution unit, compared with
the two-unit execution and bus organisation of the 8086. It
has demultiplexed address and data buses to improve bus
efficiency, particularly in protected mode.
The instruction unit can decode and hold a queue of three
prefetched instructions, which it sequentially feeds to the
execution unit. Meanwhile, the presence of a dedicated
address unit, which calculated the physical addresses in
memory of the instruction and data being called upon, offered
a key performance improvement over previous systems.
GAMING
The 286’s extra power meant that more was possible for game
By 1990, memory-hungry games, such as the US release of Sorcerian, advertised their need developers. New instructions for moving data between stacks
for ‘AT-compatible’ PCs and registers benefited those working in high-level languages
such as C. Although the increasing multimedia capabilities of
PC systems through the 1980s also played a significant role,
the PC’s processor power was becoming apparent. That said,
in the 1980s, 286 systems were still prohibitively expensive
compared with more family-orientated microcomputers, as
well as low-end 8086-based PC-compatible machines.
Despite this, the second instalment in Sierra’s King’s Quest
series, 1985’s Romancing the Throne, explicitly supported the
286-based IBM PC/AT, booting directly from a floppy disk. By
1990, popular series, such as Ultima and Wizardry, which had
once been developed for rival systems, such as the Apple II and
IIGS, were receiving MS-DOS first releases.
It wasn’t all positive. Some older games whose performance
was fixed to clock cycles became unplayably fast, which led
to the widespread use of ‘Turbo buttons’, which would slow
the system down to clock speeds comparable with 8086 and
8088 CPUs. Other 286 PCs had a BIOS option to do the same,
and utilities such as Mo’Slo were developed in the 1990s to
The 80286 has a dedicated address unit, bus unit, instruction unit and execution unit slow down overspeed games.
9
Get the competitive edge you need to unleash your full gaming potential with the 24’’ and 27‘‘ G-Masters
offering 0.8ms MPRT and 165Hz refresh rate. Armed with FreeSync Premium you can make split second
decisions and forget about ghosting effects or smearing issues. The ability to adjust brightness and the
dark shades with the Black Tuner delivers greater viewing performance in shadowed areas and the IPS
panel technology guarantees superb image quality.
0.8ms
MPRT
INTEL 386
Ben Hardwidge looks back at the PC’s first 32-bit CPU
W
e often complain about the over-inflated price Just like the prices, the numbers involved with the
of graphics cards these days, but the prices of manufacturing process of the 386 are staggering compared
today’s PC components are extraordinarily with today’s CPUs. The first 386 chips contained 275,000
generous in comparison with the early days. If you want the transistors, which made them a marvel of miniaturisation at
latest top-end Threadripper CPU, the fastest gaming GPU the time, but that’s a piddly number compared with over 9
and an enormous amount of storage, a machine such as billion transistors, which you’ll find in the Ryzen 9 3950X
Chillblast’s Fusion Conqueror (see p32) will deliver all of it across all its dies. In terms of raw transistor numbers, a
in a well-built machine for £5,999 inc VAT. Ryzen 9 3950X is like 35,000 386 CPUs.
Now, I’m not going to pretend that’s a small amount of money Those transistors were massively bigger as well,
Inside a 386 die,
– it’s unaffordable for most of us. But, to get some perspective, produced on a 1,000-1,500nm node, compared to 7nm in with 275,000
let’s take the TARDIS back to September 1986, when Compaq AMD’s latest CPU dies. The very first CPUs off the production transistors
released the Deskpro 386, marketed as the first ‘true’ 32-bit
computer. This was a good 11 months after Intel first launched
the first 12MHz 386 CPUs, but seven months before IBM’s first
386 machine got out of the doors, marking a new era where
‘clone’ PCs were becoming dominant.
11
PROCESSORS
line were clocked at 12MHz, then 16MHz, with 20MHz, In order to maintain backwards compatibility, the 386 still
25MHz and 33MHz flavours launching later – even the latter retained this segmenting approach in ‘real mode’, but it also
is around 1 per cent of the clock speed we see on today’s offered a new form of ‘protected mode’. This mode was first
CPUs. Pin-compatible CPUs were also made by AMD, as introduced with the 286 to allow the use of virtual memory
well as other manufacturers, including Cyrix. (effectively paging to a hard drive). However, the 386 added an
on-board paging translation unit to mediate between the
MEMORY MANAGEMENT segments and the physical address bus, which effectively
The first 32-bit x86 CPU was big news in the computing enabled the computer to present all these segments as one big
world though. While Motorola’s 68000 (used in the Atari ST sea of memory, even though it was technically still segmented.
An Intel marketing and Commodore Amiga, among others) had introduced us to It made for a much friendlier memory system for software
shot for the 386 an internal 32-bit CISC CPU architecture back in 1979, it also developers, particularly for memory-hungry graphical user
shows a 16MHz 386 used a 16-bit external data bus and a 24-bit address bus. interfaces, and it paved the way for PCs with ever larger
CPU, as well as an
Intel’s first 80386 CPUs were 32-bit internally and across memory allocations.
80387 CPU and
some 1.2MB 5.25in external buses, offering a huge advance over the previous
floppy disks 16-bit 8088, 8086 and 80286 processors. THE JOY OF SX
The ability to address so much memory was overkill for the home
market, though, and the prices of original 386 machines put them
well out of the reach of this market anyway. To get the 386 into
home machines, Intel introduced a cut-down version called the
386SX, with the original design now getting the ‘DX’ suffix.
This isn’t to be confused with the ‘SX’ and ‘DX’ suffixes
used on the later 486 chips though. When it came to 486
CPUs, the DX versions had a built-in floating-point unit, called
a math coprocessor at the time, while the SX chips only had
an integer unit, although you could add an 80487 math
coprocessor to most 486SX machines separately.
Conversely, neither the 386SX or DX had a built-in floating
point unit – you needed a separate 80387 coprocessor if you
wanted that. The difference between the 386SX and DX
was that the former had a 16-bit data bus, although it kept the
CPU’s internal 32-bit architecture. The idea was that having
a 16-bit data bus would cut down on the need for highly
intricate PCBs with loads of traces, reducing the cost of
manufacturing. The other knock-on effect of fewer
connections was that a 386SX could only address up to
16MB of RAM. However, as we’ve already covered, this was
still way more than enough for the home market at the time.
SOFTWARE
The big problem for the 386 for most of its useful lifespan
In theory, this meant a PC could now address 4GB of was mainstream software support. An executable file called
RAM (a limit that would only become seriously challenged
20 years later), although realistically the limits of A special 386 version of Links gave you gorgeous SVGA graphics
technology at the time meant that most 386 PCs could for the time
only address up to 32MB, and even that was considered
overkill. For reference, my 386 PC in the 1990s came with
4MB of RAM, but I upgraded it to 8MB using 30-pin
SIMMS and it felt decadent.
More importantly for the time, the 386’s memory
system was designed to be easily extended well beyond
the 640KB base memory limit of MS-DOS. The ins and
outs of archaic memory systems are well beyond the
scope of a two-page nostalgia piece, but the basic gist is
that a 16-bit x86 CPU could only address 64KB of memory,
so any memory on top of this figure had to be divided into
‘segments’ that it could address separately.
12
Master II, Myst The Elder Scrolls: Arena, Sim City 2000
and UFO: Enemy Unknown (otherwise known as XCOM) all
required a 386 CPU as the bare minimum. There was also
a special 386 version of the Golf game Links, giving you
superior graphics at 800 x 600.
That said, I ran many of these games on my 20MHz 386SX in
the early 1990s, and while they technically worked, I usually had to
run them at extremely low detail, and even then the frame rate
would have been unacceptable by today’s standards. Running
Doom required me to have big bars around a tiny screen in order
to make the game playable.
13
PROCESSORS
INTEL 486
Stuart Andrews recalls the mighty CPU that
made the PC the ultimate powerhouse
T
he 486 went into development at an interesting In theory, the i860 should have trumped any 386
time for Intel. The Intel 386 line had seen Intel successor, but in 1985, Intel’s CEO, Andy Grove, put John
snatch a victory from the jaws of a disaster, making Crawford and hotshot architect Pat Gelsinger in charge of
up for the failure of Intel’s new-fangled iAPX432 architecture the design. Crawford and Gelsinger had already worked
with a mix of strong compatibility and great performance. Its together on the 386 and shared a strong belief in the The 1st-generation
design team, led by chief architect John Crawford, had potential of the x86 and CISC architecture. Both felt that, 486 was twice as
dragged the 16-bit x86 architecture into the 32-bit era and while RISC had its advantages, a redesigned x86 chip fast as a 386 with
kept Intel ahead of the pack. could keep up. the equivalent clock
speed. Image credit:
But other manufacturers were moving fast. Arch-rival What’s more, it could do it without forcing big software Andrzej W K, own
AMD was already developing its own 386 CPUs and only publishers to redevelop their applications, rebuild operating work, CC BY-SA 3.0
Intel’s litigation was delaying their release. Cyrix was already
producing Intel-compatible maths co-processors and was
threatening to move into CPUs. Intel needed an awesome
new product.
14
systems and optimise compilers. When you threw more The big innovation was to combine a tighter, more
transistors at the problem and increased their frequency, streamlined pipeline with an integrated L1 cache – a first in a
there was no reason why a CISC chip couldn’t compete with mainstream CPU. With 8KB of high-speed SRAM as a store
a RISC CPU. Apply Moore’s Law and keep increasing speeds, for recently used instructions and data on the same silicon,
and a CISC chip might even crush it. the instruction pipeline could be fed with a consistent flow,
enabling it to execute the simplest and most commonly
OPTIMISE THE PIPELINES! used instructions at a sustained rate of one per clock cycle –
Gelsinger and Crawford focused on delivering a processor an achievement that RISC devotees believed was beyond a
that was fully 386-compatible and would build on the existing CISC processor.
32-bit architecture but would give you a massive increase The new pipeline had five stages, although the first – the
in performance – at least double, clock for clock. They took Fetch stage – wasn’t strictly necessary for each instruction,
inspiration from what was going on with the new RISC CPUs, as the CPU could fetch about five instructions with every
paying particular attention to how instructions were loaded, 16-byte access to the cache. Once fetched, instructions
organised, decoded and executed on the CPU. went through two decoding stages, where they were
organised and fed into the execution units. Here they
were executed, and the results written back to registers
or memory in a final write back stage.
The cache minimised any delay in loading data and
instructions, and did such an effective job of caching data
and instructions that the processor only had to go to system
memory on roughly 5 to 10 per cent of memory reads.
What’s more, many 486 motherboards incorporated a
secondary cache with 16KB or more of high-speed RAM,
reducing latency even further. Meanwhile the two decoder
stages enabled those instructions to be pipelined and
processed more efficiently – with five instructions running
through the pipeline, one would normally be processed
The 2nd-generation DX2 chips doubled their predecessors’ clock with every clock cycle.
speed, a feat never replicated by any subsequent Intel CPU. The result was a spectacular improvement in
Image credit: Henry Mühlpfordt, own work, CC BY-SA 3.0 performance. On integer instructions – very much the
meat and potatoes of computing at the time – the 486 was
THE RIVALS at least twice as fast as a 386 running at the same clock
If Intel’s processor design teams put the 486 far ahead of the speed, and sometimes 2.5 times as fast. This meant the With enough
pack in terms of performance, its legal teams did a cracking job CISC-based 486 could hit similar levels of performance to processing power
of suppressing any competition. However, eventually Cyrix and to run it full-screen
the RISC-based i860, while still being compatible with all
AMD won their legal fights, and 486 competitors began to appear. at a full VGA
the existing x86 software. There was no need to rebuild
Cyrix’s 486SLC and DLC processors, released in 1992, were resolution, Doom
particularly interesting. or recompile – code developed for the 286 and 386 became the 486-
Effectively a 386DX with a 486 instruction set and just 1KB of L1 just worked. DX2’s killer app
cache, they still used a 32-bit bus and gave users a cheap halfway
house – a 486DLC33 could run software at roughly the same
speed as a 25MHz 486-SX. Not only were the processors more
affordable, but they plugged into existing 386 motherboards,
meaning the platform as a whole was cheaper.
I had one of these beauties in my first PC, and while it was
noticeably less capable than my friend Brian’s mighty 33MHz 486-
DX, it could still run X-Wing, Ultimate Underworld II, Alone in the
Dark and – eventually – Doom. Ultima VIII: Pagan? A bit more of a
slideshow, but then it wasn’t a great Ultima, so who cares?
AMD released its own 486 chips in 1993, and while they were
late to the party, AMD made up for it with a repeat of a classic 386
performance trick. AMD’s CPUs ran on a 40MHz bus, meaning
that the, SX-40 and Am486DX/2-80 were slightly faster than the
equivalent Intel CPUs.
Meanwhile, AMD’s straight Am486 DX-25 and 33 and SX-33
gave you the same performance as Intel’s equivalents at lower
prices. AMD even released what it called the AM5x86-133 in 1995,
which competed with the low-end Pentium 75 but was actually a
486 running on a 4x multiplier with a 33MHz clock.
15
PROCESSORS
At this point, floating point instructions weren’t so This meant there was less overhead in shifting data
commonly used, but here the news was just as good. between CPU and FPU; this, combined with other
Previous Intel processors had worked with optional, optimisations, resulted in a significant improvement in floating
discrete maths co-processors, which handled all the point performance. Fast forward a few years, and Quake
floating point logic. These were expensive and not would require a CPU with a floating point unit, with the system
popular outside of business, as only a few major business requirements citing a 486-DX4 as the minimum. Today,
applications, such as dBase, Borland Quattro Pro and Lotus it’s impossible to imagine a CPU without an FPU, and that’s
1-2-3, actually used a Floating-Point Unit (FPU). The 486- thanks to the mighty 486.
DX, however, integrated one directly onto the processor die, Beyond this, differences from the 386 were relatively
connected to the CPU by its own dedicated local bus. small. The 486 had a few extra ‘atomic’ instructions that
sped up some basic operations, but nothing compared with
OVERDRIVEN the instructions added with the 80286 or 386. The 486
The 486 marked another shift in Intel’s tech and marketing strategy by embracing the also didn’t mess with the 386’s memory model; it could still
whole idea of PC upgrades. As Intel released its clock-doubling DX2 and DX4 processors, address 4GB of RAM across a 32-bit bus, with a protected
it also released Overdrive versions designed to boost existing PCs. Some 486 Overdrive mode that presented both real and virtual memory as one
processors were simply replacement CPUs, plugging into the existing 168-pin socket and
big pool. However, its improved Memory Management Unit
replacing, say, your 25MHz 486-SX with what was effectively a 50MHz 486-DX – albeit at
an eye-watering cost of $549 to $699 US.
performance meant it was much more efficient at shifting
At this point, not every CPU could be removed from its socket, but luckily many 486 data between the system RAM, the CPU and the cache.
motherboards shipped with their own 169-pin upgrade socket, originally designed to fit a
487-SX maths coprocessor for 486-SX machines. Sneakily, the 487-SX was actually a fully DOUBLE THE CLOCKS!
functional 486-DX with an extra pin that told the motherboard to ignore the existing CPU, There was one final architectural change that was to have
and the OverDrive chips just repeated the trick with some extra control circuitry with 50MHz a major impact, even on today’s PCs. Intel CPUs from the
and 66MHz 486-DX2 CPUs.
8086 to the 1st-generation 486 ran at the same frequency
Doubling your speed was definitely tempting, and SX owners got a maths co-processor
in the mix as well. And while Intel pushed the benefits with AutoCAD, WordPerfect and as the external bus that connected all the core components
Corel Draw, the biggest sellers for OverDrive chips were undoubtedly games such as Strike together. This meant that the initial 486-DX processors,
Commander, Falcon 3.0 and Doom. introduced in 1989-1990, ran at the same 20, 25 and 33MHz
speeds as the I/O bus. Intel pushed speeds higher, releasing
16
Games such as Strike Commander pushed the 486 architecture to
its limits with advanced 3D texture mapping and Gouraud shading
When Intel tried the same trick with the 386, it released
a hobbled version with a 16-bit data bus and slower clock
speeds, but the 486-SX was basically a 486-DX with the FPU
disabled. At the time, with so little software that supported the
FPU, this wasn’t much of an issue, and by the time the 486-SX
was released, it only cost around $250 to $300 US.
17
PROCESSORS
SOCKET 7
Ben Hardwidge recalls the strange pocket of time
in the 1990s when one motherboard could support
CPUs from multiple manufacturers
J
ust imagine if you could pick up any one of the parties such as AMD and Cyrix to make clone chips to fill out Socket 7 was found
motherboards in this month’s Z490 Labs test the supply and meet demand. That all changed when Intel on both AT and ATX
motherboards,
(see p44), stick a Ryzen chip in it and know it introduced the first Pentium-branded CPUs. with chipsets
would not only fit, but also work fine. In fact, imagine you’re From this point, third-party companies weren’t from multiple
not just limited to Intel and AMD CPUs, but you could put a allowed to reproduce Intel’s flagship desktop CPU chip makers.
Photo credit:
CPU from all sorts of other chip manufacturers in your microarchitecture, or use the Pentium brand. Instead, the
Konstantin Lanzet
brand new motherboard. Not only that, but there’s a choice old clone chip suppliers, which still had an x86 licence from
of chipsets all designed to work with all these CPUs as well. the cloning days, had to design their own CPUs.
We’re so used to exclusive socket and chipset designs The first Pentium CPUs were launched on the 5V Socket
now that the idea seems like commercial suicide, but this 4. In this era, Cyrix and AMD instead focused on launching
was the situation in the Socket 7 era of the 1990s. This ‘5x86’ CPUs designed as upgrades for existing Socket
period is a strange little oasis in the time between times, 3 486 motherboards, as did Intel’s Pentium Overdrive
where CPU and chipset manufacturers just assumed their CPUs. However, it was the later 3.3V Socket 5 and Socket 7
parts needed to be compatible with each other. platforms that saw Intel, Cyrix and AMD targeting the same
Until this time, Intel had completely governed the design CPU socket.
of x86 CPUs, bringing us the 8088, 8086, 286, 386 and The only difference between Socket 5 and 7 was that
486 (and others) in various guises, and drafted in third the latter upped the total pin count from 320 to 321, and
18
Socket 7 supported Socket 7 could provide dual voltage to the CPU via a split If you used an AT power supply, you also had to physically
CPUs from multiple rail. The sockets are otherwise basically the same, to the switch off the PC after use, as it couldn’t be shut down
manufacturers,
including Intel,
point where you could put a Socket 5 CPU in a Socket 7 with software. Again, though, this was a strange crossover
AMD, Cyrix and motherboard and it would run fine. period, and there were motherboards that conformed to
IDT. Photo credit: the AT form factor, but which also had both AT and ATX
Konstantin Lanzet CHIPSET CHOICES power sockets.
This was before AMD made its own chipsets, but there While Intel was busying itself with ATX and Slot 1, though,
were still plenty of options. If you wanted the best AMD and its chipset partners went all out on Socket 7. The
compatibility, your best Socket 7 option was Intel’s Triton result was Super Socket 7, which maintained compatibility
series, which peaked with the Triton 430TX in 1997. The with older Socket 7 CPUs, but also supported AGP graphics
430TX supported either a 60MHz or 66MHz front side bus, cards and could clock the front side bus at up to 100MHz.
and also gave you the option of three types of memory – There was also a range of Super Socket 7 motherboards in
fast page non-parity, EDO and SDRAM, with the latter two both ATX and AT form factors.
options coming in the brand new DIMM form factor. This Super Socket 7 was great for cash-strapped enthusiasts,
led to many motherboards as it meant you could keep most of your old PC – the PSU,
The K6-III even pushed coming with both DIMM case, hard drive and even the memory in many cases; you
slots and the older 72-pin just needed a new motherboard and CPU if you wanted a
the clock speed up to SIMM slots. decent upgrade. It was massively cheaper than upgrading
However, your choice to Pentium II.
550MHz wasn’t limited to Intel You could run a Pentium CPU in a Super Socket 7
chipsets. Plenty of third- motherboard too, or a Cyrix M-II or IDT WinChip 2, but what
party chip makers, including VIA, ALi, SiS and Opti had their you really wanted was an AMD K6-II or K6-III. AMD’s last
own Socket 7 chipset options. For the most part, they held Super Socket 7 CPUs really pushed the limits of this old
up pretty well, and they were usually cheaper than genuine socket and the AT era, with the 100MHz front side bus
Intel boards, but there were also sometimes compatibility often making these systems faster than the 1st-generation
problems. As an example, when I worked in a computer 66MHz Pentium II CPUs, while costing much less money.
shop in the late 1990s, we often had problems with ALi- The K6-III even pushed the clock speed up to 550MHz, and
based motherboards not working with the 32x Samsung integrated 256KB of L2 cache onto the die.
CD-ROM drive we stocked.
END OF AN ERA
IS IT A BIRD, IS IT A PLANE? NO, IT’S SUPER SOCKET 7! AMD finally moved to its own Slot A platform with the first
Intel pulled the plug on Socket 7 after the Pentium MMX, Athlon CPUs, as well as introducing the Ironbridge chipset
and instead moved its Pentium II CPUs to the new Slot 1 under its own brand, before it discontinued the K6-III at the
format (see Issue 200, p107). In the meantime, it settled on end of 2003, eight years after Intel first launched Socket 7.
ATX as the motherboard and PSU standard for Pentium II. Meanwhile, Cyrix was bought by VIA, which later produced
There were some ATX Socket 7 motherboards, but a few CPUs for Intel’s Socket 370 platform, as well as its
most of them used the older AT form factor, which split the own embedded EPIA platform. But the days of multiple
main power socket into two parts and only had a (large DIN CPUs being supported by one socket are now over – the
socket) keyboard output fixed to the board as standard – mainstream desktop PC market has since been mainly
the rest of the ports all connected to the motherboard with dominated by just Intel and AMD using their own CPU
ribbon cables. dedicated sockets.
19
PROCESSORS
AMD
ATHLON
Stuart Andrews recalls the first AMD x86
CPU that properly put the wind up Intel
T
he summer of 1999 wasn’t a great time for Intel, and territory, including 3D games. Athlon was kicking Intel right
it really should have been. In February it had launched where it hurt, and that eye-watering discomfort wasn’t going to
the Pentium III, a supercharged upgrade of the P6 let up any time soon.
microarchitecture. Cyrix, whose 6x86 processors had
embarrassed some 1st-generation Pentiums, was effectively K7 COMES TOGETHER
finished, its tech now in the hands of VIA Technologies. That just How exactly did AMD manage this feat? Well, as with so
left AMD, whose K6 line of processors had captured some of many standout products in the hardware space, the answer
the budget PC market, but didn’t have the optimised pipelines, involves several developments all coming together at the
cache or floating point performance to give Intel any same time. On the one hand, the success of the K6 II and III
serious competition. had left AMD in a surprisingly strong position.
But when AMD released its first K7 Athlon processors to The K6 architecture had made the most of technology
reviewers in June, something unexpected happened. Sure, bought in with the company’s 1996 acquisition of NexGen
there was already some buzz about the new ‘K7’ CPU, thanks to and had pumped money into AMD’s war chest. It had also
Shipping in 550, intriguing early demos and briefings, but a Pentium III killer? Not cemented AMD’s position as Intel’s most credible rival.
600 and 650MHz
versions, the original likely. Yet when the final production samples hit magazine labs What’s more, AMD also had new CPU and bus technology
K7 Athlon took and website testbenches, it became clear that the new Athlon developed by the Digital Equipment Corporation (DEC) for its
the benchmark was pretty special Alpha RISC processors. It had even taken on most of DEC’s
battle to Intel – and
AMD’s chip wasn’t just matching Pentium III, clock speed for RISC CPU design team, including key architects, Dirk Meyer
won. Image credit:
Maddmaxstar CC clock speed, but beating it. Worse, it was beating it in the kind and Jim Keller.
BY-SA 3.0 of floating point intensive apps that Intel considered home Thanks to a patent cross-licensing deal with Motorola,
AMD also had a head start on new copper-based die
manufacturing technologies, not to mention a new chip
fab in Dresden on its way to use them. This would become
important later on.
All this helped lead to a revolutionary design – the first
7th=generation x86 processor.
The original 0.25-micron (250nm) Athlon had a die with
over 22 million transistors – the highest transistor count
of any x86 processor to date. It also had an ingenious split
cache system, with 128KB of on-chip L1 cache operating at
clock speed, plus another 512KB of L2 cache included in the
processor module.
This L2 cache operated at a fraction of the clock speed –
half-speed on the initial models – but with breathing room
to scale to cover higher and slower speeds later on. This
arrangement gave Athlon a performance advantage over the
20
launch. The final kicker was that AMD was no longer second
rate on floating point operations.
Not only were the Athlon’s floating point units (FPUs)
much faster than the weedy FPUs of the K6 line, but AMD
built on the SIMD instructions of its 3DNow! Technology, with
24 new instructions on top of the original 21. Most mimicked
the cache and streaming controls seen in Intel’s mighty SSE
tech, but AMD also bundled in new DSP and complex maths
extensions, plus MP3 and Dolby Digital decoding tools. This
chip was built to game and entertain.
There was one final way that AMD now matched Intel – the
Athlon was AMD’s first chip to abandon sockets and embrace
the slot. AMD’s Slot-A connector harnessed DEC’s EV6 bus
and bus protocol, which allowed for burst data transfers
at double the rate of Intel’s equivalent GTL+, giving you a
whopping 1.6GB/sec of bandwidth between the CPU and
the motherboard chipset.
The Athlon’s front side bus operated at double the 100MHz
speed of the memory bus, and as faster RAM became
While it was codenamed the K7 right up until launch, AMD named its
7th-gen processor to make it clear it was a break from the K5/K6 past available, this gave AMD scope to up the FSB speed even
further, to 266MHz or even 400MHz. What’s more, with a
earlier K6 processors, even before you factored any other slot design, AMD could combine its CPU die and L2 cache in
architectural improvements into the equation. the one package, and that package was a whole lot easier to
But these improvements were just as significant. Meyer, fit. And to make sure dozy upgraders didn’t try to stuff AMD
Keller and their team designed an architecture that was CPUs into Intel slots or vice versa, it cleverly reversed the
capable of decoding three x86 instructions simultaneously physical design.
and – crucially – symmetrically, unlike the Pentium III.
True, the Pentium III’s instruction pipeline could handle AWESOME ATHLON
three simple instructions at once, but feed it more than Talk about architectures and specs was all very well, of
one lengthy, complex instruction and it choked, as only one course, but nothing really prepared those of us benchmarking
pipeline could manage the workload. The Athlon, by contrast, PCs in the late 1990s for the sheer undeniable awesomeness
could chew through three complex instructions without any of Athlon. The results of benchmarks wouldn’t have made
trouble. You got three instructions at a time, every time. comfortable reading for Intel, especially once the Athlon
650 rolled out in August. Both the Athlon 600 and Athlon
The final kicker was that AMD 650 were faster than the Pentium III 600 in Quake III: Arena,
whether paired with the hero graphics chip of the day –
was no longer second rate on Nvidia’s Riva TNT2 – or with 3dfx’s still speedy Voodoo 3.
The Athlon was around 10 per cent faster in standard
floating point operations Windows applications, and up to 20 per cent faster in
Inside the Athlon
cartridge. Check
gaming benchmarks. The Athlon 600 was 10fps faster out the Slot-A
What’s more, the design featured a new level of optimised than the Pentium III 600 in the fiendishly demanding connector, the CPU
core and the two
branch prediction, which was not only more accurate in Quake II Crusher benchmark. As further tests from the likes
modules of L2 cache.
guessing what the next operation would be, but faster to of AnandTech proved, even a Pentium III overclocked to Image credit: Tullius
recover when it got that guess wrong. 650MHz couldn’t keep up. CC BY-SA 3.0
Like the team brought in from NexGen, the team brought
in from DEC had serious skills and experience in RISC chip
design, and AMD put this to good use. The Athlon architecture
converted x86 instructions into more efficient ‘macro ops’
and then those ‘MOPS’ into RISC operations, which the CPU’s
execution units could work on, nine to a clock.
This design was incredibly efficient by the standards of
the day, but it was also conducive to scaling upwards. Where
the K6-III had been stuck at 500MHz, the Athlon launched
at 500, 550 and 600MHz speeds, matching the 600MHz
of Intel’s top-end Pentium III. As if that wasn’t enough, AMD
added a 650MHz version in fewer than six weeks after
21
PROCESSORS
22
INTEL SLOT 1
Holograms, black boxes and mountains of cache. Ben Hardwidge recalls
the weird moment in time when Intel’s CPUs came in Slot format
23
PROCESSORS
Inside a Pentium
II die – that’s 7.5 large 256KB cache chips, giving you 512KB in total – more than
million transistors you found on some Pentium Pro CPUs.
produced By the end of it, you had a circuitboard containing a full CPU
on a 350nm
package in the middle, with two large cache chips next to it.
manufacturing
process, and with no This was then encased in a box with a thermally conductive
integrated L2 cache metal back. The whole package was called a single edged
contact cartridge, or SECC, and you would then attach a
heatsink and fan arrangement to the metal back, and slot the
whole setup into your motherboard.
The SECC package looked good on the surface, but if you
took one apart, you could see that it was a bit of a bodge job. I
was working in a computer shop at the time, and we joked that
the Pentium II was a ‘Socket 7 on a circuitboard’ – you could
even see the solder points where the socket pins could have
been located on the CPU package. It was still a normal square
CPU package – it was just mounted on a board instead.
Performance was mixed. If you were running full 32-bit
software in Windows 95, then the Pentium II was generally
faster than the Pentium MMX, but the latter still had the edge
in some 16-bit software, such as MS-DOS games. It also didn’t
help that the first Pentium II CPUs used the same 66MHz
package. The answer was to manufacture the CPU package front side bus as the final Pentium MMX chips, with the first
on a usual square format without the L2 cache, and to then Pentium IIs running at 233MHz, 266MHz and 300MHz, and a
mount that package on a circuitboard that contained the cache, 333MHz variant arriving later, following a die shrink to 250nm.
resulting in the Pentium II in 1997. It had 7.5 million transistors, This meant that, in some cases, the top-end 233MHz Pentium
produced on a 350nm manufacturing process. MMX was faster than the low-end 233MHz Pentium II.
Like the Pentium Pro, the CPU used a separate ‘back-side bus’
to communicate with the cache, but unlike the Pentium Pro, the THINK OUTSIDE THE BOX
Pentium II could only run the L2 cache at half the speed of the The processor’s new clothes came well and truly off in 1998
CPU. Intel attempted to counter the performance of the cache when Intel introduced its budget range of Slot 1 CPUs, with
by first doubling the amount of L1 cache, from the Pentium the still ridiculous name of Celeron. The first generation of
Pro’s 16KB to 32KB on the Pentium II. The Pentium II’s L2 cache Celeron CPUs, codenamed Covington, removed all of the L2
also had a 16-way associativity, compared with 8-way on the cache from the circuitboard, as well as all the fancy, hologram-
Pentium Pro. A higher associativity means the CPU has a greater clad packaging. This left you with a peculiar-looking green
chance of finding the data it needs in that cache, but that it can circuitboard with a square CPU clearly soldered into the middle
take longer to search for it than a cache with lower associativity. of it - Intel called this non-cartridge arrangement a SEPP format.
The other way Intel bumped up the Pentium II’s The lack of cache meant these Celerons performed poorly
performance was by simply equipping it with a lot of this L2 at the time, pushing people looking for a budget CPU towards
cache. All the first models of Pentium II came with a pair of AMD’s K6 line-up, which still used the aging Socket 7 form
Slot 1 Celerons
didn’t come in a
fancy chassis, and
the first models
didn’t come with
any L2 cache either.
Photo by Qurren
24
FINAL SLOTS
While the first Mendocino Celerons were still mounted
on Slot 1 circuitboards in order to maintain motherboard
compatibility, their L2 integrated cache design meant they no
longer technically needed the rest of the circuitboard. A few
months later, the first Socket 370 Celerons started appearing,
with ‘Slotket’ adaptors required in order to plug them into Slot
1 motherboards. It was a bizarre setup that persisted for an
unusual length of time.
Intel wasn’t quite ready to give up Slot 1 yet. Intel started
by tweaking the design of the CPU chassis, removing the
metal plate at the back. The final arrangement, called SECC2,
retained the plastic front cover with the hologram, but left the
circuitboard and CPU die bare at the back, in order to facilitate
better thermal transfer to the cooler.
The Pentium III factor that Intel had deserted. However, the next generation of Next came the Pentium III, codenamed Katmai, which added
maintained the Celerons in 1999, codenamed Mendocino, overturned this part SSE instructions, but was still fundamentally based on the
front with the
hologram, but of CPU market. They were still mounted on circuitboards at first, same P6 core as the Pentium II. It also still had an external half-
used the new but Intel had now nailed a method to produce a small amount of speed L2 cache setup, with both the CPU and cache mounted
SECC2 packaging, L2 cache on the same die as the CPU, running at full speed. on a circuitboard. It wasn’t until the Coppermine (don’t be
which left the
These new Celerons came with 128KB of full-speed on-die fooled by the name – all the interconnects were aluminium,
circuitboard bare
at the back cache, meaning they were quicker than the 1st-gen (and much rather than copper) revision of Pentium III, with a die shrink
more expensive) Pentium II CPUs in some applications. By to 180m, that Intel finally integrated 256KB of full-speed L2
this time, Intel’s next generation of Pentium II CPUs used a cache into a CPU die containing 29 million transistors.
100MHz front side bus, rather than 66MHz, which provided a Later came a 133MHz front side bus and Intel’s 820 chipset,
significant performance boost over their predecessors. accompanied by high-bandwidth but expensive RDRAM.
Accompanied by the new Intel 440BX chipset, the new CPUs However, the Slot 1 design still persisted. Even the first Pentium
ran at 350MHz, 400MHz and 450MHz, and Intel clearly hoped III to break the 1GHz barrier was based on a slot design. Intel
that this FSB tweak would help distinguish the Pentium II line-up needed to maintain compatibility, which was handy for many
from the new Celeron lineup, despite the latter’s faster cache. of us enthusiasts who had worked out that you could still run
the latest CPUs on some old 440BX boards by overclocking
if you took one apart, you the front side bus to 133MHz. There was also no shortage of
Slotket adaptors at this time, enabling you to install Socket 370
could see that it was a bit CPUs into Slot 1 motherboards.
The final Slot 1 CPU I saw was an engineering sample of a
of a bodge job 1.13GHz CPU that Intel sent to PC Pro magazine, but the chip
was recalled due to stability problems. The slot era was now
Unfortunately for Intel, overclockers had started over, and motherboards based on Intel’s later SDRAM-based
discovering that there was plenty of headroom for some 815 chipset only came in Socket 370 format. The Pentium III
of the Mendocino Celerons to go much faster, despite Intel carried in on socket format, as did the later Pentium 4, and the
locking down the multipliers in an attempt to prevent it. If you CPU industry hasn’t looked back since. Slot processors might
put a new Celeron in an Intel 440BX board, or a board with have looked good, and a part of me misses the fancy casing
VIA’s competing 100MHz FSB Apollo Pro chipset, you could with the holograms, but there’s no doubt that integrating cache
try moving the 66MHz FSB jumper to the 100MHz setting. directly onto the die is a much faster and more efficient way of
If you were lucky, and you had a decent heatsink and fan on doing it.
your CPU, your 300MHz Celeron would suddenly be running
at 450MHz, thanks to its 4.5x multiplier. Combine the clock
speed with the full-speed cache and your £60 processor
could potentially outperform a £400 one.
I remember this well, and bought a 333MHz Mendocino
‘Slotket’ adaptors Celeron with a 5x multiplier, in the hope of running it at
enabled you to 500MHz. It booted, but soon fell over once you got into
plug a Socket 370 Windows. Thankfully, my VIA Apollo Pro board also gave me
CPU into a Slot
the option to run the FSB at 75MHz or 83MHz if you tweaked
1 motherboard.
Photo by the jumper switches right, and the latter setting stably ran my
Konstantin Lanzet budget CPU at 415MHz. I had no need to buy a Pentium II now.
25
RETROGRADE
GRAPHICS
26
GRAPHICS
CGA
Ben Hardwidge delves into the workings of
the PC’s very first colour graphics adaptor
Some 16-colour
ASCII art by Ben
Hardwidge, aged 12
P
eople often nostalgically reminisce about archaic adaptor) and VGA (video graphics array) cards were very
technology, laughing about the frustrations and expensive at first, so CGA still had a home in cheap IBM PC
limitations of cassette tapes or floppy disks, before compatible machines, such as Amstrad’s PC1512. CGA first
adding ‘but they were amazing at the time!’ You simply can’t appeared in 1981, but new software was still supporting it well
hide the horror of the PC’s first colour graphics adaptor (CGA) into the early 1990s – you can even run Windows 3.0 on it.
behind such rose-tinted glasses (although orange-tinted
glasses might help – more on that later). Nobody, absolutely TEXT MODE
nobody, thought CGA was amazing at the time. At its basic level, a standard 16KB (yes, KB) CGA card can
I had a CGA PC in the 1980s, and but even then you felt access a palette of 16 colours, or rather eight colours at two
disappointed when you fired up a PC game to be greeted by intensities. It’s basically 4-bit colour, with three bits allocated to
a mess of purple and black on the screen. At the time, we red, green and blue (RGB), and the fourth bit enabling you to
joked that CGA stood for ‘crap graphics adaptor’. Nobody change the ‘intensity’ of the colour (RGBI).
thought of IBM computers as games machines then, of At the first level of intensity, you get black, blue, green, cyan,
course – CGA was the product of IBM trying to make a red, magenta, brown and light grey. The second level of
graphics standard that could display bar charts properly. It intensity basically gives you the same colours but with an extra
wasn’t meant to compete with the Commodore 64. level of intensity, which turns the brown into a yellow, the light
Better graphics came to the PC later, of course, but CGA was grey into a white and the black into a dark grey, while creating
supported for a long time. The later EGA (enhanced graphics light versions of the other colours.
GRAPHICS
Now, you might think 16 colours sounds okay for 1981, but COLOUR GRAPHICS
you can only display all these 16 colours on the screen at once Let’s start with the former, as that was the one that enabled
in text mode – the mode you used to see on BIOS screens you to get actual colour graphics on your PC. Generally, black
before we had fancy EFI systems. On a CGA card, the text was the background colour, and you then had three other
display has an effective resolution of 640 x 200, but it can only colours. As standard, most games used CGA in BIOS mode 4
display text characters on it, with 80 characters on the X axis, (the default BIOS mode for graphics), with the high-intensity
and 25 on the Y axis. version of palette 1, which gave you black, white, light cyan
As a kid, I used to play around with this mode quite a lot, as it light and magenta. It enabled you to make clearly defined
was the only way to get a lot of colours on the screen. If you shapes with black on white, gave you cyan for skies and
knew your ASCII codes, you could display various lines and water and then everything else would have to be filled in with
blocks as text characters and make a picture. You effectively magenta. It generally looked hideous, although it was
have to ‘type’ a picture, rather than drawing it – I used to spend sometimes better for space games – Captain Blood looked
hours doing it. To type ASCII codes, you hold down Alt and type surprisingly good in this mode.
a three-digit number – 176, 177, 178 and 256 give you three You could get other palettes too. Palette 0 was also
blocks of variable shading and a solid block, for example – it available in BIOS mode 4, and
still works in Windows. In this text mode, you could assign each gave you red, green, black and you fired up a PC
character a foreground and a background colour. brown as standard, or light red,
Game developers used this mode too – I had a clone of Ms light green, black and yellow in game to be greeted by
Pac-Man that used to run in text mode rather than graphics high-intensity mode. The latter
mode, as well as a clone of Breakout called Bricks. On a mode generally looked better in a mess of purple and
standard CGA card, it was the only way to get access to lots
of colours. There was a trick to enable you to display all 16
games to me. It meant you
couldn’t get blue for skies, but
black on the screen
colours at an effective graphical resolution of 160 x 100, by you could do pretty sunsets and
changing the number of lines of each text character to dark dungeons well. One of my favourite games to use this
display. However, it was rarely used. If you wanted graphics palette was a fantasy barbarian game called Targhan, which
rather than text, you usually either had four colours on the genuinely did look amazing considering the technology it
screen at 320 x 200, or one colour at 640 x 200. was using.
As a kid, I also discovered a trick while playing with the
night vision filters for my Dad’s binoculars. If you look at the
TRY CGA FOR YOURSELF cyan, magenta, black and white palette through an orange
In the unlikely event that you want to try out the shocking filter, it becomes the light yellow, light red, light green and
disgrace that is CGA graphics for yourself, you can do it in black palette. I bought some orange acetate from the local
DOSBox (dosbox.com). This handy software creates a virtual art shop and stapled it to a cardboard frame with Blu-Tack in
machine designed to recreate a high-spec PC from the 1990s. It
each corner – I could then swap between palettes at will!
loads a sound card and MIDI drivers automatically, and gets you
set up with a mouse too. It’s great if you want to play a round of
The low-intensity version of this palette was also used in
Doom or X-Wing. games occasionally. One example is Pharaoh’s Tomb, an
However, later VGA cards didn’t support CGA palette- early work by George Broussard at Apogee, who later went This breakout clone,
called Bricks, was
switching as standard. They could run CGA software, but usually on to work on the Duke Nukem games.
effectively built in
in the default black, white, magenta and cyan palette, even if Another trick often used by game developers was to text mode so it could
they used a different palette on a CGA machine. DOSBox runs in switch the CGA card to BIOS mode 5, which in high- access all 16 colours
VGA mode by default, which results in the same problem.
To get around it, you’ll need to open Options in your Start
menu’s DOSBox folder, which takes you into the config file.
Scroll down to the ‘[dosbox]’ section, and type ‘cga’ after
‘machine=’. After that, scroll down to the ‘[render]’ section, and
type ‘true’ after ‘aspect=’.
On some monitors you may find that you still don’t get the
correct 4:3 aspect ratio, even after changing the aspect setting
to true. If that happens, we found that setting ‘fullresolution=’ to
‘1366x768’ fixed it on our 4K monitor. We have no idea why, but
it seems to work.
If you want to run a really old game, it may also only be
optimised for early processors, and will run too fast on DOSBox’s
standard settings. If you want to emulate an XT-era 8086 PC,
scroll down to [cpu] and type ‘simple’ after ‘core=’ and change
the number of cycles to 530 (this isn’t exact, but it was near
enough in our tests).
28
1 2 3
4 5 1. C
aptain Blood 4. Ribit (a Frogger clone)
BIOS mode 4, palette 1 BIOS mode 4, palette
high intensity 0 low intensity
2. Formula 1 5. World Class
Grand Prix Circuit Leaderboard
BIOS mode 5 BIOS mode 4, palette
0 low intensity, black
3. Targhan background replaced
BIOS mode 4, palette with blue
0 high intensity
intensity mode gave you access to a black, white, light red same. The result was double-height, rectangular pixels,
and light cyan palette. It had the same limitations as the rather than square ones. This mode also produced a
default cyan, magenta, black and white palette, but to my hideous moiré effect on lots of CGA monitors, making it
eyes, the red looked less garish than magenta . difficult to look at the screen.
A few games also ventured outside these palettes with
some tricks, which usually involve replacing black as the COMPOSITE MODE
background colour. Sierra’s Leisure Suit Larry in the Land of There was one more trick to getting a standard CGA card
the Lounge Lizards, for example, used palette 0 at low to display more characters, and it involved cleverly using
intensity, but replaced the black background colour with the composite output, rather than the 9-pin RGB monitor
blue (it looks hideous). This palette worked well in golf output. Irritatingly, most PAL TVs in the UK weren’t able
game World Class Leaderboard, though, with green and to handle this mode, as it’s dependent on the NTSC
brown trees, red leaves, green grass and blue skies and chroma decoder mistakenly seeing some luminance
water – colours you should be able to take for granted. signals as colour.
Sierra used the same trick in King’s Quest IV: The Perils of As a result, you could effectively make new colours by
Rosella, but using the BIOS mode 5 palette, again replacing lining up pixels in certain patterns on an NTSC display, and
the black with blue. The result was a blue, cyan, red and again by using different intensities. By placing one colour
white palette, which worked well with blue sea against cyan pixel next to another one, you could make an entirely new
sky, but meant the grass and trees looked very odd. colour, and it looked solid rather than a messy mix of pixels.
The result is astonishing, enabling you to create a much
Xenon II: Megablast
MONO GRAPHICS wider colour palette.
640 x 200
‘high-resolution’ The other main graphical option available to standard CGA The disadvantage, of course, is that the effect can only
monochrome mode cards was the ‘high-resolution’ 640 x 200 monochrome be achieved by placing pixels next to each other, which
mode. It was used in games effectively reduces the horizontal resolution from 320 to
that had a fair amount of detail 160. Some games supported this mode, though, including
in the graphics, such as Sim Sierra’s original King’s Quest game.
City, Death Track and Xenon II:
Megablast, among others. It King’s Quest in composite CGA King’s Quest in RGB CGA
was also used for early GUI
operating systems, such as
Gem and Windows 3.0.
However, only the horizontal
resolution was higher than the
colour graphics resolution – the
vertical resolution was the
29
GRAPHICS
EGA
Stuart Andrews recalls how 16
colours changed the PC world
P
ity the poor PC of 1983-1984. It wasn’t the graphics
powerhouse we know today. IBM’s machines and
their clones might have been the talk of the
business world, but they were stuck with text-only displays
or low-definition bitmap graphics. The maximum colour
graphics resolution was 320 x 200, with colours limited to
four from a hard-wired palette of 16. Worse, three of those
colours were cyan, brown and magenta, and half of them
were just lighter variations of the other half.
By this point, IBM’s Color Graphics Adaptor (CGA) standard
was looking embarrassing. Even home computers such as
the Commodore 64 could display 16-colour graphics, and
Apple was about to launch the Apple IIc, which could hit 560
x 192 with 16 colours. IBM had introduced the Monochrome
Display Adaptor (MDA) standard, but this couldn’t dish out
Using Chips and Technologies’ EGA chipset, early graphics card
more pixels, only higher-resolution mono text. manufacturers such as ATi could produce smaller, cheaper boards.
Meanwhile, add-in-cards, such as the Hercules or Credit: Vlask, CC BY-SA 3.0
Plantronics Colorplus, introduced higher resolutions, but
did nothing for colour depth. The PC needed more, which It was massive, measuring over 13in long and containing
IBM delivered with its updated 286 PC/AT system and the dozens of specialist large scale integration (LSI chips),
Enhanced Graphics Adaptor (EGA). memory controllers, memory chips and crystal timers
The original to keep it all running in sync. It came with 64KB of RAM
IBM EGA card
was a whopper, THE NEW STATE OF THE ART on-board but could be upgraded through a Graphics Memory
even without The original Enhanced Graphics Adaptor was a hefty Expansion Card and an additional Memory Module Kit to
the additional optional add-in-card for the IBM PC/AT, using the standard up to 192KB. Crucially, these first EGA cards were designed
daughtercard and
8-bit ISA bus and with support built into the new model’s to work with IBM’s 5154 Enhanced Color Display Monitor,
memory module
kit. Credit: Vlask, motherboard. Previous IBM PCs required a ROM upgrade in while still being compatible with existing CGA and MDA
CC BY-SA 3.0 order to support it. displays. IBM managed this by using the same 9-pin D-Sub
connector, and by fitting four DIP switches to the back of the
card to select your monitor type.
EGA was a significant upgrade from low-res, four-colour
CGA. With EGA, you could go up to 640 x 200 or even (gasp)
640 x 350. You could have 16 colours on the screen at once
from a palette of 64. Where once even owners of 8-bit
home computers would have laughed at the PC’s graphics
capabilities, EGA and the 286 processor put the PC/AT back
in the game.
30
BIRTH OF AN INDUSTRY PC RPG, including the legendary SSI ‘Gold Box’ series of
However, EGA had one big problem; it was prohibitively Advanced Dungeons and Dragons titles, Wizardry VI: Bane of
expensive, even in an era when PCs were already the Cosmic Forge, Might and Magic II and Ultima II to Ultima V.
astronomically expensive. The basic card cost over $500 US, It also powered a new wave of better-looking graphical
and the Memory Expansion Card a further $199. Go for the adventures, such as Roberta Williams’ Kings Quest II and
full 192KB of RAM and you were looking at a total of nearly III, plus The Colonel’s Bequest. EGA helped LucasArts to
$1,000 (approximately £2,600 inc VAT in today’s money), bring us pioneering point-and-click classics such as Maniac
making the EGA card the RTX 3090 of its day, and only Mansion and Loom in 16 colours. And while most games
slightly more readily available. What’s more, the monitor you stuck to a 320 x 200 resolution, some, such as SimCity,
needed to make the most of it cost a further $850 US. EGA would make the most of the higher 640 x 350 option.
was a rich enthusiast’s toy.
However, while the initial card was big and hideously
complex, the basic design and all the tricky I/O stuff were
With EGA, there was scope
relatively easy to work out. Within a year, a smaller company, to create striking and even
Chips and Technologies of Milpitas, California, had designed
an EGA-compatible graphics chipset. It consolidated and beautiful PC games
shrunk IBM’s extensive line-up of chips into a smaller
number, which could fit on a smaller, cheaper board. The What’s more, EGA made real action games on the PC
first C&T chipset launched in September 1985, and within a a realistic proposition. The likes of the Commander Keen
further two months, half a dozen companies had introduced games proved the PC could run scrolling 2D platformers
EGA-compatible cards. properly. You could port over Apple II games such as Prince
Other chip manufacturers developed their own clone of Persia, and they wouldn’t be a hideous, four-colour mess.
chipsets and add-in-cards too, and by 1986, over two dozen And when the coder behind Commander Keen – a
manufacturers were selling EGA clone cards, claiming over certain John Carmack – started work on a new 3D sequel
40 per cent of the early graphics add-in-card market. One, to the Catacomb series of dungeon crawlers, he created
Array Technology Inc, would become better known as ATI, something genuinely transformative. Catacomb 3-D and
and later swallowed up by AMD. If you’re on the red team in Catacomb: Abyss gave Carmack his first crack at a texture-
the ongoing GPU war, that story starts here. mapped 3D engine, and arguably started the FPS genre.
Sure, EGA had its limitations – looking back, there’s an
CHANGING GAMES awful lot of green and purple – but with care and creativity, an
EGA also had a profound impact on PC gaming. Of course, artist could do a lot with 16 colours and begin creating more
there were PC games before EGA, but many were text- immersive game worlds.
based or built to work around the severe limitations of
Forgive the
blocky pixels and CGA. With EGA, there was scope to create striking and even A SLOW DECLINE
16-colour palette. beautiful PC games. EGA’s time at the top of the graphics tech tree was short.
In Catacombs 3-D This didn’t happen overnight. The cost of 286 PCs, EGA Home computers kept evolving, and in 1985, Commodore
and Catacombs:
cards and monitors meant that it was 1987 before EGA launched the Amiga, supporting 64 colours in games and
Abyss lay the seeds
of Wolfenstein support became common, and 1990 before it hit its stride. up to 4,096 in its special HAM mode. Even as it launched
and Doom Yet EGA helped to spur on the rise and development of the EGA, IBM was talking about a new, high-end board, the
Professional Graphics Controller (PGC), which could run
screens at 640 x 480 with 256 colours from a total of 4,096.
PGC was priced high and aimed at the professional
CAD market, but it helped to pave the way for the later
VGA standard, introduced with the IBM PS/2 in 1987. VGA
supported the same maximum resolution and up to 256
colours at 320 x 200. This turned out to be exactly what
was needed for a new generation of operating systems,
applications and PC games.
What extended EGA’s lifespan was the fact that VGA
remained expensive until the early 1990s, while EGA had
developed a reasonable install base. Even once VGA hit the
mainstream, many games remained playable in slightly
gruesome 16-colour EGA. Much like the 286 processor and
the Ad-Lib sound card, EGA came before the golden age of
PC gaming, but this standard paved the way for the good
stuff that came next.
31
GRAPHICS
VGA
Stuart Andrews looks at the tech that transformed
the PC into a gaming and graphics powerhouse,
256 colours at a time
T
he technology that put PC graphics firmly on the
map arrived in April 1987 as part of IBM’s PS/2 line
of PCs. IBM saw the PS/2 as the answer to its
biggest problems, putting Big Blue (as we all used to call it)
back in control of the PC architecture and one step ahead of
the clone manufacturers.
To do so, it had Intel’s latest processors, cutting-edge
connection options and the fastest floppy disk storage, not
to mention a revolutionary new high-bandwidth system
bus. But what turned out to be the PS/2’s most important
32
stuck at 320 x 200 in Mode 13h. However, programmers
ATi was one of many graphics chip and card
manufacturers to first clone VGA, then enhance it. found workarounds. A handful of games, such as the
Credit: Samuel Demeulemeester, CC 4.0 legendary horror game Dark Seed, opted to work with a
reduced 16-colour palette in order to use the full 640 x 480
resolution. Meanwhile, Michael Abrash, who would later
work with id Software on Quake, worked out an approach
that enabled programmers to use 256 colours at a slightly
higher resolution of 640 x 240, which he dubbed Mode X.
Meanwhile, Windows 2.0 moved to adopt the 640 x
480 mode with 16 colours, bringing the interface closer to
what we expect from a GUI today. However, many of the
applications and games we think of as belonging to the VGA
era stuck to Mode 13h and its 320 x 200 resolution. What’s
more, with the CPU performing most of what we’d now call
the GPU’s legwork, this was arguably for the best – until
the Intel 486 appeared in 1989, there wasn’t any really CPU
end option of two new graphics standards. The cheaper powerful enough to handle gaming at higher resolutions.
PS/2 models were stuck with the Multi Colour Graphics
Adapter (MCGA) which had the same 256-colour mode THE IMPACT OF VGA
but lacked VGA’s higher resolutions. Luckily, those colours alone had a huge impact. The ZSoft
Like IBM’s new MCA bus architecture, MCGA didn’t last Corporation’s PC Paintbrush and Electronic Arts’ Deluxe
long beyond the PS/2, but VGA developed a life of its own. Paint II revolutionised professional graphics and computer
Beyond hardware-level support for smooth scrolling, and art on the PC, thanks to 256-colour support. VGA also
a barrel shifter designed to shift incoming data from the made CorelDRAW, launched in January 1989, a realistic
CPU to the display at seven bits at a time, it didn’t actually alternative to the digital design packages appearing on
do much in the way of graphics acceleration. Apple’s computers.
However, it did set a new baseline standard for PC Meanwhile, for PC games, VGA was nothing short of
graphics, and for hardware and software support. Crucially, transformative. Sure, the 64,000 pixels on your monitor
through its RAMDAC and 15-pin D-Sub connector, it looked a little chunky; however, with 256 colours, the
established how the PC could convert digital instructions artists working at leading developers, such as LucasArts,
into a 256-colour analogue video signal, setting the stage Sierra Online, Microprose, Electronic Arts and Origin
for the 16-bit and 24-bit colour standards to come. Systems, were able to produce sprites that looked more
Instead of sending six colour signals from the graphics like recognisably human (or inhuman) characters, and
card to the monitor, like the older EGA chipsets, the VGA background scenery that could bring their game worlds to
chipset and its RAMDAC sent only three signals – red, life. Plus, while the PC couldn’t pull off the same smooth
green and blue, with a potential 64 different levels for each. scrolling, sprite-scaling tricks as the Commodore Amiga
For VGA, this resulted in an 18-bit palette of up to 262,144 or 8-bit consoles, its best games were developing a visual
Eye of the Beholder
– VGA’s larger colours, 256 of which could appear simultaneously in Mode richness of their own. As the PC moved into the 386 era, it
colour palette 13h. Once adopted, this same core technology gave scope was beginning to be taken seriously as a gaming machine.
(right) gave artists for 16-bit and 24-bit colour in later graphics chips, with up to Taken on its own, the first VGA chipset wouldn’t have
the chance to use
65,536 colours or 16.7 million colours on the screen at once. made such an impact. After all, you only got it to use it if you
more realistic
shading compared Resolution wasn’t the base level VGA spec’s strength. bought a pricey IBM PS/2 machine. Instead, it really only
with EGA (left) In fact, PC journalists of the time pondered why it was gained momentum once it began to appear in add-in cards.
33
GRAPHICS
The Secret of IBM was first out of the gate with its PS/2 Display Adapter, advanced resolutions, such as 800 x 600 in 16 colours or
Monkey Island a card that gave any reasonably modern IBM-compatible 640 x 480 with 256 colours.
– moving to VGA
(right) enabled
PC with ISA slots a VGA chipset for the princely sum of This in turn put pressure on the system bus. The original
PC developers to $599 US (about £420 inc VAT then and £1,200 inc VAT in VGA controllers were so undemanding that they couldn’t
create more human- today’s money). exhaust the miserable bandwidth of the 8-bit ISA bus,
like characters Yet by this point, the older EGA standard had spawned but as these new chipsets emerged, they required more
compared with
EGA (left) a growing industry of third-party manufacturers, adept at bandwidth and a spot on the wider 16-bit ISA bus.
mimicking or reverse-engineering IBM’s technology and As time went on and Intel’s CPUs grew faster, demands
spawning their own versions. What’s more, these guys would grow accordingly, resulting in the development of the
didn’t stop at simply replicating IBM’s latest standards; they Extended ISA (EISA) bus and VESA Local Bus. However, this
wanted to add a little extra sauce to their cards by actively complicated the situation further, with the fastest enhanced
enhancing them. VGA cards, based on Tseng Labs or Cirrus Logic tech,
As a result, October 1987 saw the launch of the first VGA- performing best in 16-bit versions running on the 16-bit ISA
compatible third-party graphics card, the STB VGA Extra. It bus, although this wasn’t always the case with every chipset.
By 1989, NEC would lead the early graphics chipset
Developers were able to manufacturers in the creation of the Video Electronics
Standards Association and the Super VGA BIOS, opening
produce sprites that looked up support for higher resolutions and colour depths across
the PC industry. Windows acceleration became the new
more recognisably human battleground and video acceleration became the next
cutting-edge technology.
did everything VGA did, albeit with a few foibles here and Yet all these new cards and advanced feature sets
there, with some optimisations that made it slightly faster. still had the VGA standard at their core. VGA became the
By mid-1988 to 1989, the likes of Tseng Labs, Cirrus Logic, base requirement for new PCs running later versions of
Chips and Technologies and ATi were entering the fray, and Windows or IBM’s OS/2. In many respects, IBM had built
Dark Seed sacrificed not only were they driving prices down to $339 US, but they the foundation of PC graphics for the next ten to 15 years.
colour depth for
were also adding new capabilities. These enhanced VGA In fact, you could argue that VGA is still the foundation.
resolution, in order
to do justice to H.R. cards added features to accelerate video, or increased the If so, it probably wasn’t a whole lot of comfort to IBM.
Giger’s artwork RAM to 512KB, and tinkered with the BIOS to cover more While VGA was the last graphics standard IBM managed to
establish, it wasn’t for the want of trying. Even as it launched
VGA, it was preparing its 8514 graphics adaptor, with fixed
functions to accelerate common 2D drawing processes,
such as drawing lines or filling shapes with colour. In
1990, it hoped to supersede VGA with its new 1,024 x 768,
256-colour standard, XGA.
Both these new standards floundered because they
were designed to run on IBM’s MCA bus, while IBM’s
clone-making rivals focused on getting the most out of the
existing 16-bit ISA bus, before working on the proposed
EISA replacement. The result? Super VGA became the new
de facto standard, while IBM lost its domination of the PC
industry. Bad news for Big Blue, but good news for those of
us who enjoyed the more cost-conscious, game-focused
machines in the years that followed.
34
THE VERY BEST OF EARLY VGA
Coming at you in 256 glorious colours at 320 x 200
35
GRAPHICS
3DFX
VOODOO 3D
Reflective surfaces, smooth frame rates and the
pure awesomeness of GLQuake. Stuart Andrews
recalls the truly transformative effect of 3Dfx’s
Voodoo chipset on PC gaming
I
t’s a classic case of being the right company with In 1993, Pellucid was bought by Media Vision, a company
the right tech at the right time. 3Dfx launched its that had grown rich from selling multimedia kits for PCs
revolutionary Voodoo Graphics chipset just as fully during the CD-ROM revolution. Pellucid had proposed the
polygonal 3D graphics hit the mainstream and PC gamers design and manufacture of a PC 3D gaming chip, and Media
wanted an easy and accessible way to get them. Vision wanted some of that action.
In late 1996, Quake and Tomb Raider had just been Unfortunately, Media Vision had its own (mostly legal)
released, the Nintendo 64 was out in Japan and North issues, and went out of business. However, just when the
America, and the Sony PlayStation and Sega Saturn were situation looked bleak, Scott Sellers met Gordon Campbell,
still in their first year. Reliant purely on CPU horsepower, and founder of the pioneering graphics chip manufacturer,
with no dedicated 3D hardware to back it up, the PC was Chips & Technologies. Campbell asked the trio what they
beginning to lose its place as the king of gaming platforms. wanted to do, and helped them to find the venture capital
If 3Dfx needed a
Sure, it had a bunch of 2D/3D accelerator cards, but to do it. killer app, GLQuake
they were too damn slow to make any difference. With the With Smith working as vice president of sales and delivered. You
Voodoo Graphics chipset, 3Dfx played a bigger role than any marketing, Sellers and Tarolli used all the know-how could play id’s
cutting-edge 3D
other graphics hardware manufacturer in turning around that they’d built up at SGI and Pellucid to design a cost-efficient
title at 640 x 480
situation. In doing so, it made 3D acceleration an absolute, 3D architecture built specifically to handle the polygonal in 16-bit colour at
cast-iron must-have feature. rendering pipeline used in 3D games. a smooth 30fps
36
– a bank of 2MB of high-bandwidth (for the time) EDO RAM,
and the resulting scanlines were fed out to a DAC, which
output to a good, old-fashioned analogue VGA output.
37
GRAPHICS
38
POWER VR
Ben Hardwidge catches up with the PowerVR folks
from Imagination Technologies (formerly VideoLogic),
to discuss early PC 3D accelerators
B
ack when PCs were still in horrible beige boxes, 1993, this first card first came out – it would have been in a
John Major was nasally shouting over the despatch 486 PC, so not very good floating point performance. We
box and Nvidia was just a glint in Jensen Huang’s had a Texas Instruments DSP on there to do all the
eye, VideoLogic (now Imagination Technologies, the firm also transform and lighting. This board would later do tile-based
behind PURE radios) started work on the PowerVR project. It deferred rendering, with real-time shadows, and proper 3D
resulted in some of the first PC 3D accelerators and, since volume shadows, but it didn’t have texturing, because it
then, PowerVR has become a mobile GPU system of choice, was hard enough to fit that all onto one chip.’
found in the iPhone 7 and numerous Android phones. Tile-based deferred rendering is the key to PowerVR.
I headed up to Imagination Technology’s HQ in Kings ‘Tile-based rendering and deferred rendering are two
Langley to chat with some of the folks who worked on the separate things,’ explains Kristof Beets, Senior Director,
original PC PowerVR cards. I’m taken to a meeting room, Product Management & Technology Marketing,
where a spread of PC relics from the early 1990s to the PowerVR. ‘Most of our competitors today have some
2000s is laid out. They include never-released products, form of tile-based rendering. Fundamentally, that means
including the Kyro 3 and various pre-release boards, as well you bucket your geometry, so instead of rendering triangle
as some classics. by triangle, you first sort your triangles and then render
each tile.
WHERE DID IT START? ‘The key benefit is local processing. The further your data
Simon Fenny, PowerVR Research Fellow, picks up the goes, the more power it uses. If you keep it very tight, it’s
first one – an enormous PCB with a 16-bit ISA much more efficient. Memory loves big transactions, so
interface. ‘The whole PowerVR project blasting a tile and loading the texture data for a tile is
VideoLogic’s started in July 1992,’ he says, really effective.’
Apocalypse 3DX ‘and in about early ‘The reason why we’re still so good at tiling is because
was the first of all the clever algorithms and data structures that go
mainstream
behind it, which Simon and those guys came up with
PowerVR PC 3D
accelerator in the 1990s – it’s how you sort triangles effectively
into those buckets.’ Basically, the work done on the
early PC 3D accelerators is still useful in
smartphones today.
The next part is deferred rendering, a benefit
of which is that you can identify objects that
are hidden behind other objects before
shading them, so you only shade the objects
you can see. ‘It’s like painting by numbers,’
says Fenny. ‘Imagine you’re drawing your
triangles, and instead of filling in colours you say,
“This is triangle 1, that’s triangle 5 and that’s triangle 6.”
You then say, “Okay, send those off and fill in all the 1s. Oh
what’s the next one? 3 is the next one – do those”, within
each tile.
39
GRAPHICS
‘If something else is behind you, don’t bother even the PCI bus, you could not only write things in, but you could
shading that. If you just have a normal tile renderer, it might burst things right out,’ says Fenny. ‘Because it was tile-
be local, but you still end up drawing a car behind based rendering, if you finished your tile completely you
something else, and then a wall over the top. Why would could do that and be really efficient on the bus.’ An early ISA Rapier
you bother spending all that effort? Some other people will 3dfx wasn’t using tile-based rendering, and its Voodoo 24 card – the
sort things so that it works properly, but it’s expensive to cards used a Z-buffer to solve the visibility problem. It’s a big gold Texas
Instruments chip
sort things.’ situation that not only meant Voodoo cards had to use loop- handles transform
back cables, but they also had to allocate some of their and lighting
THE FIRST PC CARDS
VideoLogic was initially
targeting the arcade
market with PowerVR,
but as PC tech
progressed, the team
soon turned to looking at
the PC. ‘Thankfully, the
Pentium had come along
with the PCI bus,’ says
Fenny, ‘so we were able
to do the transform and
lighting on the Pentium.
We’d send the models
over the PCI bus into the
chip, which would then
render it. These cards would basically mix the signal frame buffer memory to the Z-buffer. That’s why 1st-gen
coming in from the VGA card.’ Voodoo cards are limited to 16-bit colour at 640 x 480,
The first mainstream product based on this tech was while PowerVR cards could go higher.
the VideoLogic Apocalypse 3DX. This mixing of the signal ‘If you turn off Z-buffering, which means a lot of messing
was a key part of the PowerVR formula at the time. The first around in software, 3dfx could get at 800 x 600 in 16-bit,’
3dfx Voodoo and PowerVR cards were dedicated 3D says Fenny, ‘but we were streaming at 24 bits per pixel.’
accelerators, meaning you needed a second ‘2D’ graphics One area where 3dfx had the upper hand was system
card to output a display to your monitor. requirements. You could get decent performance from a
Voodoo cards needed a VGA analogue loopback cable Voodoo card with a
between your 3D card and 2D card. PowerVR cards did it
much more cleanly (at least from a hardware perspective),
Pentium 90, but a
PowerVR card needed a
the work done
mixing the signal over the PCI bus. ‘We realised that, with beefier CPU to get the on early PC 3D
most out of it.
accelerators
THE DREAMCAST
PowerVR was on a roll, and is still useful in
it had caught the eye of
Sega while it was smartphones todaY
developing the Dreamcast.
‘I remember being in a couple of meetings, saying it does
this and this, and they just looked at us thinking, “That’s not
possible,”’ says Fenny. ‘There was a great deal of
excitement. We were adding texture compression. We had
hardware ordering-dependent translucency, which is still
difficult to do now.’
What’s hardware ordering-dependent translucency? ‘If
you ever have to write a game where you have lots of
layers of translucent objects, which are in random order on
the screen, you have to make sure you do them in back-to-
front order,’ says Fenny.
A VideoLogic Apocalypse
3DX – still in its shrink-wrap Beets informs Fenny that these days developers write a
at Imagination Technologies quick-sort in a shader program to deal with it. ‘No! Yuck!’ he
40
on it, which was the cheapest fan we could find
in China, because it’s essentially cosmetic.’
KYRO
The final push for PowerVR on the desktop
PC was the Kyro series. Fenny laments
that the Kyro series saw hardware
ordering-dependent translucency removed from
hardware. The industry was moving towards standardised
A Guillemot APIs, rather than proprietary ones, and that meant
Hercules 3D compromising on some hardware features. ‘We’d say,
Prophet 4000XT “We’re doing translucency sorting” to DirectX developers
card, based
on Kyro 2
and some would say, “What? No, that’s not possible.” Others
said, “Yeah, it would be great to use it, but there are cards that
can’t possibly use it, so we’re not going to develop for it.”’
Kyro also saw the introduction of PowerVR’s ‘perfect tiling’
technique. ‘We figure out exactly which tiles that an object is
in,’ explains Beets. ‘What our competitors do is bounding
boxes, but a box covers a lot more area than a triangle.’ Next
came the Kyro 2, with a die shrink and an increase in clock
speed. I was working for PC Pro magazine at the time, and
reviewed the Kyro II. It wasn’t as quick as Nvidia’s top-end
GeForce2 chips, but it happily beat the GeForce2 MX’s
performance for a similar price.
Nvidia wasn’t happy, and briefed industry partners against
Kyro 2. A leaked PowerPoint presentation showed Nvidia
lambasting Kyro 2’s driver support, rendering quality,
The back of a Kyro responds. ‘It was funny watching some people trying to port Z-buffer issues and lack of hardware transform and lighting.
3 card, which never the Dreamcast games onto, say, the PlayStation. You’d see The presentation’s conclusion was damning: ‘Buying Kyro 2
made it to market
the early examples, and all the translucency would be is a risk – and when cards and PCs get returned, it damages
wrong, because the games were designed with the your finances and your reputation.’ Understandably, there’s
hardware doing it all for you. It did help that we had control not a great deal of love for Nvidia among the PowerVR folks.
over the API, because DirectX was kind of limited to
Z-buffer rendering.’ LEAVING THE DESKTOP
The next PowerVR PC product was the Neon 250, based on Fenny shows me the card that would have been Kyro 3, but it
some of the tech in the Dreamcast, and an all-in-one 2D/3D never made it to market – the reasons are kept off the record.
AGP card. ‘The original version of the Neon product had no fan I ask why we’ve never seen a PowerVR desktop product
on it, and we found it really hard to sell in 1999,’ muses David since. ‘We were very nervous,’ says Harold. ‘We looked at the
VideoLogic’s Vivid! Harold, VP Marketing Communications. ‘People basically market, and thought, “There are five console makers, and
Card – based on the thought, well it has no fan so it must be underpowered Panasonic is going to be out of this business in five minutes,
1st-gen Kyro chip compared to Nvidia. So the next version of the board had a fan then there’s going to be four and some day there will be
three. And in every generation you have to win a slot.”
‘After Dreamcast, we talked very seriously about doing a
console with somebody else, and realised that every single
engineering resource we had would have to go on that
project. Then, if we lost that slot to whoever in the next
generation, we would have no customer.
‘It’s the same with the PC market. When we started, there
were 50+ companies making devices for PC boards, and that
figure was shrinking – not yearly; it was practically shrinking
weekly. We looked at the market and just thought, If we keep
going after PC and console, we’re never going to have
enough customers to make our business resilient.
‘At the time, we said that one day we’d come back to those
markets, but ultimately, you’re driven by what your customer
wants to make.’
41
GRAPHICS
NVIDIA
GEFORCE
Lights, transform, action! Ben Hardwidge
recalls the very first ‘GPU’
I
t’s testament to Nvidia’s marketing
team that one of its buzzwords has now
slipped into common parlance. Not only did
Nvidia’s 1st-gen GeForce 256 introduce us to its now
famous ‘GeForce’ gaming graphics brand, but it also
brought the term ‘GPU’ to the PC with it. An initialism
that we now use as shorthand for any PC graphics
chip, or even a whole graphics card, started life as an
Nvidia marketing slogan.
To give you an idea of how long ago this was, I was
introduced to the term ‘GPU’ by a paper press release
the same week I started my first tech journalism job
in September 1999. We didn’t get press releases via
email then – they were physically posted to us, and
the editorial assistant sorted them all into a box for the
team to peruse.
‘In an event that ushers in a new era of interactivity A VisionTek GeForce 256 card with SDR memory
for the PC, Nvidia unveiled today the GeForce 256, the
world’s first graphics processing unit (GPU)’, it said. At the by the CPU. The first stage is the geometry, where the CPU
time, I thought it seemed pompous – how could this relative works out the positioning (where polygons and vertices sit
newcomer to the 3D graphics scene have the nerve to think in relation to the camera) and lighting (how polygons will
it could change the language of PC graphics? But I now see look under the lighting in the scene). The former involves
that it was a piece of marketing genius. Not only did ‘GPU’ mathematical transformations, and is usually referred
stick for decades to come, but it also meant Nvidia was the to as ‘transform’, with the two processes together called
only company with a PC ‘GPU’ at this point. ‘transform and lighting’ or T&L for short.
Once the geometry is nailed, the next step is to fill in the
TRANSFORM AND LIGHTING areas between the vertices, which is called rasterisation,
Nvidia’s first ‘GPU’ did indeed handle 3D graphics quite and pixel processing operations, such as depth compare
differently from its peers at the time, so it’s time for a little and texture look-up. This is, of course, a massive
history lesson. If we want to understand what made the first oversimplification of the 3D graphics pipeline of the time, but
GeForce GPU so special, we first have to take a look at 3D it gives you an idea. We started with the CPU handling the
pipelines of the time. whole graphics pipeline from start to finish, which resulted in
It was October 1999, and the first 3D accelerators had low-resolution, chunky graphics and poor performance.
only been doing the rounds for a few years. Up until the We then had the first 3D accelerators, such as the 3dfx
mid-1990s, 3D games such as Doom and later Quake were Voodoo and VideoLogic PowerVR cards, which handled
rendered entirely in software by the CPU, with the latter the last stages of the pipeline (rasterisation and pixel
being one of the first games to require a floating point unit. processing), and massively improved the way games
If you want to display a 3D model, it has to go through looked and performed, while also ushering in the wide use
the graphics pipeline, which at this stage was all handled of triangles rather than polygons for 3D rendering. With the
42
just as well with software T&L. DirectX 7 also didn’t require
hardware-accelerated T&L to run – you could still run
DirectX 7 games using software T&L calculated by the CPU,
it just wasn’t as quick.
The GeForce was still a formidable graphics chip whether
you were using hardware T&L or not though. Unlike the 3dfx
Voodoo 3, it could render in 32-bit colour as well as 16-bit (as
could Nvidia’s Riva TNT2 before it), it had 32MB of memory
compared to the more usual 16MB, and it also outperformed
its competitors in most game tests by a substantial margin.
ATi’s response at the time was a brute-force approach,
putting two of its Rage 128 Pro chips onto one PCB to make
the Rage Fury Maxx, using alternate frame rendering
(each graphics chip handled alternate frames in sequence
– note how I’m not using the term ‘GPU’ here!) to speed
up performance. I tested it shortly after the release of the
GeForce 256 and it could indeed keep up.
Nvidia’s GeForce CPU no longer having to handle all these operations, and THE GPU WINS
256 was the first dedicated hardware doing the job, you could render 3D The Rage Fury Maxx’s limelight was cut shortly afterwards,
consumer graphics
games at higher resolutions with more detail and faster though, when Nvidia released the DDR version of the
chip to handle the
whole 3D graphics frame rates. At this point, the CPU was still doing a fair GeForce in December 1999, which swapped the SDRAM
pipeline, including amount of work though. If you wanted to play 3D games, used on the original GeForce 256 with high-speed DDR
the transform and you still needed a decent CPU. memory. At that point, Nvidia had won the performance
lighting stages
Nvidia aimed to change this situation with its first ‘GPU’, battle – nothing else could compete.
which could process the entire 3D graphics pipeline, It also took a while for everyone else to catch up, and at
including the initial geometry stages for transform and this point, various people in the industry were still swearing
lighting, in hardware. The CPU’s only job then was to work that the ever-increasing speed of CPUs (we’d just passed the
out what should be rendered and where it goes. 1GHz barrier) meant that software T&L would be fine – we
could just carry on with a partially accelerated 3D pipeline.
At that point Nvidia had won When 3dfx was building up to the launch of the Voodoo 5
in 2000, I remember it having an FAQ on the website. Asked
the performance battle – whether the Voodoo 5 would have software T&L support,
3dfx said, ‘Voodoo4 and Voodoo5 have software T&L
nothing else could compete support.’ It’s not deliberately dishonest, but every 3D graphics
card could support software T&L at this time – it was done by
BATTLE OF THE PLANETS the CPU – it looked as though the answer was there to sneakily
As with any new graphics tech, of course, the industry didn’t suggest feature parity with the GeForce 256.
With no T&L
hardware, 3dfx instantly move towards Nvidia’s hardware T&L model. At In fact, the only other graphics firm to come up with
fought back with this point, the only real way to see it in action in DirectX 7 a decent competitor in reasonable time was ATi, which
a brute-force, was to run the helicopter test at the start of 3DMark2000, released the first Radeon half a year later, complete with
multi-chip approach
although some games using OpenGL 1.2 also supported it. hardware T&L support. Meanwhile, the 3dfx Voodoo and
on the Voodoo 5
5500. Photo credit: The latter included Quake III Arena, but the VideoLogic PowerVR lines never managed to get hardware
Konstantin Lanzet undemanding nature of this game meant it practically ran T&L support on the PC desktop, with the Voodoo 5 and Kyro
II chips still running T&L in software.
But 3dfx was still taking a brute-force approach –
chaining VSA-100 chips together in SLI configuration on its
forthcoming Voodoo 5 range. The Voodoo 5 5500 finally
came out in the summer of 2000, with two chips, slow
SDRAM memory and no T&L hardware. It could keep up
with the original GeForce in some tests, but by that time
Nvidia had already refined its DirectX 7 hardware further
and released the GeForce 2 GTS.
By the end of the year, and following a series of legal
battles, 3dfx went bust and its assets were bought up by
Nvidia. GeForce, and the concept of the GPU, had won.
43
S U B S C R I P T I O N / OFFER
SUBSCRIBE TODAY!
Mouse-pointer custompc.co.uk/subscribe
Phone 01293 312182 ENVELOPE [email protected]
Subscriptions, Unit 6 The Enterprise Centre, Kelvin Lane, Manor Royal, Crawley, West Sussex, RH10 9PE
44
SUBSCRIBE TO
GET 3
ISSUES
FOR £5
Mouse-pointer custompc.co.uk/subscribe
45
RETROGRADE
SOUND
46
SOUND
THE SOUND
BLASTER STORY
Ben Hardwidge talks to Creative Technology founder and CEO, Sim Wong Hoo,
about the development of the iconic Sound Blaster brand
N
ow celebrating its 30th
birthday, the Sound
Blaster made a
massive impact when it was
launched back in 1989. It seems
bizarre now, but at that time,
gaming was still considered to be
a frivolous novelty for the PC,
which was primarily a business
machine. While the Atari ST and
Commodore Amiga had half-
decent sound capabilities, most
PCs came equipped with only a
mono PC speaker, which simply blurted our chirps and beeps
The first Sound Blaster, codenamed ‘Killer Card’ was
like an excitable 1970s telephone. PC audio was terrible. launched in 1989, combining MIDI synthesis with 23KHz
If you wanted proper music in your games then you audio playback
needed a MIDI card. Rather than playing back a music
recording like current games, MIDI music is a bit like a Word needing masses of processing power, or a massive hard drive
document. In a Word document, the fonts are stored to store a recording. AdLib was one of the first companies to
somewhere else, and the Word file just stores the formatting, market a MIDI music expansion card for the PC, making a
meaning you can store a huge number of words and pages in massive difference to games, but the Sound Blaster went
a very small file size. In the same way, with MIDI, you have the one step further by combining MIDI music with basic
sounds stored on a synthesiser card, and a game’s music file sampling capabilities.
just tells it which sounds to play and when. The result was an audio system that could give you decent
This started with basic FM synthesisers such as music in games, as well as sampled speech and sound
Yamaha’s OPL2, which modulated frequencies to simulate effects. It changed the PC’s sound forever and sold by the
instruments, and then later went up to ‘wavetables’ of bucketload. It was the final part of the equation needed to
sampled instruments to create much more realistic- transform the PC into a proper gaming machine. Thirty years
sounding music. after the original Sound Blaster card was launched, we
In the days before we had very powerful CPUs and masses caught up with founder and CEO of Creative Technology, Sim
of storage space, this meant complicated musical scores Wong Hoo, to talk about the history of the iconic Sound
could be performed in games using tiny files, without Blaster brand.
47
SOUND
48
synthesiser, so by default, it automatically supported a wider
range of software from the two standards, giving users the best
of both worlds.
49
SOUND
Sim Wong Hoo: The CD-ROM drive that met the performance on the PC. Many of these cards suffered high returns as
requirement specifications of the Multimedia PC initiative was users found them not to be that Sound Blaster-compatible.
originally a very expensive, Japan-made CD-ROM drive with a After the returns, the users would usually then buy original
complicated and expensive SCSI interface, which cost over Sound Blasters.
$2,000 US. This expensive drive would have immediately
derailed the multimedia PC initiative. CPC: Take us through the development of the EMU chips for
So Creative solved this nightmarish scenario by the later 16-bit Sound Blasters – what were you looking to
codeveloping a new and inexpensive CD-ROM drive achieve with this level of advanced synthesis?
with MKE (Japan). Creative significantly improved the Sim Wong Hoo: The EMU was the grandfather of wavetable
performance of this low-cost drive by developing a synthesis, earlier than Yamaha and Roland, pioneering
proprietary CD-ROM drive interface on the Sound Blaster, wavetable synthesis way back in the early 1970s. EMU joined
as well as new driver software. This innovative driver went the Creative family in 1993, and we started using its
against conventional wisdom of needing an Interrupt wavetable chips in Sound Blasters to provide much better
and DMA for high-speed data transfer. Instead, it used music synthesis and FM synthesis. It was a major
the CPU to access the CD-ROM drive directly and create breakthrough for PC sound cards at that time.
a huge buffer of data in advance, thereby increasing The subsequent EMU chips – for example, EMU10K1 –
performance tremendously. besides doing wavetable synthesis, were also fully
Putting the CD-ROM interface on the Sound Blaster was programmable acoustic digital signal processing engines
an obvious advantage in that you also didn’t require an that powered our game-changing Environmental Audio
additional expansion slot for a CD-ROM drive controller. It eXtension (EAX) system. This enabled multiple
also simplified the sales of our Multimedia PC Upgrade Kits, simultaneous voices to be processable in hardware.
which comprised a sound card, CD-ROM drive and some
CD-ROM titles. CPC: Even though so many decent MIDI sounds were
available, via the AWE 32, AWE 64 and various wavetable
CPC: The Sound Blaster Pro 2 introduced OPL3 synthesis – cards, OPL2/OPL3 is still considered the ‘sound’ of the era –
what could this do that you couldn’t do on OPL2? it’s the default in DOSBox, for example. Why do you think
Sim Wong Hoo: OPL2 had two operators and nine voices, wavetable synthesis didn’t quite catch on in the same way
while OPL3 had four operators, 18 voices and stereo output. as FM synthesis?
FM synthesis with four operators used four sine waves to Sim Wong Hoo: FM synthesis supported many old games,
synthesise music, which provided a richer timbre and thus which is why it’s still found to be the default in DOSBox. As
created better-sounding musical instruments. PCs got a lot faster, and supported larger memory, I guess it
was easier for developers to stream music directly in games.
CPC: Several competitors started producing cheaper Some of them used their own software audio engines.
‘Sound Blaster Pro-compatible’ cards in the early 1990s –
how did these affect your sales, and was there any licensing CPC: The AWE32 was expandable via standard 30-pin
involved in claiming compatibility with your cards? SIMMs, but the AWE64 wasn’t. What was the reason for In the heyday of
MIDI gaming audio,
Sim Wong Hoo: These so-called compatible sound cards had this decision?
the massive AWE32
negligible effects on our sales, despite selling at lower prices. Sim Wong Hoo: The AWE64 was targeting a much bigger could be expanded
In fact, they helped to create a larger awareness for sound market and, to be cost-effective, we had to remove the using 30-pin SIMMs
50
memory upgrade functions. The
built-in memory was sufficient
for most applications. The
AWE64 subsequently became
a runaway success.
CPC: We recently did a social media survey on how people For example, Sound BlasterX AE-5 offers dedicated
use their spare PCI-E slots, and 19 per cent of our high-quality components, and proprietary technologies
respondents used a dedicated sound card. What do people such as Xamp, which drives individual headphone channels,
get from a dedicated sound card that they can’t get from providing much better headphone audio transience. It can also
integrated audio? drive two extreme ends of the headphone spectrum, from
Sim Wong Hoo: In the first place, I think motherboard audio is 600 Ohm studio monitor headphones to 16 Ohm sensitive
horrible. Many engineers, especially digital engineers, think in-ear monitors.
that PC audio is achieved by simply putting a decent DAC on a Then there’s the Sound Blaster audio processing technology,
motherboard. That couldn’t be further from the truth. A good which can be personalised to suit individual entertainment
audio design requires a good analogue section. needs such as specific game profiles. It has features such as
There are many contributors of noise on any motherboard, Creative Multi Speaker Surround 3D technology (CMSS 3D),
so designing a good analogue section on a noisy which is able to provide 3D surround audio on just two front
motherboard is almost a speakers. There’s also the Crystalizer, which helps to restore
defeating cause. On a details that are otherwise lost in compressed audio, and
In the first powerful gaming computer, DialogPlus, which enhances speech clarity in movies.
51
SOUND
THE PC
SPEAKER
K.G. Orphanides delves into the bleeps and bloops
of the PC’s original primitive sound system
B
efore sound cards brought us polyphonic music and As PC sound card adoption grew through the 1990s, fewer
CD-quality PCM (pulse-code modulation) audio games used the integrated beeper and smaller piezoelectric
recordings, PCs could make exactly one noise: a speakers would become more commonplace. These were
square wave, output through a dynamic speaker driven by the quieter, and lacked the versatility and subtlety of a larger
computer’s timer chip. Launched in 1981, IBM’s first model dynamic speaker, making some fancier audio effects far less
5150 Personal Computer had an internal 2.25in (5.7cm) distinct and often too quiet.
speaker, designed to produce BIOS error codes to help Many modern PCs no longer come with any kind of
diagnose problems at boot. speaker. But motherboards still have the header connector,
The Intel 8253 chip
It was driven by the Intel 8253 Programmable Interrupt so you can still install one and listen to audio designed for an
drove the original
Timer, the same piece of hardware that handled system internal beeper as it was meant to be heard. PC speaker. Credit:
timing. While Timer Channel 0 was used for system Wikimedia Commons
synchronisation, Timer Channel 2 was used to send square QUEST FOR POLYPHONY
waves to the internal speaker, making it beep. Whichever way your PC beeper
By the 1990s, the 8252 had been superseded by the Intel sound is implemented, it’s
8259 Programmable Interrupt Controller (PIC), and these monophonic, which means it can
days, you’ll find a modern hardware equivalent on your only produce one tone at a time.
motherboard’s southbridge in the form of an Intel Advanced But, as with other very limited
Programmable Interrupt Controller (APIC) variant. All of them early computer audio standards,
retain PC internal speaker functions. that wasn’t going to prevent
52
composers from doing modulation (PWM) as a
remarkable things with it. method of producing
FURTHER LISTENING
Beyond simple system more sophisticated The Secret of Monkey Island
beeps, the easiest music to sound, with a variable custompc.co.uk/MonkeyIsland
persuade a PC speaker to volume and harmonies.
reproduce is single-tone Also used in PCM audio through the PC speaker
melodies. A series of numerous ZX custompc.co.uk/PCM
instructions is sent to the Spectrum games, PWM
timer via the CPU, using the uses careful timing of Album: System Beeps
programming language of the signals sent to the custompc.co.uk/SystemBeeps
your choice, telling to it PC speaker to modulate
produce a series of tones its usually binary
at a specified frequency. voltage levels, forcing the speaker into a range of partially
Sound effects in games on positions to produce sine waves. This can effectively turn
also started out as simple the speaker into a 1-bit DAC (digital-to-analogue converter).
beeps, but programmers soon Also heard in titles including Hard Drivin’ and Fantasy
An arpeggiated
pseudo-polyphonic started getting clever, rapidly changing the tones being World Dizzy, this approach can be used to play a pre-
‘chord’ from sent to the speaker to produce complex audio effects. generated soundtrack, rather than using the timer chip to
The Secret of Apogee Games mastered the art of creating convincing – directly generate square wave tones. However, even at 1-bit,
Monkey Island
or at least distinctive – PC speaker effects in titles including this sound reproduction was often CPU-intensive and the
Commander Keen and Hocus Pocus. resulting audio’s low quality grates on many listeners.
You technically can’t play polyphonic music on hardware Later, Access Software’s RealSound technology used a
that can only produce one voice at a time but, as it transpires, near-inaudible carrier wave and fine-grained control of the PC
there are ways around this problem. Probably the most speaker’s displacement amplitude to produce 6-bit digitised
widely used approach to this is arpeggiation, where a audio, giving us surprisingly high-quality speech and music in
pseudo-polyphonic effect is achieved by rapidly switching games including Mean Streets, World Class Leaderboard Golf
from one tone to another – anywhere up to 120 times a and Legend Entertainment’s Spellcasting series.
second – to give the impression of chords to the listener. By 1992, even Microsoft was in on the game, releasing a
A number of games, including the 1990 PC version of driver for Windows 3.1 that allowed any PCM WAV file to be
The Bitmap Brothers’ Xenon 2 Megablast in 1989, the PC output via the internal speaker. As sound cards, CD-ROM
port of Sega’s Golden Axe in the same year and Magnetic games, and then integrated motherboard audio became
Fields’ Lotus III in 1992, create two or three virtual audio ubiquitous, the need to write dedicated timer chip music or
channels and alternate which of them is directed to the kludge samples through the internal beeper evaporated, and
timer chip, allowing basslines to be rapidly switched into PC speaker audio vanished from audio selection screens.
the music. The results often sound harsh and busy, but
produce a rather effective BACK TO THE PRESENT
impression of polyphony. Today, PC speaker music isn’t as dead as you might expect.
A combination of these Although less iconic than the C64 or NES audio systems, you
techniques was used to even can hear its influence in the modern chiptunes music scene.
better effect in LucasArts’ In February 2019, Russian composer Shiru released
PC speaker music, such System Beeps, an entire album written for the PC speaker
as the remarkable beeper and using some of the most sophisticated arrangement,
rendition of the main theme arpeggiation and hearing perception tricks we’ve heard to
from The Secret of Monkey create an illusion of polyphony. There is, of course, a DOS
Island (1990), where the version of the album, but if you don’t happen to have any
sophisticated use of fast classic PC hardware (or a copy of DOSBox), it’s also available
trills and an alternating to buy in conventional digital formats.
percussive channel created Shiru used modern Digital Audio Workstation software
the impression of steel to compose System Beeps and has made relevant plug-ins,
drum chords backing projects and source code available for anyone else who
the main melody. wants to play with them.
Other techniques made Shiru isn’t alone in working on music creation tools for
A classic PC speaker more direct changes to the way the PC speaker’s sound your internal beeper. BaWaMI, created by Robbi-985, is a
square wave from
output worked. Windmill Games’ 1983 booter game Digger Windows MIDI synthesiser that will output via PC speaker. If
the theme to Space
Quest III: The Pirates and its iconic use of Hot Butter’s Popcorn as its in-game you’re so inclined, you can still hear and make new music for
of Pestulon theme is thought to be the earliest title to use pulse width the PC’s oldest audio device.
53
SOUND
ROLAND MT-32
K.G. Orphanides looks back at Roland’s external MIDI synth
that revolutionised early PC gaming music
Released in 1987 as a
musician’s tool, the MT-32
would revolutionise PC
gaming audio
A
glossy black box with a green LCD invites you to ‘Insert 1980s, with a very different feel to analogue synthesisers’ use
Buckazoid’ on its screen. A stirring 1980s sci-fi theme of control voltages to determine pitch, gate and trigger signals.
blasts glossy-textured synth tones through the The MT-32 used Roland’s new Linear Arithmetic (LA)
speakers connected to it, as you’re brought up to speed on the synthesis (see custompc.co.uk/LASynth) technique, first
continuing exploits of space janitor Roger Wilco. In 1989, Space seen a few months earlier in Roland’s 61-key D-50 keyboard
Quest III leaned into the highest-quality music available on home synthesiser. LA synthesis relies on Partials: fundamental
computer platforms, an external MIDI audio device that was as sounds to which it then adds effects in order to produce voices.
prohibitively expensive as it was revolutionary. These Partials are either stored as pulse code modulation
When it was released in 1987, the original Roland MT-32 MIDI (PCM) sound samples (as used by audio CDs, WAV files and
synthesiser cost £450 in the UK – equivalent to over £1,200 in so on) or fully simulated combinations of oscillators, creating
today’s money, and it didn’t even come with the MIDI interface the tone. Filters then determine the brightness of the sound by
card you’d need to connect it to your PC. fixing its cutoff frequency, and an amplifier then determines
Roland primarily marketed its MIDI expander module at its loudness. The LA chip’s pitch and amplitude envelopes
amateur electronic musicians: a multi-timbral synth-in-a-box act on the PCM sounds, determining the note produced and
that could be controlled by any MIDI keyboard. It proved popular
by being significantly cheaper than most rivals, and by supporting
32-note polyphony across up to eight simultaneous voices.
But the MT-32 would become best known as the pinnacle
of IBM PC-compatible gaming audio from the late 1980s to the
mid-1990s, and it helped to popularise the fully orchestrated
game soundtracks we take for granted today.
54
Uniquely, the MT-32
its attack, decay, sustain and release. This technique enabled could be sent SysEx
the synth to produce a realistic (for the time) reproduction of messages to display
genuine instruments. short text strings –
a feature that many
Alongside the LA chip, you’ll find a dedicated gate array, games used
a reverb chip, a Burr-Brown PCM54 DAC, a clutch of op-amps,
and EEPROMs that hold the MT-32’s firmware and PCM
sample banks. You can even send custom patches to the for new General Midi audio devices such as the Roland SCC-1.
MT-32 – specific configurations of effects for the LA synthesis Other studios supported the MT-32 as late as 1997, with the
chip to render on a voice from the PCM bank, so you can cover disk demo of Bethesda’s The Elder Scrolls: Daggerfall
effectively make new instruments. (custompc.co.uk/Daggerfall) being among the last.
55
RETROGRADE
STORAGE
56
STO R AG E
FLOPPY DISKS
Ben Hardwidge takes you through the workings of various types
of floppy disk, which were once the PC’s main storage medium
T
he classic 3.5in HD floppy disk has become a bit of were laughably awful in terms of reliability, convenience,
an icon now, and in more ways than one. Plenty of space and the length of time taken to load and save data. In
people will tell you that their kids think of a floppy 1971, IBM’s first read-only ‘Type 1 diskette’ was an attempt to
disk as the Save icon in Word. The 3.5in floppy is also what solve these problems in a neat 8in package, capable of
most people imagine when you say ‘floppy disk’ – a plastic storing 81KB. In 1973 it became commercially available with
case with a spring-loaded metal protector and a 1.44MB read/write abilities, and a larger capacity of 248KB.
storage capacity. The history of floppy disks goes back well
beyond these neat little storage packs though. MAGNETS, HOW DO THEY WORK?
My first experience of multiple types of floppy disk came Floppy disks work on the basic principle of magnetic binary
when my dad bought me a game (Targhan, in case you’re storage. As you probably know, all computer data can be
interested) for our PC XT clone in 1988. I opened the box, and broken down to simple on-off switches called bits at its
inside it were two 5.25in disks. We had to send it back to get most basic level – if the switch is off, it’s a zero, if it’s on, it’s
the version with 3.5in disks, which took ages because, at the a 1. There are eight bits in a byte, 1,024 bytes in a kilobyte,
time, hardly anybody used 3.5in disks for PCs. All the major 1,024 kilobytes in a megabyte and so on.
models, from the IBM PC to the Amstrad PC1512 and PC1640, Inside the package of a floppy disk is a circular piece of
A single-sided, had one or two 5.25in floppy drives instead. magnetically coated material with a hole in the middle, and
single-density Floppy disks were the main form of storage for the first a piece of protective fabric on either side of this material to
8in floppy disk, decade of the PC’s history, in many cases the only form of protect it. In the case of 8in and 5.25in disks, the hole is left
with a 50p coin for
storage. But the history of the floppy disk goes back even blank for the drive’s spindle to go through it. In the case of
scale. It has a total
formatted capacity further than the first PCs. For the purpose of this feature, I got 3.5in disks, there’s a metal plate in the middle with holes in
of 248KB hold of one of the very first types of floppy disk, an 8in single- it, onto which the floppy drive can lock.
sided single-density The drive then spins the disk and a stepper motor brings
disk. It’s huge. You the magnetic read/write heads into contact with the disk.
can put a 50p piece With 8in and 5.25in disks, where the disk is exposed in a
in the central spindle hole at the front, the heads make contact with the disk once
hole with space you insert the disk and flip down the physical lever at the
to spare. It holds a front of the drive to lock it in place. With 3.5in disks, the
formatted capacity heads make contact with the desk once the disk has been
of just 248KB. fully inserted in the drive, meaning the protective metal
The floppy disk plate at the top has been fully moved to expose the disk,
was one of the ‘first and it’s all locked in place.
solutions to the Once the heads make contact with the disk, and the
problem of disk is spinning, the drive can then read or write data –
transferring data from a magnetic transition denotes an on (1) switch, while no
one place to another. magnetic transition means an off (0) switch. All the ones
We’d used punch and zeroes are encoded/decoded in a bitstream, and in the
cards, punched tape case of 5.25in and 3.5in floppy disks, this is generally MFM
and magnetic tape, (modified frequency modulation), although there were
which worked, but other encoding methods in the early days of floppy disks.
57
STO R AG E
Going into the complete workings of MFM would take a into sectors containing a certain
feature in itself, but the basic gist is that it introduces a clock number of bytes, with unused bytes
to separate the bits in the bitstream. After all, a computer on either side of the sector and a
would have a tough time reading a long line of zeroes in a header to mark the start of the sector.
row, with no magnetic transitions between them to tell it This header also contains a cyclic
whether this was one ‘off’ bit or several of them. The idea redundancy check (CRC), which was
behind using FM and MFM data encoding was to enable a also placed at the end of the data used
non-return-to-zero (NRZ) system, so there was never a in each sector, for error checking.
state where there was neither an on nor off signal. There are many blank spaces, and
An FM bitstream can encode a 0 as 10 and a 1 as 11, for bytes to denote start and end points
example. MFM is more complicated than just using 10s for tracks and sectors, which is part of
Inside a 3.5in disk’s plastic shell is
and 11s, but the principle of using a clock to separate the the reason why the formatted capacity is a small floppy disk with a piece of
bits is basically the same. Incidentally, MFM was also the always lower than theoretical maximum protective fabric on either side
standard used in early hard drives, including the Amstrad capacity of a disk.
PCs in the late 1980s, before IDE was introduced.
SIDES AND DENSITIES
MAKING TRACKS In the case of the classic 3.5in floppy
Towards the latter days of the floppy disk’s reign, you could disk, you had 512 bytes per sector on a
buy disks pre-formatted for your type of computer, but you PC. Depending on your disk (and your
originally bought them unformatted. In this unformatted hardware), you could also use both
state, there’s nothing on the disk. It’s a blank, circular piece sides of the disk, doubling the storage
of magnetically coated material. You would then have to capacity. The other way to increase
tell your computer to format it for your system. In my case, the capacity was with the ‘density’, the
that meant typing ‘format a: /w’ at the DOS prompt. number of tracks and sectors per side.
The ‘/w’ means ‘wait’ – like many people at the time, I As an example, a ‘double-density’
couldn’t afford a hard drive, so I had to boot DOS from a (DD) 3.5in floppy disk has 80 tracks per
floppy disk called a system disk each time I started my PC – side, each containing 9 sectors with 512
this system disk also contained all the DOS commands, so bytes each. Each track therefore has
you would have to type the format command with the 4,608 bytes – multiply that figure by 80
system disk in the drive, then swap over to the unformatted and you get 368,640 bytes, or 360KB
disk when prompted – you really didn’t want to accidentally per side. So, a single-sided, double-
format your system disk! density (SS/DD) disk has 360KB – add
The formatting process would then prepare your disk for another side and you get a double-sided
reading and writing. Unlike the spiral of data used on most double-density (DS/DD) 720KB disk.
CDs and DVDs, floppy disks organise data in ‘tracks’ – The next step, once you've got faster
concentric circles that are separated by controllers and better physical media, is to double the
small areas containing no data. number of bits (and hence sectors) that fit on one track,
These tracks are then, in taking the number of sectors per track from nine to 18 and
turn, separated doubling the capacity from 720KB to the classic
1.4MB high-density (HD) floppy disk.
I’m referring to PC standards here, of course,
but other computers, such as the Amiga and
Mac, had more efficient ways of formatting disks
that resulted in higher capacities. With a
continuous motor speed, every sector held the
same amount of data, regardless of whether it was
on the inner or outer part of the disk area. As you move
from the centre of the disk outwards, however, the sectors
become physically bigger, which means space is wasted on
the outside area of the disk, where there should be more
room for data storage.
Apple got around this issue by varying the speed of the
motor when the head was on the outside of the disk vs the
From left to right, 8in
inside of the disk, enabling it to add more storage capacity
SS / SD, 5.25in DS / on the outside of the disk and get 800KB from a double-
DD, 3.25in HD sided double-density disk, rather than the 720KB on a PC.
58
You also got wildly different amounts of formatted KNOW YOUR FLOPPIES
storage space from the same physical size of disk on
different systems in the early years of the floppy, as there SIDES / FORMATTED
were so many different software standards, all with DATE SIZE
DENSITY CAPACITY
different sector sizes and
formatting systems.
A double-density It all resulted in a bit of
1973 8in SS / SD 248KB
59
RETROGRADE
SOFTWARE
61
S O F T WA R E
WINDOWS 1.0
35 years ago Microsoft finally launched the first version of
Windows. Stuart Andrews looks back to where Windows
started, and tries using Windows 1 again for himself
I
t’s now more than 35 years since Windows either too expensive – an Apple Lisa cost around $10,000
launched in November 1985, 18 months behind US, while you could buy a PC for under $3,000 – or too
schedule and almost three years after Apple’s Lisa demanding in their system requirements. If it wasn’t bad
had introduced the first commercial GUI. It wasn’t exactly a enough that Visi On needed a staggering 512KB of RAM
hit; it flopped commercially, while reviewers criticised its and a hard disk, its applications needed to be coded in a
performance and wondered whether some of its most specific version of C using Unix tools. This left space for
powerful features were really that useful. Yet less than five an alternative.
years later Windows dominated the operating system Gates hired Scott McGregor, one of the key developers
market, running on over 70 per cent of all personal at Xerox PARC, and set a team to work on a project
computers sold. You can see Windows 1 as the ugly duckling codenamed Interface Manager. Crucially, it wasn’t seen
that was to transform into the, well, still gruesome but as a complete OS, but as a graphical environment that ran
enormously successful swan. on top of MS-DOS. In November 1983, Gates announced
Windows and set its release date for April 1984.
It wouldn’t require a hard The hype said Windows would bring a new way to
use PCs. It wouldn’t require a hard drive – just two floppy
drive and it would run with disk drives – and it would run with just 192KB of RAM. By
Even in the first
release, there
December 1983, an early version was previewed for an were options
just 192KB of RAM article in Byte magazine, with its writer, Phil Lemmon, to personalise
Windows, although
arguing that ‘Microsoft Windows seems to offer
good luck finding a
MAKING WINDOWS remarkable openness, reconfigurability and transportability, colour scheme that
Windows began its journey in the autumn of 1982. as well as modest requirements and pricing’. The result, didn’t look horrific
Microsoft’s CEO Bill Gates was already aware of research
into mouse-driven, graphical user interfaces at the
legendary Xerox PARC, and of Apple’s continuing work on
the same principles. However, the story goes that Gates
attended the autumn 1982 Comdex trade show in Las Vegas,
where he saw VisiCorp demonstrate Visi On: a GUI for the
IBM PC. Gates is said to have watched the demo several
times, back-to-back, before suggesting that other Microsoft
personnel needed to come out to Comdex and take a look. If
GUIs were the future, Microsoft wanted a piece of the action.
At this point Microsoft wasn’t the huge tech monolith we
know today. It was still a small company that had grown
successful on the back of Microsoft BASIC and MS-DOS.
Gates saw an appetite for a new and easier way to work
with the personal computer, and that rival systems were
62
Cutting, copying
and pasting were
revolutionary new
ideas, enabling
you to move
information from
one app to another
Lemmon thought, could bring computing to a new, non- embarrassment. Trower even saw being put in charge of
technical audience. the project as a step towards getting fired. By this point Scott
Why, then, did it take another two years to get finished? For McGregor had resigned, and while the core components
a start, there were some major technical challenges. When were in place, elements of the design and the look weren’t
development started, standard CGA screen resolutions were working. More seriously, there weren’t any applications.
limited to 640 x 200 in monochrome, and it was only with ‘Even at Microsoft, getting developers to write Windows
the development of EGA graphics boards in late 1984 that software was a challenge,’ said Trower in a 2010 interview.
you had enough pixels to make Windows effective. The slow ‘I couldn’t even get my former team to build a version of
speeds and limited capacity of floppy disks had an impact, BASIC.’ However, there was a prototype of a simple bitmap
while the Intel 8088 CPUs used in most PCs weren’t exactly drawing program, while Trower persuaded Gates and
bursting with firepower. Ballmer that Windows needed a set of simple applets,
Perhaps worst of all, there was a challenge in building including a word processor, calendar and business
industry support. As Gates said in 1983, ‘the primary focus card database.
of the company and the speeches I gave, the promotion I What’s more, Trower made it a requirement that You could run three
did, was to get people to believe in the graphics interface Windows could run existing DOS applications. This in itself or four applications
whether it was Macintosh or Windows, and that was a tough proved awkward – many DOS apps exploited tricks or at once, provided
you could tolerate
thing because people like WordPerfect and Lotus refused to workarounds that caused problems for Windows memory
painful slowdowns
put the resources into doing applications’. management – but it was a major boost to Windows in and a lack of screen
Some believe that other factors were in play. By 1984 the future. real estate
Microsoft was working with Apple on Macintosh software,
and had signed licensing agreements for specific UI
elements, but not others, including overlapping windows
and the Recycle Bin. It’s possible that Microsoft reworked
Windows to avoid including these elements and triggering
future litigation. If so, Microsoft wouldn’t admit it. A
November 1983 article in the US computing mag, Infoworld,
suggested that Microsoft’s Steve Ballmer saw tiled windows
as delivering a neater desktop.
A DEVELOPMENT DISASTER
Whatever the case, the development of Windows was
definitely troubled. Tandy Trower came in as the product
manager in autumn 1984, at a point where Windows
was seen externally as vapourware and internally as an
63
S O F T WA R E
Lack of app support By the early summer of 1985 Windows still wasn’t the mouse or the GUI. Sandberg-Diment had his doubts about
was such a problem finished, but Ballmer decided to release a ‘Premiere Edition’ dialogue boxes, suspecting that most people would prefer ‘a
that the Windows
to application developers and members of the press. more direct means of executing commands.’ He also felt that
team developed its
own paint programs, The team went into crunch, to the extent that one young multi-tasking was a waste of effort. ‘Most people use but one
utilities and games program manager, Gabe Newell (yes, that one) started program most of the time, if not all the time,’ he suggested.
sleeping in the office. Even at the last stages, new defects That’s aged well.
were found in the memory management code, delaying
the release even further. It was only in November that USING WINDOWS
testing Windows was finished, to be released at Comdex So how successfully did Windows 1 lay down the foundations
1985 with a comedy roast where Microsoft poked fun at its for the Windows we know and sort of love today? Well, it has to
own product’s lateness. be said that it’s a very different experience. There’s no desktop
and the management of windows is incredibly primitive. While
Even selecting from a pull-down menu it is mouse-driven, icons don’t play a starring role. Instead, you
launch applications by double clicking on a list in the MS-DOS
is different, involving a click, button- Executive – a simple file manager that lists not just the
programs, but all your MS-DOS files.
hold, select and release process The first application you launch occupies the whole screen,
and subsequent applications split the screen into two, three
MALIGNED AND MISUNDERSTOOD or four. Once windows are in place you can close, maximise
You might have expected the response to be rapturous, or resize them, or move them from one half or corner of the
but – as with so many Microsoft products – there was
disappointment and bemusement. InfoWorld ran its review
with the headline ‘Windows Requires Too Much Power’ and
gave it 4.5 out of 10. A piece by Erik Sandberg-Diment for
The New York Times called Windows extremely memory-
hungry. ‘Running Windows on a PC with 512K of memory’,
he noted ‘is akin to pouring molasses in the Arctic. Also,
the more windows you activate, the more sluggishly the
program makes its moves’.
Most of all, pundits weren’t convinced that Windows
solved any genuine problems. Some didn’t see the point of
64
widescreen aspect ratio, or your
favourite drawing subjects are
sausage dogs and snakes.
Windows Write is recognisably a
word processor, but there’s no spell
check or anything beyond basic
formatting features, much like the
Windows Write we all carried on not
using before Windows 95. And as
for Reversi, well it’s a variant of the
classic black and white disc strategy
game Othello, but – let’s face it – it’s
no Minesweeper or Solitaire.
65
COLLECT AND PLAY.
IT’S THE EVERCADE WAY!
IGN T3
“... A GREAT WAY “... MAKE PLAYING
TO PLAY ARCADE AND RETRO ARCADE GAMES
RETRO CLASSICS IN THE T3 FUN, AFFORDABLE AND
COMFORT OF YOUR HOME.” ACCESSIBLE TO ANY GAMER.”
GAMING BIBLE
“THE VS MIGHT JUST BE MY 8/10 NINTENDO LIFE
“WE LOVE THE FACT THAT IT’S
FAVOURITE NEW GAMING GIVEN PLAYERS LEGITIMATE
HARDWARE OF 2021.” IGN ACCESS TO MANY GAMES”
AVA I L A B L E N O W, V I S I T W W W. E V E R C A D E . C O . U K / R E TA I L E R S
/EVERCADERETRO EVERCADERETRO @EVERCADERETRO
S O F T WA R E
WINDOWS 3.1
Screensavers, colourful icons and proper fonts. 30
years after its release, Stuart Andrews looks back at
the version of Windows that finally put it on the map
W
indows 3.1 is arguably the most crucial Windows horrible, text-based MS-DOS Executive of Windows 1.x
ever – the Windows that defined how PC and 2.x with the new Program Manager and File Manager
computing looked just as it was beginning to take components. Instead of clicking on a program or a file in a list,
off. Before version 3.1, Windows was a successful operating you could double click on an icon to launch it. Yet Windows
system, but one that looked and felt like a GUI shell perched 3.1 went further, taking advantage of the VGA and SVGA
precariously on DOS. graphics standards to introduce a revamped UI with more
With the launch of Windows 3.1 in April 1992, Windows colourful icons.
finally looked and felt like the real deal. What’s more, it was a What’s more, those icons could now do more than just
sales phenomenon, shipping over 3 million copies in its first get clicked on, as Windows 3.1 introduced drag and drop. You
six weeks on the market and 25 million within the first year. could explore your PC’s file system visually, copying files from
Windows was already big, but 3.1 put Windows in the lead. one folder to another by clicking on the file, dragging it over
How did Windows 3.1 do this? That’s not something you and releasing the mouse button. You could drag a file onto the
can nail down to any one factor. It was partly a question of Print Manager icon to print it out, or onto the application’s icon
stability, partly features and partly look and feel. Believe us – in Program Manager to open it and start work.
Windows 3.1 looks rough by today’s slick standards, but not Yet perhaps the most vital enhancement over Windows
half as rough as what came before. 3.0 was the introduction of TrueType fonts. At this point,
Cue a sigh of relief Windows still involved a lot of text and, up until Windows 3.1,
when this splash LOOK AND FEEL this text didn’t look good. It was pixelated, primitive and ugly,
screen showed up.
Look and feel certainly played an important part in Windows with no real provision to vary horizontal or vertical spacing.
Launching Windows
from MS-DOS could 3.1’s success. Windows 3.0 has already done some of While developing Windows 3.1, Microsoft put a team
be s…l…o…w the hard work of introducing a proper GUI, replacing the together to fix this problem, and that team worked with
The Program Manager was the heart of Windows 3.1. Double clicking
icons launched the applications, or you could drag and drop files onto
the icons or open windows
67
S O F T WA R E
monitors, and After Dark’s fish and flying toasters had already
appeared on Windows 3.0 and macOS. However, Windows
3.1 made screensavers a standard component, introducing
long-time favourites, such as the classic flying Windows logo,
the Star Trek-style Starfield, and the psychedelic Mystify and
Swirl. Seriously. After few too many shandies, they blew our
primitive, PC-loving minds.
ARCHITECTURAL IMPROVEMENTS
Yet the most important features that Windows 3.1 introduced
were those you couldn’t see. Windows 3.0 had introduced
protected memory – a way of using the protect mode of the
80286 CPU to allow Windows and Windows apps to use up
to 16MB of RAM rather than just the first 640KB.
Coded by ex-physicists David Weise and Murray Sargent,
TrueType fonts were a revelation to Windows users, making the
OS look significantly better and opening up more sophisticated this feature had been crucial, making Windows a viable
WYSIWYG DTP and design applications alternative for Microsoft to working with IBM on what would
become OS/2. Running in protected mode gave Windows
one of the two leading typesetting companies of the era, programs more stability, and enabled MS-DOS applications
Monotype, to design a new set of core fonts. Meanwhile, to run under Windows and still access all the available RAM.
Microsoft worked on the technology to render those fonts This in turn meant that Windows spent less time crashing,
on-screen, so they could be scaled upwards and downwards, which made it a lot more attractive to people trying to get
rotated and respaced, and still look pretty good. some actual work done.
Monotype came up with the Times New Roman, Arial and Windows 3.1 built on this foundation by taking the new
Courier New fonts that Windows still incorporates today, memory management features built into the newer 386
while Microsoft licensed and adapted Apple’s TrueType processors and using them in a 386 Enhanced mode. Where
technology, adapting the font hinting tech that made these Windows 3.0 was limited to a maximum of 16MB, Windows
fonts clear and legible even on a VGA resolution (640 x 480) 3.1 upped that limit to 256MB (or, in theory, up to 4GB) and
screen. This not only made Windows look a whole lot better, enabled programs to use virtual memory above and beyond
but made it a viable platform for desktop publishing and the physical memory installed.
design. Suddenly, the Mac had competition. It also enabled most DOS programs to be run inside a
Window with mouse support, and multiple DOS programs to Windows 3.1 gave
This was also the first version be run simultaneously. What’s more, all these enhancements us new ways to
customise our
meant Windows 3.1 only worked on an Intel 80286 CPU or
of windows to include a built- later. Rocking an old-school 8086? Tough. desktops, although
not much of any
These changes improved not just Windows’ overall
in screensaver stability, but its multi-tasking capabilities as well. Applications
value with which to
customise them
68
CONFOUNDING ISSUES
Let’s not heap too much praise on Windows 3.1; it still had its
fair share of issues. One was that Windows still didn’t support
long filenames, so both files and directories were limited to
names eight characters long, followed by a three-character
suffix that told the OS what kind of file it was. This meant
users became ingenious at truncating filenames, which in
turn made looking through a folder full of documents or save
games feel like decoding some esoteric text.
What’s more, while Windows 3.1 had support for
multimedia hardware, which was just about becoming
affordable and available, ease of installation wasn’t on
Microsoft’s list of priorities. Restrictive hardware didn’t help
– these were the days when solving hardware conflicts
involved moving jumpers from pin to pin to swap Direct
Memory Access channels. However, Windows 3.1 made the
whole process of installing drivers for a CD-ROM drive and
sound card as challenging as possible – it might take hours
to get the whole setup running.
Networking wasn’t any better either, because Windows 3.1
This was as exciting mostly got the resources they needed, and a central didn’t have any built-in networking support. Instead, it piggy-
and intuitive as file messaging system alerted them to hand over resources backed on networking clients for the underlying MS-DOS
management got in
Windows 3.1. Notice
as and when they were needed, although not all Windows operating system. If you hadn’t already mastered Novell
those old-school 8.3 programs behaved as well as others. A Task List enabled Netware or Microsoft LAN Manager, you were still going to
character filenames you to see all the currently running programs and halt any have to get to grips with them here.
that were gumming up the system, although the more likely Nor was the Windows shell ideal. Simply finding a program
outcome was that they would crash Windows first. in Program Manager could be daunting, especially if you
What’s more, all of this went hand in hand with another weren’t sure which folder or group held it. With screen space
major Windows feature. Windows already had the Dynamic at a premium, you would have to constantly minimise and
Data Exchange (DDE) protocol, which allowed you to take restore Windows while you looked. Don’t even ask about
messages and/or data from one Windows program to finding files in File Manager.
Most of all, Windows wasn’t a great platform for games.
WinG worked on a technical Dodgy drivers and the massive overheads involved in just
running Windows itself made it much, much easier to run
level, as proven by a WinG Wolfenstein 3D or The Secret of Monkey Island 2 in DOS,
which Windows needed to run anyway and for which all
port of id Software’s Doom Windows users had to pay. This also meant that getting
games running still required tinkering at text editor level
another. Windows 3.1 went one better with Object Linking with a range of crucial system files, to the point that most PC
and Embedding (OLE), which enabled you to embed an gamers were on intimate terms with config.sys, himem.sys
object created by one application into a document created by and autoexec.bat. Windows 3.1 didn’t change this one bit.
another, with both apps updating seamlessly when you made With time, there was some movement. In 1994, Microsoft
any changes. released a new API, WinG, which was designed to deliver
Suddenly, you could create a chart in Microsoft Excel and faster graphics performance under Windows and encourage
stick it in your Microsoft Word report, then update the data more developers to port their DOS games. WinG worked on
in Excel and see the changes rolled out in Word. I know. It a technical level, as proven by a WinG port of id Software’s
doesn’t sound that thrilling, but at the time, this rocked the Doom. Yet it didn’t work so well on the commercial level, with
computing world. developers looking at the work involved and the existing DOS
Last, but not least, Windows 3.1 gave the world the user base, then shrugging their shoulders until Windows 95
Windows registry. At the time, this central database of and DirectX came along.
settings wasn’t all that well known or understood, and Still, for all these faults, Windows 3.1 was a major leap in the
we never felt the need to edit it directly as we would in the right direction, paving the way not just for Windows 95, but for
Windows 95 years. Still, it showed a willingness to gather vital the switch from IBM and OS/2 towards Windows NT. Without
system information and preferences in one place, rather than that we might never have had the PC boom of the mid-1990s,
in a horde of SYS, INF and INI files, as had been the Windows Windows XP and everything beyond. And where would we all
way until this point. be without that?
69
Join us as we lift the lid
on video games
ALL FORMATS
116 PAGES MONTHLY
CASTLEVANIA
The designer who created a
Konami classic – and vanished
PLAYDATE’S
INDIE DEV
GAME BOY SCENE
Make an infinite runner on
genuine retro hardware THE JOY OF
MODDING
HANDHELDS
EXPLORING
NINTENDO’S
UNUSUAL ORIGINS
Issue 62 £6
wfmag.cc Issue 63 £6
wfmag.cc
63
xx
xx
THE FIRST PC
71
T H E F I R ST P C
IBM PC 5150
Ben Hardwidge travels back to August 1981, when IBM
released its Personal Computer 5150 and the PC was born
A
big ape had only just started lobbing barrels at a personal computers. We ask him if it felt like the beginning
pixelated Mario in Donkey Kong arcade machines, of a new era when the PC was first launched 40 years ago.
Duran Duran’s very first album had just rolled off ‘Yes,’ he says, ‘but probably not the beginning of something
the vinyl presses and Roger Federer was just four days old. In so huge that its legacy lives on today.’
this time, the UK was even capable of winning Eurovision At this time, the home computer market was really
with Bucks Fizz. It’s August 1981, and IBM has just released starting to take off, with primitive 8-bit computers, such as
the foundation for the PCs we know and love today, the the Sinclair ZX80 and Commodore VIC-20, enabling people
PC 5150. at home to get a basic computer that plugged into their TV.
‘By the late 1970s the personal computer market was At the other end of the scale, large businesses had huge
maturing rapidly from the many build-it-yourself hobbyist mainframe machines that took up entire rooms, connected
kits to more serious players like Apple, Commodore and to dumb terminals.
Tandy,’ retired IBM veteran Peter Short tells us. ‘As people There was clearly room for a middle ground. IBM was
realised the greater potential for personal computers in going to continue producing mainframes and terminals
business as well as at home, pressure grew on IBM to enter for many years yet, but it also wanted to create a powerful,
the market with their own PC.’ independent machine that didn’t need a mainframe behind it,
Short is now a volunteer at IBM’s computer museum in and that didn’t cost an exorbitant amount of money.
Hursley (slx-online.biz/hursley), which holds a huge archive The PC 5150’s launch price of $1,565 US (around £885
of the company’s computing machines and documentation, ex VAT) for the base spec in 1981 equates to around £3,469
from Victorian punch card machines to the company’s ex VAT in today’s money. That’s still very far from what we’d
72
IBM’s System 23 Datamaster,
pictured here at the IBM Hursley
Museum, cost $9,000 US
call cheap, but it was a colossal price drop compared with see now, but there are
IBM’s System/23 Datamaster, an all-in-one computer still some similarities. For
(including screen) that had launched earlier the same starters, the floppy drive
year for $9,000 US – six times the price. And even that connects to the PSU with
was massively cheaper than some of IBM’s previous a 4-pin Molex connector,
microcomputer designs, such as the 5100, which cost up still seen on PC PSU cables
to $20,000 US in 1975. today. The PC was also
clearly geared towards
The ticking heart of the box expansion from the start.
The ticking heart of the box is a 4.77MHz 8088 CPU
is a 4.77MHz 8088 processor made by AMD – Intel had given the company a licence to
produce clones of its chips so that supply could keep up with
made by AMD demand. It’s for this reason that AMD still has its x86 licence
and can produce CPUs for PCs today, but at this point, the
IBM needed to act quickly. Commodore had already got two companies weren’t really competitors in the way they
a foothold in this market several years earlier with the PET, are now. To all intents and purposes, an AMD 8088 was
for example, and IBM realised that it couldn’t spend its usual exactly the same as an Intel one, and PCs generally came
long development time on the project. The race was on, with with whichever one was in best supply at the time of the
the project given a one-year time frame for completion. machine’s manufacture.
‘At the time, IBM was more geared up to its traditional, The CPU itself is an interesting choice. It’s a cut-down
longer-term development processes,’ explains Short. version of Intel’s 8086 CPU that it had launched in 1978. The
‘But it eventually realised that, with a solid reputation in the 8088 has the same execution unit design as the 8086, but
marketplace, it was time to look for a way to do fast-track has an 8-bit external data bus, compared with the 8086’s
development that would not produce a machine three, four 16-bit one. As with today’s PCs, the CPU is also removable
or five years behind its competitors.’ and replaceable, but in the case of the PC 5150, it’s in a long
dual in-line package (DIP) with silver pins, rather than a
PROCESSORS AND COPROCESSORS square socket.
We opened up a PC 5150 for this feature, so we could have Immediately above the CPU sits another DIP socket for
a good look at the insides and see how it compares with an optional coprocessor. At this point in time, the CPU was
PCs today. It’s hugely different from the gaming rigs we only an integer unit with no floating point processor. This
73
T H E F I R ST P C
74
A 5.25in floppy interface components and a larger area for building your own
drive was the design’. It’s a far cry from the heavily populated PCI-E cards
standard storage
with complex machine soldering that we see today.
system for the
5150, with no The memory is organised in four banks in the bottom right corner of
hard drive option MEMORY the motherboard – in this case there are four 64KB banks, adding
at launch That 384KB memory card shows a very different approach up to a total of 256KB
to memory expansion than the tidy modules we have today.
Believe it or not, at launch, the PC 5150 base spec came with bank, while the 64KB configuration filled all four banks with
just 16KB of memory (a millionth of the amount of memory 16KB of memory each.
in today’s 16GB machines), which was supplied in the form of A later revision of the motherboard expanded this to 64KB
DRAM chips on the bottom right corner of the motherboard. as the base spec with one bank filled, and 256KB with all
The top spec at launch increased that amount to 64KB, four banks filled (this is the spec in our sample). If you then
although you could theoretically also install the DRAM chips added a 384KB memory card, such as the one in our sample,
yourself if you could get hold of exactly the right spec of chips you ended up with 640KB of memory – the maximum base
and set it up properly. The chips on the motherboard are split memory addressable by PCs at this time.
into four banks, each with nine chips (eight bits and one parity
bit). In the original spec, the 16KB configuration filled one GRAPHICS AND DISPLAYS
As we previously mentioned, our PC 5150 sample has
a dual-monitor card, which supports both the display
IBM’s colour 5153
standards available to the IBM PC at launch. A Mono Display
monitor didn’t come
out until 1983, Adaptor (MDA) card could only output text with no graphics,
shown here with while a Color Graphics Adaptor (CGA) card could output up
an IBM PC XT at to four colours (from a palette of 16) at 320 x 200, or output
Hursley, with Alley
Cat in full CGA glory
monochrome graphics at 640 x 200.
However, as Short notes, ‘the PC was announced with the
mono 5151 display in 1981. The CGA 5153 was not released
until 1983’. Even if you had a CGA card in your PC 5150, if you
used the original monitor, you wouldn’t be able to see your
graphics in colour. Seeing colour graphics either required you
to use the composite output or a third-party monitor.
‘Once the colour monitor became available,’ says Short,
‘it could either be attached as the sole display with its own
adaptor card, or equipped with both a mono and colour
adaptor card, and could be attached together with a mono
screen. Now you could run your spreadsheet on the mono
monitor and display output graphics in colour.’
There’s an interesting connection with the first PC
monitors and the legacy of IBM’s computing history too.
When we interviewed the Hursley Museum’s curator Terry
Muldoon (who has now sadly passed away) in 2011, he told
us the reason why the first PC monitors had 80 columns. ‘It’s
because it’s the same as punch cards,’ he said. ‘All green-
screen terminals had 80 columns, because they were
basically emulating a punch card.’
75
T H E F I R ST P C
76
Research for CPM, but Digital Research didn’t return the call.
Bill Gates did, but he didn’t have an operating system, so he
went down the street and bought QDOS.
‘The original DOS was a tarted-up QDOS, supplied to IBM
as IBM Personal Computer DOS, and Gates was allowed to
sell Microsoft DOS (MS-DOS). And they carried on for many
years with exactly the same numbers, so 1.1 was DOS 1 but
with support for us foreigners, then we went to DOS 2 with
support for hard disks, DOS 2.1 for the Junior, DOS 3 for the
PC80 and so on.’
You can have a play with DOS 1.0 on an emulated PC 5150
The IBM Personal Computer
at custompc.co.uk/5150, and it’s a very basic affair. Even if laid the foundation for the
you’ve used later versions of DOS, there are some notable PCs we know and love today
absences, such as the inability to add ‘/w’ to ‘dir’ to spread
out the directory of your A drive across the screen, rather
than list all the files in a single column. of software interrupts in that BIOS that people used, such
What’s also striking is the number of BASIC files supplied as the timer tick, which were really useful. You get that timer
as standard, which can be run on the supplied Microsoft tick and you can get things to happen, so you have to be able
BASIC. One example is DONKEY.BAS, a primitive top-down to produce something that hits the timer tick, because the
game programmed by Bill Gates and Neil Konzen, where software needs it.’
you move a car from left to right to avoid donkeys in the road Rival computer makers could circumvent the copyright of
(really). What’s more, this game specifically requires your the BIOS by examining what it did and attempting to reverse-
PC to have a CGA card and to run BASIC in advanced mode – engineer it. Muldoon explained the process to us.
you couldn’t run it on the base spec. ‘The way people did it is: with one group of people, say:
“this is what it does”, and another group of people take that
A FUTURE STANDARD specification, don’t talk to them, and then write some code
With its keen pricing compared with previous business to make it do that – that’s called “clean room”. So one person
computers, the IBM PC 5150 was well received in the USA, documents what it does, and another person now writes
paving the way for a launch in the UK in 1983, along with DOS code to do it – in other words, nobody has copied IBM code,
1.1 and the option for a colour CGA monitor. Clone machines and there’s a Chinese wall between these two people.
‘What some of the clone manufacturers did is, because
The power supply wasn’t we published the BIOS, they just copied it. Now, the BIOS
had bugs in it, and we knew they’d copied our BIOS because
beefy enough to power they’d copied the bugs as well. This was only the small
companies that came and went. Phoenix produced a clean
the 5150 and a hard drive room BIOS, so if you used a Phoenix chip in your clones, you
were clean.’
from companies such as Compaq soon followed, claiming Of course, any self-contained personal computer can
(usually, but not always, rightly) to be ‘IBM PC compatible’, technically be called a PC. Peter Short describes a PC as a
and the PC started to become the widespread open standard machine that ‘can be operated directly by an end user, from
that it is today. Was this intentional on IBM’s part? beginning to end, and is general enough in its capabilities’. It
‘Industry standard components, an expansion bus and a doesn’t require an x86 CPU or a Microsoft OS. In fact, there
prototyping card would naturally lead to an open standard,’ was and still is a variety of operating systems available to
says Short. ‘Not publishing the hardware circuitry would x86 PCs, from Gem and OS/2 in the early days, through to
make it difficult to capture the imagination of “home” the many Linux distributions available now.
developers. Open architecture was part of the original plan.’ However, the PC as we generally know it, with its x86
Muldoon wasn’t so sure when we asked him back in instruction set and Microsoft OS, started with the PC 5150 in
2011. ‘Now where did IBM make the mistake with DOS?’ He 1981. Storage and memory capacities have hugely increased,
asked. ‘This is personal opinion, but IBM allowed Bill Gates as have CPU clock frequencies, but the basic idea of a self-
to retain the intellectual property. So we’ve now got an Intel contained box with a proper
processor – the bus was tied to Intel – and another guy owns CPU, enough memory for
the operating system, so you’ve already lost control of all of software to run, its own THANKS
your machine in about 1981. The rest is history. storage and a display output, Custom PC would like to thank Tim Beattie
‘The only bit that IBM owned in the IBM PC was the BIOS, as well as room to expand for the loan of his PC 5150 for this feature, and
which was copyright. So, to make a computer 100 per cent with extra cards, started the team at IBM’s Hurlsey Museum. RIP Terry
IBM compatible, you had to have a BIOS. There were loads here. Thank you, IBM. Muldoon – you’re very much missed.
77
RETROGRADE
HOW TO
78
H OW TO
79
H OW TO
If you’ve ever wondered what it was like to use a PC We also wanted to avoid using an old hard drive.
from the old days, and fancy having a dabble with old Mechanical hard drives can become unreliable after
hardware, but don’t know your AT from your AT-AT, then five years, let alone 25, plus they’re slow and noisy, so
this Retro tech special is for you. In fact, even if you don’t we wanted to use solid state storage. However, we
want to buy a load of overpriced ancient hardware in order also wanted the flexibility to run any old software from
to construct an obsolete gaming machine (and we won’t charity shops and eBay, not to mention disks from the
judge you for that, much), we’ll give you a grounding in loft, and that means our system needs a CD-ROM drive
how the PC has changed in some ways, but not in others, and a floppy drive too.
and help you understand the foundation on which today’s Finally, PSUs have come an enormously long way
PCs are built. since the 1990s. We have modular and semi-modular
designs, wrapped/sleeved cables as standard, and
MISSION BRIEFING the 80 Plus initiative has weeded out the flaky and
The idea behind our retro rig is to combine the best of the inefficient PSU designs that were commonplace 25
old world with the perks of the new world. While there years ago. So we’re using solid state storage, a modern
are parts of the legacy PC hardware era we miss, there case and a new PSU. The rest of the core spec, however,
are others that we’re is contemporary 1990s DOS hardware.
The idea is to combine the very glad have been
consigned to the great SLOTS OF FUN
best of the old world with silicon scrapheap in Your first priority when building a DOS gaming system
the sky. There was no is the motherboard. You want one with 16-bit ISA slots
the perks of the new world way we were going to (long and usually black), so you can get the sound
use a 1990s case, for working properly. PCI sound cards were largely designed
example. As well as having that horrible off-white colour, for Windows, rather than DOS, and while some of them
which yellowed over time, early PC cases were often have DOS drivers, it’s a faff trying to get them to work in
badly designed and built. There was nearly always sharp all your games.
metal on which you could easily scrape your knuckles, PCI sound cards also tend to make heavy use of the
very little consideration given to cable routing, drive bays CPU, and don’t have all the required audio hardware on
everywhere and the PSU would often be sat at the top. them, relying on the CPU to do some of the work. That’s
fine if you have a Pentium III and Windows 98, but it’s
rubbish for DOS gaming – an ISA card will have all the
synthesiser hardware you need on it. However, it’s worth
having a PCI slot for your graphics card.
Some motherboards from the 1990s will also have
VESA local bus slots, which look a bit like an ISA slot
with a brown PCI slot on the end. You can install an ISA
graphics card in these slots, but actual VESA local bus
cards are generally expensive and hard to find these
days. Ideally, look for a motherboard with a mix of both
ISA and PCI slots – the latter are short, with a thin socket
in the middle, and they’re usually white.
You also want a replaceable CMOS battery.
These silver discs are a standard feature of today’s
80
Molex-to-floppy adaptors are
easy to find on eBay. These
black ones are made by Corsair
81
H OW TO
82
ON THE CASE
As we mentioned earlier, you should be able to install
an old ATX motherboard into a new ATX case, but if you
want to run software from the original media, it will also
need front-facing drive bays. You’ll need a 5.25in bay
for a CD-ROM drive, and a bay for a 3.5in floppy drive if
you want that too. A dedicated 3.5in bay can be used for
the latter, or you can get a 3.5-to-5.25in adaptor to put a
floppy drive in a 5.25in drive bay.
We’re using a Fractal Design Define R5, which we had
spare in the lab and has two 5.25in bays (we’re using an
Akasa adaptor to install the floppy drive in a 5.25in bay),
but there are other new cases that will do the job. Fractal’s
latest Define 7 has one 5.25in bay, for example, and the
slots, but the IDE method makes for a system that’s easy larger XL model has two 5.25in bays. The Pure Base 600
to set up for booting in the BIOS. from be quiet! also has two 5.25in front-facing drive bays.
CompactFlash is readily available in a variety of
capacities, and its removable nature means you can easily PERIPHERAL VISION
have a few flash cards to boot your system with different If you have an old ATX motherboard, the rear I/O panel
options. What’s more, you can easily plug a Compact will likely have a pair of 9-pin serial ports (usually used
Flash card into a USB card reader, and transfer files from for mice and external modems), a 25-pin parallel port
another PC to it, which is much easier than mucking about (usually used for printers, but also some scanners and
with slow and unreliable floppy disks. storage devices) and a pair of PS/2 ports (small 5-pin DIN
We’re using a 512MB card, but you can go higher. DOS sockets) – one for a keyboard and one for a mouse. You
runs on a FAT16 (not FAT32 ) file system, which means may even have USB ports, but these are useless for DOS.
you’re limited to using no more than 2GB for a single drive, We recommend using the PS/2 ports for your keyboard
although you can also partition a larger flash card into and mouse. PS/2 is quicker than serial, and there’s a
multiple 2GB drives with different letters. decent range of PS/2 kit available, including optical mice
Our other two storage devices are an IDE CD-ROM (no one wants to return to using analogue ball mice again,
drive and a 3.5in floppy drive, which will connect directly however nostalgic they are!)
to the motherboard’s IDE and floppy controller ports. You can even get some modern USB keyboards and
You’ll need cables to attach both of these devices, which mice working with old PS/2 ports via adaptors, but you’ll
are commonly available in ribbon format, but in the early need to do your research. A USB peripheral will need to
2000s, some manufacturers started bunching the wires internally support the PS/2 protocol in order for it to work
all together to make ‘rounded’ IDE and floppy cables, so over a USB adaptor, and many of them don’t. With a bit of
they take up less space. You can still buy these new, and help from Google, you should be able to find out if you can
we’re using some blue ones here. use your USB keyboard or mouse with a PS/2 adaptor. If
not, you can buy second-hand PS/2 peripherals cheaply
on eBay – it’s not as if you’re going to need a 4,000dpi
sensor to play The Secret of Monkey Island, or even Doom
for that matter.
SOUND BYTES
Finally, we come to the sound card. As we mentioned
earlier, you want a 16-bit ISA sound card for DOS. The
basic standard for DOS games is the Creative Sound
Blaster Pro, which combines FM synthesis for music with
the ability to play back 8-bit sampled sound – it’s great
for a game such as Doom, so you get music and demon
growls, gunshots and explosions. It’s also compatible with
the first Ad-Lib products, which provided FM synthesis
(but no sampled sounds) and are supported in quite a few
An IDE CompactFlash DOS games, particularly early ones.
adaptor makes for easily
There are plenty of non-Creative 16-bit ISA cards that
swappable storage
that’s also comparatively have Sound Blaster Pro-compatibility and can be picked
fast and reliable up quite cheaply – just make sure you can get the drivers
83
H OW TO
BUILDING TIPS
So you’ve got all your bits and pieces, and because we’re
using an ATX case, PSU and motherboard, much of the
build process is similar to making a custom PC now.
However, there are a few key differences.
JUMPER ROUND
The first warning is that your motherboard’s BIOS will be
very different from today’s user-friendly EFI systems, and
even the BIOSes found on boards ten years ago. In fact,
Creative’s Sound
for the moment, forget about the BIOS, and instead look at
Blaster 16 will
happily provide FM the various jumpers and switches on your motherboard.
synthesiser music for them. We’re using a Creative Sound Blaster 16, which Firstly, there may be some DIP switches – plastic
in games, as well has full compatibility with the Sound Blaster Pro, and can blocks featuring several little numbered on/off switches.
as sampled sound
also play and record 16-bit sound. Secondly, look for jumpers. Jumpers are small sets of pins
There are advantages to buying a better sound card, with movable, conducting tops, which can be swapped
though, as you can massively improve the quality of the around to connect pairs of pins, acting as switches.
synthesiser music in some games. Creative’s AWE32 Before you change anything, look for any tables printed
and AWE64 cards have much better synth sounds than on your motherboard that outline the position of the
the Yamaha OPL2/OPL3 FM synthesiser sounds used by switches and what they mean. If you can’t find them, try
most cards at the time. Again, Doom is a great example of to find your motherboard manual online, so you can see
a game that sounds much better with one of these cards. what all the switch settings mean. Getting this wrong
Another alternative is to use the MIDI interface on can genuinely result in you accidentally overvolting or
the Sound Blaster 16. The early Sound Blaster 16 and overclocking your hardware and cooking it. These could
AWE32 cards (but not the later ones) had a wavetable be perilous times for PC building!
daughterboard connector, to which you can attach a Now you need to set the switches and/or jumpers
secondary synthesiser card. If you can find a Yamaha to meet the voltage, bus speed and multiplier for your
DB50XG, the sounds are amazing, and there are CPU. In the case of our 166MHz Pentium MMX, that
plenty of other decent-sounding daughterboards too. means a voltage of 2.8V, a 66MHz front side bus and a
Once it’s plugged into the wavetable connector, your 2.5x multiplier. You’ll also need to check any jumpers or
daughterboard will then just output its synthesiser switches for the memory – some motherboards require
sounds through the line out. This setup will only work on a switch to be set to use SIMMs or DIMMs, or to set the
games that can play music data through the MPU-401 memory speed. Triple-check all your switches and
interface but quite a few do, and they’ll sound better for it. jumpers before you install your CPU.
You can also use the 15-pin joystick port on the back of
the Sound Blaster 16 for MIDI, via a 2 x MIDI (5-pin DIN) CPU INSTALLATION
to 15-pin cable, which you can buy on eBay. A popular Physically fitting the CPU is one area that hasn’t really
external MIDI box at the time was the Roland MT-32, changed over the past 25 years. In the early PC days,
which is supported in some DOS games, including King’s CPUs were sometimes soldered into motherboards,
Quest IV. rather than using a sockets, and it wasn’t until the 486 era
84
Installing a Socket
Some low-powered Socket 7 CPUs don’t even
7 CPU is very
need a fan, but you’ll need one for a Pentium MMX.
similar to fitting an
Ours screws directly into the heatsink fins
AM4 chip today
85
H OW TO
86
An Akasa ISA cards have the PCB on the other side of the backplate from PCI cards
3.5-to-5.25in bay
adaptor enables drive (and on our CompactFlash PCB) will be a jumper, CARD GAMES
us to fit a floppy
which can be switched to M, S or CS, with the latter Finally, slot your graphics card and sound card into place
drive in our Fractal
Design Define R5. standing for ‘cable select’, although we recommend using – put the graphics card in the top PCI slot, and the sound
Yay, Lemmings! the ‘master’ or ‘slave’ options for certainty. card in one of the bottom ISA slots. ISA cards have the
Set this to ‘M’, unless you’re running two drives on PCB on the other side of the backplate from PCI cards,
one cable, in which case set the faster drive to ‘M’ and with their chips facing the top of the case, but you fit them
the slower drive to ‘S’. If you don’t do this properly – for in the same way. Slot them in place, and secure their
example, by putting two ‘master’ drives on one cable – backplates with your case’s screws.
the system may not boot. You can then connect your IDE
cables. There will be two plugs with a short length of cable PLUG IT IN, PLUG IT IN!
between them, and All your hardware is now basically installed – the final step
a longer cable going is to plug in your keyboard, mouse, mains cable and VGA
The ATX power socket down to a third plug. cable and start it up. Your BIOS should be set to boot from
87
H OW TO
INSTALL FREEDOS
ON VINTAGE
HARDWAREFollowing our vintage PC building guide, K.G. Orphanides
shows you how to get a retro PC up and running with FreeDOS
USING DOS
DOS is a command line operating system, and if you’ve ever
used Windows’ cmd, it will feel familiar. It’s case-insensitive:
commands, paths and file names don’t have to be typed
Unlike standalone versions of MS-DOS, FreeDOS supports the FAT-32 file system in UPPER CASE but are often styled that way. To run an
88
is to simply mount your drive on your usual PC with a card
reader and copy in the lines you need using a GUI editor, such
as Notepad in Windows. If you prefer to write or edit config
files under DOS, just use FreeDOS’ EDIT command for a very
capable MS-DOS editor with mouse support. If you want to
comment on a line, put the word ‘rem’ in front of it. This is handy
for troubleshooting and working out exactly what lines you
need in your boot files.
DRIVERS
Although FreeDOS has some integrated drivers, you’ll
still need the manufacturers’ drivers for your sound card,
possibly your graphics card, and any non-standard interfaces
or unusual input devices, such as specialist joysticks and
The FreeDOS 1.3 executable file – which will typically have a .COM, .EXE or .BAT Zip drives.
RC3 live disk makes extension – just type its name without the extension. File and Your first stop for driver sourcing should be Vogons Drivers
testing, formatting
and installation a
directory names are limited to eight characters and extensions (vogonsdrivers.com), a spin-off of the popular and infinitely
convenient menu- to three, with longer names curtailed with a tilde (~). When helpful Vogons retro gaming message board. The drivers
driven affair you’re finished with DOS, you just turn off the computer. Some generally come with full instructions and examples of the lines
older programs don’t even have the option of quitting back to you’ll need to insert in boot-time config files. Another useful
the command line. collection of hardware drivers, this time with a focus on storage
devices, can be found at Hiren & Pankaj’s Homepage (hiren.
EDITING DOS CONFIG FILES info/downloads/dos-files).
As it loads, DOS looks for specific user instructions in files FreeDOS’ default FDAUTO.BAT file includes the most
traditionally known as AUTOEXEC.BAT and CONFIG.SYS, in common SET BLASTER address line for Sound Blaster
the root of your boot drive, whether that’s a floppy or your compatible cards. This will be enough in many cases, but you
C:\ partition. As we’re using FreeDOS, these are actually may still have to add the path to the actual driver yourself, as
called FDCONFIG.SYS and FDAUTO.BAT. FreeDOS includes a well as assigning your own MIDI settings. For example:
selection of useful drivers, such as ones for mice and CD-ROM
drives, and these are already called in its boot files. SET BLASTER=A220 I5 D1 H5 P330
The easiest way to create or modify these files, assuming SET MIDI=SYNTH:1 MAP:E
you’re using a CompactFlash or SD card for your hard disk, SET SOUND=C:\DRIVERS\SB16
The OS comes
with FreeDoom,
but real Doom
works well too
89
H OW TO
TRANSFERRING DATA
If you’re using a CompactFlash or SD card-based DOS drive
and have a reader connected to your PC, you can just mount
your entire DOS drive under your normal Windows, Linux or
macOS operating system and copy any files you want to it. This
convenient approach makes it easy to get retro games you’ve
Pre-defined Under DOS, you’ll generally have an easier time of bought on gog.com or Steam onto your DOS drive – we tried
startup menus configuration if you stick with ISA cards, although we got the this with the Steam versions of Quake and Ultimate Doom, and
provide commonly
required memory
PCI Sound Blaster Live! 5.1 from 2000 working with some both games worked fine on our retro machine.
configurations, tweaking of its driver’s CTSYN.INI file. If you run into IRQ or Alternatively, you can burn a load of DOS software to a data
but you can add DMA conflicts, check your motherboard’s bios settings – if CD and transfer it the old-fashioned way. However, if you’re
your own too in doubt, disable on-board components such as unused using standard IDE hard disks, or you don’t want to routinely
parallel and serial ports, and – especially if you’re using PCI open your DOS PC to load up its hard disk, you might want to
components – disable Plug and Play and enable Legacy Mode. add USB mass storage support to FreeDOS.
Graphics drivers were far less important in the DOS era If your motherboard has the common UHCI-compliant
than now: if your card supports the VGA display mode, you host controller, then you’re in luck, as FreeDOS includes Bret
Johnson’s USBDOS drivers (bretjohnson.us). We recommend
K N OW YO U R F R E E D O S CO M M A N D S just invoking them as needed to keep memory consumption
DIR down, rather than loading them in FDAUTO.BAT.
List everything in the current directory. FreeDOS by default applies the /P command If your vintage system only has an OHCI controller, or if
extension to pause when the screen is filled. Press space to see more. you’re using a newer motherboard with an EHCI USB chipset,
DIR /W then you’ll need Panasonic’s multi-chipset USBASPI driver
Show filenames and extensions only, in a columnated list.
(custompc.co.uk/USBASPI) and use the Motto Hairu USB
X: Mass Storage driver (custompc.co.uk/Hairu) to mount
Change to specified drive letter, swapping ‘X’ for the letter of the drive you want to access.
your disks. To add an OHCI controller in FDCONFIG.SYS, add
CD PATH
Change Directory to the specified directory name or path, replacing
the following lines, modifying the driver paths as appropriate:
PATH with the name of the directory you want to access.
CD.. DEVICE=C:\DRIVERS\USBASPI1.SYS /V /O
Move back to previous directory DEVICE=C:\DRIVERS\di1000dd.sys
CD \
Move to top level directory USB drives must be plugged in at boot time to be accessible.
MD NAME
Make a Directory called NAME CPU THROTTLING
COPY X:\PATH\ X:\NEW\PATH\ If you’re using a 500MHz PC from 2000 to run games
Copies files and directories from one place to another
from 1991, your processor will make older clock cycle
MOVE X:\PATH\ X:\NEW\PATH\
fixed software run impossibly fast. FreeDOS includes the
Moves files and directories to a new location
SLOWDOWN tool to counter this problem.
EDIT
The friendliest DOS text editor For Origin’s Martian Dreams, for example, with an
RESET
executable called MARTIAN.EXE, we just typed SLOWDOWN
You don’t have to type reset to reboot your PC, but FreeDOS gives you the option MARTIAN in its directory. You can then reduce speeds by
SHUTDOWN pressing Ctrl and Alt together until you get the speed with
Another optional FreeDOS command for the comfort of modern computer users which you’re happy.
FDISK
DOS partitioning tool MEMORY MANAGEMENT
FORMAT X The classic DOS games came from a time when only 640KB
Formats the specified drive (replace X with the appropriate drive letter). of conventional (or base) memory could be directly used in
This will erase its contents and ready it for use with DOS
MS-DOS ‘real mode’. Even then, that was a tiny amount of
90
As an alternative to using the old-school boot floppies
that most gamers had at the time, we’re going to use
FreeDOS’ integrated startup menu system.
FreeDOS has already done a lot of the work for us here,
creating high memory and JEMM386 expanded memory
startup options.
If you need the maximum amount of conventional
memory available, select option 1, Load FreeDOS with
JEMMEX, no EMS (most UMBs) and max RAM free, which
nets us 643KB of available conventional memory.
If you’re running one of the many 1990s games that
require EMM386 expanded memory (their manuals will
tell you if they do), you want option 2.
You can usually install straight from the CD without any fuss, but you can run MAKE YOUR OWN BOOT MENU
FDISK from the live OS if you need more control over drive partitioning In FDCONFIG.SYS, a MENUDEFAULT section defines
four numbered startup menus. We can add an extra
RAM with which to play, so methods of increasing available option 5 like this:
memory were rapidly introduced. These included a 64KB
high memory area, expanded memory of up to 32MB (EMS) MENU 5 - SB LIVE (JEMM386, HIMEM, NO USB)
and extended memory of up to 4GB (XMS).
In FreeDOS, these memory areas are controlled by In the same file, you can add specific lines to a chosen
HIMEMX, JEMMEX, menu number by putting the number(s) and a question mark
From the late 1980s and JEMM386,
which are invoked in
at the start of the line. For example, putting ‘125?’ before a
line means it will be included in boot options 1, 2 and 5 – we’ve
onwards, most games FDCONFIG.SYS. added ‘5?’ to lines that call HIMEMX, JEMM386 and FDAUTO.
To free up extra BAT to include those features in our new menu option.
included installers memory, DOS users In FDAUTO.BAT, a quick way to load drivers that only apply
traditionally have to to your new menu option is to insert an ‘if not’ block just
juggle extended memory management tools, load drivers before :FINAL at the bottom. For example, the following lines
into the high memory area, and winnow out unnecessary enable a PCI Sound Blaster Live! if we select menu option 5,
drivers until there’s enough memory available to load your but skips straight past it if we select any other menu option:
desired application.
IF NOT ʺ%CONFIG%ʺ==ʺ5ʺ GOTO FINAL
:SBLIVE
SET MIDI=SYNTH:1 MAP:E MODE:0
SET BLASTER=A220 I5 D1 H5 P330 T6
SET CTSYN=C:\DRIVERS\SBLIVE\DOSDRV
C:\DRIVERS\SBLIVE\DOSDRV\SBEINIT.COM
INSTALL A GAME
Software installation is usually blissfully simple under
DOS. From the late 1980s onwards, most games included
installers, so you just need to insert your install CD or floppy,
go to its drive letter (for example, type ‘a:’ at the C prompt to
go to your floppy drive) and run the installer, usually called
INSTALL or SETUP.
You’ll probably be asked to select your graphics mode,
sound card and choose an install location – this should be
drive C. The installer will copy over its files and tell you what
you need to run to play the game. You may need to do some
disk swapping during this process, and games with CD audio
will also require the disc to be in the drive while you’re playing
the game. Some games don’t have installers, but if you copy
If you’re using an IDE CompactFlash reader for your retro machine, you can plug all their files into a directory on your hard disk, you can usually
it into a card reader on a Windows PC to easily copy and edit files for your OS run them from that location.
91
H OW TO
EMULATE DOS ON
RASPBERRY Pi
K.G. Orphanides shows you how to use the powerful DOSBox-X emulator to
boot Raspberry Pi to DOS, and run anything from Windows 3.11 to classic games
G
raphical user interface? Pah, luxury! When us PC com), DOSBox-X has more precise hardware emulation,
gamers were young, we had to type in text at the supports a wider range of software, and can effectively
DOS prompt. Of course, you can already buy run more DOS-related operating systems (up to Windows
some classic DOS games readily from GOG and Steam ME). It also has a sophisticated graphical interface to help
that have been tweaked to run from Windows 10. you manage tasks such as configuration and virtual disk-
However, if you want to get the authentic DOS experience, swapping. In this guide, we’ll show you how to make a
where you have complete control over your system, you Raspberry Pi system that boots straight into DOS.
have to either run an emulator or build a machine based
on old hardware. 1 / CREATE YOUR DOS DIRECTORIES
We’re going to take you through the latter next month, Let’s create the directory structure to house the software
where we’ll show you how to build a machine that natively we’re going to run through DOSBox-X:
runs DOS games.
When us PC gamers were Another alternative,
though, is to use
mkdir -p dos/{floppy,cd,games}
young, we had to type in an emulator, and The floppy and cd directories will house disk images, and
Raspberry Pi makes we’ll be able to switch between them in DOSBox-X. This
text at the DOS prompt a great platform for tutorial and our template config files presume you’ll keep
this if you want to all your DOS files in a /home/pi/dos/ directory, so be sure
make a dedicated machine that boots straight into DOS, to change any paths if you’re using a different username
particularly because of its low cost. or DOS directory names.
The extra oomph of the 4GB or 8GB edition of While our generic config file should handle most DOS
Raspberry Pi 4 provides plenty of power for emulating software well on a Raspberry Pi, you can also create
classics of the past in DOS, and that even goes as far separate .conf files for specific programs, in order to
as installing and running early versions of Windows. better match their requirements and automatically
In this tutorial, we’ll show you how to emulate PC run commands.
software from the DOS era using DOSBox-X. If you
don’t need DOSBox-X’s menus 2 / TWEAK YOUR GRAPHICS
COPYRIGHT or extra features, though, the Assuming you’re using a standard 1,920 x 1,080 display
standard version of DOSBox with your Raspberry Pi, you’ll find some more demanding
DOSBox is an emulator and we use it
0.74-3 available in the package DOS software struggles at full resolution, particularly
with open-source FreeDOS code. Be
repository is a handy alternative. if you have DOSBox-X configured to use OpenGL and
mindful of copyright when downloading
Just type sudo apt install aspect ratio correction.
files for DOS software, and only use
dosbox. You’ll find its config file in On the desktop, open the main menu, go to Preferences
proprietary software that you own and
/home/yourusername/.dosbox and select Screen Configuration. Right click on your
in accordance with the licence terms.
Forked from the original display – most likely marked HDMI-1 – and select 1,280 x
custompc.co.uk/dosboxlegal
DOSBox emulator (dosbox. 720 from the Resolution menu. Running your entire GUI
92
4 / EXPORT A CONFIG FILE
Restart DOSBox-X and tell it to generate a config file
that we can later modify in a text editor, based on the
program’s default settings, and then exit.
93
H OW TO
DOWNLOAD
pi-dos.conf THE FULL CODE:
> Language: DOSBOX-X CONFIG FILE custompc.co.uk/PiDOS
001. # Basic DOSBox-X config for 90s DOS software o 026. core = dynamic
Raspberry Pi. 027.
002. # See default config file and https://github.com/ 028. # some software benefits from emulating a specific CPU,
joncampbell123/dosbox-x/wiki for further documentation which can be specified here
003. 029. cputype = auto
004. [sdl] 030.
005. # set fullscreen true if you want to boot to an 031. # if you experience lag or juddering audio, set CPU
authentic-feeling DOS environment cycles to max.
006. fullscreen = false 032. cycles = auto
007. 033.
008. # Don't forget to set Raspberry Pi's desktop resolution 034.
to 1280x720 035. [autoexec]
009. fullresolution = desktop 036. # Your DOS autoexec.bat file. These commands will
010. be run at startup, making it easy to mount lots of
011. # opengl allows aspect ratio correction floppies or CDs at once, as well as your working
012. output = opengl directories.
013. 037.
014. [render] 038. mount c /home/pi/dos/
015. # set frameskip to 1 or 2 for resource-hungry titles 039.
016. frameskip = 0 040. # uncomment and customise these lines to mount floppy
017. and CD images. Remember that DOS isn't case sensitive,
018. # aspect ratio correction but Linux is.
019. aspect = true 041.
020. 042. # imgmount a "/home/pi/dos/floppy/disk1.img" "/home/pi/
021. # choose your favourite. Don't use scalers on games dos/floppy/disk2.img" "/home/pi/dos/floppy/disk3.img"
that already have high resolutions. Set scaler to none -t floppy
to improve performance. 043. # imgmount e "/home/pi/dos/cd/a directory with spaces
022. scaler = advmame3x in/sherlock.iso" /home/pi/dos/cd/quake/QUAKE101.cue -t
023. iso -fs iso
024. [cpu] 044.
025. # use normal core for multitasking OSes such as Win95 045. c:
94
You may find that the
interface of Windows 3.x
feels rather alien
MS-DOS, although you can install and run MS-DOS from
a disk image if you own a copy.
Navigation through DOS directories isn’t too different
to using a Bash terminal, particularly as a number of
Bash commands have been included, such as LS as an
alternative to DIR in DOS. To run a .com, .exe or .bat file,
just type its name without the extension. RPG classic Worlds of Ultima: Martian Dreams is legally available
for free from GOG.com, but you’ll have to use innoextract
To capture and release your mouse, use the LEFT- 1.8 (constexpr.org/innoextract) to pull the files out of it
CTRL+F10 shortcut. The autolock entry under SDL config
enables capture-on-clock.
CD WINDOWS
9 / WINDOWS 3.11 WIN
Now we’re going to install Windows for Workgroups
3.11, released in December 1993. The biggest challenge 11 / USING WINDOWS 3.X
is finding a copy of Windows 3.11 to install – that usually If you’ve only ever used Windows 95 or later, the interface
means aging of Windows 3.x may feel rather alien. There’s no Start
floppy disks, or button, and if you want to quit back to the DOS prompt,
disk images if you you have to open Program Manager’s File menu and
had the foresight select Exit Windows.
to make backups. The default Program Manager folders, each of which
We’re working from are full of shortcuts to helpful software and settings, are
a set of disk images. clearly labelled. To explore your mounted DOS drives,
If you don’t open Main and then File Manager. Accessories include MS
already have one, Paint (precursor Paintbrush), a Sound Recorder and even a
and don’t fancy Media Player. A line at the top left of each opened window
the second-hand allows you to move and close it, and you’ll find minimise
market, you can, and maximise buttons on the top right of each window.
surprisingly, find
it included in 12 / BOOT RASPBERRY PI TO DOS
Microsoft Visual Once you’ve configured DOSBox-X – and any relevant
Windows 3.11 will
cheerfully run Studio Subscriptions (formerly MSDN Subscriptions), window managers – to your satisfaction, you can
either on top of currently priced at £33.54 per month, for the benefit of complete your pitch-perfect 1990s PC simulation
DOSBox-X’s default developers working on backwards compatibility, by booting straight to DOS. Open a Terminal window
FreeDOS operating
and type:
system, or installed
with DOS 6.22 10 / INSTALL WINDOWS
on a dedicated Copy the contents of each installation disk or image to a mkdir /home/pi/.config/autostart
hard disk image /win311 subdirectory of the dos directory tree we made mousepad /home/pi/.config/autostart/
earlier; you can do this as you normally would on the dosbox.desktop
desktop or at the command line, or by using DOSBox-
X’s IMGMOUNT to mount them and using the DOS COPY Add the following to the new text file:
command while switching disks. At the command line,
start DOSBox-X with a Windows-suitable config file – [Desktop Entry]\
download ours from custompc.co.uk/PiWin Type=Application
Name=DOSBox
dosbox-x -conf win311.conf\ Exec=/usr/bin/dosbox-x
CD WIN311 This will use DOSBox-X’s default config file. You’ll need
SETUP to enable fullscreen in your DOSBox-X config for this to
launch correctly, and we strongly advise enabling
Windows 3.11 will install itself. Reboot. opengl-dependent aspect ratio correction.
95
T H AT M A D E
OUT
“The Computers that Made Britain
is one of the best things I’ve read NOW
this year. It’s an incredible story of
eccentrics and oddballs, geniuses and
madmen, and one that will have you
pining for a future that could have been.
It’s utterly astonishing!”
- Stuart Turton, bestselling author
and journalist
Avilable on