“Digital archaeologists” excavate the microprocessor that ushered in the home computing revolution
1974, a group of eight design engineers left their jobs at the semiconductor company Motorola to create a low-cost computer microprocessor with a competing company, MOS Technology. Within a year, the team built a tiny wafer of silicon and metal smaller than the size of a person’s pinky fingernail called the MOS 6502. The new central processing unit (CPU), which is essentially the brain of a computer, would revolutionize its industry by enabling computers to come into the home. The 6502 was inexpensive and easy to program—two features that ultimately helped it sell tens of millions of units.
Those units (or minor variations of it) eventually found their way into several classic computers, many of which were the first to appear in homes in both the U.S. and the U.K. in the late 1970s and early 1980s. They could be found in Apple Is and IIs, Commodore PETs and 64s, BBC Micros, Atari 2600s, and Nintendo Entertainment Systems. The chip’s influence also enabled the mobile computing of today—the British company ARM makes microprocessors inspired by the simple elegance of the 6502 for devices such as the iPhone, Blackberry, and Android smartphones.
Back in 1974, the original schematic for the 6502 was sketched out by hand on a drafting board. (In contrast, today’s design methodology has hundreds of engineers working on hundreds of computers creating archived digital files of their work when collaborating on today’s microprocessors.) The creator of the 6502’s schematic doesn’t know where that document is today, and very little information on how the chip was created survives. Further, in the more than 35 years since its design, the understanding of how this remarkable chip performed its functions was lost.
“The 6502 is the last of that generation where processor manufacturing was a work of art,” says Barry Silverman, a Toronto-based software consultant and part of a three-person team that reverse-engineered the 6502 to determine how it worked and to preserve it for posterity. “In artifact terms, you might have a lot of examples of a particular piece of pottery, but the way it was created is gone. Even though it hasn’t been that long, it’s quite rare to find someone who remembers exactly what they did more than 30 years ago.”
The team behind the conservation of the 6502 was Silverman, his brother Brian, who is president of a Montreal company that designs digital education experiences for children, and Greg James, a graphics software engineer based in San Francisco. To accomplish its task, the trio treated the chip almost as if it were a dig site. They “excavated” the 4-by-3.5-millimeter chip, took high-resolution photographs of its layers, and mapped its circuitry. Their historical preservation work culminated in a website called Visual 6502 (www.visual6502.org), which hosts a simple simulation of the chip at work, allowing visitors to understand how electrical signals flow through the chip to accomplish the mathematical computations that drive a computer’s function.
The members of the Visual 6502 team refer to themselves as “digital archaeologists,” a term that Christopher Witmore, an archaeologist at Texas Tech University agrees is accurate. “Even to say ‘excavation’ is quite appropriate here because you have to dig down through the components, you have unpack it and take it apart,” he explains. “So much of it is lost, meaning it’s wide open for archaeologists to engage.”
Bill Mensch refers to himself as a “tall, thin man,” a term that among the computer engineering set refers to a person who understands how a microprocessor works from the silicon level to the system level. Mensch was one of the primary designers of the 6502 and was part of the cadre of former Motorola employees who defected to the Pennsylvania-based MOS Technology in late summer 1974, led by Chuck Peddle, whose idea for a low-cost CPU was rejected by Motorola top brass. In particular, Mensch was responsible for the design of the chip’s circuitry.
The CPU is essentially a maze of circuits mounted on a silicon wafer. Dotting the circuits are transistors, junctions of wires that act as switches, which can open or close off a particular pathway. The microprocessor reads an input from the particular program (anything from an operating system to a game), performs transactions as required, and then writes its output to the computer’s memory. Essentially, it’s the master of ceremonies, deciding what to focus on, making sure each step is followed, and presenting various results—sending them to memory, a monitor, or a printer.
Mensch drew the entire layout of the chip on a single sheet of paper that he says was likely about 3.5-by-4-feet in size. Designers at companies such as Xerox created sprawling schematics of up to hundreds of pages with different sections of a chip on each. His method, he says, guaranteed that the logic flow (specifically, how steps of process control and arithmetic are performed by the chip and then passed along) matched with the wiring of different transistors and circuits on the microprocessor. It’s a “what you see is what you get” approach that means, despite the original diagram being lost, the excavation of the chip by the Visual 6502 team would be able to clearly demonstrate how it functioned. “If anybody really studies Visual 6502 in detail,” Mensch explains, “what they’ll find is that everything was strategically located at its best position on the chip.”
When it debuted at the Western Electronic Show and Convention at San Francisco’s St. Francis Hotel, MOS Technology’s 6502 was four times faster and two to four times smaller than competing chips offered by Motorola and Intel. It was also roughly a tenth of the cost, being sold for $25 a piece out of “a big old Mason jar.”
Soon, the 6502 would become ubiquitous. Apple Computer cofounder Stephen Wozniak was among those who picked up a couple of chips. “I would credit Apple and Wozniak for popularizing the 6502,” says Mensch, adding that personal computing took off thanks to the Apple II’s expansion slots that allowed consumers to add memory or install an extra floppy disk drive. Though Apple was among the first to incorporate the 6502, it wasn’t the best-selling brand to use the chip. In the mid-1980s, casual computer consumers favored the Commodore family of home computers, which also ran on a version of the 6502. But the Nintendo Entertainment System outsold every other device that the 6502 appeared in, combined, moving close to 62 million units.
The 6502’s profile extended to pop culture, where it apparently powered two well-known fictional robots : the Terminator and Bender from the animated series Futurama. In the 1984 film The Terminator, scenes shown from the perspective of the title character, played by Arnold Schwarzenegger, include 6502 programming code on the left side of the screen. In a 1999 episode of Futurama, it’s revealed that Bender’s brain is powered by a 6502. Executive producer David X. Cohen has said that his fondness for the chip came from programming video games on his Apple II Plus in high school.
According to Mensch, through the mid-1980s, beginning computer engineers learned the craft of microprocessor design by studying the 6502. Today, while chip designers may appreciate the simplicity of the 6502, they design only discrete parts of the CPUs. The era of the tall, thin man is over, says Mensch.
In 2009, while browsing a retro computer parts website, Greg James saw two 6502s on sale for $10 each. He bought them both. He’d recently cleaned out his garage and stumbled upon an Atari 2600 and an Apple II, two machines that had “played a big part in my childhood.” He credits the former with teaching him that computers were fun and the latter with introducing him to programming. When he realized that both ran on essentially the same chip—the Atari contains an MOS Technology 6507, a 6502 in different plastic packaging—he started to research the microprocessor, eventually tracking down an incomplete schematic that he thought he could improve on to determine how the chip worked.
To analyze and then preserve the 6502, James treated it like the site of an excavation. First, he needed to expose the actual chip by removing its packaging of essentially “billiard-ball plastic.” He eroded the casing by squirting it with very hot, concentrated sulfuric acid. After cleaning the chip with an ultrasonic cleaner—much like what’s used for dentures or contact lenses—he could see its top layer.
The 6502 has three basic layers. The bottom layer is a wafer of silicon known as the “substrate.” Above it is a thin layer of polysilicon wires that form transistors and build circuits around the chip. The top layer is thick metal wiring primarily for supplying power. Its bulky structures obscure the polysilicon’s complex maze of wiring. Wires in a single layer can’t cross over one another, so connections can be made between layers to clear the cobweb of polysilicon and pack circuits closer together.
After photographing the chip’s topmost layer, James removed the metal using phosphoric acid mixed with acetic acid and nitric acid heated to 120 degrees Fahrenheit. Once the metal was gone, he took another photograph. “That was the money photograph,” says James of the moment when, in a real-world excavation, archaeologists can observe a landscape of artifacts, like canals or foundations of homes. James went one step further, removing the polysilicon layer with hydrofluoric acid, so that he could capture an image of the bare substrate.
Once he had all three photographs, he enlarged them to thousands of times their actual size and aligned them, creating images of a complicated network akin to a dense map of roadways. He then traced them, creating a complex network of lines like the maps drawn by Google or Mapquest. The vitual map includes the precise position and shape of each component in each layer of the chip, clearly identifying components like metal wires, transistors, and vias (holes in layers that allow wires to pass through and connect two levels).
James sent these full circuit extraction drawings to Barry and Brian Silverman. The brothers translated James’ circuit model into an inventory of the 6502’s components and connectivity (spelling out which component is connected to which other ones). This detailed list, called a “netlist,” is essentially the 6502.
The Silvermans then created a simple web-based simulation in which the virtual chip is turned on and allowed to run. A signal sent to a single input of the virtual chip causes certain transistors to flip on and off, which is shown in the simulation by changing the transistor’s color. These switches trigger other transistors to flip, causing a cascade as information steps through the chip. Eventually, the switches settle and the signal dies out. Then a new signal starts and runs a different course. How each cascade proceeds demonstrates how different parts of the chip are connected and the state the chip is left in after a cascade, each of which demonstrates how a different computation is done.
One Bay Area 6502 fan who saw the simulation obtained the netlist from the Visual 6502 team and fed the description into a “chameleon chip” called a field programmable gate array that consists of many transistors that can be programmed to connect in different ways. By lending the chameleon the characteristics of a 6502, he was able to hook it up to an old Atari 2600 and run games. “That means that we don’t need actual 6502 chips to drive old hardware or to study how old hardware works,” explains James. “We’re not crippled by the fact that the original 6502 is no longer being made.”
The pace at which the computer industry moves causes new technology to become obsolete within a matter of years. The more than 35 years since the release of the 6502 has seen a complete shift in the way people interact with technology. “Arguably every new technology transforms our rapport with our world,” says Witmore, the Texas Tech archaeologist. “They’re really prosthetics of humanity.” Think about a movie like Back to the Future. Marty McFly may have been overwhelmed by what he saw in 2015, but had his son from the future been suddenly transported back to 1985, he would have been just as befuddled when placed in front of a Commodore.
“The only thing that comes close to replicating the rate of growth in the electronics and computing fields is bacteria,i says Dag Spicer, a senior curator at the Computer History Museum in Mountain View, California. Indeed, since the release of the 6502, which contained 3,500 transistors, the sophistication of microprocessors has advanced by many orders of magnitude. In 1965, Gordon Moore, the cofounder of Intel, predicted that the number of transistors that chip designers could stuff onto a « single silicon chip » would double every two years at least until 1975. His prediction was accurate far beyond that point. Intelis current top-of-the-line desktop computer microprocessor, the Intel Core i7, has more than 700 million transistorsoright in the neighborhood of what Mooreis Law would predict. iModern chips have something like 10 layers of metal all stacked up on each other,i James says, allowing for more transistors and more computing power.
As these advances keep coming, the devices of the present quickly become relics of the past. “Digital media will not survive by accident,” explains Witmore. “If you leave a 3.5-inch floppy disk in a tomb next to a rolled-up papyrus, you can unroll that papyrus and engage with it in a way that you can’t with a floppy, which requires you to bring other materials to bear,” like a particular computer or knowledge of a chip capable of reading the data on the disk.
While there is no formal protocol for preserving our digital technologies, the Visual 6502 team is expanding its work to other chips, such as the Motorola 6800, which the 6502 undercut with its lower price point. James has also excavated and photographed the other two chips in his Atari 2600—one drove the graphics display and the other handled joystick inputs. One of the team’s future projects is to preserve an entire Commodore 64 system, which means not only excavating its chips, but also characterizing its motherboard, the circuit board that connects the CPU with the chips that control sound, inputs/outputs, and control the disk drives.
“People take for granted that our digital artifacts are going to be preserved,” says Visual 6502’s Barry Silverman. “To preserve an exact copy is not that easy. It’s got to be an active process.”
Nikhil Swaminathan is a senior editor at ARCHAEOLOGY.