Skip to main content Skip to main navigation menu Skip to site footer
PC gaming, frame rate, Origin Systems, id Software, computer performance, maintenance and repair

Chips of Theseus

Performance, Maintenance, and the PC Game, 1992-1997

Daniel Volmar (Independent Researcher)


Unlike other video-game systems, the PC has no standard unit, and games can behave very differently depending on how the individual machine is built and configured. When a game fails, it is not always obvious who is responsible: the designer, the user, or the producer of any one among many system components. Such failures are especially difficult to mediate in cases of poor performance, where the game does operate, but in a manner that disrupts the player’s subjective sense of interactivity. PC users have learned to rely on informal benchmarks that measure the frame rate, or graphical performance, of high-end games to troubleshoot problems, understand system capabilities, and agitate for fixes from developers and manufacturers.

This habit emerged in the early-to-mid 1990s, a time of exceptional instability in the PC market. Waves of new multimedia products inspired games that made over-ambitious use of them. Players expressed great aggravation at developers like Origin Systems, whose titles were difficult to get running even on cutting-edge hardware. But the relatively judicious games of id Software, which were furthermore transparent about their performance, found use as benchmarks and diagnostics, establishing, in themselves, what it meant for a game to run well.


Maintenance supposes a notion of proper functioning. In order to fix something, we must know if it is broken. What it means to be broken in the first place, however, can be subtle and contestable. A car that will not start is a problem for anyone, but a nominally undertuned air–fuel mixture is not observable—let alone meaningful—except to enthusiasts and professionals. The question of maintaining a video game’s performance is similarly idiosyncratic. Glitches, crashes, and slowdowns that one player tolerates may completely spoil the fun for another.

But where uncertainty exists is also the urge to measure. Although many aspects of computer performance affect the player–game interaction, graphical frame rate, as measured in frames per second (fps), dominates discussion among players, as well as the concerns of developers, manufacturers, and, indeed, the attitude of the entire industry. This number simply represents how quickly the game program is able to update the display. High frame rates give a greater impression of smooth motion, while low ones may appear rough and choppy, to the point that it may interfere with the player’s ability to respond to the game. When a title fails to deliver an acceptable frame rate under reasonable conditions, some may deem it broken or unplayable.1

Reductive as it may be, a fast and steady frame rate means a game performs well. Journalistic venues exist for communicating graphical performance data, while marketing sells customers on the promise of fluid 30, 60, 120—or even “uncapped”—fps experiences.2 This is certainly an aesthetic value but also a moral one: a means for ascribing blame for a perceived defect, and thereby, the responsibility for fixing it.3

Until the 1990s, however, graphical performance was a recondite point of video game programming. Players themselves rarely discussed it. This paper will show how the contemporary discourse grew from the concerns of PC users in that period. Unlike other gaming systems, the PC was not a standard unit, with no single point of technical sponsorship or control. Built and configured in chaotic varieties, PCs could make it very difficult to get a game running, and when the game did run, further difficult to decide whether the game was running as it should. Measurements of graphical performance, in the form of PC game benchmarks, helped players understand what to expect from their systems and who to blame when expectations were not met—game developers, hardware manufacturers, software vendors, or themselves.

The first section of this article considers the unusual place of the 1990s multimedia PC as a device that needed frequent upgrading to maintain compatibility with mainstream games. The aggressiveness with which developers made games for top-of-the-line hardware, and the enthusiasm with which customers anticipated them, created a high-pressure market for hype-worthy products that left many players confused, anxious, or excluded. The splendid but inhospitable RealSpace-engine games of Origin Systems are the subject of the second section, which examines a case of player pushback. The third section discusses the influence of id Software’s Doom (1993) and Quake (1996), two instances where players largely accepted the burden of keeping up as the fair price of progress. These games were, additionally, transparent about their behavior in a way that no others had been, inspiring communities to benchmark their performance in the interest of shopping, troubleshooting, and sheer amusement, with unfortunate consequences for budget chipmakers like Cyrix and AMD, whose products struggled to measure up.

Keeping Up a Multimedia PC

It is not obvious that the problem of achieving high performance in a complex system should be considered as a question of maintenance and repair. Where maintaining seems to concern the dichotomy between “broken” and “fixed,” high performance suggests a continuum of partially degraded states that may be satisfactory in some contexts and not others. According to Steven Graham and Nigel Thrift, however, maintaining opposes the Heideggerian insistence that a tool’s nature becomes “present” only when it ceases to serve the purpose for which it is otherwise held “ready at hand”—apt and uninterrogated.4 Their development leads David J. Denis to conclude that “studying maintenance and repair practices precisely consists in paying attention to all the overlooked situations that take place in the interstices of routine and breakdown, situations in which technologies are never completely functional and never completely broken.”5

Indeed, breakdown is only one among numerous forms of presencing of which the modern computer is capable due to its astonishing flexibility as a multifunction device: a business device, a communications device, an entertainment device, and so on—roles it can perform simultaneously and sometimes in conflict with one another. Speaking generally, these functions arise from software programs, each of which imposes a unique workload on the system, with many possible computational behaviors, even when the computed output is the same.6 A majority of games, moreover, belong to the particularly exacting class of real-time software, which, in addition to unplanned failures, continually fixes the user’s attention on system artifacts like loading screens, error screens, hitches, glitches, time outs, latency, and many kinds of visual anomalies. While players express varying sensitivities to these phenomena, they are inherent in the design of the digital electronic computer and are perceived, in their worst cases, as defects.7 In some sense, then, to play a game is to be always reminded of the machine playing it.

Such reminders spur some people to seek self-made means of improving the experience. Rather than label this enterprise with a bulky title like “maintaining high performance,” we can turn to scholarly precedent for a better one already on offer: hot rodding. In his 2008 monograph, The Business of Speed, David N. Lucsko characterizes hot rodding—the practice of modifying ordinary automobiles for racing—as the domain of enthusiasts: “end users for whom the car is something more than a way to ferry themselves to work, shuttle their kids to soccer practice, haul groceries, or escape to the mountains on weekends.”8 For enthusiasts, the vehicle is an end in itself, an object that is regularly modified, rebuilt, and customized as a hobby, as an artwork, and as a competition. While other forms of automotive enthusiasm exist, what distinguishes hot rodding is its own benchmark principle: speed. From the robust aftermarket for Ford accessories and replacement parts arose, on Southern California lakebeds in the 1920s, a community of “enthusiasts [who] raced not in an oval but in a straight line: their only object was to find out just how fast their daily rides could go.”9

It is the occasional fortunes of the dual-use hot rod—adapted for racing, but still practical as transportation—that most interests Lucsko, as does the dual character of the industry that has supported this activity. Because hot rodding is naturally concerned with models that are common and affordable, the high-performance market is not materially segregable from the network of garages, scrapyards, and small manufacturers that comprises the general automotive-maintenance base.10 Thus, what appears to most drivers as a hazard and a fetish is, contrarily, a spontaneous subsidy on the upkeep of their own unflashy vehicles. “Almost without exception, early speed equipment manufacturers produced, advertised, and sold equipment intended for street use from the outset,” observes Lucsko of the era of the Model T. “In fact, on-road products were always as important to these companies as their racing lines, if not more so.”11

Many of the voices presented in this paper are those of PC enthusiasts, or power users, expressing an identical means–ends inversion. If playing a game always reminds them of the machine playing it, then it is better to be reminded by the improvements they have made with their own time and diligence. Likewise, the PC market has evolved a similarly dual character, with enthusiast products contributing to the development, sale, resale, and refurbishment of less expensive variants.12 Unlike an automobile, however—even one used for competitive purposes—a computer’s workload can and does change drastically over time, by virtue of the continuous complexification of software programs. Here, the hot-rod analogy begins to break down because roads do not have their speed limits regularly doubled, nor do they bank and bend themselves in increasingly severe angles.

In the world of PC gaming, today’s enthusiasm is almost always tomorrow’s mediocrity. Mainstream users must keep up with power users, whose consumption patterns become, in themselves, justification for more and more demanding software, with ever-steepening system requirements. Thus, in exercising their own agency over the machine, a privileged class of users has limited the choices available to most. It will therefore bear observing, throughout the pages that follow, the noxious, spiteful, techno-masculinist reasons for maintaining a PC, the act of which gives meaning to a community that has ironically styled itself, after the epithet of video-game critic Ben Croshaw, as the “Glorious PC Gaming Master Race.”13 If maintenance and repair pertains also to cultures of consumer enthusiasm, then its literature must stay mindful of these decidedly negative associations as well.

Gaming and the Mutable Machine

A PC is not quite the same kind of system as most other video game systems, including other types of personal computers.14 A game console is usually built as an integrated system wherein a single firm designs, tests, markets, and supports a whole product as a vertically coordinated industrial partnership. By long-standing practice, the primary manufacturer controls the rights, tools, and data needed to develop games or accessories through secrecy and restrictive license agreements. Moreover, the system unit is conventionally designed to limit unauthorized modification by either the customer or a potential business rival. This creates a powerful technical–legal relationship called a closed platform, where participants must pay economic rent to a proprietor who, in turn, enforces the capability, quality, and uniformity of the enabling product.15

The PC, by contrast, is a comparatively open platform, with few formal restrictions on who can build, sell, or modify a conforming product, with no single party in a dominant enough position to define the integrated whole. Consistency is achieved not by top-down design, but rather through the use of interoperable components—hardware and software that is intended to work as part of a complete system, but without strict reference to one. While industry working groups and business-to-business relationships do bridge certain gaps, their members remain in a state of fluid “co-opetition,” jockeying for priority among one another, yet mindful of their interdependence in promoting the overall health of the market.16

To understand this market’s structure, it will be convenient to review the architecture of the microcomputer kits that inspired the design of IBM’s original Personal Computer model 5150, released in August 1981. In its most basic form, such a kit consisted of little more than a metal chassis with screw mounts for a power supply and an inert circuit board called a backplane. The backplane was similarly rudimentary: its purpose was to provide a common electrical connection, or bus, between a number of plastic edge-connectors embedded with conductive leads. To assemble a working computer, the builder slotted the backplane with a selection of printed circuit boards, which separately included the system’s microprocessor, main memory, and interfaces for various input–output devices. There was, moreover, no software, which was left to the user to load by hand or other means.17

The success of these kits popularized the physical and electrical layout of a system bus, the usage of which supported dozens of small manufacturers offering hundreds of different types of expansion cards and modules. A market also emerged for software to control a microcomputer, including Microsoft’s BASIC and the CP/M operating system, sold by Digital Research. CP/M also introduced a key solution to the so-called bootstrap problem—where a program is needed for the purpose of loading other programs into memory—by way of the Basic Input–Output System (BIOS), a small program read from a ROM chip every time the microprocessor was powered on.

The vitality of this secondary industry for microcomputer parts and software ultimately convinced IBM to design the Personal Computer from many of the same components, and in much the same fashion as a hobby kit. IBM did socket the processor, memory, and BIOS directly into the backplane, but it retained the expansion slots to interface with floppy drives, display monitors, and other devices desired by the customer. (Microsoft DOS largely supplanted CP/M as the standard operating system.) This remains more or less the same basic layout used by the PC’s many descendants today, IBM’s implementation having been cloned multiple times by competitors as soon as 1983. By the early nineties, there existed hundreds of PC manufacturers ranging from global conglomerates and national superstores to mail-order boutiques and bespoke local dealers, with hobby builds remaining a popular option as well.18

But for most, “buying a PC is unfortunately a complicated business,” according to Christina Erskine of PC Review. “No sooner do you settle down with a bunch of brochures and a general idea of what you want, than you are assaulted by the need to consider the precise specifications you need for each component, with differing standards for all of them, and then try to match these to the hundreds of configurations available without breaking your delicate budget.”19 With a priority on gaming, moreover, “the process becomes even more fraught, because, as you’re probably sick of hearing, the PC wasn’t designed with games in mind.” Indeed, it had not been. The original IBM PC had a notoriously limited set of features with respect to graphics, sound, and inputs, and while these capabilities had greatly improved since 1981, progress had been achieved largely through the bewilderingly fast-paced market for upgrade chips and third-party expansion cards.

And keeping pace was expensive. “When deciding what to go for,” PC Review advised its readers, “try to buy the most high-powered PC you can possibly afford. It will last longer before software begins to outgrow its capabilities, and run all programs faster.” But merely considering what a new computer could do at the time was insufficient; even then, Erskine urged leaving some flexibility to upgrade:

You’ll need two expansion slots free for a soundboard and joystick, even if you don’t plan to buy them immediately, so think ahead also to the time when you might want to add a CD-ROM drive. Is there room in the casing to add a second hard disk? The standard 40Mb model might seem vast now, but I assure you it will fill up quickly and it is a lot less hassle to add a second drive than to back up the first one and then replace it. How much more memory can be fitted on the board inside? There are more and more games coming out that require extended memory, so make sure you can add it later if you’re not buying it to start with.

The only simplifying advice was to avoid the PC’s “false economies,” such as forgoing a VGA monitor, which added cost as well.20

Once obtained, however, these supposedly interoperable parts were not all easy to keep working together. “Why is it that the PC is plagued with problems of the software outdating hardware and vice-versa?” user Yorkie moaned on the newsgroup in April 1993. “I have been using a good 386 for quite a while now and get really f****d off [sic] when I can’t use a particular mouse driver or I have to use EMM [expanded memory], and you can’t use this with that ’cos it bombs out!!!!!! THAT’S why PCs are awful.”21 This ubiquitous complaint referenced the notorious “640-kilobyte barrier,” a vestige of the original IBM PC that limited the memory available to DOS-based programs, which, in addition to requiring meticulous reconfiguration in order to play specific games, greatly complicated the task of programming, and thereby, the prevalence and severity of bugs.22 “I currently own an A500 and an A1200”—two models in Commodore’s Amiga series, a competing non-PC—“no problems with compatibility there. The A1200 pisses all over a 486 for speed and the CPU is only 14MHz, doesn’t it make you PC users sick to know that your machine is slooooowed by all the extra bits glued to the motherboard?????” While Yorkie’s factually flimsy bit of trolling actually found approval from aggrieved PC users, respondents noted that Commodore itself was commercially moribund, with few developers still publishing games for its systems. The company did indeed file for bankruptcy thirteen months later.23

While other video game platforms rode the fortunes of these vicious–virtuous cycles between player uptake and developer interest as well, the less-integrated nature of the PC subjected even its basic functions, like graphics and sound, to uncertain—and sometimes perverse—market outcomes. “In the early days of IBM gaming,” for instance, “one may as well have been hearing impaired,” as per Mike Weksler, Joe McGee, and the October 1993 issue of Computer Gaming World. “The chirps and beeps of the internal speaker added almost nothing to the gaming experience. It was the silent movie era for computer games.”24 Then, in 1987, “the introduction of AdLib’s FM synthesis was as dramatic a leap as from prehistoric cave paintings to the glorious age of portraiture epitomized by Gainsborough and van Dyke.” Mixed metaphors aside, the roundup covered twenty new sound-related expansion products from eleven manufacturers, with “more combinations of digital audio and synthesis than there are combinations of word processors and spreadsheets,” and prices ranging from $200 for a Gravis UltraSound to the $600 premium on the Roland RAP-10. “The success or failure of many of the products discussed in this survey will depend upon the marketing angles used by the manufacturer,” the authors concluded. “We are, quite frankly, amazed at all of the available products and the unique angles with which consumers are being lured,” including being “bundled excessively [emphasis in original] with multimedia software products which are superfluous for games, software which, if anything, raises the product price.”25 Thus, despite the myriad of new capabilities on offer, Computer Gaming World ultimately recommended compatibility with the aging Sound Blaster as the one indispensable feature in 1993’s crop of new sound cards.

* * *

Costliness, complications, and uncertainty typified the experience of maintaining a system as industrially disaggregated as the PC, with an application as temperamental as gaming in mind. “Keeping your computer up to date is always a challenge,” Charles Brannon wrote for PC Gamer in July 1995, “it’s like trying to hit a moving target,” because “the cost of upgrades—indeed, entire computer systems—always goes down, even as the demands for top-of-the-line games go up.”26 The advice that these periodicals gave their readers, and which players gave one another online, stressed the consumerist virtues of knowledge, patience, and above all, shrewdness.27 In Brannon’s case, the responsibility lay with the user to “take stock of your current setup and decide how much you need to upgrade … and upgrading a piece at a time may be the best way to go if you only have so much money to spend, at any given time. Deciding which upgrades make the most sense for you—and will make the most of your money—depends on you, on your system and the amount of money and effort you’re willing to invest.”

The consumer’s dilemma therefore depended on two personal assessments, one technical, and the other aesthetic and moral. In the first instance, ambiguity arose from the inherent complexity of computer performance and the difficulty of gauging how much spending would lead to a proportional improvement in game-related functionality. A machine conforming to the game’s printed requirements was already no guarantee, and while some requirements could be very strict, others were less conclusive.28 Knowing the difference was, in itself, a matter of judging whether a game would run well. And hence the aesthetic, or moral, questions: What did it mean for a game to run well? And who was at fault when it did not? For both parameters, the trend of the 1990s was to quantify graphical performance with greater thoroughness and precision.

The Problem of Good Performance

With a fully frame-buffered graphics mode, the PC differed significantly from contemporary video game systems, which, in general, could not wait arbitrarily long for a program to render the entire screen. These systems instead relied on rigorously timed repetitions of small memory elements, writing them out line-by-line as the television beam swept across the screen.29 The PC, by contrast, synchronized the display monitor to the output of a digital-analog converter (DAC) located on an expansion board, called the video card, or graphics adapter. This same board also carried a relatively large block of memory corresponding to the system’s frame buffer. When the screen needed to refresh—typically, once every 14.3 milliseconds—the DAC dutifully read out whatever data happened to be in the frame buffer. Thus, the programmer generated images by instructing the CPU to write the results of graphics calculations onto the bus and into the video card’s frame buffer.30 Though costly to implement at the time, the frame-buffering method is more flexible and better suited to the complex displays needed for graphical applications.

But video games are, programmatically, loops. So while frame buffering allows program execution to be timed somewhat loosely, games must still be designed to rewrite the frame buffer at a pace appropriate to the intended gameplay. If the loop has not completed by the time the screen must refresh, the same frame buffer will be read out again, duplicating the previous image. The longer the CPU takes to calculate a new frame, the more times the image will repeat. When such repetitions accumulate, they give the impression of slow or inconsistent motion, which, in addition to illusion-breaking, can hinder the player’s ability to respond. Predicting the flow of a multibranching loop is difficult in itself, and all the more so when, as in the case of the PC, players may attempt to run the game on machines with radically different capabilities, or even the same machine, but configured differently at runtime.31

Slow Graphics, Fast Fashion

Although it is beyond the scope of this paper to profile the performance of vintage hardware, recordings show that premiere titles like Wing Commander (Origin Systems, 1990), Falcon 3.0 (Spectrum Holobyte, 1991), and Links 386 Pro (Access, 1992) ran much slower than the 24-fps standard for motion-picture projection, or the effective 25 or 30 fps of broadcast television (PAL and NTSC formats, respectively).32 Attempting to run them faster would, in many cases, break them. “Help!” William John Baker exclaimed on Usenet in June 1992. “I just upgraded my computer to a 486DX/33Mhz machine and my favorite games just run TOO DAMN FAST!”33 Clock-sensitive software was a common PC problem that most designs mitigated with a button or switch on the front panel, which toggled the system clock between a high state and a low one. It was a crude solution. “My board has a turbo switch,” continued Baker, “but when I slow down the board this way, everything is way too slow. So I’m stuck with way too fast or way too slow. Any suggestions?”

If Baker had reviewed the postings three weeks prior, he would have found that at least one of the games that vexed him, Wing Commander II (Origin Systems, 1991), could be restrained by a keyboard combination. “I believe it is ALT-+ or ALT-− on the numeric keypad which will increase or decrease the frames/second rate,” advised Norm MacNeil. “I played WCII on a 486-33 at 5fr/secs and finished ok.”34 (This setting actually corresponds to a frame time of 5/60ths of a second, or 12 fps.) Programmers gradually took greater care to ensure their games did not run impossibly fast on new hardware, but for older titles, players often had to work around it. In 1990, a Los Angeles–based graphic-art professional named David Perrell began distributing MOSLO.EXE, a DOS utility program that periodically halted the CPU, thereby simulating a older one. This solution was suggested widely online and in magazines, recommended officially by technical support staff, and sometimes packaged with the game itself.35

As the capabilities of PC systems diversified, however, developers became more ambitious about pushing the latest hardware with demanding graphics, even when this inconvenienced—or outright disqualified—a large proportion of the player base. This trend was countered initially by some reverence for the platform’s identity as a niche for games of greater complexity and potential than the tile-based, controller-bound worlds of video game consoles and cabinets. In 1991, Chris Crawford, the eminent auteur, tried to make this case against a twenty-three-year-old Chris Roberts, lead designer of the Wing Commander series, at a session of the Game Developers Conference. “Resolved,” read the debate’s provocation, “the creation of good graphics should receive higher priority in the development process than the creation of good game play.”36 While Crawford hesitated to “emphasize the sizzle instead of the steak,” his gestures toward the “interactive fiction” of out-of-business Infocom brought heckles from the crowd. Roberts, on the other hand, seemed to deny that advanced graphics were necessarily at odds with quality gameplay, claiming, moreover, that graphics were what sold.

Roberts’s claims pointed more broadly to the mentality of the microcomputer industry, and the place of gaming within it. From the beginning, the PC’s was the language of “killer apps” and “early adopters,” the latter term taken from a 1962 publication, Diffusion of Innovations, by the sociologist Everett M. Rogers. Within the Silicon Valley power collective, a web of consultancies—that of legendary hype-master Regis McKenna, most of all—taught Rogers’s diffusion model as a unified theory of marketing. New products should be designed to win the attention of well-connected “opinion leaders,” to appeal to their respectability and self-styled good taste, or, in other words, to push computing like fashion.37

And modern fashion evolves quickly. While the Personal Computer line had initially distinguished itself as a business brand, once IBM lost control of the platform, the pace of business would no longer satisfy the PC’s leading marketeers.38 “There’s a new phenomenon in the corporate computing world that is becoming more common by the day,” Ed Foster wrote in a 1995 issue of InfoWorld, “the domestic power-user.”39 By this, Foster meant “the folks with multimedia Pentium systems with CD-ROM and scads of memory in their living rooms who turn up their noses at the 486SX you [i.e., their employers] offer them” at work. Meanwhile, the typical home PC filled many roles, and was likely shared by multiple members of a household. Despite their lowly reputations, flashy games put new hardware purchases to better use than spreadsheets did. Said Foster, “the only killer apps that have emerged for cutting-edge PCs are games.”

Origin Systems: “Punching the Envelope”

After more than a year of delays, Origin Systems finally released Strike Commander, Chris Roberts’s follow-up to 1990’s Wing Commander, in April 1993. A notional flight simulator, Strike Commander’s verbose plot followed a fighter squadron for hire through a near-future world filled with overpopulation and end-of-history tropes. After blowing past an initial deadline of Christmas 1991, the project grew in technical ambition to the point where memory and performance problems defeated a Christmas 1992 deadline as well. Despite extensive optimization, the team was simply unable to get the game’s texture-mapped polygonal graphics running smoothly, even on top-of-the-line hardware. One programmer, Frank Savage, recalled in an interview that: “with the best machines … we could get ahold of, with all the graphics turned on as high as they could go … we were lucky to get 15 [frames per second]. We used to joke that the only thing the game really was missing right now is a small bottle of shampoo. And everybody would look at us weirdly. What, what? Shampoo? So, yeah, what you do is you take the shampoo and you drop it in your eye, and then you can’t open your eye for very long. But, boy, when you’re blinking, those 15 frames per second—it looks amazing.”40 When Strike Commander finally did ship, however, reviewers had their eyes open, and some players pushed back.

Word began spreading online about Strike Commander’s performance issues a few days before its official release date. “I don’t know about everyone else,” Michael Kloss wrote on, “but I wish that these people posting these vague things like ‘it sucks, it’s too slow’ without any sort of qualifiers would stop. I want to hear the facts about the game, THEN someone’s impression based on that.”41 Wild attempts at quantification had been made: “Moondawg said it ran about 20% faster than Falcon 3.0 did on [a] good rig.”42 Some offered dueling claims about Strike Commander running “smooth as glass with ALL the detail turned up full blast,” or “essentially unplayable for over 90% of [its] audience.”43 Jude Greer complained, “I got it this weekend, and I have a 486-33 Local Bus and at full detail, it SUCKS!!! I’m taking it back Monday,” and continuing: “To be more specific, the frame rate is BAAAD … Everything is so incredibly jerky that it’s maddening to try to fly the plane when the smallest sidewards motion on the stick produces ~30 degree bank. This makes it impossible to bomb anything on the ground, and even dogfighting is terrible, and I found it to be the most enjoyable part of the whole game.”44 Prospective purchasers asked what to expect from their systems, while others made varied suggestions for getting the game to run faster. Before long, the conversation collapsed onto itself, with the game’s admirers digging themselves out from under the dogpile. “Most of all give it up guys. If you find it too slow on a damn 486/66 with VLB [VESA Local Bus] well I guess you just have too high expectations for anything.”45

The reaction to Strike Commander bears all the now-familiar markings of online toxicity, which does call its authenticity into question. Despite all the noise, and some tepid print reviews, the game sold well.46 “To say that people’s opinions on this game are widely divergent is an understatement,” observed Dave Chaloux. “Some think it is a 10. Others think it is unplayable on a 486/66 DX2 with Local Bus graphics and a 3DBench rating over 40. Another guy is playing and enjoying it on a 16 MHz 386.”47 The discourse reveals the extent to which, for some players, keeping up a well-tuned system was a pleasure in itself, one they expected to see gratified with exclusivist gameplay experiences. Admiring how the game made use of the player’s hardware—and the money and the time that had gone into it—could not be completely disentangled from enjoyment of the game itself. “Just from my stand point, if you are going to buy a game worth 60+ dollars, at least enjoy it at full detail level,” Xiangxin Shen said in contemplation of the prospect of lowering graphics settings in order to improve performance. “Makes me feel I should have gotten a 486/66.”48

Elsewhere at Origin Systems, other teams struggled to build on the tools and libraries developed for Strike Commander, which the company called the RealSpace engine. These were smaller-scope projects intended to generate additional returns on prior engineering investments. But Pacific Strike, a World War II–themed turn at the Wing Commander formula, overran two deadlines—the second unexpectedly—resulting in an embarrassing situation where the game’s accessory speech pack appeared months before the game itself did. (“Heh, spent about 25 mins today wandering around at Best Buy looking for [Pacific Strike] to go with the speech pack.”)49 When the game finally did ship in May 1994, the reception was not warm. For Computer Gaming World, Tom Basham wrote that while “there are a few bugs … they are not the primary reason for discontent. The main problem is that the game is extremely slow.”50 Even on a very powerful system, “the program has more herks and jerks than a fuel deprived engine,” the severity of which often led Basham to wonder whether the game was loading in data or had simply crashed.

Though undocumented by the game’s manual, players discovered that Pacific Strike’s developers had exposed an internal frame-rate counter in the final release code. Basham reported: “Using -F to display frame rate inside the program, the testbed machine topped out at an underwhelming 13 frames-per-second with all detail options disabled, and dropped to an unplayable four frames-per-second during combat with detail levels maximized. Anything less than a Pentium simply will not provide a decent frame rate with any significant detail level, and a fast 486 is required to play even with detail options minimized.”51 The author found this unacceptable and called for Origin to fix it with postrelease patches. “Unless deep and fundamental changes are made to address these problems,” he concluded, “this is one game that could end up with the U.S. warships at the bottom of Pearl Harbor.” For Game Bytes, a digital disk magazine, Greg Cisko also needled, “I don’t believe the machine has been built that can run this game smoothly … How they could have tested Pacific Strike on a 486DX/33 and thought that it was okay is beyond me.”52 Cisko then proceeded to detail an exasperated call he had made to Origin’s product support line, portraying his hostility as journalism.

Forum users similarly realized that the frame-rate counter could put the lie to Origin’s marketing claims of 15-fps performance on target systems. On a top-shelf machine, David Masten estimated “4–5 fps in the crowded missions. This with sea and sky texture off. This I consider very POOR.”53 Players, like reviewers, demanded fixes from the developer. The backlash was such that, within a month of release, Origin arranged to have an “open letter” from producer Eric Hyman placed in several magazines: “Some players have expressed problems with Pacific Strike’s frame rate. Origin has always developed games to exploit the full capabilities of high-end computers, but we also understand that anything less than a perfectly fluid frame rate can lead to frustration … To that end, we are working on patches that will be available for downloading through online services where Origin provides product support.”54 But though one small update had already appeared, the promised improvements never did. “After much debate and research, we found that it would take a minimum of 6 months to make any significant patches for the game,” read a July 15 statement from Origin Product Support.55 Instead, customers were offered refunds or exchanges upon request, while Pacific Strike’s expansion disk and CD-ROM edition were both quietly dropped from Origin’s release calendar.

Damage to the company’s reputation was only momentary, though. That December, Origin debuted the RealSpace-powered Wing Commander III to high praise and prodigious sales. While still a demanding program, it fared better technically than previous RealSpace titles, owing in no small part to a deep-space setting that skirted one of the engine’s sore points: rendering ground terrain. Fans were nonetheless insistent in their concern over low frame rates, causing the development team to reveal how fast the game was running on test systems several months before launch.56 A CD-ROM demo was also produced in October, allowing potential purchasers to evaluate the game’s performance beforehand. “We have our problems at Origin, but we’re struggling to eliminate them as best we can,” read a letter from an anonymous employee published in a Game Bytes disk magazine. “We do listen to our customers and we know we’re always teetering on the bleeding edge of hardware, but in order to break new ground as everyone expects us to, we have to punch the envelope [sic]. We’re really sorry that means leaving a few of our fans in the dust.”57

Intentions aside, Origin’s culture of compulsive “envelope punching” had its costs. The company also broke new ground in imposing crunch time on its employees, worsened by frequent delays and postrelease fixes.58 The planned successor to Pacific Strike, a World War I–era dogfighter called Wings of Glory, had to be substantially reprogrammed after the fiasco over the former game. “What it meant was that we were almost done with the game,” said producer Warren Specter, “and then we just pulled all the way back to Square One.”59 Converting the RealSpace engine from 16-bit memory addressing to true 32-bit protected mode took the Wings of Glory team more than six months to accomplish, but the release code, which shipped in February 1995, was judged to perform very well.60 In online forums, however, bitter fans continued to impugn the competence of Origin’s staff, with comments that bordered on public harassment. “Why should I have to buy a Pentium just because Origin programs like dead sloths???? Those idiots …” To this, Bill Armintrout, who had labored on Pacific Strike, evidently boiled over, retorting that “dead sloths get more respect.” The reply: “they deserve it more.”61

* * *

The games of Origin Systems were among the most highly regarded PC titles of their era. The care and inventiveness with which they were made is plain to see, but retrospect also reveals something deeply problematic in their conception. While software developers of all kinds have sought to maximize computer hardware, Origin leaned on its players’ commitment to computer ownership as well. “I guess it really just boils down to that we like to build the games we want to play, and we want to play the games that are as cool as they possibly can be,” producer Chris Roberts said. “I think we’re kind of guilty of pushing the hardware specs, but we reach for higher standards, I guess.”62

Such talk of “higher standards” may have sounded virtuous in the midst of the nineties multimedia-computer boom, but the consequences of falling short had to be absorbed somewhere. At Origin, they flowed downstream from the star figures like Roberts to team members, product-support staff, and to players who had to decide how much time, money, and technical frustration these games were worth.63 Late in 1993, Johnny L. Wilson, editor of Computer Gaming World, characterized the recurring problem with Origin’s releases: “Aren’t we ever going to have computer games that we can just load and play?”64

The Benchmark of Leisure

Only with time has it become possible to see id Software’s Doom (1993), reputed as it was for violent excess, as a work of technical restraint. “Our game design starts by selecting a speed for the game on our target platform,” wrote lead programmer John Carmack, “then trying to get as many capabilities and as high fidelity as possible.”65 This approach was rather the opposite of what Frank Savage had described of Origin’s Strike Commander, whose visuals had needed to be urgently pared back in order to run acceptably even on very high-end machines. Doom, on the other hand, had been built with the goal of rendering at least 10 fps on a middling 386-based system. Within the bespoke world of PC gaming, Doom’s minimalist gunplay was sometimes criticized as a regression to the arcade hall, but its graphical prowess had been achieved precisely because of Carmack’s search for clever compromises that would make the most of the average CPU.

In short, Doom ran well, and it could prove it. As part of his bedroom marketing effort, Jay Wilbur, id’s business manager, had begun hyping the game’s performance in online forums more than a year before its release—indeed, before most of the team had even gone to work on it: “And the frame rate (the rate at which the screen is updated) is high, so you move smoothly from place to place, turning and acting as you wish, unhampered by the slow jerky motion of most 3-D games. The game plays well on a 386SX, and on a 486/33, the normal mode frame rate is faster than movies or television.”66 The claim would turn out to be inflated. On its release, no system could consistently run Doom at its designed limit of 35 fps—not without greatly sacrificing the image’s size and detail. Here, though, the numbers are somewhat beside the point of their existence in the first place. While a handful of earlier games had simple (usually undocumented) frame-rate counters, Doom could be benchmarked, which is to say, that its performance was both measurable and comparable across systems.

Beyond 3DBench

Benchmarking Doom was possible due to the same architecture that enabled the game’s multiplayer feature. The engine could redirect control inputs from the renderer to the network—for broadcast to other clients—and also to a file on the disk, called a demo file. During normal playback, the engine would fast forward simulated game states as necessary to compensate for the variable time the system might take to render each frame. As such, a game session could be recorded, copied, and replicated on another machine.67 Doom shipped with three such demos, which looped automatically whenever the game was started. Launching the program with an additional “timedemo” parameter caused the playback to reproduce every state, then print a duration to the console. Though not documented at release time, players discovered this functionality by examining the game’s executable file. Then, in May 1994, Carmack himself explained to exactly what the function did, and how to interpret the output. “This is a real world benchmark,” he stressed, “because it executes EVERYTHING, not just rendering code.”68

Benchmarking software was not a novelty on the PC, nor had Doom’s timedemo feature been intended as more than a simple way to test and debug the program during development. Evaluating computer performance is a notoriously difficult task, however, as it can be extremely precarious to relate the speed of simple operations, like integer addition or memory access, to the execution of complex programs, which are, in reality, subject to stalling, bottlenecking, cache misses, and a host of other phenomena that cannot be reliably demonstrated with a short code sequence. More than a delicacy for engineers, the subtleties of these so-called synthetic benchmarks are thoroughly exploitable for less-than-honest purposes, such as marketing. A further, “major drawback of synthetic benchmarks,” according to David A. Patterson and John L. Hennessy, two prominent computer architects, “is that no user would ever run a synthetic benchmark as an application, because these programs don’t compute anything a user would find remotely interesting.”69

By the mid-nineties, however, a number of microcomputer magazines—PC World and PC Magazine, among others—had developed their own real-world benchmark suites for use in product testing. But these tests relied on typical office-productivity software that gave little insight into gaming performance.70 The few efforts at benchmarking in gaming publications amounted to little more than a stopwatch and some manually repeated inputs.71 On Usenet, PC owners had been sharing scores generated by a freeware program called 3DBench. Released by a Berlin-based VR developer, the program circulated prolifically online and via magazine disks. This simple, untextured 3-D animation was seen as a reasonable proxy for the hardcore flight simulators and pilot-themed Star Wars send-ups that dominated the Usenet-based community’s interest.

Good use of 3DBench required some operationalization. Steven Gershenfeld explained that “with the faster computers (486 and up), [a 3DBench score] doesn’t seem to have much meaning in an absolute sense, but it is a good diagnostic to see if your system is set up optimally”:

You see, if you get a score that is lower than everyone else that has the same CPU (Pentium 90, for example), then you know that one of the following is true:

1. Your video card is crap.

2. Your BIOS settings are not set optimally.

3. You have a really bad motherboard (VLB/PCI combo being the worst possibility).

4. You are using an ISA card.72

Submitted scores showed that “on most Pentium 90 CPUs, the 3DBench program will say that you get 83.3 fps. If on your system, you get less, then you know one of the above is true.” So while this number could serve a useful function, its meaning was rather difficult to relate to the experience of running a state-of-the-art game. One of 3DBench’s creators revealed that the code actually inflated the final score by an arbitrary 30 percent factor, a likely placebo.73

For more real-world comparisons, Carmack had suggested favoring Doom’s timedemo results over toy programs like 3DBench. The fully capable shareware version was, after all, free. Results from semiorganized surveys gathered via email, newsgroups, and other online services appeared within a few months, along with casual postings.74 The inspiration for many is discernible as boasting or excitement over the speed of a newly purchased machine. And while the tone of the discourse can scarcely be called courteous, there was a relative lack of complaining directed at id Software, which was, by contrast to the Strike Commander team, revered. Doom’s graphics engine was seen as such an immodest step forward—“a high-altitude, wind-aided, Carl Lewis of a leap ahead,” according to even one of the milder reviews—that players mostly absorbed the blame for perceived deficiencies in performance, the solution to which, in general, was to upgrade their systems.75

The prospect of measuring Doom’s performance across platforms captured some quasi-academic attention as well. For development systems, id Software had used professional-grade UNIX workstations, rather than PCs. The commercial MS-DOS release was technically, therefore, a concurrent port of the in-house development version. Since the code had been written with portability in mind, it was relatively straightforward for id, with some trusted partners, to get Doom running on a small fleet of alternate operating systems and microarchitectures.76 Some of these platforms, like Silicon Graphics’ IRIX and Sun Microsystem’s Solaris, were practically unknown outside of industry and academia.

A hot game was a novel point of comparison for these high-end workstations, which otherwise differed significantly in their design and compatibility with various software. The professional community that evaluated these systems was aware that many performance measures, such as the ubiquitous Whetstone and Dhrystone tests, had been compromised by their own popularity. System vendors had discovered too many score-boosting tricks that did not necessarily improve performance in meaningful ways. In 1988, the high-performance computing industry created the Standard Performance Evaluation Corporation, or SPEC, a nonprofit, ostensibly neutral organization that curated a suite of real-world benchmarks with science and engineering usage in mind.77 It was, however, proprietary; SPEC data cost money, and users could not, in any case, obtain results from their own systems. According to one computational chemist, “I simply don’t trust results that I can’t get from a list to which random people with actual production machines can submit results. Vendors, by themselves, will cheat and do cheat.”78 For one Intel engineer, “the benchmark the world really needs is Doommarks.”79

To this end, in August 1995, Anton Ertl, a compiler specialist at the Vienna Institute of Technology, began soliciting submissions for Doom benchmark figures from his professional network, which he collected on his website. “Let’s face it,” he quipped, “the only thing we need fast PCs for are games :-).”80 The smiley gestured to the seemingly sordid fact that even academics, researchers, and other well-trained technicians wanted to play Doom—especially at work. The game’s vaunted death-match feature, whereby up to four players could engage in competitive free-for-alls, relied on an IPX network of the kind that was common in high-technology workplaces, and almost nowhere else. The Intel engineer again: “My wife bought a Pentium-processor based system because she liked running Doom so much on mine, felt guilty about hogging my machine, and wanted to try two-player mode!!”81

The Doom benchmark list was a passing fancy for Ertl, who compiled hundreds of responses over two decades, mostly on PC, though spread across DOS, Windows 95, Windows NT, OS/2, and Linux, as well as for a handful of specialty products like a $25,000 SGI Indigo2.82 For several years, the game was still seen as a good indicator of integer performance and often factored into online discussions of operating systems and microarchitectures. When Dan Hildebrand had an exciting revelation to make on behalf of QNX, a UNIX-derived industrial system, he did so, cheekily, as such: “A benchmark that has recently become increasingly significant is the Doomstone”—a pun on the aging Dhrystone metric. “In order to facilitate the use of this important benchmark during performance evaluations of QNX, we are pleased to announce that a native port of Doom to QNX is now available.”83

Cyrix, Quake, and the Floating-Point Unit

Because of the program’s complexity, it was infeasible to “cheat” a Doom timedemo. Circuit designers could optimize for it, but doing so would also improve the performance of general integer-based calculations, and thus benefit many other types of application as well. In a bid to undercut Intel with budget-conscious, upgrade-friendly microprocessors, Dallas-based Cyrix Corporation adopted a design strategy that relied on heavily optimized integer arithmetic. Cyrix had first gained favor in 1992 with a popular conversion chip: a 486-like microprocessor that fit into the same socket as the older 386, thereby saving the cost of a new mainboard, if not an entirely new machine. In 1995, the feat was repeated by the Cyrix 5x86, which could similarly replace an existing 486 chip with a new one offering Pentium-class performance in some cases.84 At Doom, it screamed: Ertl’s data show the 5x86-120 (suggested retail: $160) keeping pace with Intel parts that sold for more than twice as much money.

Manufacturing problems delayed Cyrix’s 6x86 line of microprocessors until 1996. Rather than a stopgap, the new chip was positioned directly against the name-brand Pentium, with an aggressive value pitch.85 Conventional testing largely validated this, with the seemingly minor caveat that “computer-aided design and some 3-D applications, however, will fall short of Pentium performance … a weakness shared by all the Pentium alternatives.”86 Intel’s pricing could not be challenged without sacrifices, and Cyrix had reasonably chosen to economize the floating-point unit, or FPU, the circuit that handled numerical approximations. Consumer software did not much need it, after all. Not even Doom.87

Indeed, for most PC players, floating-point performance was an esoteric topic and a cause of much confusion. “I’m on the verge of buying a new motherboard/CPU combination,” queried Willis Yonker: “My current dilemma is that I don’t know how important FP operations are to me. Mainly I use my computer for light business apps (Excel, MS Word, etc.) and games. It’s the games I’m really worried about. I like sims like MechWarrior and Wing Commander. Do such games use a lot of FP operations? If they do, I’ll guess I’ll have to fork over the extra $100 for the Intel.”88 The answers given in response to questions about floating-point units were often contradictory or wrong, but their equivocation spoke to an undeniable anxiety among PC owners: concepts that were hazy now could always turn out to be important later. And so it went with the Cyrix 6x86 microprocessor when, in late February 1996, id Software launched QTest, a stripped-down version of the unreleased Quake, intended to test the game’s TCP/IP networking code.

In development since early 1994, Quake radically exceeded the earlier Doom engine, with a true 3-D perspective that allowed for vastly more complex environments. With improved tools, id’s now-experienced designers created a beguiling, albeit weird, pastiche of sci-fi, fantasy, and cosmic-horror elements. Quake had, in fact, been something of a rudderless project, riven by interpersonal turmoil, and less than inspired in its retreading of the first-person shooter formula.89 Compared to Doom, however, Quake’s peripatetic spaces rewarded momentum over rhythmic shots and dodges, showcasing the power of the latest PC hardware. Players eager to deduce from QTest how the final game would run on their systems discovered the built-in “timerefresh” command, which could substitute for a proper timedemo.90

Amid the obligatory trash talk (“Pentiums suck 486s suck Cyrix 6x86 P120+ rules!”), some argued the existence of a genuine anomaly with the Cyrix chip.91 According to Vareck Bostrom, “running timerefresh as prescribed by the Quake Stomping Grounds benchmark list … I get these scores”:

6x86-100: 21 fps

6x86-120: 26 fps (overclocked, 60x2)

Pentium-133: 46 fps

The gap was huge: a factor of two—the difference between a game that ran well and one that might not have been behaving as intended. “I’ve been in some discussion with various people on the net, and the closest we can figure is that Quake must use the FPU for geometry calculations, and the 6x86 FPU is not terribly good, it seems.”92 Indeed, when Quake’s commercial release followed in June 1996, it listed an Intel Pentium processor as an official requirement.

“Why?” asked Achmed Rahhiim. “Is it possible that id purposely added instructions to Quake which would botch on processors other than Intel Pentium?”93 For a time, such speculation was rampant. Professional reviews—not to mention Cyrix’s marketing material—had shown the 6x86 besting rival products at common benchmarks like Winstone, Landmark, even Doom. But “benchmarks don’t lie,” replied David Matiskella. “It’s just that people don’t know how to interpret them.”94 None of these benchmarks had stressed the chip’s floating-point unit.

Rendering a true three-dimensional perspective generally requires the accuracy of floating-point numbers, where rounding errors would result in visual artifacts. But to get Quake’s graphics running smoothly, id had brought in Michael Abrash, the well-published optimization guru, to hand-tune a small number of performance-critical subroutines. His solutions made clever use of the Pentium’s FPU pipelining, that is, its ability to do other work while waiting on the result of a lengthy floating-point calculation. In April, Abrash had given a talk at the 1996 Computer Game Developers Conference, explaining exactly how Quake had exploited the Pentium’s instruction pipeline in ways that could not be replicated on other microarchitectures.95 “Of course they will heavily optimize their routines for the most likely system to run the game,” Kenneth Hwang cautioned on Usenet. “Let’s not get carried away with inane talk of sabotage and conspiracy.”96

But the consumer experience is ultimately an emotional one, and bitterness lingered among those who felt cheated by their choice of microprocessor. “ID YOU SUCK!!!!” proclaimed someone posting under the name of Beavis. “I keep hearing others spam about Quake being Pentium optimized,” which they took as an excuse for “programmers [who] write crappy loose code.”97 Beavis, proving that some wounds never heal, wished that id would “fall on their face with this one much like Origin did with Strike Commander”—more than three years prior. By and large, however, it was Cyrix’s reputation that suffered, not id’s. “How many more messages do we got to read that just flames Cyrix and the 6x86?? I’m tired of it. I and many other people have learned that the 6x86 is a flawed CPU and Cyrix rushed it just to compete with the Pentium and we have seen all the shit quality they did by doing this.”98 Allan Cole likewise believed that “saying that Quake is poorly done is [emphasis in original] an ill-informed opinion … Some Cyrix users need to face-up to the fact that the Cyrix 686 is weaker in FPU performance than the Pentium and there is no excuse for it other than it is so. It is not a flaw in a game or application for using the FPU, and this is a trade off Cyrix 686 owners make.”99 Critically, for Cole, as for many others, “Quake is where it is at for multiplayer gaming, and I would never consider a Cyrix because of it.”

The point of multiplayer competition bears underscoring. Where Doom had been bound mostly to local networks, Quake could be played across the exponentially growing internet. The locus of gaming discourse, moreover, had shifted from a fragmentary constellation of newsgroups, BBSes, and online services to the World Wide Web, where popular hosts like Planet Quake amplified a handful of celebrity “Quakers.”100 For technical insight, one might consult DaKoTa’s Quake Newbie Guide, Zanshin’s GLQuake Dojo, or any number of gutter-mouthed clan pages, instead of writing to a magazine.101 Although crowdsourced FAQ sheets had been passed around pre-web venues for years, this new, more personified genre offered the chance to emulate a role model, while also heightening the sense of a system-performance arms race.

Cyrix’s ironically Doom-optimized products had clearly fallen behind. Robert Osorio, a.k.a. the Flying Penguin, considered “the minimum frame-rate for Quake solo play to be 20 fps, and the minimum for netplay to be 25 fps,” as of December 1997. These numbers are astounding compared to what was considered acceptable at the time of the game’s release. “This is only a minimum, however. Netplay, in particular, improves dramatically with a faster frame-rate, and for serious competitive netplay you need at least 35 fps.”102 The path to achieving such performance was rather narrow, with players steering one another toward fast Pentiums. The 6x86MX, Cyrix’s answer to the Pentium II, likewise found its weak points emphasized by Thomas Pabst, the German physician-turned-editor of Tom’s Hardware, an ultraenthusiast web outlet that pioneered the exhausting 3-D benchmark roundup as a review style.103 For AnandTech, and the similarly pathbreaking Anand Lal Shimpi—who was, at the time, fourteen years old—the terms were even starker. “Ugh!!!” concluded his review. “That’s all I need to say about the 6x86MX’s Quake performance.”104

* * *

It is implausible, as industry legend has it, that Quake single-handedly put Cyrix out of business.105 Fellow Intel-rival AMD managed to survive the weak showing of its K6 family of microprocessors on gaming benchmark sites. But Quake would be the last major PC release to catch hardware vendors completely at a loss; by the late nineties, the financial stakes simply could not be overlooked. Chipmakers would thenceforth become active participants in the game-development process, partnering with key studios to ensure the latest titles did not embarrass their products—or, even better, outshine their competition’s. With 1997’s Quake II, for instance, AMD was able to mitigate its FPU liability—at least in marketing terms—by developing its own patch, under license from id Software.106

As for players, it was a time for learning to cope with new frustrations. The tedious boot disks, recondite memory management, and crude pin jumpers of the MS-DOS era had mostly vanished. In their place came a parade of software packages clamoring to be kept up to date and out of conflict: game patches, display drivers, chipset drivers, runtime redistributables, service packs, and all the delights of Microsoft Windows.107 Taking the slack from these industrial misfires, however, was always the ability to launch a game, look up benchmarks online, and ask, “Is my system performing as it should?”


On a Wednesday morning in October 1996, Jon Nisbett took delivery of a package he had had shipped overnight. It contained an Integraph Reactor, an early 3-D accelerator card equipped with the Rendition Vérité V1000 chip. After finishing breakfast, he immediately set to trying it out on new games like IndyCar Racing II (Payprus Design, 1995). “WOW!!!!” he shouted on “That’s the only way I can describe this to you guys who haven’t seen the Rendition-enhanced IndyCar2 in action.” As he went on to prove, however, there were, in fact, many ways to describe it: “Now, I’ve played the original IndyCar2, and I thought it was rather plain and unattractive. Having said that, the Rendition-enhanced version is the most beautiful game I have EVER seen! It even blew me away more than Quake! In SVGA, with all textures on and everything else set in order to slow the game down the most, the framerate counter consistently stayed above 30 fps! On my P90!!! It’s SO smooth and so beautiful … It almost brings a tear to my eye that my puny P90 is capable of such things .”108 The online forums of old are filled with these little exclamations. Although this paper has focused on the more inflammatory points of friction—as did the discourse itself—PC enthusiasts were not without their sense of the sublime. It was a sublimity of sound and images, for certain, but also a satisfaction at the abstract virtuosity of a well-ordered machine.109

The first section of this article suggested an ethnographic analogy between this form of enthusiasm and hot rodding in the automotive case. In both communities, members do not merely fix their possessions when they break but adopt new terms of operation that enlarge, reinforce, and even subvert the designs of the manufacturer. The analogy became strained, however, especially in the second section, by its assimilation into a bracingly extravagant mode of consumption, which leveraged enthusiasm to push cutting-edge products that sometimes cut very deeply indeed. What emerged in the third section was the benchmark as a moral center for PC gaming’s obtuse power network—its intersubjectivity of faults and credits. By and large, good behavior is good frame rates for players, vendors, and game makers alike.

Beyond good behavior, the intersubjectivity of frame rate was key in this community’s evolution of good taste. Doom, in itself, did not transform the PC from a varsity hobby into a bloody riot. It was also in the way Doom activated the values of owning a PC, a tricky, recalcitrant device, which could nonetheless reward the owner’s investment with benchmarkable feats of computation. “Taste communities coalesce around practices like that,” notes Steven Shapin, “practices that refer to mutually accessible external properties as the causes of internal states.”110 Within a matter of months, even the 30-fps figure that so excited Nisbett in 1996 would look entirely pedestrian compared to the triple-digit results pumped through high-end graphics chips. “I have hardly ever tested any computer hardware product that impressed me as much as this [Voodoo2-powered] graphics board,” wrote Thomas Pabst, of Tom’s Hardware, in February 1998. “I can even go as far as saying that everyone who wants to play 3D games and doesn’t get a Voodoo2 board has to be considered as seriously misled.”111

A taste for high performance remains one of the PC’s marks of distinction to this day. It is a part of its gamer habitus—its “feel for the game”—and a weapon of symbolic violence.112 Everyone knows that a PC game should run at 60 fps, including those who otherwise reject the elitism of the hot-rodding hardcore. Even so, neither game publishers nor hardware vendors seem any more averse to debacles today than they were thirty years ago, a stubbornness that suggests great stability in the equilibrium between player enthusiasm and executive mismanagement. Fine taste can be flaunted to get a game patched or a product replaced, but it also enables a cycle that burdens the best-intentioned, and the least praised, among us.. This cycle has since grown to aggrieve other video game systems, while on the PC, it is still a barrier to new players and a source of frustration for even the most committed ones.113

All this would seem a grave price for a splendid moment, the strange pleasure of obstreperous systems coming fitfully together in our hands. And yet, whole communities gather in admiration of machine performance—strive for it, afflict themselves and others over it, and engage with forces very much stronger than they are in shaping, exploiting, and defending it. We may lament this complicity in rank consumerism, but at the same time, the urge to care for our possessions—to remake these alienating technological products as best we can—does not always spring from goodness, necessity, or the rational optimum. Whether for cars, stereos, or computer games, the connoisseurship of high performance reaches beyond purpose, content, and even cultural politics and into what we might dare to call a genuine love for the machine.


1. ^ Graphics are one of multiple interdependent subsystems that contribute to a game’s sense of interactivity, the perception of which is ultimately many factored and psychological; see Lori Landay, “Interactivity,” in The Routledge Companion to Video Game Studies, ed. Mark J. P. Wolf and Bernard Perron (London: Routledge, 2014), 173–84. Nevertheless, simple cybernetic models are central to how designers and many players understand games to work. See, for example, Steve Swink, Game Feel: A Game Designer’s Guide to Virtual Sensation (San Francisco: Morgan Kaufmann, 2009), chap. 3.

2. ^ Marketing claims to the contrary, cognitive science is equivocal on the effects of high frame rates. Displays faster than 60 Hz can likely be perceived, but the brain’s ability to process visual information is probably much slower. Alex Wiltshire, “How Many Frames Per Second Can the Human Eye Really See?,” PC Gamer, May 6, 2022,

3. ^ While hardly a form of activism, these backlashes do function as a sort of market device, intervening in the valuation of a commodity to force corrections in producer behavior. Sophie Dubuisson-Quellier, “From Moral Concerns to Market Values: How Political Consumerism Shapes Markets,” in The Oxford Handbook of Political Consumerism, ed. Magnus Boström, Michele Micheletti, and Peter Oosterveer (Oxford: Oxford University Press, 2019), 813–32.

4. ^ Stephen Graham and Nigel Thrift, “Out of Order: Understanding Repair and Maintenance,” Theory, Culture & Society 24, no. 3 (May 2007): 1–25,

5. ^ David J. Denis, “Why Do Maintenance and Repair Matter?,” in The Routledge Companion to Actor-Network Theory, ed. Anders Bok, Ignacio Faras, and Celia Roberts (London: Routledge, 2019), 285.

6. ^ For a mostly readable discussion on computer performance, see David A. Patterson and John L. Hennessy, Computer Organization and Design: The Hardware/Software Interface (San Mateo, CA: Morgan Kaufmann, 1994), chap. 2. Subsequent editions divide up this material differently.

7. ^ Nathan Wainstein has argued that glitches resist critical appreciation because they lack any apparent intentionality and thus strike us as moments of naked failure. Wainstein, “Bugs and Features: On Video Game Glitches and Interpretation,” Los Angeles Review of Books, March 27, 2021,

8. ^ David N. Lucsko, The Business of Speed: The Hot Rod Industry in America, 1910–1990 (Baltimore, MD: Johns Hopkins University Press, 2008), 5. Other literature on this topic includes Robert C. Post, High Performance: The Culture and Technology of Drag Racing, 1950–2000 (Baltimore, MD: Johns Hopkins University Press, 2001); and H. F. Moorhouse, Driving Ambitions: A Social Analysis of American Hot Rod Enthusiasm (Manchester, UK: Manchester University Press, 1991).

9. ^ Lucsko, The Business of Speed, 42.

10. ^ For further elaboration, see also the Lucsko’s subsequent Junkyards, Gearheads, and Rust: Salvaging the Automotive Past (Baltimore, MD: Johns Hopkins University Press, 2016).

11. ^ Lucksko, The Business of Speed, 23.

12. ^ Indeed, as mobile devices have subsumed many of the PC’s traditional roles, the enthusiast segment has grown to claim much of the remaining retail market. For example, see Dylan Martin, “As PC Decline Continues, Systems Builders Find Bright Spot in Growing Enthusiast Market,” CRN Magazine, April 12, 2018,

13. ^ Ben “Yahtzee” Croshaw, “The Glorious PC Gaming Master Race,” Escapist, May 28, 2013, The term now serves, for instance, as the name of the community’s home on Reddit: Tyler Wilde, “Let’s Stop Calling Ourselves the ‘PC Master Race,’” PC Gamer, September 26, 2021,

14. ^ Darold Higa, “Walled Gardens Versus the Wild West,” Computer 41, no. 10 (October 2008): 102–5, This paper will use PC to refer specifically to the technical lineage of the IBM Personal Computer, thus excluding other families of personal computer, such as the Commodore Amiga or Apple Macintosh, but compare: James Sumner, “What Makes a PC? Thoughts on Computing Platforms, Standards, and Compatibility,” IEEE Annals of the History of Computing 29, no. 2 (April–June 2007): 87–88,

15. ^ For a discussion of related business strategy, see Thomas R. Eisenmann, Geoffrey Parker, and Marshall Van Alstyne, “Opening Platforms: How, When and Why?,” in Platforms, Markets, and Innovation, ed. Annabelle Gawer (Cheltenham, UK: Edward Elgar, 2009), 131–63. Dominic Arsenault, Super Power, Spoony Bards, and Silverware: The Super Nintendo Entertainment System (Cambridge, MA: MIT Press, 2017), critiques Nintendo’s platform strategy, which is arguably the archetype for the entire industry.

16. ^ Jason Dedrick and Kenneth L. Kraemer, “Market Making in the Personal Computer Industry,” in The Market Makers: How Retailers Are Reshaping the Global Economy, ed. Gary G. Hamilton, Misha Petrovic, and Benjamin Senauer (Oxford: Oxford University Press, 2011), 291–310; John Hagedoorn, Elias Carayannis, and Jeffrey Alexander, “Strange Bedfellows in the Personal Computer Industry: Technology Alliances between IBM and Apple,” Research Policy 30, no. 5 (May 2001): 837–49,; and Peter Grindley, Standards, Strategy, and Policy: Cases and Stories (Oxford: Oxford University Press, 1995), esp. chap. 6.

17. ^ On microcomputer kits, the design of the IBM PC, and the rise of PC compatibles, see the relevant passages in Thomas Haigh and Paul E. Ceruzzi, A New History of Modern Computing (Cambridge, MA: MIT Press, 2021), chaps. 7–8, 10, and their associated references.

18. ^ Scott Mueller, Upgrading and Repairing PCs (Carmel, IN: Que, 1988), and its subsequent editions, offers detailed descriptions of how the IBM PC and its contemporaries were assembled, serviced, and configured. Ben Hardwidge, ed., Retrograde: The Ultimate Guide to Pre-Millennial PC Hardware (Cambridge, UK: Raspberry Pi Foundation, 2022), a special publication of Custom PC magazine, contains illustrated, essay-style discussions of many components referenced in this paper, particularly microprocessors.

19. ^ Christina Erskine, “The No-Nonsense Guide to Buying a Games PC,” PC Review, no. 3 (January 1992): 22, Internet Archive,

20. ^ Erskine, “The No-Nonsense Guide to Buying a Games PC,” 24.

21. ^ Yorkie [pseud.] ([email protected]), “PCs are awful (sometimes),” Usenet:, April 14, 1993, 13:13:03 UTC. As of this writing, all Usenet sources in this paper can be retrieved by searching the Google Groups archive for the specified newsgroup, e.g. for Over time, however, Google has proven itself to be an unreliable steward of Usenet data, but more future-minded backups, such as by the Internet Archive, are currently very far from comprehensive. One tranche containing many of the sources cited here can be found at, albeit in a raw form unusable except with special tools.

22. ^ Tim Victor, “DOS for Gamers: Memory Management Made Easy,” PC Gamer 2, no. 1 (January 1995): 64–70,, illustrates how system memory had to be configured under MS-DOS. This abstruse necessity persisted due to PCs’ long legacy of backward compatibility; see Ross Nelson, “The IBM PC Programming Architecture,” in Extending DOS: Programming MS-DOS for the 1990s, ed. Ray Duncan (Reading, MA: Addison–Wesley, 1990), 1–30.

23. ^ Compare Jimmy Maher, The Future Was Here: The Commodore Amiga (Cambridge, MA: MIT Press, 2012), esp. chap. 9. It bears remarking that the IBM PC arrived four years after the first wave of home computers, including Commodore’s earlier PET, and did not begin to outsell its less expensive rivals until the late 1980s. Jeremy Reimer, “Total Share: 30 Years of Personal Computer Market Share Figures,” Ars Technica, December 15, 2005,

24. ^ Mike Weksler and Joe McGee, “CGW Sound Card Survey,” Computer Gaming World, no. 111 (October 1993): 76, Internet Archive, Computer_Gaming_World_Issue_111/.

25. ^ Weksler and McGee, “CGW Sound Card Survey,” 83.

26. ^ Charles Brannon, “Making the Upgrade: How to Boost Your Game System,” PC Gamer 2, no. 7 (July 1995): 66, Internet Archive,

27. ^ It has been proposed by numerous commentators, from Karl Marx to modern marketing theory, that consumer goods possess powerful symbolic meanings that must in some way be reconciled with the owner’s values and social identity. Roberta Sassatelli, Consumer Culture: History, Theory, and Politics (London: Sage, 2007), chaps. 3, 5. Of particular relevance is Grant McCracken’s theory of grooming rituals, in which acts such as maintenance are performed as a means of taking emotional possession of the object. McCracken, Culture and Consumption: New Approaches to the Symbolic Character of Consumer Goods and Activities (Bloomington: Indiana University Press, 1988), 83–88.

28. ^ While a PC game box usually came printed or stickered with official system requirements, these listings were nonstandard, confusing, and notoriously untrustworthy, e.g., LGR, “System Requirements Stickers,” May 3, 2013, YouTube video, 19:28,

29. ^ Some esoteric uses aside, the 8- and 16-bit generations of home consoles ordinarily did not have enough video memory to buffer a full frame. Instead, most games relied on sprites that refreshed during the screen’s blanking periods, when the beam shut off to reposition. Nathan Altice, I Am Error: The Nintendo Family Computer (Cambridge, MA: MIT Press, 2015), 27–51. As such, these games effectively ran locked at 50 or 60 Hz, depending on the territory.

30. ^ The curious may consult the preliminary chapters of Richard F. Ferraro, Programmer’s Guide to the EGA, VGA, and Super VGA Graphics Cards, 3rd ed. (Reading, MA: Addison–Wesley, 1994); or see Fabien Sanglard, Game Engine Black Book: Wolfenstein 3D, rev. 2.2 (self-pub., 2018), as a complete example of how PC games of this era controlled the display adapter.

31. ^ Computer-game developers did not reflect profusely on their methods, protecting them as secrets of their craft. One critic of this taciturnity was Michael Abrash, who contributed a series of game-programming articles to Dr. Dobb’s Journal for more than a decade. Much of Abrash’s writing was later collected in Graphics Programming Black Book (Albany, NY: Coriolis, 1997). André LaMothe, et al., Tricks of the Game-Programming Gurus (Indianapolis: Sams, 1994), was another well-read resource of the time; see chap. 12 esp. on timing and loops.

32. ^ The performance of these and similar titles are frequent topics of discussion on the Very Old Games on New Systems (VOGONS) web forum,, which also serves as the public face of the DOSBox emulation project.

33. ^ William John Baker, Jr. ([email protected]), “Too Damn Fast! (WC2, WC, Falcon),” Usenet:, June 8, 1992, 01:39:31 UTC.

34. ^ Norm MacNeil ([email protected]), “Re: Slowing Down Wing Commander,” Usenet:, May 21, 1992, 17:15:35 UTC.

35. ^ The author still maintains some background information on his website: David Perrell, “Mo’Slo Computer Slowdown Software,” Hearn Perrell Art Associates,, last updated 2015.

36. ^ “Highlights from the 1991 Computer Game Developers’ Conference,” Computer Gaming World, no. 83 (June 1991): 45, Internet Archive,

37. ^ Everett M. Rogers, Diffusion of Innovations (Glencoe, IL: Free Press, 1962). Though developed initially to explain the uptake of seed cultivars by rural farmers, Rogers’s model was applied liberally, albeit often unattributed, in the Silicon Valley–connected business literature; see, e.g., Geoffrey A. Moore, Crossing the Chasm: Marketing and Selling Technology Products to Mainstream Customers (New York: Harper, 1991). On McKenna’s consulting activities, see Margaret O’Mara, The Code: Silicon Valley and the Remaking of America (New York: Penguin, 2019).

38. ^ On IBM’s declining fortunes in the PC market, see James W. Cortada, IBM: The Rise and Fall and Reinvention of a Global Icon (Cambridge, MA: MIT Press, 2019), chap. 14.

39. ^ Ed Foster, “Users May Be Setting Your Operating System Standard For You—at Home,” InfoWorld, May 8, 1995, 52, Google Books,

40. ^ Transcribed from Frank Savage, “Strike Commander,” interview by Fabien Sanglard, December 3, 2019, Fabien Sanglard’s Website, MP3 audio, 1:11:32,

41. ^ Michael Kloss ([email protected]), “Strike Commander ... why I’m angry,” Usenet:, April 19, 1993, 15:09:58 UTC.

42. ^ Daniel Starr ([email protected]), “Strike Commander—here's some frame rate data &c,” Usenet:, April 13, 1993, 17:16:46 UTC.

43. ^ Ross Erickson ([email protected]), “Strike Commander—I’m playing it!,” Usenet:, April 13, 1993, 22:04:42 UTC; and Richard M. Hartman (hartmann@ ulogic.UUCP), “Re: SC? I think it’s worth it,” Usenet:, April 27, 1993, 21:07:15 UTC.

44. ^ Jude M. Greer ([email protected]), “Re: Strike Commander? Is it worth it?,” Usenet:, April 26, 1993, 03:28:43 UTC.

45. ^ Anthony ([email protected]), “Strike Commander and all the bitching,” Usenet:, April 26, 1993, 17:21:37 UTC.

46. ^ Jimmy Maher, “Origin Sells Out,” The Digital Antiquarian (blog), September 6, 2019,

47. ^ Dave Chaloux ([email protected]), “Strike Commander—another view,” Usenet:, April 29, 1993, 21:20:37 UTC.

48. ^ Xiangxin Shen ([email protected]), “Re: Strike Commander on 486/25,” Usenet:, April 25, 1993, 19:02:14 UTC.

49. ^ Corey Turner, “Re: Pacific Strike Speech Pak—Wow!,” Usenet:, April 4, 1994, 21:20:28 UTC.

50. ^ Tom “KC” Basham, “Engine Failure? Origin’s Pacific Strike Sputters after Takeoff. Can It Recover?,” Computer Gaming World, no. 121 (August 1994): 96, Internet Archive,

51. ^ Basham, “Engine Failure?,” 96.

52. ^ Greg Cisko, “Pacific Strike and Pacific Strike Speech Pack,” Game Bytes, no. 20 (Summer 1994), MS-DOS computer program, Internet Archive,

53. ^ David A. Masten ([email protected]), “Re: Pacific Strike—Speech Pause Solution,” Usenet:, May 16, 1994, 02:47:48 UTC.

54. ^ Dave Franta ([email protected]), “Origin apology letter here,” Usenet:, June 10, 1994, 21:18:56 UTC. For more discussion, see Bandit LOAF, post to “All about Origin’s CD-ROM editions,” Wing Commander Combat Information Center, February 2, 2019,

55. ^ Bernard Dy, “Pacific Strike: Grounded!,” PC Combat Simulations 2, no. 5 (November–December 1994): 6, Internet Archive,

56. ^ Aubrey Chen ([email protected]), “Wing Commander III: Frame Rates?,” Usenet:, October 16, 1994, 14:45:32 UTC.

57. ^ Mike C. [pseud.], letter to the editor, Game Bytes, no. 21 (Fall 1994), MS-DOS computer program, Internet Archive,

58. ^ See any number of Jimmy Maher’s writings on Origin projects for the Digital Antiquarian blog:, accessed November 15, 2022. Compare Brad King and John Borland, Dungeons and Dreamers: The Rise of Computer Game Culture from Geek to Chic (Emeryville, CA: McGraw–Hill, 2003), where crunch is taken as evidence of the company’s passion.

59. ^ Warren Specter, interviewed in Origin Systems, Official Guide to “Wings of Glory” (Indianapolis: Brady, 1995), 139, Internet Archive,

60. ^ The game’s performance was reviewed quite favorably by Martin E. Cirulis, “Courage and Canvas: Relive the Youth of Air Combat in Origin’s Wings of Glory,” Computer Gaming World, no. 143 (April 1995): 142–46, Internet Archive, As noted previously, the 16-bit legacy of MS-DOS created numerous difficulties for 32-bit memory addressing on 80386 and later microprocessors. This gap was eventually bridged by so-called DOS extenders, but game programmers still preferred the familiarity of 16-bit real mode well into the 1990s. M. Steven Baker and Andrew Schulman, “80386-Based Protected Mode DOS Extenders,” in Duncan, Extending DOS, 193–257.

61. ^ Sean Connery [pseud.?] (bo…, “Re: Pacific Strike works under OS/2? Or Origin still bites?” Usenet:, May 13, 1994, 16:10:24 UTC; with replies by Bill Armintrout ([email protected]), May 16, 1994, 12:34:37 UTC; and Gary Cooper ([email protected]), May 16, 1994, 18:25:25 UTC.

62. ^ “Game Commander: The Hyper Interview with Chris Roberts,” Hyper, no. 19 (June 1995): 24, Internet Archive, It is worth noting the connection to Roberts’s later reputation for outright hucksterism. Matt Perez and Nathan Vardi, “The Saga of ‘Star Citizen,’ a Video Game That Raised $300 Million—but May Never Be Ready to Play,” Forbes, May 1, 2019,

63. ^ Origin inflicted this pattern largely on itself, though it is worth noting further amplification by Electronic Arts, Origin’s corporate parent from 1992 to its closing in 2004, where the process has become industrial practice. Jason Schreier, Blood, Sweat, and Pixels: The Triumphant, Turbulent Stories Behind How Video Games Are Made (New York: Harper, 2017), chap. 6.

64. ^ Johnny L. Wilson, editorial, “The Sub-Standard in Computer Software,” Computer Gaming World, no. 113 (December 1993): 10, Internet Archive,

65. ^ Jonathan Mendoza, The Official Doom Survivor’s Strategies and Secrets (San Francisco: Sybex, 1994), 244. While Carmack’s engineering sense was not altogether revolutionary, the way he structured Doom’s code would itself reorganize video game production. Henry Lowood, “Game Engine,” in Debugging Game History: A Critical Lexicon, ed. Henry Lowood and Raiford Guins (Cambridge, MA: MIT Press, 2016), 203–9.

66. ^ Jay Wilbur ([email protected]), “Official Doom information,” Usenet:, November 5, 1992, 22:11:37 UTC.

67. ^ Following Fabien Sanglard, Game Engine Black Book: Doom, rev. 1.2 (self-pub., 2018), 248–50.

68. ^ John Carmack ([email protected]), “3DBench? Why not Doom?,” Usenet:, May 27, 1994, 11:34:07 UTC.

69. ^ Patterson and Hennessy, Computer Organization and Design, 73–74.

70. ^ Mark L. Van Name and Bill Catchings, “Understanding PC Labs’ New Benchmark Tests,” PC Magazine12, no. 21 (December 7, 1993): 391–406, Google Books,, is one of many descriptions of consumer-testing methodology.

71. ^ Paul C. Schuytema, “In Search of the Ultimate Game Machine: CGW Explores the Mystery of Hardware and Compares Two ‘GameFrames,’” Computer Gaming World, no. 113 (December 1993): 83–85, Internet Archive,

72. ^ Stephen Gershenfeld ([email protected]), “3D bench. Is it reliable???,” Usenet:, August 16, 1995.

73. ^ Michael Adrian ([email protected]), “Re: 3DBench replacement: Needs Support!,” Usenet:, April 10, 1996.

74. ^ Eike Mueller ([email protected]), “Doom benchmark list,” Usenet:, March 12, 1995, 15:06:12 UTC; and Scott Coleman ([email protected]), “Doom timedemo benchmark scores,” Usenet:, September 7, 1994, 09:56:36 UTC.

75. ^ Chris Lombardi, “They’re Going to Hell for This One! The Bad Boys at ID Software Take Their 3-D Engine to New Depths,” Computer Gaming World, no. 108 (July 1993): 104, Internet Archive,

76. ^ Sanglard, Game Engine Black Book: Doom, chap. 3. These versions are not be confused with Doom’s commercial ports to video game consoles, such as the Atari Jaguar, which did require significant reworking.

77. ^ Kaivalya M. Dixit, “Overview of the SPEC Benchmarks,” in The Benchmark Handbook for Database and Transaction Systems, ed. Jim Grey, 2nd ed. (San Mateo: Morgan Kaufmann, 1993), 489–521.

78. ^ J. D. McDonald ([email protected]), “Re: Summary: Alpha Performance,” Usenet: comp.benchmarks, June 30, 1993, 14:12:17 UTC.

79. ^ Dennis O’Connor ([email protected]), “Re: PRESS RELEASE->Power Macintosh Outperforms Fastest Pentium Systems,” Usenet: comp.benchmarks, August 1, 1994, 11:00:13 UTC.

80. ^ Anton Ertl ([email protected]), “System benchmarking with Doom. Please participate!,” Usenet: comp.benchmarks, August 11, 1995.

81. ^ Dennis O’Connor ([email protected]), “Re: PRESS RELEASE->Power Macintosh Outperforms Fastest Pentium Systems,” Usenet: comp.benchmarks, August 2, 1994, 06:12:13 UTC.

82. ^ This data was still being updated as recently as 2013. Anton Ertl, “Doom benchmark results,” Vienna Institute of Technology, Institute for Information Systems Engineering, Compilers and Languages Working Group,, accessed November 15, 2022.

83. ^ Dan Hildebrand ([email protected]), “Doom release for QNX,” Usenet: comp.benchmarks, September 13, 1994, 09:39:51 UTC.

84. ^ Tom R. Halfhill, “New 486 Chips Deliver Inexpensive Power,” Byte 20, no. 9 (September 1995): 30, Internet Archive,

85. ^ Dick Pountain and Tom R. Halfhill, “CPU Scorecards: Cyrix,” Byte 20, no. 11 (November 1995): 182, Internet Archive,

86. ^ Michael Slater, “Beyond the Pentium,” PC Magazine 15, no. 10 (May 28, 1996): 106, Google Books,; see also Tom R. Halfhill, “The x86 Gets Faster with Age,” Byte 21, no. 11 (November 1996): 89–104, Internet Archive,

87. ^ See Paul Hsieh, “x86 6th Generation CPU Comparisons,”, last updated July 10, 1999.

88. ^ Willis Yonker ([email protected]), “Re: Cyrix 686 floating point performance,” Usenet: comp.benchmarks, February 10, 1996.

89. ^ While former team members have made numerous statements over the years, none have substantially challenged the account in David Kushner, Masters of Doom: How Two Guys Created an Empire and Transformed Pop Culture (New York: Random House, 2003), chaps. 11–12.

90. ^ The first compiler of these results was Michael Verstichelen, “The Quake Frame Rate Benchmark List,” last updated December 14, 1996, Wayback Machine,

91. ^ Mats Otterskog (mg1…, “Re: id is in fear of Duke Nukem 3D? I don’t think so!,” Usenet:, March 22, 1996.

92. ^ Vareck Bostrom ([email protected]), “Re: id is in fear of Duke Nukem 3D? I don’t think so!” Usenet:, March 27, 1996.

93. ^ Achmed Rahhiim ([email protected]), “READ—The mistruth about Pentiums and fast 486s,” Usenet:, May 21, 1996.

94. ^ David Matiskella, ([email protected]), “READ—The mistruth about Pentiums and fast 486s,” Usenet:, May 25, 1996.

95. ^ The speaking notes were transcribed as Michael Abrash, “CGDC Quake Talk,” The Difference Engine, June 8, 1996, This material was later republished in Abrash, Graphics Programming Black Book, 1167–70.

96. ^ Kenneth Hwang ([email protected]), “Re: READ—The mistruth about Pentiums and fast 486s,” Usenet:, May 25, 1996.

97. ^ Beavis [pseud.?] ([email protected]), “486s suck for Quake and it’s ID’s fault,” Usenet:, March 28, 1996.

98. ^ ZeNTRoN [pseud.] ([email protected]), “Re: WARNING: DON’T BUY CYRIX. READ THIS NEWS!!!,” Usenet:, August 26, 1996. Note that the precise subject contains an immoderate 105 exclamation marks.

99. ^ Allan Cole ([email protected]), “Re: WARNING: DON’T BUY CYRIX. READ THIS NEWS!!!,” Usenet:, September 7, 1996.

100. ^ Compare Kevin Driscoll, The Modem World: A Prehistory of Social Media (New Haven, CT: Yale University Press, 2022).

101. ^ A mirror of the Quake Newbie Guide exists at Darren Tabor and Andrew Wu, “The Gamer’s Guide,” Blue’s News,, last updated October 23, 1998. Many links are still retrievable via the Wayback Machine.

102. ^ Robert Osorio, “Comparison of Frame-Rates in GLQuake Using Voodoo and Voodoo2 3D Cards,” Flying Penguin’s Quake Coop,, last updated January 25, 1998.

103. ^ Thomas Pabst, “The Return of the Jedi: Cyrix and IBM’s 6x86MX CPU,” Tom’s Hardware, May 30, 1997,,26.html.

104. ^ Anand Lal Shimpi, “Cyrix 6x86 MX,” AnandTech, April 8, 1997,

105. ^ As a retrospective, Adrian Potoroaca, “Cyrix: Gone but Not Forgotten,” TechSpot, November 3, 2022,

106. ^ Andrew Sanchez, “R U 4 AMD-K6-2 3D CPU?,” Boot, no. 23 (July–August 1998): 41–45, Internet Archive,

107. ^ See Stephanie Dick and Daniel Volmar, “DLL Hell: Software Dependencies, Failure, and the Maintenance of Microsoft Windows,” IEEE Annals of the History of Computing 40, no. 4 (October 2018): 28–51,

108. ^ Condor (Jon Nisbett) ([email protected]), “Intergraph Reactor (Vérité card) REVIEW!! *long* Reactor REVIEW Reactor REVIEW Reactor REVIEW,” Usenet:, October 16, 1996.

109. ^ Bart Simon, “Geek Chic: Machine Aesthetics, Digital Gaming, and the Cultural Politics of the Case Mod,” Games and Culture 2, no. 3 (July 2007): 175–93,, also suggests a parallel between PC gaming and hot rodding, in addition to audiophilia, which is another connoisseurship relating to measurable quantities. Marc Perlman, “Golden Ears and Meter Readers: The Contest for Epistemic Authority in Audiophilia,” Social Studies of Science 34, no. 5 (October 2004): 783–807,

110. ^ Steven Shapin, “The Sciences of Subjectivity,” Social Studies of Science 42, no. 2 (April 2012): 178, Shapin’s programmatic on taste communities has had follow-ups, e.g., Christopher J. Phillips, “The Taste Machine: Sense, Subjectivity, and Statistics in the California Wine World,” Social Sciences of Science 46, no. 3 (June 2016): 461–81,

111. ^ Thomas Pabst, “Diamond Monster 3D II—More Than a Worthy Successor,” Tom’s Hardware, February 1, 1998,,52.html. While pathbreaking in his instrumentalist review style, Pabst’s understanding of graphical performance was, in fact, more nuanced: “NVIDIA vs. 3Dfx—TNT vs. the Voodoos,” Tom’s Hardware, October 7, 1998,,87.html.

112. ^ I owe this phrasing to Graeme Kirkpatrick, The Formation of Gaming Culture: UK Gaming Magazines, 1981–1985 (London: Palgrave, 2015), and its application of Pierre Bourdieu, Distinction: A Social Critique of the Judgment of Taste, trans. Richard Nice (Cambridge, MA: Harvard University Press, 1984).

113. ^ The PC’s hostility has been a frequent subject of editorializing over the years, e.g., Emanuel Maiberg, “PC Gaming Is Still Way Too Hard,” Motherboard, Vice, July 9, 2016,; and Jacob Ridley, “PC Gaming Is Becoming Way Harder to Get Into, and That Sucks,” PC Gamer, December 30, 2021, Some of these hostilities are no longer exclusive to the platform; see William Antonelli, “I Wish Games Didn’t Update So Much,” Kotaku, September 19, 2019,