Back to the Table of Contents
View entry in Giant List
The ten original titles he has worked on have generated over a billion dollars in sales.
What did you do before getting involved in the game business?
I went to school at UC Berkeley in the mid 70s. I was originally going to become a biochemist, but it just got way too complicated. Essentially you're playing God's game with God's rules. The chemistry of life is a tremendous challenge, and I realized I could spend my life studying the pubic hair on a rat, and although I would be the world's foremost expert on the subject, I might still not be able to explain why it is curly.
I got interested in computers and how they could be enslaved to the megalomaniac impulses of a teenager. Actually my first experience was a one day course at IBM in San Jose, writing FORTRAN on punch cards in high school. Punch cards were a drag because if you made a single mistake on an eighty character line, you had to throw it out and start over again. And if you dropped your several thousand card deck and they got out of order, its was game over.
At Berkeley I started on FORTRAN programming the CDC 6400 mainframe. It was a 64 bit machine, very kick-ass for its day. It was probably about as powerful as one of today's microwave oven controllers. Around this time the Intel 4004 microprocessor hit the scene. I got really excited programming micros. It was clear that a new era of ubiquitous computing had arrived, and the days of the glass room, software priesthood were ending.
My first exposure to video games was playing the original "Space War" on an old IBM 710 machine in the basement of the physics lab. The display was an old oscilloscope and it was awesome two-player combat. We'd boot up the machine at midnight and before you knew it, it was dawn. Every now and then the old pile would crash and you'd have to go jiggle a couple of circuit boards in the CPU cabinet. A single PC board held one bit of memory.
I got very interested in compiler design as a step toward natural language understanding. My goal was to eventually create a computer program to pass the Turing Test. So when I graduated, I got a job at Hewlett Packard in compiler design. Regrettably, my first job was on a six year project to write a COBOL compiler. I lasted about three days. I didn't want to end up like my colleagues at HP, bored shitless and sitting around talking about their lawn sprinkling systems, as they toughed out another coffee break on the road to oblivion.
So how did you move toward games?
Coincidentally, the day I quit, I got a call from a little company called Atari which I had previously interviewed with at school. They were so screwed up that they took three months to call me back. Anyway programming games sounded like a gas, even if the salary was only half what HP was paying.
So I went to Atari, and after a week my boss quit, and then a week later they fired his boss, so I was pretty much on my own programming one of the first microprocessor pinball games. Atari was a wild place to work in the late 70s. Half the employees were partying stoners who did virtually nothing, and the rest were amazing geeks who worked like banshees. At the time, Jobs and Wozniak were trying to get "Breakout" working, and eight-player "Tank" was the ultimate multiplayer experience. Every couple of weeks, Nolan Bushnell and his buddy Gene Lipkin would come down to engineering, critique your game with often bizarre and thought provoking, but totally irrelevant, suggestions. When they were gone, you'd just do what you had in mind originally, since in another two weeks time they would forget their prior recommendations, and have new and totally unrelated bizarre suggestions.
After a couple years at Atari, the pinball division went belly up, so I headed to Chicago to program pinball games for Williams. I was totally psyched to work with Steve Ritchie, perhaps the greatest pinball designer of our time; he also started at Atari. At Williams I programmed sounds using a software synthesizer consisting of a 6800 microprocessor hooked to an 8-bit DAC. I also programmed a couple of pinball games. At about this time–1978-9–the first microprocessor video games came out. "Space Invaders," "Space War," and "Asteroids" were the rage, and I could see the incredible universe now open to video game designers. This led to the "Defender" project at Williams.
What was the atmosphere like at Williams in those days?
Williams was a very small place engineering-wise. The company had nearly gone bankrupt a few years before, so the staff was very lean. There were two or three programmers, two or three hardware guys, and a handful of mechanical designers. The sales and marketing department was an old guy with a phone. It was completely sink or swim, as no one in management really knew the technology. Since there was nobody to tell us what to do, we had complete creative freedom.
There was a huge, grimy old World War II era factory with a thousand employees making everything from transformers to switches to metal stampings to circuit boards to finished games. If the latest game stunk, everyone was laid-off until the next model was ready. Basically the fate of the company rested in the hands of a couple of 19 year olds.
What did you think of the arcade games of the day, circa 1979-80?
They were totally awesome. To go from a blank screen to "Pong" to something like "Space Invaders" was mind-blowing. There were entire arcades that consisted solely of "Space Invader" games. The introduction of progressive difficulty drone AI, along with interactive sound-track tension building, whipped players into a frenzy. Many of today's games still rely on the multiple life, progressively difficult level paradigm of Space Invaders.
How did "Stargate" come about? Were there parts of "Defender" you thought you could improve?
After "Defender," Williams' first video game, came out and was such a huge hit, management decided that if one team could design a huge hit, 100 teams could design 100 hits. It was basically the monkey and typewriter approach to game design. So a huge new facility was constructed–Kedzie–with a beehive of offices. Since the edict went out to staff up quickly, an army of warm bodies were hired. Whereas I designed "Defender" in an abandoned factory in complete isolation, now I was subjected to twenty stereos playing "Disco Duck," all the while gagging on Cannabis fumes from the turned-on masses.
To escape this insanity, Larry DeMar and myself formed the Vid Kidz design firm. "Stargate" was the brainchild of Larry DeMar. Soon after we left Williams, they became rather desperate for a game. Being a purist, I felt a totally original game should be designed, however Williams needed a game in four months. DeMar proposed an enhanced "Defender" game. It couldn't be just a regular cosmetic con-job, but a really cool enhancement. We got real excited about tweaking the code and programming gobs of new and cool enemies and getting better real-time performance so more stuff could be packed on screen without blowing the silky smooth sixty frames-per-second performance. The "Stargate" warping feature was just icing on the cake.
We programmed the game on a dual 8" floppy drive 1MHZ 6809 Motorola Exorcisor development system. Since PCs were very expensive in those days–about $30,000–we worked on one system in Larry's spare bedroom. I programmed the system in the day, and he worked at night. And in four months it was done.
Where did the "Robotron" idea come from?
The basic play of "Robotron" was programmed in three days. The game was inspired by the immortal arcade robot game "Berzerk" and the game "Chase" on the Commodore PET. The prototype was a "Defender" game with a "Stargate" board and a couple of Atari 2600 joysticks screwed to the control panel. Originally, "Robotron" was going to be a passive game with no firing. You killed the robots by making them walk into the electrodes.
The grunt robotrons were the first enemy designed. Actually the electrodes were first, but they don't do anything except kill everything that touches them. The grunt AI was extremely basic: plot the shortest path to the player and seek out the player until either the player or the robot is dead.
It was fun for about fifteen minutes, running the robots into the electrodes. But pacifism has its limits. Gandhi, the video game, would have to wait; it was time for some killing action. We wired up the "fire" joystick and the chaos was unbelievable. Next we dialed up the Robot count on the terminal. 10 was fun. How about 20? 30, 60, 90, 120! The tension of having the world converge on you from all sides simultaneously and the incredible body count created an unparalleled adrenalin rush. Add to it the mental overload of a truly ambidextrous control, and it was insanity at its best.
The basic magic of "Robotron" is the independent movement and firing controls. I was a great fan of the game "Berzerk," and the frustration of that and all other single joystick games, was that you have to move toward an enemy in order to fire in that direction. "Berzerk" had a mode that alleviated that somewhat in that if you held the fire button down, the character would stand still and then a bullet could be fired with the joystick in any direction. So essentially in that mode the joystick fired the bullet. I just put on a separate joystick to fire bullets.
How did you choose the enemies in "Robotron"? The balance is perfect.
The philosophy of enemy design was to create a handful of AI opponents as unique as possible from one another, with unique properties of creation, motion, projectile firing, and interaction with the player. The enemies would be deployed in a wave related fashion, with distinct themes for each wave. Some of the most interesting and deadly aspects of the enemies were bugs caused by improperly terminated boundary conditions in the algorithms. Often these bugs produced behavior far more interesting and psychotic then anything I conceived of. An interesting bug causes the enforcers to drift in to corners occasionally for a deadly rain of terror.
The recipe for individual wave mixes of enemies was guided by the theme idea, featuring a certain enemy in each wave, as well as the presence of a basic core element of grunts, enforcers, and electrodes.
What was the "Robotron" hardware like? Why was it so fast?
The "Robotron" hardware was a 1MHz 8-bit 6809 processor, with a custom image coprocessor also running at 1MHz. The amazing thing is this slow circuit had more image processing power than PCs until the early 90s. What made it so powerful was that the image coprocessor was one the first examples of what later became known as a bit blitter–popularized by the Amiga computer. In fact, several of the future designers of the Amiga, including RJ Mical, worked at Williams at this time [1982]. The coprocessor wrote two-dimensional objects to the bitmap with transparency, color paletting, color substitution, and other special effects.
When did you first realize that "Robotron" was going to be something special?
It was really on the third day of the project when we got the grunts and firing going. I've never seen something go together that quickly and be that fun. The game was finished after six months of crash effort.
Are you disappointed that "Blaster" has remained virtually unknown?
It was a bummer that "Blaster"'s primitive 3-D hardware system was too expensive to be widely distributed. Bottom line: the fun in the business is seeing your game get played. If nobody plays a game you busted ass on for two years, you feel like your life's work is an abandoned piece of garbage.
What happened to the Williams video game division after the crash? There was quite a dry spell without any new games.
Williams suffered greatly in the crash. A flood of cheap video games were dumped on the market, and the much heralded laser disc revolution of prerecorded track-switching games bombed. They were precursors of todays bloated, boring, non-interactive Full-Motion-Video CD-ROM titles. A huge laser-disc motorcycle simulator, "Star Rider," was a major dog, and $50 million in losses were tallied. Essentially about 90% of the staff quit or was fired. The Kedzie Ave. video facility was closed. The company reverted back to the ancient 1940s vintage California Ave. factory in Chicago making pinball machines, and although a few video flops were produced, like "Escort" and "Turkey Shoot," nothing of significance was released until 1989's "NARC."
At the time of the crash, I was in graduate school at Stanford on the West Coast. Basically, everyone thought that video games were permanently dead, the Hula-Hoop of the 80s, a disco fever flame-out scenario. Except Nintendo.
For a while, you were supposedly thinking a lot about virtual reality? Did anything ever come of that?
I have been interested in VR since my school days at Berkeley, where I studied the research by Ivan Sutherland from the 60s. I gave a course on the subject at the '84 Siggraph; this was before the term "VR" was coined.
A group of us at Midway broke off and set out to develop a VR game in 1990, however we eventually became disenchanted with the poor resolution, brightness, color saturation, and update rate of the current generation of headset displays. After a period of excitement and hype, you were left with a headache. Binocular stereo displays can also be fatiguing to the eyes, because you have the depth parallax without the proper focusing depth of field. What we really need are real-time electronically controlled holographic displays.
We also realized that for public performance, the fear of cooties, of putting your head in something 27 million other sweaty people used, would be a deterrent. And all the early VR games were really lame, which turned a lot of people off.
Because of the problems of immersive stereo technology, we decided to focus on cockpit window style simulation which is very comfortable, non-intrusive, and totally ubiquitous–everyone has a monitor. The rest of the gaming world coincidentally came to the same decision. We created a proprietary texture mapped 3-D hardware system and used it to create the driving simulation games "Cruis'n USA" and "Cruis'n World." The texture mapping hardware system, known as the V-unit, was purchased by Nintendo for use in the Nintendo 64 project.
Was/is originality in game design important to you?
To me, if you have nothing new and cool to bring to the table, then there is no sense in designing a game. Regrettably, about 80% of the video game business involves clone products and cheesy licensed titles. These are the too-numerous to mention titles that no one remembers once the ad budget runs out. Life is too short to waste on me-too efforts. If you are just doing it for the money, and you can't get even get yourself psyched about your project, then it's time to move on to something fresh. Why waste irreplaceable time in life just making money, when the alternative is having some fun exploring the unknown? Money can be made later, but time is lost forever.
Excitement can be a new character, a new game genre, a new camera angle, a novel playfield, a better display engine: color vs. black & white, texture mapping vs. flat shaded polys, 3-D vs. 2-D, better frame rate, awesome AI, etc. The huge megahits never are strictly a graphics upgrade, but involve fundamental gameplay paradigm shifts: "Mario," "Doom," "Space Invaders," "Tetris," "Pac Man," etc. The latter three involved no real improvement in graphics, they were just awesome games.
What do you think of the modernized versions of your games: "Defender 2000" and "Robotron X"?
It a very natural tendency for gamers to recall the rush they had playing the classics years ago, and then try to think up ways to freshen these concepts to bring their excitement into the current scene. The retrogaming gene is found in all of us. This task often proves difficult if not impossible, because the magic of the games themselves were often due to the limitations they impose upon the player. This is a common mistake all gamers make.
We want total freedom in our fantasy game world to do anything, but we forget that the essence of a game is the limitations of its world. The game of chess is a classic example of a severely limited playfield, but with mind-blowing intrigue. We face the freedom of the real world every day, but find it more interesting to play on our computer screens, because it holds the promise of a less infinite world that can be conquered.
It is especially difficult to translate the 2-D classics into 3-D. The 2-D perspective, while not as graphically exciting as 3-D, provides the player with complete information of his position, all current threats to his survival, all escape routes, projectile trajectories, etc. Once you move to a first person 3-D perspective you get the "I am there" adrenalin rush, but there is much less information and more ambiguity. The 3-D game requires the player to memorize the playfield since he can't see it very well. Enemy locations can only be guessed at, and death is often a surprise shot in the back. Because of these problems, most 3-D games in the "Doom"/"Quake" genre involve only a handful of active player threats at any time. It becomes very difficult in 3-D to duplicate the hundred-plus threats of Robotron, and have the player survive for more than a few milliseconds. Also, todays 3-D systems are limited to displaying at most a few dozen rendered objects, so very crowded scenes can bog easily.
"Robotron" has always been frustrating in non-arcade versions because of the lack of the dual joystick control. Because of the intensity of play the game is very athletic, and it is very nice to have a 300 pound arcade cabinet stabilizing your joysticks. Without true dual fixed joysticks, the game can be quite frustrating in console and PC versions.
Do you see retrogaming as just a fad or do you think it will in some way have an effect on the types of games being written?
For me the retrogaming movement is more than just nostalgia of misty eyed Gen X'ers. It's a reaction to the current graphical overkill, the simulation obsessed gaming environment of the late 90s. In our quest for absolute graphical realism, we have forgotten the basics of gaming. Look at "Virtua Fighter 3" vs. "Virtua Fighter 2." Unless you are a proctologist, you can't find a dimes' worth of difference in the gameplay. It is clear that the design team focused on the beautiful water effects, facial expressions, awesome backdrops, and 400 polygon, fully rendered loin-cloth animations. Have we as game designers become mere interior decorators, spending months on the reflection mapping of candlelight, or loin-cloth motion capture? Have we forgotten the essence of gaming which is to present the player with novel and original challenges? Once you've seen the interior decoration, there's no need to come back. You need a game in there.
Williams kept writing games in 100% assembly language, even through the "Smash TV" period and beyond, and even though the hardware was much more powerful. Any insight into why this was?
In fact, most Williams and Midway games today are written in assembly, including the "Mortal Kombat" and "Cruis'n" series of games.
It's really a matter of philosophy. What is the objective of writing a game? I write a video game for the sole purpose of providing the player with the most awesome experience possible. The best interaction with the highest possible frame rate and quality. Regrettably, high level languages require a compromise in efficiency on the order of 20-50% versus a highly skilled and motivated assembly programmer. It is true that assembly code is more difficult to write, but am I designing a game for my convenience, or to deliver the best possible experience to the player? It is no coincidence that programmers who design with their own convenience as their top priority often end up with crappy games.
It could very well be true that the assembly language era is coming to a close, and that the advent of more efficient processors and compilers will render it obsolete. However, I must admit I was thinking the same thing in 1980.
What's your take on the network game craze?
The network craze is about gaming with human opponents. Human opponents were what the first video games were about–e.g. "Pong"–since there was no AI in that era. The microprocessor revolution created a golden age of AI driven computer opponents, but even the best AI can eventually get stale, boring and predictable. The Artificial Intelligence in many games is often little more than Artificial Stupidity. And then there's the biggest problem with a computer opponent. They have no ego. We not only want to beat an opponent, but to inflict real psychological damage. We fantasize that a particularly crushing defeat will condemn an arch rival to years of therapy to overcome the trauma. A computer opponent is an egoless wimp that can be done away with in a flick of the power switch.
Networking has brought accessibility to a whole universe of human opponents, creating an almost unlimited challenge and interactions to gamers. Networking also adds the human element of socialization to the gaming experience. We are clearly at the threshold of a new frontier here.
What do you think is the most overrated game of the last ten years?
I'd have to say its a tie between "Myst" and "Wing Commander."
Re: "Myst"
I'm sorry, but I just don't get it. Non-interactive graphical wallpaper
can be very beautiful, but after a few seconds I get bored. I want action.
Contrived challenges like puzzles make me feel very puny and stupid.
I don't play games to be anally retentive, I want conflict and action.
Re: "Wing Commander"
Mind-numbing sequences of pre-rendered storyline which have nothing
to do with the game action are bogus. First you watch a bad movie, then you
play a game that's even worse. You should play the story, not watch it!
It's obvious that the FMV sequences had ten times the budget that the
game did. Why didn't they put the money into the interactive game design?
The whole thing reeks of a cheap bait and switch.