By Michael Ahlf
Video games have had a remarkable evolution; the first ones for computer were little more than glorified "choose your own adventure" books, with text to involve the reader in the storyline. Classic games like the famous Zork series, star trek simulations, and any other sort of number-generation or choice-generation games led in popularity, though their use was originally limited (mostly) to academics who had access to huge mainframes. Games went on, however, and "computers" as such improved. The one thing everyone could go after: pinball tables, though admittedly not the high-tech versions we see today, and the arcade machines which originally again were little more than analog play wonders. Life went on, and things improved: take a look at the ways things have progressed, and classics have been built.
The '70s were a time of simple, yet fun gameplay. 1971 introduced us to PONG, a game that nobody who plays games should need to ask about. Pinball table makers began to try to clone it, producing their own versions (most of which never caught on). Atari games moved on, producing the wonderful ideas such as Asteroids, Breakout, and more. Midway also had their hands in the mix, kicking out Galaxian to try to topple Space Invaders. Home game consoles appeared in the later '70s as well, with Home Pong beginning the action in 1974. The '80s, however, saw a big explosion in video gaming due to home machines. They also gave us increasing (for the times) techological marvels such as BattleZone (now seen in TWO Activision follow-up remakes), the Donkey Kong series, and in the latter half, home computers themselves. I admit, this is only scratching the surface of what was seen; video games were everywhere and it seemed every company, right down to Quaker Oatmeal, wanted in on some of the action.
At the height of their original reign, home video game machines were numerous and all sported more or less (unfortunately) the same library, as video game makers refused outright for the most part to give up any part of the market due to different machines being available. Consoles from Atari, Colecovision, Sears, and others all shared the shelves and games, with admittedly some differences with ports from machine to machine. Later machines gave more faithful arcade ports, but the basic premise was the same -- get your game into the arcades, and kids will play them there. If the kid can BUY your game, they'll do that too, and have fun in both places. It was in this era that the first battles over emulation really came to pass: when various companies decided to make adapters so that their machines could play the games made for other systems, fights broke out. But the games continued, up to a point: though the prices were high, up until 1984 every company that could was making video games. Even companies like Kool-Aid got into the act, getting games made based on Kool-Aid Man made for both the Intellivision and the Atari 2600. Then, disaster. In 1984, people began to realize they could own actual COMPUTERS for the same price as a console system. Systems like Coleco's ADAM, Mattel's Aquarius, Magnavox's Odyssey, and more gave people secondary options, as well as the fact that the market (thanks to overzealous game makers) was completely glutted. Systems went down the toilet, as people refused to buy in again... for a while.
In 1986, things came back: the market more or less stabilized over three systems, the Atari 7800 (the big idea here being that old Atari games sitting around still worked on it, as did new games with quality near the original Nintendo games' quality), the Nintendo NES, and Sega's Master System. Master System and 7800 faded out, though having their own loyal following, in the face of Nintendo's marketing lead and the fact that newer, now-classic games such as Super Mario Bros, Excitebike, and more were only available on their system. NEC tried to get in in 1988 with the TurboGrafx 16 system, but they had the same luck as Sega and Atari had in the face of Nintendo. As things went on, the NES was supreme right up until the SNES came out in 1991, not because of technological superiority (NEC kept trying, SNK's NEO-GEO certainly had technology since it was the same hardware as the arcade machines) but thanks to the library which kept growing and keeping an audience with games the other systems just couldn't get. With the SNES, Sega went into some marketing, and though Nintendo held the market the gaming public recognized the Genesis (which had arrived first) as an equal machine, prompting independent game makers to program for both systems. Arcade games went on as well, with "quarter suckers" such as the Teenage Mutant Nunja Turtles getting players to dump large amounts of money in, all while enjoying and having fun.
Home computers had grown in this time as well, however; we had gone from the inauspicious though widely-remembered Apple systems (Apple IIe, IIgs) and other fun systems like the Trash-80, through the C64, C128 and other goodies, through to IBM PS/2s and actual home computers involving the 286, 386, and beyond lines of processors. We got to see major classic games like Wolfenstein 3D, Commander Keen, as well as many of our old favorites from the era when consoles were king get ported up. Hardware since then has gone through the roof: we've gotten to the point where major online gaming takes into account location-specific damage in shooting games, sports tournaments would be feasible, and where even the console systems compare their hardware to home computers in order to brag. The lines of console (game-oriented machine) and computer (ostensibly more generalized) have once again begun to blur.
So where are we going? Or, as the title of the article states, WHERE ARE GAMES GOING? Old video games going back into the '70s still hold appeal, with many gamers. It seems, however, that sometimes the game programmers -- and yes, even we reviewers -- don't give enough credence to GAMEPLAY. Games like Unreal and others are technological (visually, anyways) marvels when they come out. There are even neat features involving "bots", AI that can use tricks and react faster than humans to give a high degree of FPS challenge. However, the industry now as then has its problems. Just as back then, for every Asteroids or Mario or Moon Patrol there would be a game like Kool-Aid Man following it or showing that there are some ideas that don't hold up well, for every Quake or Doom there will be a game like Tomb Raider, which just won't seem to go away. Thousands of games are made each year, and a vast majority don't get attention even though they give us great gameplay. The computer equivalent to Street Fighter II or Mortal Kombat, a little game called One Must Fall 2097, is arguably just as good in graphics. It also sports a multi-tiered fatality (called Scrap/Destruction) system as well as a high number of special moves, plus an AI which will REALLY give gamers a challenge, but it got almost no attention from the majority of gamers. In the place where high-gameplay, high replay value games once were have come creations like Unreal Tournament, which rely on humans for gameplay and replayability. Gameplay has proven not to be linked to graphics -- look at the success of CIVILIZATION. Graphics arguably won't make a game a success either, as can be seen by the myriad of games which sport great screenshots but bore people after 5 minutes. With increasing hardware, there are three things that can happen in the market, and one which hopefully will.
First, games could get worse. Graphics might improve, but gameplay go down the toilet. Instead of games turning into classics, gamers might turn BACK to classics. While I doubt there will be another huge crash like 1984, a smaller crash could indeed happen, especially with PC competing with N64 competing with Playstation 2 competing with X-Box competing with Dolphin competing with whoever else wants to try to get into the market, all of whom have a significant number of game makers in their corner. It's an arguable relationship that the more graphic goodness a game has, the more load is put on the processor and the less is left for the AI (look at the speed difference running BattleChess in 2D versus 3D on an old 286 or 386 if you don't believe me). In real-time games like Starcraft or First Person shooters, this translates into dumber AI. If too much emphasis is placed on graphic goodness, a challenging AI slows down to target practice.
Second, graphics might stagnate. After a certain point just like in sound (there's really only so much that can be done with small-scale 2-speaker systems) there comes a point where increased visual reality won't make much difference. 64k in color depth would do it -- there's no real point going to 128-bit color rendering since the human eye wouldn't care. When monitors refresh only to 100 or 120 Hz, a video card putting out 200 frames per second is meaningless. If graphics stagnate, however, this leaves more room for AI. Computer opponents might get smarter, harder, might eventually get to the point where they're as hard to predict as real humans, but graphics stay close to the same. To a certain point this goes on already in console systems -- evolution there is not a function of better hardware, but programmers getting smarter with what they have to work with. New systems open the boundaries somewhat, but eventually things start looking the same, like is currently happening on Sony's Playstation right now. If console systems start being where most PC games come from (porting), especially as Dreamcast and X-box games are made to ALREADY be ridiculously easy to port to PC, it could very well be the case that the gameplay improves while graphics reach a more constant level.
Most likely, however, is that gameplay and graphics will continue to be in competition; programmers will continue to ask when their gameplay is sufficient, when the graphics need improvement, or when both are still too far behind. Various designers and programmers address these issues in their .plan files and similar writings, proving that at least they CARE where the games are headed today. At some point, a new game becomes a classic -- it is acknowledged that Doom and Wolfenstein 3D have made their mark, as has a lesser-known game called Populous that started out the God Games genre. Simulator games and the like make strides in graphics merely because realism has been their forte and their claim to fame; arcade-style games make their mark in gameplay, sometimes deliberately being unrealistic (space combat games using a decidedly un-Newtonian physics system, Mario being able to control the direction of his falling) in order to further the fun of the games. Games today are heading the way they have always gone, treading the fine line (and many times, falling off into the void) trying to balance out graphics and gameplay in order to draw in the player and keep his/her interest for as long as possible. PC programmers won't, unless a corporate bigwig says to, restrict their programming to make a game easier to port to consoles; this is keenly evident in the Quake and Duke Nukem series ports to console systems. PC ports of arcade games tend to lose things that PC gamers like, such as save-anywhere features, and also face the problems of controller differences though this is diminishing with controllers for PC advancing in functions, including clones of the more popular console pads. More than likely, players a year from now will be considering some game from last year a "classic". The debate won't die, not because the debate is a bad thing but because it's a NECESSARY thing, in the community. Great gameplay doesn't require more than an 8-way joystick and fire button, but that doesn't mean that great gameplay can't be achieved with full flight simulator controls. Far more important than graphical superiority, technological wizardry, or even high floating-point power is what the programmers DO with them, and hopefully the programmers will keep listening and thinking about what they want to create.Good Luck, guys.
All respective copyrights named or used in this article, and images, are copyright of their respective copyright owners. All opinions expressed in this article are the opinions of the writer, and may or may not express the view of Glide Underground, Inc. Please don't hurt me.