Gaming for Every Budget

With the holidays coming and the deluge of games upon us, there's a good chance you've either picked up a new console, have one on the way, or have been eyeing one for the past few months. All of the major console platforms offer some flexibility, but much of it comes at an extra price that isn't figured into the original cost of the hardware. Needless to say, it can add up pretty quickly, so we've put together a guide that should give you a general idea of what you're getting into financially, depending on what you want out of your gaming system of choice.

Microsoft Xbox 360


Deluxe Edition

Minimum

If you're looking to grab an Xbox 360, one of the easiest ways to save money is to head straight for the 4GB stand-alone system. The hard drive is small, but you can always purchase a bigger drive in the future if storage space becomes an issue. As with the PlayStation 3, it's also a good idea to invest in some video cables that take advantage of HD resolutions, especially because HDMI cables are usually inexpensive items (avoid purchasing the official cable from Microsoft). It's also a good idea to pick up an extra controller for local multiplayer in case an Xbox Live Gold membership is too cost prohibitive. And while it won't allow you to play games online, an Xbox Live Silver membership is worthwhile for connecting with Xbox 360-owning friends and downloading Xbox Live Arcade games and demos.• Xbox 360 4GB - $199.99
• Xbox 360 Controller - $49.99
• HDMI Cable - $3
• Xbox Live Silver* - Free
Total - $252.98

Recommended

Spending some extra money on an Xbox 360 bundle that includes a bigger hard drive opens up more possibilities than a smaller 4GB unit. For starters, you have more space to install games on the system, as well as more room for demos, Xbox Live Arcade games, and other forms of content that take up generous amounts of memory. The Xbox 360 Kinect bundle is worth looking at because it comes with a 250GB system, as well as the Kinect camera and Kinect for $100 more than the non-Kinect version of the package. There's also a similarly priced holiday bundle that doesn't come with a Kinect camera, but it does have two games--Alan Wake and Forza Motorsport III. An Xbox Live Gold membership will give you access to online multiplayer gaming, as well as additional features, such as Facebook, Twitter, Last.FM, and Netflix, which requires a separate subscription.• Xbox 360 250GB Bundle (Kinect or Holiday) - $299.99
• Xbox 360 Controller - $49.99
• HDMI Cable - $3
• Xbox Live Gold* - $59.99/year
• Netflix* - 8.99/month
Total - $421.96

Deluxe

There's really no exorbitant way to spend money beyond our recommended setup unless you're going for a special edition Halo: Reach console for $399. But if you have some extra dough, you should look into spending it on a Zune pass. This not only gives you access to a streaming music service (not unlike Last.FM), but it also lets you download and keep 10 songs per month. Various types of video content can be viewed and purchased through Zune as well. Rechargeable batteries and charging stations are also a good option because it's pretty easy to plow through a supply of AA batteries in quick fashion.• Xbox 360 250GB Bundle (Kinect or Holiday) - $299.99
• Xbox 360 Controller and Play and Charge Bundle - $64.99
• HDMI Cable - $3
• Xbox Live Gold* - $59.99/year
• Netflix* - 8.99/month
• Zune Pass* - $14.99
Total - $451.95

The Future of A.I. in Games

Artificial intelligence in games has matured significantly in the past decade. Creating effective AI systems has now become as important for game developers as creating solid gameplay and striking visuals. Studios have begun to assign dedicated programming teams to AI development from the onset of a game's design cycle, spending more time and resources on trying to build varied, capable, and consistent non-player characters (NPCs). More developers are also using advances in AI to help their games stand out in what has already become a very crowded marketplace, spawning a slowly growing discussion in the industry about redefining game genres. Think tanks and roundtables on advances in game AI have become prominent at the annual Game Developers Conference (GDC), while smaller AI-dedicated conferences such as the annual Paris Game AI Conference and developer-run online hubs such as AiGameDev.com are garnering a big industry and community following. While industry awareness about the significance of AI in games continues to grow, GameSpot prompted Matthew Titelbaum from Monolith Games, Remco Straatman from Guerrilla Games, and AiGameDev.com founder Alex J. Champandard to share their thoughts on the future and growth of game AI.

The Halo franchise is recognised as a leader in the field of game AI.

Unlocking new possibilities

While faulty AI is easily recognised, an AI system that is doing its job often goes unnoticed. No one stops halfway through a level to admire the idiosyncrasies displayed by NPCs unless they are doing something completely out of character--the more unremarkable, the better the AI system. While achieving this result is still a priority for game developers, making games with an AI system that stands out for being good is a relatively new concept: few studios want to dedicate costly man-hours to chasing innovation in a highly technical field that, for the most part, is likely to go unnoticed. However, there are some exceptions. In 2007, AiGameDev.com launched its annual game AI awards, nominated and voted by the site's community. The purpose of the awards was to spotlight the games that showed promise in the field of AI, either by trying something different or exhibiting technical proficiency. In 2009, the Best Combat AI and the overall Best Game AI awards were won by the same studio--Guerrilla Games for Killzone 2. Remco Straatman, lead AI programmer at Guerrilla, says a lot has changed in game AI in the last five to 10 years, with more developers trading low-level scripting for more advanced NPC decision systems.
"In general, I think game AI has gone from the stage where it was an achievement if it did not stand out negatively to the point where AI in most big games is solid, and some titles are using innovative new ideas," Straatman says. "More development teams have also moved from simple state machines to behaviour trees and using planners in NPC AI systems describing knowledge of the world around the NPCs have improved with better knowledge for navigation over changing terrain, and more knowledge about strategic properties of the world such as cover. I also think advances in animation systems with better ways to combine various animations and physics have become available, which now allows for more realistic movement and responses to being hit [in combat AI]. Most of these systems were not around 10 years ago or simply could not run on the hardware available."
Creating a solid game AI system involves successfully networking smaller systems together. For example, a system that deals with the problem-solving capabilities of individual NPCs goes hand in hand with a system that makes sense of the gameworld and its parameters and helps NPCs make relevant decisions. Thankfully, developers don't have to build these systems from scratch: they use specific planners that generate increasingly complex networks.
"At the moment [Guerrilla Games] is using a specific type of planner for our NPCs called Hierarchical Task Network (HTN)," Straatman says. "This is capable of generating more complex plans than what we had before Killzone 2. We also keep on improving things like the CPU performance, which means we can support more NPCs in Killzone 3 than we could in Killzone 2. The terrain-reasoning systems we generate have also evolved over our various titles. We are now able to deal with much more dynamic terrain (like obstacles moving around or changing shape) than ever before. Our data on where there is cover has also become more detailed, something that allows NPCs to deal with more complex environments such as multistory buildings, etc."
Killzone 2's lead AI programmer Remco Straatman believes the industry is still struggling to make NPCs as human as possible.
Back when Straatman and Guerrilla began work on Killzone and Shellshock, the team’s goal was to make the AI system as capable of making its own decisions as possible, realising this would make things all the more fun for players. However, doing this in a consistent way proved to be a lot more work than the team anticipated, particularly when dealing with combat AI. While the goal of normal AI is to emulate the real-life behaviour of a particular nature (for example, doctor, civilian, or shopkeeper), combat AI works very differently. Firstly, its main objective is to be as entertaining as possible. In some cases this means being efficient at killing players; in other cases, it's more about making intentional mistakes and "overacting" by way of signalling to players what is about to happen.
"Where normal AI tries to emulate an expert medical specialist or world champion chess player, game combat AI is more like emulating an actor," Straatman says. "At the end of Killzone 2 we found ourselves looking at the NPCs doing things that we did not expect, and this surprised us positively. Reviews and forum feedback confirmed we had at least partly achieved the vision we had so many years back, and people playing the game recognised and appreciated it."
One of Killzone 2's most commended features in the field of AI was the game's skirmish mode. Because this mode is more team-based and tactical than the single-player campaign, the AI bots in this part of the game need to do more than simply run around and kill one another. Guerrilla based the skirmish AI in Killzone 2 on the real-time strategy model, building two levels of AI on each individual bot. The first is a commander AI, which controls overall strategic decisions; the second is a squad AI, which translates the commander AI's orders into orders for the individual bots. The team then taught the bots how to use the in-game badges as part of the order given to them by the squad. For example, if an engineer bot is ordered to defend an area, he will first build a turret at a tactical position before starting to patrol. While some might argue that AI bots no longer play as important a role in multiplayer games--given that most gamers now play online--Straatman says bots improve gameplay and give players a chance to test out multiplayer strategies before going up against other human players.
"They give people a testing ground for real multiplayer--getting to know the maps and the game modes in a game against human players can be too much to start with."
According to Straatman, the area that needs most improvement in the game AI field is buddy AI. Because buddy AI systems often have contradictory constraints, getting this system right is often a big challenge: the buddies should be visible and close to the player but not get in his line of fire; they should stay close and respond to the player movement but not move around all the time; and so on. Buddy AI is also much closer in view to players than enemy AI, making any errors easier to spot.
"Enemy NPCs know what other NPCs of the same faction are going to do because they are all computer-controlled and can tell each other what they will do next. However, players are much harder to predict--if you would look at movement patterns of players, you will see they are quite strange at times. This is made worse by the fact that player turn rates, movement speeds, and acceleration are very high. The last point is the expectation of the player: enemies are only supposed to shoot at you, whereas buddies are supposed to fight and interact with you in a sensible way. We are working hard to make the buddies work better, because we feel that they can add a lot to the player experience when done right."
The AI director in the Left 4 Dead games is an example of how developers can use AI to reach beyond traditional individual NPC behaviour.
Straatman believes the struggle to make NPCs as human as possible is still very much at the top of the list for many AI programmers, with the future set to change the way we think about in-game interaction.
"The ideal is always to immerse the player in the game: the NPCs should feel like they are living and breathing creatures, and this illusion should not be spoiled anywhere. Within the relatively limited interaction you have in a game, it may be achievable to make the distinction very small. I think human behaviour is so interesting, and yet subtle interactions such as conversations are still out of reach of autonomous AI; games rely on clever scripting or cutscenes to get that across. If we as a field will master these types of interactions, more parts of the game can be interactive, and possibly whole new game genres may become feasible."
"I think this will make games more approachable and immersive. If we are able to maintain the immersion by having realistic behaviour in the interactive parts of the game, you will get a seamless experience from cutscenes to combat. I also think we are ready to use AI for more than just individual NPCs--the director system in Left 4 Dead is one interesting first step in that direction. We probably will see more combinations of AI systems that before were limited to one type of game: RTS games will have unit AI that will come closer to what you now see in first-person shooter games. MMOs could also start using more elaborate AI, potentially even to command hordes of NPCs. I hope we will see some brave studios try to create these new systems that are now becoming possible."

Getting the most out of multiplayer

Alex J. Champandard, the brains behind AiGameDev.com, has his finger in every game AI pie in town. Following his work as senior AI programmer for Rockstar Games developing RAGE (Rockstar Advances Game Engine), Champandard moved on to contract for Guerrilla Games’ Killzone 2, where he developed the strategic AI for the multiplayer bots before starting up AiGameDev.com, the online hub of the game AI community. In-between running the site, Champandard continues his AI consulting work with studios like 2K Czech and Crytek, as well as co-organising the aforementioned Paris Game AI Conference.
Citing Left 4 Dead, the Killzone games, the Halo franchise, Grand Theft Auto, Assassin's Creed, and Far Cry 2 as exemplary candidates in the game AI field, Champandard says the last decade has brought a deeper understanding on the part of game developers about how to best fix age-old AI problems.
"We've realised that just borrowing techniques from traditional artificial intelligence doesn't work, and it requires a significant know-how to get something fun and believable out of those algorithms," Champandard says. "A large part of figuring this out has been to think about the impact on the player and the in-game results. Thinking about creating NPCs as 'computational behaviour' instead of 'game AI' sums it up perfectly: it's not about the intelligence under the hood, it's about what the behaviour turns out like: adaptive, credible, entertaining."
Champandard lists the Assassin's Creed franchise as exemplary candidates in the game AI field.
This recent progress means game AI is no longer the weakest link in game development, as was the case 10 years ago. While some studios see AI as nothing more than a necessity, others are trying to innovate in the field. Champandard says the best example of this is AI Directors in sandbox games.
"The entire concept of a sandbox game is impossible without AI. The idea that you can do anything in the world and its inhabitants will react to you would not be possible without AI to power those NPCs. I think the industry has already discovered that you need AI Directors to make sandbox games really fun. Otherwise you may end up with situations that emerge out of the simulation that are just plain boring. Using AI programs that 'direct' the game helps make sure you're seeing the best of what the game has to offer, as planned by the designer. This kind of technology opens the doors to new types of games, where the story is generated as you play. The progress is slow, however, which means it may take a few years before this becomes mainstream."
Champandard believes the future of game AI lies in more solid multiplayer experiences, as AI systems slowly improve.
"I see more and more games providing bots as a way to improve the multiplayer experience. Playing online can be very unpredictable if you're not with your friends, and statistically players tend to prefer playing against bots than random people. On the technology side, AI is now helping out in all the other disciplines of game development, slowly revolutionising software engineering, and improving techniques and algorithms that are applied. It's an exciting time to be in the field."

Expanding the AI domain

Like Straatman, Monolith Games' Matthew Titelbaum believes future AI systems will become more immersive and allow for a new kind of gaming experience. However, quoting his experience working on the F.E.A.R. franchise, Titelbaum does not believe that giving NPCs more human-like behaviour is the way to achieve this. To him, it is not humanity that will advance game AI, but rather, more unpredictable behaviour.
"Most games take the player on a journey from point A to point B. Along that journey, the game presents the player with a series of puzzles to solve. Some may be navigational (i.e. how do I get across that gap?), some may be organisational (i.e. what order should I build things in?), but most of them rely on some sort of other characters' intent on destroying or rescuing the player. Without a series of interesting interactions with these characters, the journey can become fairly tedious," Titelbaum says. "It used to be acceptable for AI to have perfect knowledge of the environment. Now we have stimulus and sensor systems to more accurately model what an AI can conceivably know about. We’re also using planners, hierarchical state machines, and behaviour trees to map this out. The bar keeps getting raised as new concepts from the academic world steadily find their way into commercial releases."
F.E.A.R. 2: Project Origin made use of Monolith's goal-based action planner AI, which helped produce context-sensitive behaviours in the game.
Like Guerrilla, Monolith employs several subsystems for both normal and combat AI. The common link between the two is the goal-based action planner, which shipped in F.E.A.R. and F.E.A.R. 2: Project Origin, which helped produce the context-sensitive behaviours in the games. The first step came from the team’s level designers, who annotated each level with information that the AI could understand. For example, marking a table as a good spot to take cover and noting that the table must be flipped over before being used. After this marking is completed, it is up to the AI to properly interpret the annotations and integrate its own context: when the AI decides to go for cover and it chooses that table, the annotation tells it to flip it over. These systems work together to give the AI more knowledge about the world, and give players the impression that NPCs see their movements and correctly judge their intentions.
So if the goal of NPCs is to fool players into believing they think and act just like other players, why does Titelbaum think that striving to make NPCs correctly copy human behaviour in future AI systems is not a good idea?
"In general, I think human-like AI in traditional narrative player versus environment (PvE) games doesn't really make a lot of sense. The AI characters are there to play certain roles and be part of the puzzles along the way. They can be challenging, they can be unpredictable, they can even adapt to the player, but, above all, they have to be fun to solve. I don't see humanness and fun being directly correlated. I suspect Miyamoto-san doesn't spend a whole lot of time thinking how to make Goombas more human-like.
"That said, I think there are some game genres and styles where it may make sense to have more human-like AI. In a purely player versus player (PvP) experience, or in the real-time strategy genre where all player and non-player entities have the same set of choices, I can totally see the desire to have AI that behaves like humans. But, even then, do we want human-like AI or do we want challenging, engaging, and unpredictable AI?"
Titelbaum’s answer is to expand the domain to which AI is applied in games. For example, if AI can stand in for humans playing the game, this means developers should be able to find ways to allow AI to stand in for humans making the game.
"A good portion of level designers' time on F.E.A.R. was spent annotating all the places AI could hide; it's possible, using terrain analysis techniques, that much of that nitty-gritty work can be automated. Using automation removes the need to redo hand-done work when there are major changes to the level layout. The more parts of level creation we can automate, the more time level designers have to create and polish rich experiences."
Games like Far Cry 2 push the boundaries of game AI.
Titelbaum believes that as games mature and players' objectives begin to push past the 'kill everything in sight' trajectory, developers will create AI that is able to engage in richer, more immersive behaviour that will allow NPCs to express motivation and emotion not just through dialogue, but also through interactions with the player, other NPCs, and the environment.
"I don't think it's about AI acting just like humans. Possible or impossible, I just don't think that's really the desired outcome. Should AI engineers really be working on human-like behaviours and button-mashing behaviours? Is that really what players want to play against?
"I think it's more about AI always acting in plausible ways, given the role within the game they fulfil. There can be a lot of variation in what is plausible at any given time, and this is what makes for compelling interactions. They may have the capability to do what a human could do, but they don't necessarily need to do it when and how a human would."

PC Processors

Games use the CPU for physics, artificial intelligence, sound, and calculating world information. With increasing CPU power, developers can start designing better characters with smarter artificial intelligence, as well as incorporate more complex physics into games.
Processors for the most part aren’t the bottleneck in PC game performance. You’ll find that if you already have a multi-core processor that offers good performance, upgrading to a CPU with higher clock speeds or additional processing cores won’t improve game performance as much as upgrading your video card.

Upgrading your CPU will improve game performance as well as overall system performance.Upgrading your CPU will improve game performance as well as overall system performance.
Processors for the most part aren’t the bottleneck in PC game performance. You’ll find that if you already have a processor that offers good performance, upgrading to a CPU with higher clock speeds or additional processing cores won’t improve game performance as much as upgrading your video card.
However, that doesn’t mean you can go cheap on the CPU. You still need to have a good processor that’s powerful enough to run the game. Upgrading from an Intel Pentium 4 to an AMD Phenom or an Intel Core 2 (or the newer Core i7/i5/i3) processor will certainly increase frame-rate performance. You don’t have to get the top-of-the-line $1,000 CPU if you’re upgrading your processor or configuring a new system. A $200-$300 processor will perform almost as well in gaming applications.
Game system requirements will specify baseline CPU speeds and model types.Game system requirements will specify baseline CPU speeds and model types.
We can’t forget that processors are useful for other applications besides gaming. Faster processors can accelerate computationally intensive tasks such as video encoding, and can improve the overall system feel by speeding up boot times and reducing how long we have to wait for applications to launch. Having multiple cores also increases system resources to improve multitasking performance.

PC Video Cards

The video card you have in your system will determine what games you can play, how good they'll look, and how well they'll run. We won't call out specific video cards to buy in this section because cards stay relevant only for a year or so before they're replaced by newer models. Instead, we'll talk about how the video card can affect the quality of your gameplay experience and give you some tips on understanding game system requirements.
You can expect to upgrade your video card every two or three years to keep up performance levels and to maintain support for all the latest graphics-quality advancements. New games tend to take advantage of the features and extra processing power introduced by the latest video cards. Older cards that aren't as powerful and have fewer features won't perform as well or, in the case of extremely old cards, won't even work with some new games. For example, many of today's games now require video cards compatible with DirectX 10 or higher.

Choose a powerful video card to get the best graphics experience.Choose a powerful video card to get the best graphics experience.
Having a powerful video card lets you increase screen resolution and graphics-quality settings while still maintaining playable frame rates (how quickly the game updates onscreen). Faster frame rates help games run smoothly, without any choppiness. Players often have to compromise on graphics quality with less-powerful video cards. You'll have to reduce screen resolutions and disable some graphics effects to keep frame rates at an acceptable level on weaker video cards.
Games offer video card guidelines in their minimum and recommended system specifications. These guidelines can specify video memory sizes, DirectX compatibility levels, and even GPU type. The video card requirements can vary widely from game to game, given that some games can be much more graphically demanding than others.
Always check the system requirements to make sure the game will 
work with your PC.Always check the system requirements to make sure the game will work with your PC.
In general, you can expect video cards that barely meet the minimum specifications to run games at the lowest graphical settings, whereas cards that meet baseline specifications are good for medium graphics-quality settings. You'll need a much more powerful card if you're the type of person that plays on a big widescreen display with the highest graphics settings.

ATI RADEON HD 5870 Eyefinity with 6 Monitors

Gaming Experiences


Sticker shock aside, six monitors is a doozy of an experience. We don't get nauseous when playing games, but when that much of your view is in motion you might want to keep a paper bag around. Playing Tom Clancy's HAWX was an absolute trip. You actually feel like you're looking out of the cockpit of a fighter plane, sans the g-forces. DiRT 2 also provided a fantastic experience. Playing in hood mode or even inside of the car gave us an immediate connection to the rallying world, mostly because the width of three monitors is pretty darn close to the width of a real car--we're talking over 60 inches wide here. A real Subaru Impreza WRX is roughly 68 inches wide.
Supreme Commander 2 also looked amazing, although we have to add that you need to increase the mouse sensitivity if you hope to move across the screen with any level of quickness. DEFCON: Everybody Dies, think of the movie WarGames brought to life, on a screen this large--you might as well be in NORAD slinging nukes across the world.

First-Person Shooters


First-person shooters across six monitors simply do not work. The crosshairs on just about every game will line up perfectly with the bezel, which means you're constantly looking at a black bar and trying to aim with it. The effect is even worse when you turn on bezel compensation, which means you'll never even see your crosshair. Of course, there's nothing wrong with switching down to three monitors with games like these. The other three can still be used to run Web browsers, videos, and whatever else you want.

Desktop Usage

Gaming is only one use for the setup. Daily computing is absolutely mind-boggling and probably the chief reason you'd want to get the entire setup. Why minimize windows when you can leave everything up and running? Use two monitors for Photoshop, another one for Web browsing and IM clients, another for videos, and do whatever else you want on the other two you still have empty.

Greatest Gaming Rig

This week, it's the moment we've all been waiting for on Greatest Gaming Rig as we benchmark our system for the first time. On the motherboard, we've got all our amazing components plugged in and wired up to our two Corsair power supplies, and we've borrowed a watt-measuring multimeter from our chums at ZDNet UK to measure power consumption. For testing, we used a number of synthetic tests, including 3D Mark Vantage, Cinebench, and Unigine Heaven, as well as more real-word results in games like Crysis, Batman: Arkham Asylum, and Modern Warfare 2. All tests are run at stock speeds and at a 1920x1080 resolution.
The rig was set up as follows:
2 x Intel Xeon X5680 @ 3.33Ghz
3 x GTX 480 1.5GB 3-Way Sli
1 x GTX 460 for PhysX
24GB of DD3 RAM
Corsair Force 120GB SSD
1 x Corsair AX1200 PSU
1 x Corsair AX850 PSU
First up are 3DMark Vantage's CPU tests. The first runs an AI test, which features a number of high-intensity cooperative manoeuvering and pathfinding artificial intelligence calculations. The second runs a physics test, which features smoke collision and various cloth and soft-body obstacles. The Xeon X5680 is the fastest CPU that Intel currently produces, and with its six cores and workstation credentials, we expected some fantastic results. As the graph shows, our system easily came out on top, almost doubling the score of the Core i7-980x.
gamingrig
Next we fired up Cinebench. It's based on Axon's Cinema 4D, which is a piece of software used by production houses to create 3D content for movies. The benchmark renders a 3D scene with approximately 2,000 objects, more than 300,000 polygons, reflections, area lights, shadows, procedural shaders, and antialiasing. Cinebench is a great test for the X5680 because it's fully multi-threaded. This means it uses all 24 cores, taxing the CPU to the extreme. With all those cores, our CPU easily bested the i7s and made an absolute mockery of the aging Core 2 Duo.
gamingrig
Kicking off the graphics benchmarks is Unigine's Heaven. It uses a number of graphical techniques designed to tax the most hardened of GPUs, including dynamic lighting, physics calculations, and hardware tessellation, a feature unique to DirectX 11. Scores were taken with all settings maxed out and hardware tessellation enabled. Interestingly, though the three GTX 480s easily beat out the single cards, it's not by as large a margin as we predicted. With some tweaking and overclocking we should be able to eke a much better performance out of them.
gamingrig
Synthetic benchmarks are one thing, but what's really important is how the system handles actual games. We tested Crysis, Batman: Arkham Asylum, and Modern Warfare 2 all on maximum settings, getting great results. Of note was Batman: Arkham Asylum, which didn't run all that well in the video. After we restarted the machine, though, the game ran flawlessly, and we haven't been able to replicate that slowdown since.
gamingrig
During our tests we monitored the power consumption of the rig; the results were not exactly eco-friendly. Power usage peaked at a massive 950 watts during the 3D Mark Vantage tests, and averaged around the 650-850 mark when playing games. This is likely to go much higher once we start overclocking the system, meaning a higher electricity bill for us, and sadness for baby seals the world over.

Call of Duty Modern Warfare 2

Even if you've been living under a rock in the desert, chances are you've heard of Call of Duty: Modern Warfare 2. The game has sold millions of copies worldwide on the PlayStation 3, Xbox 360, and the PC. Like its predecessor, Call of Duty 4, Modern Warfare 2 runs well and looks spectacular. In fact, the minimum requirements for the game have hardly budged at all, which is not to say that the minimum grants you a remotely enjoyable experience. But those that are a few steps up from the bottom can still have a great experience. In case you're looking to get a better experience, we've gone through all the hardware to see what nets you more frames and eye candy.
Call of Duty Modern Warfare 2 Hardware Performance Guide
We used Fraps and the opening sequence of the Act II mission, Hornet's Nest, to measure frame rates in our tests. Hornet's Nest drops you off in a heavily wooded area somewhere in Rio de Janeiro and has you run up a small path leading to a shanty town full of enemies, rampaging trucks, and the ever-present explosive barrels. The level has a nice mix of the environments you're likely to encounter in the game. We ran each test three times and then averaged the results.
Game Settings
Modern Warfare 2 doesn't have too many settings you need to adjust to get the game to run well. You can leave most of them on, but two in particular will take a chomp out of your computer's performance.
Graphics
We tested Modern Warfare 2 with everything from the now-ancient GeForce 6800 up to AMD's current flagship GPU: the Radeon HD 5970. The game runs well on most video cards, but you're likely to find your biggest gains here if you want to upgrade.
CPU
You could grab an Intel Core i7-960 and be done with it, but you can get away with much less. We went through both quad-core and dual-core CPUs to see how much is enough.
Memory
Modern Warfare 2 needs 1GB of RAM to run, but we already know that that's just the starting point. We checked out what upgrading to 2GB, or even 3GB of RAM, can do for performance.

Systems

We put together a few sample systems to show how the game performed using real-world computers. Our slowest machine and the absolute minimum required to run the game, a 3.2GHz Pentium 4 paired with a GeForce 6800, struggled to churn out a barely playable experience at rock-bottom image-quality settings. Our mid-range system, outfitted with a 2.66GHz Core 2 Duo and a Radeon HD 5770, pumped out a more than playable experience with every setting on and a resolution of 1920x1200. The behemoth system, a Core i7-960 paired with the Radeon HD 5970, literally doubled the performance of our midrange sytem. The behemoth is entirely overkill for Modern Warfare 2.

System Performance

(Higher number indicate better performance)

640x480, Low Quality

Minimum System
21

1920x1200, 4xAA/8xAF, High Quality

Mid-Range System
70
High-End System
141
System Setup:High-End System: Intel i7-960, Intel DX58S0, 3GB DDR3, 750GB Seagate 7200.11 SATA Hard Disk Drive, Windows 7 32-bit. Graphics Card: Radeon HD 5970, beta ATI Catalyst.
Mid-Range System: Intel Core 2 E8600 , 2GB Corsair XMS Memory (1GBx2), 750GB Seagate 7200.11 SATA Hard Disk Drive, Windows 7 32-bit. Graphics Card: Radeon 5770, ATI Catalyst 9.11.
Minimum Requirements System: Intel Pentium 4 3.2GHz, Asus P4C800, 1GB Corsair XMS Memory (512MB x 2), 160GB Seagate 7200.7 SATA Hard Disk Drive, Windows XP Professional SP3. Graphics Card: GeForce 6800 128MB, Nvidia ForceWare 191.07.

powered by Blogger | WordPress by Newwpthemes | Converted by BloggerTheme