Though there are still some hotly anticipated titles due in 2016, we might already have the game of the year on our hands. Overwatch has been on our radar for a while now but it recently blew up in a big way with an open beta that attracted 9.7 million players earlier this month. That’s a phenomenal figure, doubling that of Destiny and a third more than The Division.
When it comes to making video games, few do it as well as Blizzard. Although they aren’t known for creating first-person shooters, it seemed like everyone walked away loving the beta and simply couldn’t wait to get back in. I personally spent a number of hours playing and found the game to be flawless; the online multiplayer was satisfying and lag free.
Tapping its vast experience with competitive real-time strategy games, Blizzard has made something of a first-person strategy game with 21 heroes split between offense, defense, tank and support classes. Naturally, certain match-ups work better than others depending on the situation.
In typical Blizzard fashion, Overwatch is impressive looking yet runs well on a wide range of hardware. It scales down to work on low-end hardware but can also be cranked up to take advantage of high-end gear, especially at the 4K resolution.
Little to nothing is known about the game engine except that it was developed by Blizzard specifically for Overwatch and this goes a long way in explaining why the game flows so well.
Between Blizzard’s $40 base price and the fact that class-based shooters have been relatively stagnant since the arrival of Team Fortress 2 nearly nine years ago, Overwatch is poised to be hugely popular among PC gamers. Of course, one question remains: can your hardware handle it?
Benchmarking Overwatch accurately is a real problem because it’s exclusively an online multiplayer game. Getting more than a dozen co-operative players on a map at the same time to carry out a series of benchmarks over two days isn’t realistic. Therefore, we decided to test GPU performance using a custom map with no other players.
This saw me walk around an empty map for 60 seconds taking the exact same path each time using the same character. The Hollywood map was used though it wasn’t selected for any particular reason.
For CPU testing this method wouldn’t really be particularly useful so we had to get a little more creative. The solution again was to make a custom match, this time including bots. The addition of AI controlled bots places much more load on the CPU but because I can’t control where the bots go and what they do, this makes it difficult to gather accurate results over our 60 second test period.
The solution was to run the test for six minutes and take the average from three runs, which meant playing for at least 18 minutes per CPU test. I spent around 10 hours dominatiing medium difficulty bots — fair to say I’m a lot better now than when I started. The results from each of the three runs were surprisingly similar with no more than a 10fps variance, often much lower.
As you would expect, being a Blizzard title there are a huge amount of tweakable options in Overwatch and this enables the game to play on a wide range of systems. While the game can be played on relatively basic hardware we tested using the more demanding settings as we so often do. The ‘Ultra’ quality preset was used, though be aware you can go one step higher with ‘Epic’.
Something to also be aware of is that by default the game uses an automatic render scale setting which means you are unlikely to be playing at the actual resolution that has been set. All presets put the render scale option to auto. The game detects the graphics card being used and if it’s determined to be too slow the resolution is scaled down, creating a blurrier image than you would otherwise expect. Blizzard has obviously done this to make the game more hardware friendly but I think this setting should be locked at 100% for the ultra and epic presets.
For every graphics card tested I had to first enable the ultra preset and then force the render scale to 100% to ensure we were running at the target resolution. As usual, we tested at 1080p, 1440p and 4K while Fraps was used to record the frame rate data. Radeon graphics cards were tested with the Crimson Edition 16.5.3 Hotfix driver and GeForce cards were paired with the Game Ready 368.22 driver.
- Intel Core i7-6700K (4.00GHz)
- 4GBx2 Kingston Predator DDR4-2400
- Asrock Z170 Extreme7+ (Intel Z170)
- Silverstone Strider 700w PSU
- Crucial MX200 1TB
- Microsoft Windows 10 Pro 64-bit
- Nvidia GeForce 365.19 WHQL
- AMD Crimson Edition 16.5.2 Hotfix
- Radeon R9 Fury X (4096MB)
- Radeon R9 Fury (4096MB)
- Radeon R9 Nano (4096MB)
- Radeon R9 390X (8192MB)
- Radeon R9 390 (8192MB)
- Radeon R9 380X (4096MB)
- Radeon R9 380 (2048MB)
- Radeon R9 290X (4096MB)
- Radeon R9 290 (4096MB)
- Radeon R9 285 (2048MB)
- Radeon R9 280X (3072MB)
- Radeon R9 280 (3072MB)
- Radeon R9 270X (2048MB)
- Radeon R9 270 (2048MB)
- Radeon HD 7970 GHz (3072MB)
- Radeon HD 7970 (3072MB)
- Radeon HD 7950 Boost (3072MB)
- Radeon HD 7950 (3072MB)
- Radeon HD 7870 (2048MB)
- GeForce GTX Titan X (12288MB)
- GeForce GTX Titan (6144MB)
- GeForce GTX 980 Ti (6144MB)
- GeForce GTX 980 (4096MB)
- GeForce GTX 970 (4096MB)
- GeForce GTX 960 (2048MB)
- GeForce GTX 950 (2048MB)
- GeForce GTX 780 Ti (3072MB)
- GeForce GTX 780 (3072MB)
- GeForce GTX 770 (2048MB)
- GeForce GTX 760 (2048MB)
- GeForce GTX 750 Ti (2048MB)
- GeForce GTX 680 (2048MB)
- GeForce GTX 660 Ti (2048MB)
As you can see, even on the ultra quality preset Overwatch plays well on a wide range of hardware. Incredibly, even the lowly R7 260X and GTX 660 averaged around 60fps. This leaves older GPUs such as the HD 7950 or GTX 680 rendering around 90fps, which will get the job done nicely, as will the more modern R9 380 and GTX 960.
At 1440p the R7 260X and GTX 660 averaged around 40fps, and while this isn’t ideal performance, it is playable. For the much more desirable 60fps something like an old HD 7950 or GTX 680 will get the job done nicely, as will the R9 380 and GTX 960.
Keep in mind that Overwatch is a competitive online first person shooter and we found a significant difference between playing at 60fps and say 90fps for example. That being the case, serious gamers will want at least a GTX 970 or R9 390 for competitive performance at 1440p.
As always, 4K puts forward a more serious challenge for GPUs. Here the R9 380 and GTX 960 averaged around 30fps, which I would call unplayable in this title. Even the GTX 970 provides a bit of a rough experience here at 45fps while the R9 390 was slightly better at 50fps.
The cheapest way for gamers to reach that magic 60fps is the R9 Nano. The new $600 GTX 1080 (from board partners) performed admirably with an average of 91fps and this means the card will still be able to deliver over 60fps using the epic quality setting.
Republished with permission from:
Steven Walton is a writer at TechSpot. TechSpot is a computer technology publication serving PC enthusiasts, gamers and IT pros since 1998.