Two years ago we published an in-depth performance review of the fifth major installment in the Call of Duty series, World at War. Since then Call of Duty: Modern Warfare 2 was released almost exactly a year ago. Now on to a successive and very successful yearly release, Call of Duty: Black Ops arrives, paving the way for even more games in the series.
Call of Duty: Black Ops debuts a new theme around the Cold War conflict whereas previous titles related to either World War II or modern age warfare. Developed by Treyarch, Black Ops seems to have a lot more in common with the older World at War than Modern Warfare 2 though.
Besides the fact that it's coming from the same development team, Black Ops essentially uses the older Call of Duty 4 game engine. To be precise the game runs on an enhanced World at War engine, which is in itself an improved version of what Call of Duty 4 used back in 2008. This new revision of the engine features a streaming texture technology also supported in Modern Warfare 2 that allows for larger levels, for example the "Payback" level where the player controls a helicopter. Additionally, lighting effects have been improved and the game supports 3D imaging rendered when using the correct hardware.
Before we get busy with the benchmarking to see how various hardware configurations handle this game, here is a quick summary of what's going on (courtesy of Wikipedia). As pointed out before, Black Ops takes place during the Cold War. The player mainly controls two characters: special forces operative Alex Mason and CIA agent Jason Hudson. The single-player campaign revolves around an experimental Soviet chemical weapon codenamed "Nova-6". The game includes locations such as the Ural Mountains in central Russia, Cuba, Laos, and Vietnam. Viktor Reznov, a key character from the Soviet campaign in World at War has been confirmed to return for Black Ops, joining Mason and the SOG in Vietnam. Dimitri Petrenko, the Russian protagonist from World at War also makes an appearance.
The online multiplayer mode of Black Ops retains the experience points and unlockable reward system that has been kept since Call of Duty 4. "Create-a-Class 2.0" allows enhanced personalization with appearance items as well as upgradable perks: weapons are extensively customizable with clan tag writing, emblems, attachments and camouflage painting.
For the first time in the series, clips from online gameplay can be recorded. Some PC specific features that were taken away from Infinity Ward's Modern Warfare 2 return, such as lean, mod tools, the developer console and dedicated servers. Dedicated servers are exclusively provided by Game Servers. Steam is the exclusive platform for Black Ops on the PC, so the game is protected by Valve Anti-Cheat.
While testing different processors with Call of Duty: Black Ops, it became apparent that quad-core processors offered considerably better performance. Before we show you the results, here's a quick CPU utilization screenshot.
Call of Duty: Black Ops can take advantage of four cores and is undoubtedly optimized for the current crop of chips. In fact, the game relies so heavily on all four cores that we found it almost unplayable (OK, that may be exaggerating a bit) on even the fastest dual-core CPU. That said, AMD's triple-core Athlon II and Phenom II processors did provide lag free performance.
Although it is a CPU bound game, Black Ops doesn't rely on a processor's L3 cache to perform at its best. The quad-core Athlon II processor X4 645 was just 6fps slower than the hexa-core Phenom II X6. Moreover, six cores are a waste in this game and the Phenom II X4 970 proved that added frequency is more valuable.
Something that's immediately apparent when looking at the above graph is the fact that the Core i5 and i7 processors are far superior to anything else we tested. For example, when paired with the Radeon HD 5970, the Core i5 750 was a staggering 29% faster than the Phenom II X4 970, which we might add is clocked 32% more aggressively.
The Core i3 540 on the other hand crashed and burned, with just 42fps on average, making it surprisingly slower than the Core 2 Duo E8500. Clearly, Hyper-Threading technology was of little help here.
1680x1050 - Gaming Performance
At 1680x1050 we found that the Radeon HD 5770 was able to average 61fps with a minimum of 41fps, which we considered playable. Below that there were half a dozen slower graphics cards - the GeForce GTS 450, for example, delivered playable frame rates but sudden drops in performance were occasionally noticeable.
Surprisingly, the old Radeon HD 4890 performed poorly. Averaging just 50fps, it was 2fps faster than the Radeon HD 4850 but both cards occasionally dropped to just 30fps and the lag was quite noticeable. The GeForce 9800 GT averaged 42fps and really struggled with intense scenes, while the current-gen Radeon HD 5670 was as always a disappointment.
Looking beyond the Radeon HD 5770 it was all smooth sailing. The old GeForce GTX 260 managed an average of 70fps and a minimum of 52fps, while the GeForce GTX 275 was significantly faster with an average of 84fps. The Radeon HD 5850 also performed well averaging 89fps, though it occasionally dipped down to a minimum of just 52fps.
At the top of the scale we have the brand new GeForce GTX 580 with an average of 123fps, where we suspect at this resolution it found the limits of our overclocked Core i7 processor. The GeForce GTX 480 was on average just 2fps slower, though the minimum recorded frame rate was 17fps slower. The dual-GPU Radeon HD 5970 is rarely seen performing well at lower resolutions but with an average of 117fps we have nothing to complain about this time.
Republished with permission from TechSpot.com.
Steven Walton is the chief hardware editor at TechSpot; he also runs his own review site Legion Hardware.
TechSpot is a computer technology publication serving PC enthusiasts, gamers and IT pros since 1998.