I Bought An Expensive-Ass PC Gaming Monitor And It’s Really Good

Look, sometimes you spend $709 on a gaming monitor and wind up really happy with your purchase. I’m not saying it happens every time, but it happened this time.


After months of hemming, hawing, and talking myself out of it, last week I went ahead and dropped a bunch of bucks on a newfangled G-Sync monitor. This was a luxury purchase, and in many ways a ridiculous one: The monitor I bought cost twice as much as my PC’s graphics card, which itself cost as much as a PS4 or Xbox One. Whatever, though. Thirty minutes after I plugged the thing in, it was already clear: This shit was worth it.

As a single upgrade, this monitor has had a bigger and more immediate impact than upgrading to a 4GB graphics card; more than moving my Windows installation and games to a solid-state hard drive; certainly more than upgrading to Windows 10 or overclocking my GPU and CPU.

G-Sync is a proprietary monitor technology that the graphics-card manufacturer Nvidia introduced a couple of years ago. The idea sounded good: A chip that’s built into a monitor allows the monitor to talk directly with your PC’s graphics card and change its refresh rate on the fly, smoothly matching whatever is being output by the card.

If you’ve seen all the talk about frame-rate and 60fps over the last couple of years, that’s all tied to refresh rate, too. The higher (and more stable) the frame-rate on a game, the smoother it looks. The closer the game’s frame-rate is to the screen’s refresh rate, the less chance of tearing or hitching.

Illustration for article titled I Bought An Expensive-Ass PC Gaming Monitor And It’s Really Good

Historically, PC gamers have been stuck with two options for syncing frames with their monitor refresh-rates: They can use vertical sync (Vsync), which artificially locks a game’s frame-rate to a target number, or they can simply run the game with an unlocked frame-rate. Both options have downsides, and both options can leave you feeling like you’re not getting the most out of your expensive graphics card. Everything I’d heard about G-Sync suggested that this technology is a for-real, actual, bona-fide way to sync your PC and your screen, and that it makes games run noticeably more smoothly.


Last week I decided, fuck it, I’m going for it. Here’s the monitor I bought. It’s a 27-inch, 2560x1440 Acer with a 144Hz refresh rate and built-in G-Sync support. There are plenty of other G-Sync monitors out there; this one had some good reviews, so I decided to go with it. I got mine from Amazon for a little below list price, but lots of stores carry them.


Thoughts informing my decision:

  1. It seems like a safe bet to get a 1440p monitor, given that it’s become a more reasonable resolution for stable PC gaming. 4K resolution just doesn’t seem practical or even necessary for a monitor-sized screen.
  2. 144Hz is more than double the 60Hz refresh rate of the other screens in my apartment, but I’ve seen enough PC gamers swear by higher frame-rates that I wanted to see what the deal was.
  3. Between the resolution and refresh rate, this monitor seems like it’ll be future-resistant, at least for a few years. For better or for worse, I’ve already committed to Nvidia’s whole deal by buying my latest graphics card from them, so I don’t really see myself switching to AMD anytime soon.
  4. It’s getting dark at like 4PM in Portland this time of year, and buying myself something cool will make me temporarily forget about that and feel happy.


  1. It only has a single DisplayPort input, so I won’t be able to have it double as an aux monitor for my game consoles without buying an expensive adapter and manually swapping inputs. Apparently this is always true for this kind of G-Sync monitor, and it’s a bummer.
  2. It costs $709, which is an insane amount of money to spend on a gaming peripheral, and enough to buy a whole lot of donuts and pastrami sandwiches.
  3. I don’t really care for some of the ways Nvidia does business. (More on that in a bit.)

The pros outweighed the cons, so I ordered the thing. A few days later, it arrived. I plugged it in, and yow. It is a damn good monitor.

Illustration for article titled I Bought An Expensive-Ass PC Gaming Monitor And It’s Really Good

It’s tricky to write about this kind of technology, because I can’t just show it to you. You’ll have to take my word for it. So: G-Sync works as advertised, and it’s noticeably changed how I experience PC games. I no longer sweat frame-rate fluctuations at all—I just turn on a game, turn off Vsync, and let it run. I have a GTX 970 graphics card, which can run most games at at least 1080p and get them north of 60fps. Since my monitor now goes all the way up to 144Hz, it has plenty of headroom to let games exist in the 60-80fps range, and thanks to G-Sync, it runs all of those frame-rates smoothly, with no tearing.

(Some purists may need their games to run at 144fps—I’m not there yet. I can usually detect when a game drops below 60fps, but in the midst of gameplay, I can’t really tell the difference between, say, 78fps and 94fps. When I run a game at a locked 144fps I can detect that it’s unusually smooth, but anything north of 60 is fine by me.)


I wasn’t aware of just how thoroughly screen-sync issues had invaded my PC gaming consciousness until I no longer had to deal with them. Time was, I’d start playing a new PC game with one eye on the FPS counter in the corner of the screen. If I saw a hitch or a slow point, both eyes would dart to the edge of the screen, like I was trying to catch the performance dip in the act. “Oh, nicked down to 54fps that time,” I’d think. “That’s not good.” Eventually I would have to turn off the FPS counter just so I would stop fixating on it and enjoy the game.

Now, every game just runs. GTA V is capable of maintaining 70-90fps on near-ultra settings in 1440p, and you should see it. It looks perfect. Even in the rare event that the frame-rate dips below 60fps, I barely notice, because there’s no hitching or stutter. Other games look just as good: Shadow of Mordor, Mad Max, The Witcher 3, Black Ops III, and on, and on. Dying Light looks bananas. Fallout 4 and Just Cause 3 have some real problems running at a consistent frame-rate, but even those games’ dips aren’t a big deal with G-Sync running.


The monitor can be a little funky sometimes: Assassin’s Creed Syndicate, for instance, drops its FPS to zero every time I leave a menu, though it returns to 60+ a few moments after that. Divinity: Original Sin will sometimes start freezing and unfreezing periodically, though that problem is addressed by restarting the game and I actually can’t say whether it’s G-Sync related or not. Regardless, a few hiccups don’t do much to mar the overall experience for me. It’s PC gaming, after all. There’ll always be some funkiness.

There are a few other things I don’t like about the monitor, however, chiefly the fact that it’s required me to make a substantial financial commitment to Nvidia’s hardware ecosystem. Ugh. Just typing the phrase “hardware ecosystem” makes me feel compromised. It’s one thing to take the initial step of buying an Nvidia or AMD graphics card. That’s a first step, and you can always change your mind next time around and go the other way. Buying a second piece of hardware is a much more substantial investment; it effectively removes any chance that I’ll switch to AMD for the duration of this monitor’s lifespan.

Illustration for article titled I Bought An Expensive-Ass PC Gaming Monitor And It’s Really Good

The technology is so good that I wish all monitors had it and that it could work with any graphics card. So, it’s too bad that G-Sync is proprietary to Nvidia and requires such a financial commitment to get it. AMD has a competing technology called FreeSync, which sounds like it works similarly, in that it requires a monitor to be equipped with the technology before your AMD card can work with it. There are some technical differences in how the two operate, but a primary difference is that Nvidia has controlled who can and can’t have a G-Sync module for use in their monitors, while AMD has made FreeSync freely available to any company that might want to support it.


AMD’s more open approach is doubtless fueled in part by the fact that they’re the less popular brand and need to cut into Nvidia’s lead, but the dichotomy between the two still reinforces my distaste for the way Nvidia operates. Nvidia is all about injecting their proprietary tech anywhere they can, meaning that most big-budget games ship with GameWorks features that can (and often do) hobble performance on non-Nvidia GPUs. I get how competitive the PC gaming market is, and I don’t entirely begrudge Nvidia their attempts to succeed and make money, but as we’ve seen over the last couple of years, that kind of cutthroat maneuvering hurts games as often as it helps them and causes headaches for gamers who don’t get in line.

It feels like it’ll only be a matter of time before all gaming-oriented monitors and TVs can do something similar to G-Sync. I may dislike that I’m forced to pick a brand and stick with it for the foreseeable future, but for the time being, I’m ultimately fine throwing in with Nvidia. I trust that they’re not going anywhere and that their cards will generally do a good job of running the games I want to play. It’ll do for now.


I understand that this article might get some of you considering spending too much money on a piece of hardware that you really don’t need. I sympathize! Most PC gamers have that one piece of equipment that they don’t own, but that they’d like to own. Your PC can always be a little bit better, after all; it’s both the blessing and the curse of PC gaming. You could always have that slightly faster CPU or that slightly improved graphics card; that clackier keyboard, or that mouse with all the buttons.

All of us have that one thing—the next thing—that we’re considering getting. Sometimes that next thing is more trouble than it’s worth; sometimes, that next thing is a disappointment. But sometimes, you spend a bunch of money on a thing and it winds up being totally worth it. Hooray, a new piece of technology does exactly what it promised it would.


To contact the author of this post, write to kirk@kotaku.com.



I never understood why most pc gamers are so reluctant to upgrade above 1080p. It has instant noticable benefits (so many more pixels, not even including G-sync or higher refresh rates). This isn’t aimed at people running a GTX660 or something, but you have people rocking SLI/3-way-SLI 980/TI/Titan X and saying “1080p is fine”.

Not to mention G-sync is amazing. I’ve been using a ROG Swift for about 18 months now and I can literally not understand the logic behind people willing to spend €2k on GPUs and stay at 1080/60. My ROG Swift has been the best piece of hardware I bought in recent years hands down. I can’t wait until I finish my new, overkill build to really make it shine.