While people are still grappling with the technical ramifications of Google’s Stadia platform, gamers have begun asking deeper, more troubling questions. What do mods look like in a world of game streaming? What happens to game preservation? What happens if Google dwarfs gaming the same way it has with search, browsers and advertising? And most worryingly of all, what happens if Google decides to walk away from the industry later on?
In the immediate aftermath of the Google Stadia announcement, the public discourse largely centered on the technicalities. That was the part Google had provided the most detail on, so it was natural for people to focus on broadband connections, latency, and what is possible now versus a few years from now.
There was a little bit of excitement mixed in with all of that. What’s the gaming experience like when your connection is in the same room as the dedicated servers that you’re playing on? What’s the potential level of fidelity like when games aren’t limited to the hardware in a single console, or a single PC? What experiences can you have when it’s possible to develop a game that takes players across multiple screen formats?
That’s exciting to think about. But there’s no such thing as a free lunch, especially with a company that wants to carve up a sizeable chunk of the gaming pie for itself.
The biggest complains or concerns against Stadia can be categorized into three broad aspects. The first is a backlash against Google itself. Not Google the search engine, or the presence of a company the size of Google (or its parent company Alphabet), but rather concern over how Google specifically operates as a business.
Google has a history of launching and then abandoning products, even ones that users really love. There’s Google+, the company’s alternative to a Facebook-style social offering that never really took off. There’s offerings like Google Reader, which fans of RSS readers still miss today. Google Health, a service to broaden access to health and wellness information, was shut down in 2012 after “not having the broad impact that we hoped it would”. Google’s Orkut social networking service found some popularity overseas, but it didn’t gain traction in the United States, so that was killed off in 2014. Google’s Allo messaging app was shut down this month.
It’s not just virtual products that Google has a history of walking away from. The most damming indictment of the company’s attitude brought up in the past week was the rollout of Google Fiber in Louisville, Kentucky. Louisville became the 12th city added to the fiber project back in 2017, and the internet conglomerate quickly set about rearranging the city’s infrastructure to offer gigabit speeds to residents.
But Google vastly underestimated the technical scope of the project. The plan was to roll out fiber using a series of shallow trenches, where fiber was laid two inches beneath the sides of roads and later covered up with asphalt. The process caused massive disruption to the city’s roads, since they had to be torn up. Worse still, the pits and asphalt were too thin, resulting in the rubber patching and, in some cases, exposing the cables and wiring.
Google had to recover affected areas with hot asphalt a second time, but that wasn’t the only problem they faced. AT&T and Spectrum sued the conglomerate to block a city ordinance granting Google access to electricity poles in the city. AT&T owns most of the poles in the area, but the lawsuit was really just an attempt to stall Google’s rollout, as evidenced by the company’s refusal to challenge the judge’s ruling.
But the technical challenges proved too much, and after all the disruption Google announced it was shutting down the Louisville project entirely, less than two years after signups began. The experiment hasn’t been a total failure - Google’s presence forced AT&T to roll out gigabit services faster than they would have ordinarily. But for residents who watched their city pass all the laws Google wanted, and then watched as Google tore up their streets and laid hot asphalt over everything to fix it, only to abandon the project and shut down services altogether, it’s a galling lack of respect.
Rightly so, people have questioned what would happen if Google took the same approach with games. Which feeds into the second major concern.
Part of the reason why emulators are so revered is because it’s the only way some older titles can be played at all. Video games are built on a long and great history of quirks and differences - different games for different regions, titles being censored or banned outright in some nations, as well as what happens to a game during the localization process.
In the modern era, that preservation problem has been less about functioning hardware and more about compatibility. There’s plenty of modders and gamers who have found ways to get titles that used to run on Windows 95 or Windows 98 playing just nicely in 2019. GOG and Night Dive Studios are great examples of making a living doing precisely this.
But have you ever tried to get a game that only ran on Windows 3.11 going? And that’s just the compatibility problems. Archivists also have to deal with the degradation of physical media: cartridges that no longer work after 15 or 20 years, magnetic media that becomes disoriented over time, essential data stored on EPROMs that eventually becomes unreadable.
Preserving these games is only possible because gamers have access to the original files, either through physical means or by way of being able to download them locally in the first place.
Cloud gaming does away with that process entirely. It’s part of why cloud gaming has any appeal at all - by not having to download and install tens of gigs worth of assets, you’re cutting out all kinds of loading and downtime that gets in the way of actually playing a video game.
But it also means you’re entirely reliant on servers for that game, or the platform holders that offer them, being online forever. And that’s never, ever the case. Even when communities have tried to keep older games online, they can run afoul of license holders and copyright issues. But at least fans can try to keep a game alive.
With cloud gaming, that’s not possible.
Now that might not matter a great deal for games that are being offered via traditional, local storage mediums. In the interim, things like the next Assassin’s Creed, the next Fallout, Battlefield 6 or whatever the next AAA game is will be available like that. You’ll be able to buy them digitally or on a disc, like always.
But what happens when games are designed solely around the idea of a cloud service, like the platform exclusives Google is funding?
And what happens to the future of mods? Some of the greatest games today exist exclusively as a result of mods: Team Fortress 2, which went on to inspire Overwatch; Counter-Strike, which the foundations of esports in the West were built on, was borne out of a Half-Life mod; and even the ways games have been improved or overhauled through the tireless work of fans, as seen in the Fallout and Skyrim communities.
Do developers have to build new systems and models to make existing mods playable in a cloud gaming context? Do new editors have to be made for people to access the files? Or does that functionality just disappear altogether?
Part of Google’s Stadia pitch wasn’t just to eliminate frustrations for gamers, but also the technical limitations of existing hardware that frustrates developers.
Take the idea of elastic compute. Instead of relying on the power of a single console, developers building for Stadia could design around combining multiple data centers PCs, allowing games to be run at even higher resolutions, with even more fidelity, able to populate in-game worlds with more people, more things to do, and just more stuff.
That’s enticing because existing hardware will only take you so far before you run into a litany of performance problems. It might be the lower-powered CPUs in consoles that make it difficult to calculate the movement of too many NPCs at any given stage. Or memory limitations that affect how much data a client can buffer and stream at any given moment.
But how do you keep a game alive that was never designed to exist outside of a data center in the first place?
Nobody can answer that. And to be precise, it’s not a new problem. It’s a question people have asked repeatedly with the rise of digital platforms like Steam, and the online-only nature of gaming services in 2019 more generally. Even without cloud gaming, the push towards subscription-based services means there will be a segment of gamers who - in all likelihood - spend hundreds of dollars a year on a hobby without actually having anything tangible to show for it.
You’re paying for access, not a product. Should that company decides your money is no longer worthwhile, there’s bugger all you can do about it. And the same applies for pricing and access more generally. Australians might have access to a wealth of gaming platforms, and there’s competition on the horizon for cloud gaming too.
But in emerging countries and continents, where modern gaming has failed to penetrate due to a myriad of issues (socioeconomic conditions, internet infrastructure, shipping and supplier problems in getting hardware into some countries), that choice might not be available.
What happens in those places when there’s nobody to stop Google from upping prices?
The third and most instant backlash to Stadia was the technical possibility, as in whether Stadia would function at all. A lot of that conversation was dominated by the here and now. Some Australians have rightly pointed out that the spotty, broken rollout of the NBN means a service like Stadia is vastly less enticing than it should be. But the majority of criticism actually came from Americans. Google might have all the data centers, cloud platforms and internal infrastructure it needs throughout the US continent, but the quality of internet service from state to state is shockingly unreliable, so much so that it’s not unreasonable to argue that Australia has better internet - on the whole - than the continental US.
Google Stadia’s chief Phil Harrison told Kotaku that only 30mbps is required for streaming 4K content, with the 1080p/60fps stream for Assassin’s Creed: Odyssey needing 15mbps (although 25mbps was recommended). If you consider that most Australians tend to stream content at 720p or on smaller devices, where the trade-off of lower resolutions is more acceptable, it’s not unreasonable to think that, as of today, a solid chunk of the Australian diaspora would be capable of enjoying a smooth Google Stadia stream right now.
There’s the rollout of the 5G network to consider as well, the advancement of the NBN, and what happens with future compression technologies and next-generation video encoders like H.265/HEVC/AV1. Newer encoders offer better quality at lower bitrates, meaning users don’t have to stream as much data to get the same quality picture.
But even if we make some concessions for the practical bandwidth requirements, there’s still the latency problem.
John Carmack’s quip this week about gamers playing with unoptimized TVs is interesting as a reminder. Gaming is the world’s largest entertainment medium, and while there is a huge subsection that cares extraordinarily deeply about the smoothness and technical precision of some games, there are plenty of people out there who really, truly don’t give a shit.
There is a point where “some lag” becomes “unplayable”, and what that window looks like varies enormously for different games. Narrative adventures or episodic titles like Life is Strange should have no qualms running on any service. As long as the video quality is sufficient and the delay isn’t tectonic, most people will be happy.
But the whole Stadia project isn’t designed just to bring singleplayer games to the world. It’s an extension of the largest source of content creation on YouTube - gaming - and the community that exists within that. So the real test of whether Stadia works depends on how much Google can minimize the latency in multiplayer games. And some of those games have very, very small margins for error.
Fighting games are a great example. A lot of these games have extremely tiny response windows. Take the simple parry technique, a motion introduced in Street Fighter 3 that required pinpoint timing. It’s not just a neat feature, but a measure of skill that also happens to be central to one of the greatest and most iconic moments in gaming’s history:
Parrying a super like Daigo did requires 15 correct taps up or down on the stick. The window for just one successful parry is only between six and ten frames, which amounts to about one-tenth of a second at best to respond, or 100 milliseconds.
The average reaction time of most humans is between 210 milliseconds to 250 milliseconds for a visual prompt, around 170 milliseconds for an audio cue, and a little less than that for physical stimuli (being touched, for example).
When you factor in the time someone has to respond against the lag between a button press and that action being recorded, along with display lag and any other associated delay from the connection itself, it’s a bloody small window.
Initial tests from Eurogamer found that Google Stadia had around 166 milliseconds of lag, with display and Wi-Fi connection delay included. That’s more than double what you’d get from a PC game playing at 60 frames per second. It’s also far, far too much than what players would consider acceptable for a lot of esports titles - Counter-Strike, League of Legends, Rainbow Six: Siege and so on - and certainly enough that it would interfere with the experience of twitch-based shooters, like Apex Legends, Fortnite or Battlefield.
Of course, if anyone can make it work it’s Google (or Microsoft). The biggest downfall for cloud gaming services in the past has always been infrastructure, which is the biggest component in making a service like this work. The streaming element is a problem that’s already been solved. Some gamers are saying the input lag is the biggest problem facing Stadia, and while it’s certainly a huge challenge, it’s worth remembering that reducing lag was a problem that developers and game programmers were finding ways to solve in the ‘80s and ‘90s as well.
As more devs shift their focus or start investigating the cloud gaming experience for themselves - which a company the size of Google generally encourages - more solutions will be found to reduce response times and input lag across multiple devices. The Stadia controller connecting directly to data centers, rather than a Chromecast or another device, is one way of tackling this.
It’s also worth remembering that Stadia doesn’t have to solve all these problems. Companies are excited for cloud gaming precisely for its potential to expand the current gaming market - not necessarily its potential to subsume the existing audience. There are plenty of emerging markets that can’t enjoy gaming today due to the cost of consoles, TVs, gaming PCs and associated peripherals, and for those markets the ability to stream something through a low or mid-range phone, relying exclusively on their mobile connection, opens up a whole new world. There are hundreds of millions, if not billions, of people in situations like those, and a lot of the discussion around Stadia has left them out of the loop entirely.
But that doesn’t mean Stadia is a service that should be welcomed with open arms. Google doesn’t just need to convince people that Stadia can work - it needs to convince gamers that it will stick around for the long-haul. Google’s handling of the shifting trends on YouTube certainly hasn’t engendered a lot of faith, and it’s natural for people to be concerned about what the gaming market looks like after a conglomerate the size of Google starts throwing its weight around. Google hasn’t allayed those fears just yet, and until they do, expect the backlash to continue.
This story originally appeared on Kotaku Australia.