YouTube has blown up this week over revelations that child predators have been using a combination of seemingly harmless videos and the site’s comments section to engage in predatory behaviour. In response to this, major advertisers like McDonalds and Nestle have begun pulling commercials off the platform, along with Epic and Fortnite.
Having learned that some Fortnite ads were playing before the kind of videos explained here by Matt Watson, Epic has pulled all pre-roll (the ones you get before a video starts) commercials from YouTube.
“We have paused all pre-roll advertising,” a spokesperson for Epic told The Verge. “Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service.”
YouTube has pledged to take action, but the revelation once again raises two big issues for creators on the platform. Firstly, if more major advertisers like Epic and McDonalds begin removing their commercials, then everyone’s ad revenue is going to suffer, adversely affecting people who have had nothing to do with the controversy.
And secondly, this discovery of actual child porn behaviour comes in the same week that some harmless Pokemon videos were mistakenly barred from YouTube. These kind of errors and oversights are inexcusable for a company the size of YouTube, and show that they’re too reliant on algorithms to moderate their site when more direct human intervention could have better addressed this.