YouTube is the most popular video service on the web today, but its unannounced changes to advertising in the past month have significantly hurt those who make a living by uploading to the service, in a series of decisions that users are calling the Adpocalypse.
The changes involve an automated system that checks for what may be considered inappropriate content to advertisers. This comes after racist slurs used by the content creator with the largest number of viewers on the platform, Felix “PewDiePie” Kjellberg. It makes sense considering that the ads chosen to be placed in front of any particular video are more or less random, outside of Google’s targeted ad system, and advertisers don’t want their product displayed in front a potentially offensive or off-putting video. And when you’re advertising on a platform where anyone can upload content and there are channels dedicated to hate speech and violence, that is not an unlikely possibility or unreasonable expectation.
The issue comes in implementation. YouTube’s entire content base comes from outside users, with the exception of its YouTube Red subscription service. But this new system was not announced until it was already in place for roughly a month. The response was at first concern as many users’ revenue was cut by a significant margin. That turned to outrage when they discovered the reason.
Popular games journalist Jim Sterling, who uses YouTube as a way to display the majority of his work, said in a video: “For my part, the content I produce that’s still ad-supported, saw its revenue slashed by about half. Roughly 50 per cent of the money I was making vanished in a month. No idea why, outside of assumption and the nervous murmurings of other content creators.”
If the system does deem that a video is unsuitable for most advertisers it will restrict what ads are played, and since the video won’t be getting the full range, the uploader’s revenue will suffer. Once the video is flagged, users do have the option of submitting it for manual review, where a real person will look over the video and then deem whether or not it’s actually unsuitable in some way. However, since the system had been in place for a month without notice, users didn’t realize they had that option.
“We were getting flagged, not knowing about it, having no option to question it, and wondering where the money went,” Sterling said.
Sterling then asked to have his content reviewed manually. After a day, all but two of the videos submitted were deemed suitable for advertisers and had their original ads reinstated. The system is making mistakes.
“Every time I upload a new video, almost without fail, it gets falsely flagged and I have to request a review, which could take anywhere from a day to several days.” Sterling said. “Oh, and they only review videos that got 1,000 views in the last seven days, which means a lot of my older content isn’t worth appealing anymore, while those who run really small channels are just screwed.”
The automated system has been flagging pretty much everything, from video game content, to tech support videos, and How-To videos. For most, it means taking a significant hit to ad revenue in the first day or two of a video being uploaded, which is typically when a video gets the most views.
With the decrease in ad revenue, content creators had to look to other places to make a living, typically using the service Patreon, a site where people can pay creators for the work they do on either a monthly or per video basis. But YouTube decided to disallow links to Patreon or links to merchandise sites on video end cards for users who weren’t part of the YouTube Partner Program, due to abuse.
Sterling said, “To put it bluntly if it weren’t for Patreon, I’d be financially devastated by this.”
Users are still able to provide links to Patreon and merchandise in their description boxes and on their pages, but until YouTube’s Adpocalypse is over it will be difficult to make up for lost revenue.