YouTube recently underwent a serious shift in what videos it considered appropriate to be monetised. Monetisation means that a video is approved to have ads show ahead of it, and the creator of the video gets a percentage of that advertising revenue. Previously, channels were considered for monetisation on the basis of their content, and if they were approved, then it was rare for a particular video from a channel to be demonetised – that is, to be considered inappropriate for advertising, and thus to make its creators no money from YouTube.
However, last month a number of advertisers in the UK were shocked to find their ads being played alongside videos with “extremist content” – including those which promoted anti-Semitism and terrorism. Big brands pulled their advertising from YouTube and Google went into a panic. Almost overnight a huge swath of videos were demonetised – including plenty which were completely innocuous.
Content which Google decided was “not advertiser friendly” includes videos that are sexually suggestive, contain foul language, or – somewhat problematically – deal with topics like war, political conflict, or natural disasters.
This essentially means a video about the Christchurch Earthquake is considered to be as toxic to advertisers as one which contains racial slurs.
Alongside this, the category of “sexually suggestive” appears to encompass all LGBT-related content, even if sex itself is not mentioned in the video at all. This has led creators to call the demonetisation policies discriminatory, and means that a lot of good content is likely to cease as the creators can’t make any money from their videos.
This isn’t just a problem for YouTube. As Facebook brings in more video content, brands are going to find their ads sandwiched between videos of cute puppies and live streams by your racist uncle. This is less of an image problem for Facebook advertisers as users understand their racist uncle isn’t actually getting any money from that particular advertisement. However, static ads can still appear alongside pages and groups full of misogynistic vitriol and racist speech.
And this problem of appearing to endorse objectionable content will pretty much always be a problem for advertising on social media (which is what YouTube still is, for all that it’s trying to become Netflix). Due to the vast amount of content that continually gets uploaded to these sites, the definition of what will be considered objectionable is always, at least at first, going to be defined by algorithm. These algorithms may be too lax, or too strict, but until robots can analyse the way humans can, they’re probably going to continue to get it wrong.
This means creators will struggle to make money off good content and brands will appear beside things they’d boycott if it were television. It depends on the settings of the algorithm as to which occurs more often.
Advertisers, of course, would prefer to err on the side of their brands not appearing to endorse terrorism. But it also means they may struggle to effectively target audiences they wish to reach. If swearing automatically demonetises a video, what happens when your desired demographic is one which doesn’t mind the odd f-bomb?
Social media advertising can feel like a highly targetable medium – every platform comes with the ability to narrow down the target audience to an almost frighteningly specific degree. But again, this requires placing trust in algorithms that might not work for you. So what’s to be done?
The answer may be to look beyond advertising on a platform to advertising within it. In the case of YouTube, this means cutting out Google and engaging with creators – instead of the ad running before a video, it runs within it – often plugged by the creator themselves. This leverages both the creators existing targeted audience and YouTube star power.
Companies have long collaborated with “influencers” on social platforms – particularly Instagram and Snapchat. This could be the way to get around the whims of Google or Facebook – to the point where small businesses may have to become influencers themselves.
This is undoubtedly more complex than the algorithmic targeting of age, sex, location. It involves creative thinking and trying to predict the latest trends and getting on-board with the latest platforms. But in a world where people are employing ad blockers and using the skip button ever more liberally, it may be the only way to really get your product in front of a plethora of interested eyeballs.