The ‘demonetized’: YouTube’s brand-safety crackdown has collateral damage

This past May in New York City, YouTube held a summit for its top 100 or so video personalities, individuals and networks. It was a chance for these creators to get face time with senior YouTube executives and discuss what was happening on the video platform. One issue that some attendees felt YouTube needed to address: declining ad revenue.

A month earlier, YouTube had started using more of its artificial intelligence tools to flag offensive videos on the platform — a move driven by the so-called “YouTube adpocalypse,” during which many advertisers threatened to curb ad spending on the platform after discovering ads were being shown next to offensive, hateful and violent content.

Four months later, eight creators Digiday spoke with said they’re still feeling the impact of “demonetized” videos to this day. Some of these creators’ videos might have colorful language, but nothing unsavory, and yet with YouTube’s new ad policies their videos are being swept up with the real muck.

Creators are still hurting
When YouTube’s disabling of ads started in March, some top creators saw a significant hit to their ad revenue. Phil DeFranco, who has about 5.6 million subscribers on YouTube, saw ad revenue drop by 30 percent within the first month. H3h3Productions, which has about 4.7 million subscribers, said it was only making 15 percent of what it typically makes month to month on YouTube. (Tubefilter has a breakdown on how widespread and significant the issue was early on in YouTube’s crackdown.)

“We expect the third quarter to generally start lower due to seasonal trends, but this year, the drop was larger than in past years,” said Evan Bregman, director of programming at Rooster Teeth, which operates a network of owned and partner channels on YouTube including Rooster Teeth and Slo Mo Guys. Collectively, the network reaches 38 million subscribers.

By May, revenue for most top creators were getting back to pre-apocalypse levels. Of the 100 or so attendees of the creator summit, only four raised their hands when a YouTube exec asked if anyone’s revenue numbers were still taking noticeable hit, according to a YouTube spokesperson.

But that doesn’t mean the issue has gone away for all creators, especially those that don’t have several-million subscribers and invitations to private summits with top YouTube brass.

Take Hannah Rutherford, a gaming creator within the Yogscast network. Rutherford consistently gets 1 million views per month on her YouTube channel, which has 1.3 million subscribers, according to Tubular Labs. In a tweet, she said she’s now making twice the revenue on Twitch than she does on YouTube, primarily because her YouTube ad revenues have been “tanking” even though viewership has remained flat. 

The issue is even more troublesome for news and social issues creators on YouTube. Real Women Real Stories, for instance, is a small YouTube channel that promotes women’s rights by producing testimonials in which women share stories of trauma. The charity project relies entirely on YouTube advertising to fund its productions. But since its content focuses on sensitive issues such as rape, sexual abuse and sex trafficking, the Real Women Real Stories channel made $10 in YouTube ad revenue in June, down from $2,000 the previous June, according to channel owner Matan Uziel.

“We want to make sure we bring as much women’s stories forward as we can, but our charitable project is nearly dead because we can no longer pay for productions,” Uziel said.

Flagging sensitive content
The situation has exposed flaws in how YouTube’s AI disables ads on videos. The general understanding among creators is that YouTube will automatically flag videos, titles and thumbnails that feature graphic, edgy or sensitive imagery and text. This is problematic for gaming creators, who frequently play games with graphic visuals and strong language. It’s also problematic for creators who want to talk about news and social issues. Just because a video covers topics such as terrorism, white supremacy or rape doesn’t mean it’s endorsing those ideas.

“When I reached out to YouTube to see why our videos were being demonetized, they said it’s because the videos are not advertiser friendly,” said Uziel. “They came to the conclusion that videos that cover topics such as sexual abuse, rape and women’s issues, in general, could not be monetized.”

This is a reality of platforms, which try to use technology to police matters that often need a human hand. YouTube’s not alone in this area: Facebook faced criticism when it removed an iconic Vietnam War photo that depicted a naked child running away from napalm bombs. Artificial intelligence doesn’t understand nuance yet, which means videos that cover sensitive topics are going to be swept up with the muck advertisers want to avoid.

The problem with YouTube is that the AI it’s using is also inconsistent. Rutherford has published 12 gameplay videos of the game “Hellblade,” and YouTube cleared 11 videos while disabling ads on one, with no clear explanation as to why. Uziel, meanwhile, found that while YouTube disabled ads on a Real Women Real Stories video titled “The Harmful Consequences of Objectifying Women,” a video from The Young Turks titled “Kidnapped and Sold: One Houseless Woman’s Tale” featured ads.

Uziel also found a workaround by reposting videos with the title and subtitles in different languages. For instance, YouTube disabled ads on a video titled “I Was a Sex Slave to Europe’s Elite at Age 6,” but the same exact video with the same title in French, “J’étais une esclave sexuelle pour l’élite mondiale à partir de l’âge de 6 ans” has ads running.

“I’m struggling to understand because they say they can’t monetize topics which relate to women’s issues, but there are videos of those topics that are being monetized,” Uziel said.

One YouTube rep proposed softening titles and language as a potential solution, Uziel added.

“Since some of the wording could be an automatic trigger for our system which is primarily text-driven, so you might like to try maneuvering the algorithm a little bit. However, we can’t guarantee that your videos will monetize with this change since the content within the videos are still of a sensitive nature. We also recommend that you use caution when describing assault situations in your videos,” said a YouTube channel rep in an email to Uziel.

YouTube’s appeals process is lacking
With 400 hours of video uploaded to YouTube every minute, according to YouTube, it’s easy to see why most of the vetting is left to algorithms. But creators complain that YouTube has set up a slow and inefficient appeals system.

A recent 58-minute gameplay video Rutherford scheduled to post was flagged by YouTube before it even published. “There’s not going to be some guy [at YouTube] who goes, ‘Oh, I’ll watch it now and scan a 58-minute video and get back to you; it’s going to take a while,” Rutherford said in a video.

Though Rutherford has appealed more than 200 videos flagged by YouTube, only two of the videos have been approved by the system so far.

“Back in March we rolled out new controls for advertisers to help them better choose where their ads are placed, and we rely on machine learning to evaluate the millions of videos on our platform to implement those choices,” a YouTube spokesperson said in a statement. “But no system is perfect, so we encourage creators to appeal for a human review when they feel we got it wrong, and every appeal helps our advertising systems get smarter over time.”

Ad-disabled videos on YouTube must get 1,000 views in the span of seven days to qualify for a review, which YouTube says is done so the AI is not slowed down by videos with only a handful of views. This approach hurts smaller YouTube channels, because it removes the ability for creators to make money on the most important stage of a YouTube video’s life cycle: the first seven days. Typically, videos receive 70 percent or more of their views in the first seven days, according to multiple creators.

Unfortunately, there isn’t one easy fix to this problem. YouTube could update its algorithm to factor in a video’s like-to-dislike ratio, which may improve the YouTube API’s ability to make a distinction between an offensive video and a piece that explores sensitive topics, said Ashkan Karbasfrooshan, CEO of WatchMojo. (Facebook, meanwhile, is hiring up to 3,000 human curators this year to scan for and eliminate extremist content.)

Ultimately, the creators’ situation points to the risk of publishing on someone else’s platform, whose own business goals may not be aligned with those who supply the content. YouTube’s goal is to get every advertiser comfortable with advertising on the platform, said a YouTube source. That would certainly benefit most creators on the platform, but not all — at least within the platform’s current ad guidelines.

“YouTube is trying to balance the needs of advertisers with the needs of creators. Do they always get it right? Of course not,” said Steven Oh, COO of The Young Turks Network. “It’s a very tough balance to strike, and just as advertisers push for their interests to be protected, creators should do the same.”

https://staging.digiday.com/?p=252769

More in Media

NewFronts Briefing: Samsung, Condé Nast, Roku focus presentations on new ad formats and category-specific inventory

Day two of IAB’s NewFronts featured presentations from Samsung, Condé Nast and Roku, highlighting new partnerships, ad formats and inventory, as well as new AI capabilities.

The Athletic to raise ad prices as it paces to hit 3 million newsletter subscribers

The New York Times’ sports site The Athletic is about to hit 3 million total newsletter subscribers. It plans to raise ad prices as as a result of this nearly 20% year over year increase.

NewFronts Briefing: Google, Vizio and news publishers pitch marketers with new ad offerings and range of content categories

Day one of the 2024 IAB NewFronts featured presentations from Google and Vizio, as well as a spotlight on news publishers.