What Can’t You Say on YouTube? Its Content Creators Aren’t Sure
Read more of this story at Slashdot.
“Recently, on a YouTube channel, I said something terrible,” confesses a staff writer for the Atlantic. “But I don’t know what it was.”
Whatever it was, it was enough to get the interview demonetized, meaning no ads could be placed against it, and my host received no revenue from it.
“It does start to drive you mad,” says Andrew Gold, whose channel, On the Edge, was the place where I committed my unknowable offense. Like many full-time YouTubers, he relies on the Google-owned site’s AdSense program, which gives him a cut of revenues from the advertisements inserted before and during his interviews. When launching a new episode, Gold explained to me, “you get a green dollar sign when it’s monetizable, and it goes yellow if it’s not.” Creators can contest these rulings, but that takes time — and most videos receive the majority of their views in the first hours after launch. So it’s better to avoid the yellow dollar sign in the first place. If you want to make money off of YouTube, you need to watch what you say….
YouTube operates a three-strike policy for infractions: The first strike is a warning; the second prevents creators from making new posts for a week; and the third (if received within 90 days of the second) gets the channel banned…. Although many types of content may never run afoul of the guidelines…political discussions are subject to the whims of algorithms. Absent enough human moderators to deal with the estimated 500 hours of videos uploaded every minute, YouTube uses artificial intelligence to enforce its guidelines. Bots scan auto-generated transcripts and flag individual words and phrases as problematic, hence the problem with saying heroin. Even though “educational” references to drug use are allowed, the word might snag the AI trip wire, forcing a creator to request a time-consuming review….
[T]alk with everyday creators, and they are more than willing to work inside the rules, which they acknowledge are designed to make YouTube safer and more accurate. They just want to know what those rules are, and to see them applied consistently. As it stands, Gold compared his experience of being impersonally notified of unspecified infractions to working for HAL9000, the computer overlord from 2001: A Space Odyssey. [“They don’t tell me if it’s Nazis, heroin, or anything,” Gold says later. “You’re just left wondering what it was.”]
The article notes that YouTube’s algorithm seems to flag people who are debunking misinformation as misinformation. (One study found that purveyors of controversial content simply stop worrying about YouTube demonetizing their videos, using them to direct viewers instead to their “affiliate” links offering commissions, or to their content on other still-monetized platforms.)
In just the last three months of 2022, YouTube made almost $8 billion in advertising revenue, the article concludes. “There’s a very good reason journalism is not as profitable as that: Imagine if YouTube edited its content as diligently as a legacy newspaper or television channel — even quite a sloppy one. Its great river of videos would slow to a trickle.”