Instead, the first viral video of this year was a vlog by YouTube star Logan Paul in which he, among other potentially offensive things, visited a “suicide forest” in Japan and filmed a human corpse. Although Paul ultimately apologized removed the video (after it attracted several million views), the backlash was fierce enough that YouTube itself released an “open letter” on Twitter, condemning its content.
All of which is a lot of throat-clearing for the question: Could all this have been prevented? On the latest episode of Too Embarrassed to Ask, Daily Beast reporter Taylor Lorenz said that YouTube doesn’t always speak out when videos cross the “guidelines” it has set up.
“Their community guidelines are just so arbitrarily regulated, and it’s so all over the place,” Lorenz said. “They would probably need a million more people if they wanted to truly moderate the platform.”
Instead, YouTube has said it will use artificial intelligence to more efficiently police the enormous quantity of videos on its platform. However, Lorenz said she’s skeptical of that, and proposed a “tiered system” for catching the next “suicide forest”-type video before it goes viral.
“I’m by no means a content moderation expert, but I do think they should have some kind of system where things are being reviewed on a regular basis if you reach a certain threshold on your channels,” she said. “Yes, they should be moderating tons of stuff, but if somebody has an audience of 16 million, they should keep a closer eye.”
“And they theoretically do, but they’ve given a lot of — even the biggest creators — a lot of free rein,” Lorenz added. “It only becomes a problem when things are getting negative attention in the media.”
On the new podcast, Lorenz also explained why YouTube has been reluctant to moderate its video creators’ content in the past. Among other reasons, the site hasn’t officially recognized that it is a media company with a consistent set of values that should be upheld all the time.
“Right now, they’ve alienated so many creators because they’re randomly punishing people if they get in trouble with press, but they’re not being proactive about changing the culture or changing what content performs well,” Lorenz said. “They don’t want to exert that much control, but I think they need to take a little more of a proactive approach with some of the biggest people on their platform.”
She also talked about why, in spite of the negative attention YouTubers like Paul may attract in the short term, YouTube isn’t at risk of losing its hold on a generation of younger media consumers.
“It’s a better experience to watch stuff on YouTube than traditional TV,” Lorenz said. “You can watch it whenever you want, you can subscribe to certain channels and they’re pumping stuff out 24/7 that’s interesting; it’s not like waiting for a TV show. Also, it’s mobile, short-form [and] easy to digest. I don’t see it going away at all.”
Have questions about YouTube that we didn’t get to in this episode? Tweet them to @Recode with the hashtag #TooEmbarrassed, or email them to TooEmbarrassed@recode.net.
If you like this show, you should also check out our other podcasts:
- Recode Decode, hosted by Kara Swisher, is a weekly show featuring in-depth interviews with the movers and shakers in tech and media every Monday. You can subscribe on Apple Podcasts, Spotify, Pocket Casts, Overcast or wherever you listen to podcasts.
- Recode Media with Peter Kafka features no-nonsense conversations with the smartest and most interesting people in the media world, with new episodes every Thursday. Use these links to subscribe on Apple Podcasts, Spotify, Pocket Casts, Overcast or wherever you listen to podcasts.
- And finally, Recode Replay has all the audio from our live events, such as the Code Conference, Code Media and the Code Commerce Series. Subscribe today on Apple Podcasts, Spotify, Pocket Casts, Overcast or wherever you listen to podcasts.
This article originally appeared on Recode.net.