Photo via YouTube screencap

If YouTube won’t monitor offensive content, viewers must

By Aisha Sajid, January 23 2018 —

If you’ve been on social media recently, you’ve probably heard about Logan Paul’s YouTube video that showed a suicide victim in the Aokigahara forest in Japan. Paul, a Vine star turned YouTube celebrity, posted the video on Dec. 31, 2017. He faced widespread outrage and criticism for failing to recognize how reprehensible making jokes about this topic is.

The creators of Vine have subsequently banned him from re-joining their community. YouTube finally took action 10 days after the incident by removing Paul from their “Google Preferred” program,  but they didn’t suspend his account or channels. The preferred program allows advertisers to sell ads on the most popular creators’ content, including Paul’s, whose video itself generated $90,000 from the 6.3-million views it accumulated in 24 hours.

YouTube’s community guidelines dictate that content containing visually graphic or violent material is not permitted unless it’s for educational purposes. Content is removed if it’s noticed by YouTube, or if viewers flag it for review. Only after “three strikes,” or three instances of inappropriate content, is the channel deleted.

But YouTube is unlikely to level this punishment against channels that have a large following because popular creators make the website a staggering amount of money from advertising revenue and sponsorships. The site rewards those who amass views, so creators compete for clicks, which drives them to find new ways to catch the eye of viewers. This can include shocking audiences with jarring thumbnails and captions to garner more views to possibly be featured on the coveted rending page.

In Paul’s case, not only did he intentionally post inappropriate content — with a blatant disregard for the victim and their family — but he demonstrated that he didn’t understand why laughing at suicide victims is inherently wrong. The video was also featured on YouTube’s trending page for 24 hours before Paul deleted it. He gave an apology, but this didn’t stop the outrage from viewers and other YouTube creators, mostly because Paul was not reprimanded in the same way that less popular creators would have been.

The young, idolizing teens who watch YouTube have put big creators like Paul on a pedestal, protecting him from criticism and other backlash for his mistakes. Since viewers are stakeholders that YouTube must please, they ultimately have a responsibility to reject content and creators that violate the boundaries that the community has set. This is not to say all content on YouTube is bad — there are some amazing people and channels that deserve to be recognized. Next time you come across a purposely shocking video, send a message to content creators by skipping it, because YouTube won’t do it on their own.

Articles published in the Gauntlet‘s opinion section do not necessarily reflect the views of the Gauntlet editorial board.


Hiring | Staff | Advertising | Contact | PDF version | Archive | Volunteer | SU

The Gauntlet