Professionalism/YouTube Censorship

YouTube allows users to upload and share videos. With over 1 billion active users in 2013[1], YouTube has become the primary source of income for content creators and a secondary platform for mainstream media companies[2][3]. In contrast to mainstream television, YouTube supports a disproportionate number of minority personalities[4]. With projects like It Gets Better , YouTube is also a vehicle for social change. Despite it's widespread use, various reports have concluded that YouTube's ad revenue does not allow it to profit due to the high cost of bandwidth[5][6]. This article focuses on how YouTube censors content through demonetization and filtering in order to retain advertisers.

YouTube Monetization ProcessEdit

As of 2019, in order to make money from YouTube videos, Content Creators have to be accepted into the YouTube Partner Program (YPP) and have AdSense linked and enabled[7]. Applying to join YPP is similar to applying for a job. Content Creators must meed subscriber and view count requirements, similar to sending in a resume to show they have enough experience. Content Creators who have not monetized their account through this process will not receive any money from YouTube, regardless of the number of views on their videos.

YouTube FunctionalitiesEdit


YouTube relies on users to report inappropriate content[8]. Reported videos are not immediately taken down. Instead, they are reviewed to see if the videos violate YouTube's community guidelines. Videos that violate the guidelines are taken down.


If a video is found to violate YouTube's community guidelines, the video is removed and the channel that uploaded it receives a strike. A channel that receives three strikes in a 90-day period will be permanently removed from YouTube[9]. There is a process to appeal the strikes. After appealing a strike, three outcomes are possible:

  • If YouTube finds that the content follows Community Guidelines, the strike will be removed and the content will be reinstated. If the user appeals a warning and the appeal is granted, the next offense will be a warning.
  • If YouTube finds that the content follows Community Guidelines, but is not appropriate for all audiences, YouTube will apply an age restriction. If the content is a video, it will be hidden from users who are logged out, under the age of 18, or have Restricted Mode activated. If it is a custom thumbnail, it will be removed.
  • If YouTube finds that the content was in violation of their Community Guidelines, the strike will stay and the video will remain down from the site. There is no additional penalty for appeals that are rejected.

Users may appeal each strike only once[10].

History of Youtube Monetization PoliciesEdit

2007: Launch of original YouTube Partner Program (YPP) for channels to earn ad revenue

2011: YouTube invests $100 million in content creators

2012: YouTube opens the YPP[11] to everyone, with the sole requirement that people monetize a single video (not entire channels)

2014: YouTube launches "Google Preferred", a program that allows advertisers to pay more for their ads to appear specifically on high-performing creators' videos

2015: Launch of YouTube RED, a paid content subscription service

2017: The Beginning of the "Adpocalypse[12][13]." Policy changes require a channel to have 10,000 views in order to be monetized[14]

2018: YouTube changes YPP requirements to 4,000 watch-hours and over 1,000 subscribers

2019: YouTube requires YPP in addition to the account being linked to an AdSense account for monetization

Types of CensorshipEdit


Demonetization, with respect to YouTube, is when a video that normally would have ads played with it (which the creator would receive revenue from) does not have ads. Video demonetization is determined by an algorithm based on advertising-friendly guidelines. Many YouTubers have claimed that their content has been unjustifiably demonetized, which is also detrimental to their career. Nasim Aghdam, one such YouTuber, attacked YouTube's San Bruno, California headquarters in April of 2018, presumably motivated by her displeasure at YouTube's monetization policies[15]. Her channel, which featured content ranging from animal rights and veganism to exercise demonstrations, was affected by changes related to age-restricted video criteria and monetization status. She posted a video complaining about one of her videos being demonetized “after new close-minded YouTube employees, got control of [her] Farsi YouTube channel last year, 2016, and began filtering [her] videos to reduce views & suppress & discourage [her] from making videos[16]." Her ultimate decision to act violently on the basis of this perceived injustice necessitates closer consideration of the YouTube’s monetization policies and their implications for content creators and the community.

Adpocalypse and PewDiePieEdit

YouTube's "Adpocalypse" refers to the phenomena of widespread demonetization of content due to new advertiser-friendly guidelines. It allegedly began with controversy related to an anti-Semitic video posted by the popular YouTuber Felix Arvid Ulf Kjellberg, better known as PewDiePie. PewDiePie was the most subscribed channel on YouTube in 2017. On January 11, 2017, PewDiePie became the center of controversy due to a video he posted in which he hired people on Fiverr to hold up a sign that read "Death to all Jews". This video was heavily criticized in a series of Wall-Street Journal articles which emphasized the consequences that advertisers might face due to unwitting association with similar content. Following these articles, advertisers realized their commercials were being played alongside terrorist and other hate-inciting videos. Many large advertisers such as Coca-Cola and Amazon[17] pulled their ads off of YouTube, causing the company to make drastic changes in order to re-establish the revenue it received from these advertisers[18]. Changes included sweeping reforms to automatic-censorship policies, and the expansion of the "Not Advertiser Friendly" (NAF) tag on videos[19]. After the Adpocalypse, PewDiePie was dropped by Maker Studios, YouTube ended his YouTube RED series in its second season, and he was removed from the Google Preferred list of content creators. As of 2019, he is still making gaming and satire videos on YouTube through standard AdSense revenue[20].

Logan PaulEdit

Logan Paul, originally a famous Viner, successfully transitioned to YouTube after Vine shut down. Logan Paul faced backlash for a video he posted in December 2017 in which he walked through the Aokigahara, a notorious "suicide forest" in Japan. He was criticized for showing the corpse of a suicide victim and making insensitive comments about the corpse. The video quickly rose to Trending status. Many channels posted reaction videos, generally expressing their outrage and disgust at Paul's video. While these reaction videos were swiftly demonetized, the original video was neither demonetized nor taken down. Eventually Paul removed the video himself and issued apologies[21][22]. Much like PewDiePie, he faced repercussion from YouTube itself; losing his Google Preferred status, being cut from a YouTube RED series, but retaining base AdSense revenue from videos on his channel. Shortly after the fiasco, YouTube changed the requirements for the YouTube Partner program in an effort to "prevent bad actors from harming the inspiring and original creators[23]." In order to have a monetized video, a YouTuber must have 4000 watch hours, 1000 subscribers, and connect an AdSense account to collect the revenue[24].

Content CreatorsEdit

Some YouTubers aspire to make YouTube their full time career, even quitting their current jobs to expand their channel[25]. However, a race to earn subscribers and quick views encourages low quality content. Content creators turn to making clickbait videos for shock value, following a recent trend, or both[26]. Additionally YouTube prevents videos from being monetized on  “Controversial issues and sensitive events[27]." However, some content creators have noted that this policy has been applied inconsistently. Casey Neistat posted a video on the Las Vegas mass shooting, pledging to aim all proceeds from the video to relevant charities[27]. YouTube demonetized his video, claiming that “no matter the intent, our policy is to not run ads on videos about tragedies[27].” Philip DeFranco, another content creator who also created a video about the tragedy, points out YouTube's hypocrisy. While YouTube demonetized Neistat’s and his video, it continued to run commercials on Jimmy Kimmel’s videos about the tragedy[27].


User FlaggingEdit

Another technique YouTube uses to identify extremist or inappropriate content is user flagging. By clicking "flag" under a video, a user can report a video. In order to combat offensive content, YouTube took flagging a step farther by introducing the YouTube Trusted Flagger Program[28]. With this program, individuals, government agencies, and non-governmental organizations who “flag frequently and with a high rate of accuracy” can use advanced tools to flag videos on a larger scale[29]. If a flagged video does not violate policy but still contains what Youtube considers “inappropriate” content, it is put into a limited state[30]. Often times when YouTube removes a viral video, it is re-uploaded or put on another website, drawing the video even more attention[30]. In a limited state, users can still view the video but it is demonetized and users cannot comment on it. With this method, YouTube can quiet offensive content without fully censoring it[30].

Automatic FilteringEdit

In response to the growing number of users on YouTube Kids, the platform’s kid-friendly offshoot, YouTube has made new filtering protocol intended to prevent young children from viewing age-inappropriate content. However, as with automatic demonetization, the algorithm sometimes misclassifies content. These misclassifications have led to much criticism and outrage from parents. This controversy has been termed Elsagate. Many concerned parents have posted on Facebook warning others of the types of content that bypass the YouTube Kids auto-filtering algorithm[31]. Another concerned parent claimed that “The system is complicit in the abuse[32]." Others have voiced concerns about the fact that content creators who create deceivingly innocuous-looking parody videos are taking advantage of the fact that children cannot distinguish between legitimate content and spoofs created to generate ad revenue[33]. Experts have taken issue with the betrayal of trust in beloved characters that results from children watching these videos. YouTube’s response to its algorithm’s fallibility has been to provide a disclaimer in the YouTube Kids Parental Guide that states “While our automated filters try to keep out content that is not appropriate for kids, … it’s possible your kid may find something you don’t want them to watch” and “no automated system is perfect and your kid may come across content with nudity, highly offensive language, and extreme violence[34]." YouTube also gives parents an alternative to automatic filtering by allowing them to only let their kids watch content they have approved manually or disallow their kids from watching specific content. However, given the extensive collection of videos on the platform, Malik Ducard, YouTube’s global head of family and learning content, claims that these disturbing parodies are “the extreme needle in the haystack[35]."

Professionalism and EthicsEdit

YouTube’s has a large impact on the world. It supports the livelihoods of many content creators and is one of the most visited sites in the world. YouTube has justified its decisions and policies around monetization and acceptable content by claiming its priority is to protect advertisers and viewers from inappropriate content. Nevertheless, its policies have received backlash from critics.

This case leaves a couple solid conclusions. One is that algorithms will always make mistakes. Illustrated above are several cases where automated systems did not work as intended. In the case of Logan Paul, a video that should have been demonetized and removed, was not. With Casey Neistat, the stated rules were not applied fairly. YouTube recognized this fallibility and has an appeals process for people to examine the decisions. Another conclusion is that accountability is a key component of responsiveness. YouTube responds most effectively and clearly in response to vital threats to its interests, namely advertisers pulling out. In the case of the Adpocalypse, YouTube responded very quickly. However, complaints from individual content creators are not addressed.

There are many open questions in regards to this case. Whose responsibility is it to police content? What content should be acceptable on a large, public platform like YouTube? There are many definitions of acceptable, especially in a global context. Another question is: what commitment does YouTube need to make to the freedom of speech? Freedom of speech is an often lauded principle, especially in an American context, but many of the discussions around extremism and misinformation online call that principle (or at least the necessity of private company’s commitment to it) into question.


  27. a b c d
  30. a b c