The Cost of Free Speech: Using Cultivation Theory to Understand YouTube Controversies
Zoe Cramoysan is an English and Creative Writing student from the UK, who aspires to become a novelist. She is due to start her second year at Royal Holloway, University of London this fall, and is particularly interested in gothic literature, queer theory and writing fiction. Her other passions include theatre, overanalysing YouTube videos, and her dog Luna.
Literature, television, and film all have academic disciplines devoted to their study, with investigation into authorial and directorial choices being common practice. The potential consequences of these more traditional media forms may never be fully understood but are widely explored. Cultivation theory is particularly useful as a means to investigate the impact of media on consumers, and it states that television can distort a viewer’s ideology and perception of reality. But the influence of YouTube has been largely overlooked. With over one billion users and reaching more 18-34 year olds than any US cable network, the platform has been influencing purchasing decisions, electoral outcomes, and pop culture since its creation in 2005 (YouTube.com, 2018) (Lewis, 2018). The breadth of the site’s audience has made it increasingly attractive to businesses, that either run adverts on the site, or create their own YouTube channels. This has a huge influence over the types of videos created, as creators are incentivised to make shocking or divisive content both to stay relevant, to compete with corporate channels, and to gain views. This is problematic, especially given that YouTube videos are commonly perceived to be representations of reality. Many of YouTube’s star ‘vloggers’ (video bloggers) and ‘YouTubers’ share intimate details of their personal lives, blurring the boundaries between fiction and reality. This makes the YouTube video particularly powerful, in contrast with television and film, where the line is usually clearer (i.e. the actor is clearly cast in a role). This belief in the realism of YouTube only strengthens what I will be referring to for this article as ‘the cultivation effect’, as the messages subconsciously taken in by the viewer are deemed to be more ‘true’. Though a certain level of the cultivation effect is inevitable, the competitive nature of YouTube has created an environment in which increasingly dangerous messages are being spread, putting the viewer at risk. All parties involved need to modify their behaviour to increase the quality of the site’s content, whilst balancing this need with the necessity of free speech. This includes YouTubers, corporate channels, advertisers, viewers, journalists, and YouTube itself. As Robert Gehl summarises in YouTube as Archive, “This tension between democratic storage and display for profit is the most troubling aspect of YouTube” (Gehl, 2009, 48).
The increase in the number of corporate channels has led to YouTube becoming a more competitive environment. This is not inherently negative but has contributed to an increase in the amount of provocative content on the site, as some smaller creators find this to be an easier way to compete. Access to resources is a major factor in a creator’s success. As Michael Wesch points out in YouTube and You: Experiences of Self-awareness in the Context Collapse of the Recording Webcam, the ability to create videos is “predicated on affluence that affords webcams, personal computers, and privatised spaces” (Wesch, 2009, 31). The necessity of access to resources is problematic on two levels. Firstly, it ensures that wealthier individuals have a greater opportunity to use YouTube as a means of political or creative expression. But secondly (and more relevantly for this essay) this gives companies a significant advantage, as their larger income gives them access to more resources, that are of a higher quality than an individual creator would have.
One important resource that Wesch fails to mention are employees. A business with many employees is likely to have a greater video output than an individual can achieve, because the tasks required to create a video can be divided up—there can be a separate actor, camera-man, producer, director, and editor, as well as researchers and writers. This saves time, and multiple videos can be in production simultaneously. A typical YouTuber will either perform these roles alone or have only a few employees. This is in dramatic contrast with companies such as Buzzfeed, which has 1300 employees globally (Buzzfeed, 2018). Channels that produce a higher volume of videos are favoured by the YouTube algorithm, which according to Paul Covington, Jay Adams, and Emre Sargin, in Deep Neural Networks for YouTube Recommendations, places a high value on the ‘freshness’ of a video. These software engineers state that “recommending this recently uploaded (‘fresh’) content is extremely important for YouTube as a product” (Covington, Adams, Sargin, 2016, 3). As larger companies can produce a higher quantity of ‘fresh’ content daily than an individual could, they have a higher chance of being recommended to viewers (providing the content is relevant), and therefore will attract a greater number of views. This can lead to the video being added to the trending page, which increases the audience even further. The larger audience means that the company is will earn more advert revenue. More advert revenue means more access to resources, perpetuating a cycle in which small creators will struggle to compete. A wider audience also means that more people are likely to be impacted by the cultivation effect, gradually adopting the reality presented by these corporate channels. The ability to post multiple videos daily means that an individual viewer is likely to spend more watch time viewing a corporate channel’s content, in contrast with smaller creators, who may struggle to upload videos more than once a week.
Like the corporate channels, advertisers are not directly creating disturbing content. However, their money funds the production of future, similar videos. The appearance of a reputable company before a video could even have the potential to legitimise the video’s content, as to a viewer, this may appear to be an endorsement. Advertising on a controversial video has the potential to be beneficial for the advertiser, as it means their advert is also likely to reach a wider audience. They too can cultivate new ideas in a viewer’s mind.
YouTube’s algorithm has also contributed to this situation. Creating shocking and divisive content is one of the most successful ways for small channels to compete with larger video-production companies for views. These higher view counts mean that the YouTuber will receive more adverts to their channel and will make more ad revenue. In his article for The Guardian, “‘Fiction Is Outperforming Reality’: How YouTube’s Algorithm Distorts Truth,” Paul Lewis provides useful examples: “The algorithm has been found to be promoting conspiracy theories about the Las Vegas mass shooting and incentivising through recommendations, a thriving subculture that targets children with disturbing content such as cartoons in which the British children’s character Peppa Pig eats her father or drinks bleach” (Lewis, 2018). The reasons these videos appear on the recommendation system is that the algorithm makes no distinction between ‘popular’ content, and ‘scandalous’. Essentially, it assumes that if a video is widely discussed and shared, it must be of a high quality, and should be promoted to more viewers. In “Crying on YouTube,” Rachel Berryman and Misha Kavka propose that “The value of attention in turn drives the monetization potential not only of social media platforms but of individual posters” (Berryman, Kavka, 2018, 86). Scandalous or controversial videos receive more of this attention. This boosts the revenue of the creator, and increases traffic to the website more broadly, which is profitable for YouTube as a company. If one accepts cultivation theory, then this recommendation of disturbing content to viewers could lead to the normalisation of extreme violence, with videos depicting suicide being particularly problematic.
Berryman and Kavka explored the impact of ‘negative affect vlogs’ and found that “the more negative the personal material exposed, the more ‘real’ it is taken to be” (Berryman, Kavka, 2018, 90). Emotional displays make the viewer more likely to become invested in the YouTuber, “in turn increasing the likelihood they will return to watch [the YouTuber’s] future videos” (Berryman, Kavka, 2018, 95). The ‘negative affect vlogs’ they focus on cover sensitive emotional subjects, and usually are not offensive or controversial. However, a similar principle can be applied to scandalous content. The more negative and shocking the content is, the more ‘real’ it is taken to be, as gruesome details are deemed a sign of authenticity. This in turn causes viewers to become more invested in the creator of the content, even if they disagree with the creator’s morals. Unlike with negative affect vlogs, this isn’t emotional investment. Instead it is investment in a storyline. The audience continues to watch, because they want to know what the creator will do next. This explains why YouTubers have often gained subscribers in the wake of scandals.
Because of this investment from audiences, reprimands from YouTube have failed to have a lasting impact. One of the most popular YouTubers on the platform, Logan Paul (of Logan Paul Vlogs) currently has over 17 million subscribers, despite being the centre of a huge controversy in January 2018. On the 31st of December 2017, Paul uploaded a video of himself and his friends exploring Aokigahara forest in Japan and discovering the body of a suicide victim. For much of January, he featured heavily in news headlines, and the ethics of his actions were debated by other YouTubers. In response, YouTube took several actions to punish Paul. BBC news reported that “Paul’s channels were removed from YouTube’s Google Preferred programme, where brands sell ads on the platform’s top 5% of content creators” and that YouTube “put on hold original projects with the US vlogger” (these projects being for YouTube Red) (BBC News, 2018). The video was removed by Paul in response to criticism and was never monetized by him in the first place, so taking the video down, or demonetizing it, were not options for punishment.
Despite the action taken against him, Paul’s channel has not only survived, but he has profited from the scandal. Business Insider UK reported that Paul “gained 300,000 subscribers” within a week of the video’s upload (Kaplan, 2018). These subscribers, initially attracted by the controversy, would create views on Paul’s remaining monetized videos and would follow Paul’s future videos. Though his earnings per view would have been reduced by exclusion from the Google Preferred Programme, his number of views would have increased, as the BBC noted that “his apology video alone has racked up nearly 40 million views” (BBC News 2018). These new viewers and subscribers could also be potential customers for his merchandise line. The Telegraph stated that “Paul also hinted that the scandal had not dented his multi-million-dollar commercial empire too badly. When asked how he was making money now his YouTube revenue had been reduced he pointed to the hoodie he was wearing, which is from his own-brand Maverick Apparel range” (Wright, 2018). A Childwise study found that Paul was more popular with children than Zoe Sugg (of Zoella), who is known for her family-friendly content. Jane Wakefield quotes Simon Leggett, the research director for Childwise as saying, “Zoella losing her top YouTuber slot to Logan Paul shows that we could be moving into a new era with a change in the kind of vloggers that are popular with children” (Wakefield, 2018). This rise in his popularity amongst children occurred after the release of his suicide video. Leggett explained that “Prior to this year , Logan had not been chosen by children at all…His growing audience, which starts as young as age nine, were potentially exposed to shocking content over new year after Logan’s ill-considered decision to upload a widely criticised video of his visit to Aokigahara, Japan’s ‘suicide forest’” (Wakefield,2018). Given the sheer volume of disturbing content on the site, which has been recommended to children on the YouTube Kids app, it is likely that these young subscribers are already desensitised, perhaps explaining why the scandal did not impact Paul’s popularity.
Due to the cultivation effect, videos such as Paul’s have the potential to be hugely detrimental by normalising suicide. Exposure to content depicting suicide, especially repeatedly, can increase a viewer’s likelihood of self-harm. Paul’s video violates several of the suicide prevention guidelines for media professionals that are set by the World Health Organisation. They explain that “reporting the method [of suicide] may trigger other people to use this means”—yet in Paul’s video, it is clear the man hung himself (World Health Organisation, 2008,8). The organisation also recommends that “particular care should be taken by media professionals not to promote locations as suicide sites,” yet Paul names the forest, films the site, and even describes the distance between the car park and the location where he found the body (World Health Organisation, 2008,9). Most significantly, the organisation advises that “Photographs or video footage of the scene of a given suicide should not be used, particularly if doing so makes the location or method clear to the reader or viewer. In addition, pictures of an individual who has died by suicide should not be used. If visual images are used, explicit permission must be given by family members” (World Health Organisation, 2008,9). Paul filmed both the site and the body and did not have permission from the man’s family. The World Health Organisation will not release any suicide statistics for 2018 until the end of the year, and when they do, it will be almost impossible to prove a correlation or causation between suicides and the video. However, any video that doesn’t conform to these guidelines is likely to encourage suicide. Paul’s video is likely to encourage the method of hanging, and the use of the Aokigahara forest as a suicide site. It is worth noting that although the video was removed from YouTube by Paul, it is available elsewhere online, and was viewed over 5 million times prior to it being taken down (BBC News, 2018). Many vulnerable people were likely to have seen the video, and the continued circulation on other websites has the potential to create further damage.
The promotion of shocking and disturbing videos often comes at the expense of videos that address important, if somewhat sensitive, topics of discussion. Since the Logan Paul scandal, YouTube has been accused of demonetizing videos that address LGBT+ issues, body positivity, and mental health. Sam Levin reported for The Guardian that some vloggers “said it seemed that YouTube’s system was automatically deeming them unsuitable to advertisers simply because of their identities and placing the burden on them to appeal their case” (Levin,2018). An example of this came from creator Gaby Dunn (of Gaby Dunn and Just Between Us), who claimed that “sketches in which she makes out with men and appears in her underwear were deemed ad-friendly while videos where she kisses women or discusses queer dating were restricted” (Levin, 2018). Though the evidence for this is only anecdotal, many YouTubers have made similar allegations. This is concerning, especially for a platform that claims, “Our mission is to give everyone a voice and to show them the world” and lists “Freedom of expression”, “Freedom of information”, “Freedom of opportunity”, and “Freedom to belong” as their core values (YouTube.com, 2018). If these accusations are true, then it will become harder to produce LGBT+ content for the website, impacting the reality cultivated by viewers.
This is not the only consequence of Paul’s video. YouTube has introduced a new policy to make it more difficult for channels to qualify for monetization, a decision resulting from Paul’s video. To qualify, a channel must have “4,000 hours of views within the past year and 1,000 subscribers” (Levin, 2018). This could drive niche, small and minority creators off the platform, and would not have prevented Logan Paul from uploading the suicide forest vlog.
The increase of extreme content on YouTube is self-perpetuating. Ultimately, individual YouTubers are responsible for the content they choose to create, and the messages they wish to send to their fans. But it is difficult to compete with corporate channels without resorting to creating controversy. This is further incentivised by the potential gains of ad revenue, subscribers, and publicity. The algorithm itself works in favour of shocking content, as viewers are shown these videos through the recommendation system or through news websites. YouTubers themselves are not solely responsible for this situation.
YouTube has little reason to stop recommending disturbing videos to viewers, or to seriously punish creators for inappropriate content, as the greater traffic to their website generates a larger profit. Businesses are motivated to advertise on YouTube by the increased website traffic and are often unaware, or unconcerned, about the types of videos their advertisements might be appearing next to. Recent ad boycotts have only taken place after journalistic investigations have raised awareness of this practice and had the potential to cause PR damage. Ad boycotts are not an ideal solution, as this could have the effect of discouraging free speech on YouTube. Boycotts of the entire platform would also punish innocent creators. Demonetization of individual videos is clearly not very effective either, doing little to damage the controversial creators, who are likely to gain subscribers as well as views to their remaining monetized videos. However, creators of offensive videos should not be banned from the site (unless they are directly inciting violence) as free speech must be protected. But to raise the quality of videos on the site, and protect audiences, each party involved needs to change their practices.
YouTube needs to be more transparent about its algorithm and advertising practice, in line with its value of freedom of information. The company should offer advertisers more specific options (for example, individualised whitelists and blacklists) to best match videos to their brand. It may also be useful to introduce a means by which a creator could approve or disprove of an advertiser, so that their brand is also protected. There needs to be a means by which the algorithm can distinguish between offensive videos (i.e. Logan Paul’s suicide vlog) and those which discuss loaded issues sensitively and maturely (i.e. suicide prevention videos, which follow guidelines for reporting on suicide). Whether a video violates community guidelines should be given as much consideration as ad-friendliness, when determining if content should be monetized.
Advertisers need to work to better understand how YouTube operates, and need to be more selective about the content on which they advertise. Though it is unlikely that businesses that have run campaigns on YouTube are intentionally contributing towards offensive content, ignorance is not an acceptable excuse, and they must take responsibility for their brands.
Journalists have played and should continue to play an important role in holding creators, advertisers and YouTube itself accountable. They should continue to use any future scandals as an opportunity to raise public awareness of how YouTube operates, so that audiences can be more aware of the kinds of content they choose to consume.
Creators should be more aware of the potential harm that they could do to their subscribers, and should encourage their audiences to be mindful of what they watch. Increasing the number of collaborations between corporate channels and small creators could be mutually beneficial for both parties, helping to diversify YouTube’s content and giving smaller creators a greater opportunity.
Finally, as I mentioned in my opening paragraph, YouTube videos have largely been neglected in academic discourse. This is despite the fact they have an influence over the population comparable to that of books, films, and television—all of which have entire disciplines dedicated to studying them. Due to the perceived ‘reality’ of YouTube, a YouTube video’s power to cultivate a viewer’s ideology may even be greater than the power more traditional media exerts. To understand how viewers are impacted, further awareness and research surrounding this issue is needed.
BBC News. (2018). Logan Paul's brother: He made a mistake. [online] Available at: https://www.bbc.co.uk/news/newsbeat42786609intlink_from_url=https://www.bbc.co.uk/news/topics/cwm19wg3l15t/logan-paul&link_location=live-reporting-story [Accessed 10 Jun. 2018].
BBC News. (2018). YouTube punishes star over suicide video. [online] Available at: https://www.bbc.co.uk/news/world-asia-42644321 [Accessed 7 Jun. 2018].
Berryman, R. and Kavka, M. (2018). Crying on YouTube. Convergence: The International Journal of Research into New Media Technologies, 24(1), pp.85-98.
BuzzFeed. (2018). About BuzzFeed. [online] Available at: https://www.buzzfeed.com/about [Accessed 9 Jun. 2018].
Covington, P., Adams, J. and Sargin, E. (2016). Deep Neural Networks for YouTube Recommendations. YouTube.
Gehl, R. (2009). YouTube as archive. International Journal of Cultural Studies, 12(1), pp.43-60.
Kaplan, J. (2018). Logan Paul actually gained 300,000 more subscribers following his controversial video showing a dead body in Japan's 'Suicide Forest'. [online] Business Insider UK. Available at: http://uk.businessinsider.com/logan-paul-gained-subscribers-after-japan-video-2018-1 [Accessed 10 Jun. 2018].
Lange, P. (2007). Commenting on comments: Investigating responses to antagonism on YouTube. Annenberg Center for Communication.
Levin, S. (2018). YouTube's small creators pay price of policy changes after Logan Paul scandal. The Guardian. [online] Available at: https://www.theguardian.com/technology/2018/jan/18/youtube-creators-vloggers-ads-logan-paul [Accessed 10 Jun. 2018].
Lewis, P. (2018). 'Fiction is outperforming reality': hHow YouTube's algorithm distorts truth. The Guardian [online]. [online] Available at: https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-distorts-truth [Accessed 9 Jun. 2018].
Moylan, B. (2015). A Decade of YouTube Has Changed the Future of Television. Time. [online] Available at: http://time.com/3828217/youtube-decade/ [Accessed 6 Jun. 2018].
Wakefield, J. (2018). Logan Paul 'more popular' than Zoella. [online] BBC News. Available at: https://www.bbc.co.uk/news/technology42872606?intlink_from_url=https://www.bbc.co.uk/news/topics/cwm19wg3l15t/logan-paul&link_location=live-reporting-story [Accessed 10 Jun. 2018].
Wesch, M. (2009). YouTube and you: Experiences of self-awareness in the context collapse of the recording webcam. Explorations in Media Ecology, [online] 8(2), pp.19-34. Available at: http://hdl.handle.net/2097/6302 [Accessed 8 Jun. 2018].
World Health Organisation (2008). Preventing Suicide, A Resource for Media Professionals. Geneva: WHO Press, pp.8-9.
Wright, M. (2018). Logan Paul says ‘everyone deserves second chances’ in first public comments since ‘suicide forest’ video apology. The Telegraph. [online] Available at: https://www.telegraph.co.uk/news/2018/01/16/logan-paul-says-everyone-deserves-second-chances-first-public/ [Accessed 10 Jun. 2018].
Youtube.com. (2018). About YouTube - YouTube. [online] Available at: https://www.youtube.com/intl/en-GB/yt/about/ [Accessed 9 Jun. 2018].
Youtube.com. (2018). Press - YouTube. [online] Available at: https://www.youtube.com/intl/en-GB/yt/about/press/ [Accessed 6 Jun. 2018].