In the first episode of Rabbit Hole, Kevin’s new audio series, featured Caleb Cain, a college dropout who lives in West Virginia. Cain found himself watching a lot of YouTube videos that contain extreme topics. In fact, Cain started to believe in the things he absorbed from watching, such as misogyny, racism, as well as conspiracy theories.
People often believe in the ideas that hold complex reasons. Roose pointed out that YouTube should partly be blamed because it recommends one video after another whose topics are the same as previously watched. This can push people from watching mainstream videos into getting too consumed by what they watch, which can lead them to have dangerous ideas.
YouTube’s Brief Background
YouTube is the second largest search engine. It sits next to its parent company Google, which is the most used search engine across the world. It started as a video-sharing website created by three PayPal employees-Chad Hurley, Steeve Chen, and Jawed Karim, and launched it as YouTube on the 14th of February 2005.
Before its launch, YouTube became an angel-funded enterprise whose office is a makeshift one in a garage. But in November 2005, Sequoia Capital, a venture firm, invested an initial $3.5 million on the platform. Roelof Both, the venture firm’s partner and former CEO of PayPal, joined YouTube’s board of directors.
However, on the 9th of October 2006, YouTube announced that it would be purchased by Google, the world’s largest search engine.
Ever since it was first launched, YouTube continued its efforts to cater to the demands of its users. On the 31st of March, 2010, YouTube launched a new design where it simplified its interface to make it more convenient to its users.
YouTube and People’s Ideas
While most people on YouTube watch light topics such as cooking videos, game videos, and the now popular video blogging or vlog, it cannot be denied that some watch more disturbing and dangerous content. True enough, one cannot know how many frightening videos are now uploaded into the video-sharing platform. However, it is no secret that there are many YouTube channels uploading videos that are not appropriate for everyone.
As mentioned, YouTube is to be blamed for people developing extreme views like Cain. What is difficult to know is how much blame YouTube deserves for it. The reason behind why a person gets attracted by what he calls an “extremist rabbit hole” on YouTube is economic conditions, loneliness, and the alternative influence network that come from people who spread ideas by being good at YouTube.
Still, YouTube has a responsibility. A part of what makes this video-sharing platform seductive and very successful as a business is its automated recommendations and function where the next video will automatically play after the current one is finished. These functions play a very significant role in what YouTube users watch.
For instance, if someone visited a channel about Mein Kampf and watched one video which caused him to become a neo-Nazi, that is not the channel’s fault. A digital librarian who welcomes and steers them into German history via Mein Kampf isn’t right. If they open the YouTube app, then that might be the cause of a significant influence on becoming a neo-Nazi.
What People Should Do
Furthermore, Roose said that for people to not be influenced by the dangers that YouTube brings, they need to decrease YouTube and other platforms’ influence towards them. For him, turning off the autoplay function of YouTube and removing some of its automated features will help people to decrease the platform’s authority over them. Besides, things like making one’s own Spotify playlist and setting up Amazon’s Alexa in a more personalized way would help people feel more in control, rather than having these platforms controlling them.
Is the Internet Warping Our Minds?
Roose was asked in the Rabbit Hole episode whether or not it is their choice when they commit crimes for being radicalized online or whether it is their choice to let Alexa lead them into buying a particular brand, a pet food perhaps. His answer was it was both a person’s choice and the internet’s influence. He cited Camille Roth, a French researcher who wrote that the algorithms that powers platforms like YouTube and Facebook come into two flavors: reading our minds and changing our minds. If people are aware that machines are working on them and they feel as if these machines are steering their choices, then these people can decide whether they want to follow the recommendation of those machines or make a different decision.
The Dangers of YouTube to Children
Everyone knows that YouTube holds a lot of materials and content that are inappropriate for children. And now that children are exposed to technology at a very young age, it is not surprising if they access such material out of curiosity. For this reason, YouTube will decide to create a platform that is solely dedicated to children. With that in mind, YouTube Kids was born. This platform is where children can safely browse child-friendly and entertaining content.
However, YouTube Kids is not safe at all. This is because the content of this child-friendly version of YouTube will come from YouTube itself, and the content it will display is not filtered enough. In fact, there was a time when a clip from the movie Casino starring Bert and Ernie from Sesame Street is seen cursing. Cartoons that have graphic sexual language was also seen on YouTube Kids.
While YouTube changed its moderation strategy before these incidents, it was still not enough. Even with more strict moderation, dangerous problems again arose. YouTube Kids displayed a lot of videos about the school shooting, suicide, and abuse shown by an FL pediatrician and mother, Dr. Free Hess, on her blog.
That being said, people will need to become cautious about what they will watch on YouTube.