“Social media’s influence is tough to manage, but in this age [the platforms] have enough data to be aware of possible negative side-effects to control its content,” said John Rowe, a marketing teacher at Carlmont. (Juliana Castro / Medium.com / CC BY-SA 4.0)
“Social media’s influence is tough to manage, but in this age [the platforms] have enough data to be aware of possible negative side-effects to control its content,” said John Rowe, a marketing teacher at Carlmont.

Juliana Castro / Medium.com / CC BY-SA 4.0

Algorithms: social media’s role in modern radicalization

October 28, 2022

Dangerous and radical ideas have always existed, but only recently have they begun to spread at an alarming degree on social media. These platforms can change their algorithms to stop it, but the money gets in the way of that.

Such ideas often originate from the radical right, which refers to a political preference that aligns with extreme conservatism and often white supremacism. Similar ideals have been prevalent in culture throughout history and have sometimes dominated the world’s geopolitical landscape.

Dangers of these viewpoints

In the United States, such beliefs reemerged in an online phenomenon known as the alternative right (alt-right) movement. Existing since the early 2000s, the alt-right’s popularity spiked when Donald Trump launched his 2016 presidential campaign. 

The alt-right movement is made up of predominantly white males, as stated in an analysis published to the “Institute for Family Studies” by George Hawley. Hawley claims that such a demographic is attracted to the movement because of their disillusionment and distaste for left-wing identity politics and their insistence on traditional masculinity. In their attempt to counter ideologies like feminism, the LGBTQ+ movement, and Black Lives Matter, alt-right proponents espouse white nationalist, anti-Semitic, homophobic, and sexist ideas.

The alt-right’s online presence is prominently on social networks like 4chan and Reddit. Members of such communities use internet memes to mask radical ideas, mainly through subliminal messaging, the practice of spreading radical ideas through memes and other imagery. Similar to an inside joke, the only ones who understand it are part of a highly radicalized minority. Another method would be to mock and parody their leftist opponents, often using identity politics as the punchline of their humor.

To reach an ordinary and naive internet user, right-wing members establish a need to take the “red pill,” a reference to the 1999 movie “The Matrix.” In the film, the pill is used by Neo to achieve greater awareness of the world around him. To the right, they see the world as we know it as the “Matrix,” a simulated reality. They believe that they’ve “red-pilled” themselves and can see through it all.

This idea of the “red pill” is based upon contrarianism, which is the active belief that most people’s worldviews are wrong while yours are correct. The aim for the right is that average users would seek to also accomplish “taking the red pill” and begin to dive deeper into a rabbit hole of radical beliefs. These ideas break through to the average internet user through social media posts, mostly on YouTube and TikTok.

Spread of radical ideas

The YouTube recommendation algorithm works to accomplish two goals: to find the right video to get engagement, and to keep the user watching, according to a Creator Insider video. The content that attains such engagement is often highly controversial and entices public debate, which YouTube then promotes due to the structure of their recommendation system.

TikTok has recently seen a similar effect. Several influencers garnered popularity over the summer of 2022 on TikTok, most notably Andrew Tate. Often starting from slightly controversial political videos, these algorithms increase exposure to misogynistic, homophobic, and racist content, among other ideas associated with the alt-right.

Winston Singh, a senior at Carlmont, frequently uses social media. Although he primarily uses these platforms to engage in discourse about his favorite sports, he still encountered a significant amount of Tate’s content. 

“[Tate’s] videos have been all over my feed, not just on TikTok,” Singh said. “I have seen reposts of him all over Twitter, Instagram, and YouTube. I just couldn’t escape it this past summer.”

Although finding some content entertaining at first, Singh’s opinions quickly changed once he began seeing Tate’s more radical takes.

“I found [Tate] quite interesting when everyone was talking about him at first. My problem started once I heard him say stuff like ‘I wouldn’t give CPR to a fat dude because it’s gay’ and all his nasty stuff towards women,” Singh said. “I quickly started ignoring his content after that.”

On TikTok, videos of Tate began to emerge in May of 2022 on several accounts run by his supporters. Such accounts would repost clips from podcasts on which Tate was a guest. Initially, users saw motivational quotes about hard work and stories about his interesting lifestyle, making Tate rise drastically in popularity. 

While some of his videos are relatively harmless, Tate’s misogynistic and homophobic content began to spread on TikTok. He sought to promote the lifestyle of the “alpha male” to a young and impressionable audience through his channels, which ended up being breeding grounds for toxic masculinity.

To him, accomplishing the status of an alpha male coincided with his treatment of women and his bigoted views. Common conceptions of an alpha male include being in a position of power, having great influence, and being largely intelligent and successful. Some young viewers carry the belief that Tate encapsulates all these characteristics and therefore admire his lifestyle. Tate, like other right-wing proponents, uses the idea of the “red pill” in his content and considers himself to be outside of the “Matrix” created by the left wing.

Even before his TikTok fame, Tate had a troubling history regarding his treatment of women. During its 2016 season, Tate was kicked off the UK show “Big Brother” after a video surfaced of him beating a woman with a belt and continuing to threaten her with violence. Tate told “The Sun” that the video was an act of role-play and this controversy was swept under the rug until after his rise to fame in 2022. 

TikTok initially intended to keep Tate on their platform, likely in the interest of monetizing the attention garnered by his content, but swiftly banned him when Meta (Facebook and Instagram) took Tate off their platforms in late August.

“Tate’s message is truly evil,” Singh said. “But it doesn’t surprise me that this happened. Misogyny has become very common and casual online, and it’s quite concerning. I think Tate was just the tip of the iceberg.”

Another popular right-wing influencer is Ben Shapiro, who spreads ideas of libertarianism and conservatism. He hosts “The Ben Shapiro Show,” an opinionated podcast on his blog, The Daily Wire. Shapiro’s presence is primarily on YouTube, where he shares his podcast clips. While not part of the alt-right community, he provides one of the first steps in the radicalization process. Shapiro, like others, has made a name for himself for his critiques of social justice warriors, feminists, and liberal identity politics.

Steven Crowder is another internet personality who became popular during the same time as Shapiro. He garnered popularity through his “Change My Mind” show on YouTube. Crowder posted videos where he would go to college campuses, set up a table with a sign that held one of his opinions, and proceed to debate students over it. 

His most viral “Change My Mind” shows have been over his opinions of there only being two genders, being pro-gun, and believing male privilege is a myth. These videos gained attention due to their dramatic nature and often humiliation of the college students as their debate abilities did not match those of Crowder.

The most significant danger Crowder and Shapiro brought to YouTube was their choice of guests on their shows. Such guests often provided far more extreme takes and represented a further step into radicalization.

While the radicalization process goes far deeper, the starting point is often with YouTube and TikTok personalities talking about popular political subjects, but in a manner that provokes public debate and interactions. As the algorithm detects attention towards these creators, it will feed more content to the user, creating a cycle of consumption followed by the recommendation of more, often increasingly extreme, content.

Role of YouTube and social media

In 2018, non-profit research institute “Data & Society” published a report which sees YouTube’s algorithm as the culprit in promoting vast political positions. The report identifies the “Alternative Influence Network,” an online community of internet celebrities, media pundits, and scholars who use YouTube to promote political ideals ranging from libertarianism and conservatism to white nationalism.

“There’s no law mandating maximum shareholder profit, but many companies still do as it’s a very easy thing to set as a goal. People invest in companies hoping to see the stock price go up. There is also a question of ethics, where sometimes companies chase profit unethically,” said John Rowe, a marketing teacher at Carlmont.

Social media companies aim to maximize the profit of shareholders, which is generated through the monetization of user engagement using advertising. YouTube is able to run more advertisements and earn a greater profit when they are able to draw in the user. Such engagement is usually garnered through extreme sensationalist content, a pattern which recommendation systems recognize and exploit.

“As social media grows, we are able to become more aware of some of its negative impacts, so I think we should be less forgiving of these things. I don’t think they’ve gone out of the realm of being reasonable for where they are with control right now, but I think we need to be aware that something like the Capitol riots can happen and we need to step up some protections,” Rowe said.

Data & Society suggests that Google, which bought YouTube in 2006, uses the brand-building tools it excels at to coordinate viewers to right-wing ideologies, even while they claim they’re concerned about misinformation on YouTube. The platform is inherently built to incentivize this behavior.

The report argues that users would benefit from transparency regarding the behavior of social media platforms’ algorithms, which could be crucial in holding these platforms accountable. To prevent unwanted content from reaching viewers, platforms could also give users more control over their content feed and allow them to choose the information they engage with. Another proposition would be to continue de-platforming harmful content to cease its spread, something all of social media has done with Tate recently.

“We need to be careful, but I definitely think we should preserve free speech as long as it’s not causing problems. We need to recognize that free speech also comes with other things that may need to be controlled,” Rowe said. 

The rabbit hole created by the recent alt-right phenomenon has been proven to be dangerous. However, its spread can be attributed mainly to the recommendation systems behind many popular social media apps. The prioritization of monetary gains through recommending content based on engagement has led some users down a path of trying to “red pill” themselves, but in reality, stumbling upon a harmful radicalization process.

“We appreciate all that technology has to offer, but in almost all cases there are drawbacks. We need to make sure we control social media to make sure things don’t get out of control,” Rowe said.

About the Contributor
Photo of Alexander Menchtchikov
Alexander Menchtchikov, Scot Scoop Editor
Alexander Menchtchikov is a senior editor who is in his third year in the journalism program. He enjoys playing soccer and following several sports in his time outside of writing. Find him on X @amenchtchikov.

Scot Scoop News • Copyright 2024 • FLEX WordPress Theme by SNOLog in