64.9 F
Davis

Davis, California

Wednesday, April 24, 2024

Does YouTube want you to move to the right?

Extremism is now easier to fall into than ever before

YouTube is consistently criticized as the platform where — usually –– young white men are radicalized by far-right figures who want nothing more than to spew conspiracies and hate. But how responsible is YouTube really?

Second only to Netflix, the Google-owned service is the second most-preferred platform on television among young people. With so many younger generations moving toward streaming platforms, the content on these platforms matters now more than ever. 

There are countless stories of young, lonely white men being drawn by the allure of right-wing influencers and creators online. And with 81% of young people using YouTube in the United States, the platform is more impactful than any other. Granted, not all of them become raging neo-Nazis but many end up rooted in the extremes.

Extremes on the internet can be enticing because, in the mix of all the desensitization we get from media, believing that the world might work in a different way or that insane conspiracies could be true is a tantalizing idea. For many who seek alternatives to traditional education, media and way of thought, YouTube is often the place for them.

Across social media, far right influencers create a type of alternate influencer network to envelope themselves in a web that connects to other right wing influences like an echo chamber. However, that can build dangerous rabbit holes when the average viewing session on mobile devices lasts more than 60 minutes. And with the rise of right-wing terrorism since 2016, these extreme places online are the breeding grounds for domestic terrorrists.

I know from my experience that if I watch even one Joe Rogan podcast or one Jordan Peterson lecture, I get bombarded with all kinds recommended videos from the right web. We should all challenge and test our beliefs by listening to and understanding different viewpoints. However, algorithms should not take our attempts to challenge our perspectives by feeding us videos that will dictate that perspective. These algorithms that control what videos YouTube recommends are responsible for more than 70% of all time spent on the site.

In June, YouTube announced it would ban channels that promoted supremacist ideologies from any group. And they largely followed up on that announcement, having removed more than 100,000 videos and 17,000 channels in the three months that followed — five times the usual amount in the fiscal quarter. But more than 500 hours of videos are uploaded to YouTube every minute, and with YouTube expected to take in more that 5.5 billion dollars this year, it is in its interest to keep users on the platform. Often, the content that keeps people watching is the most extreme. 

This problem is exacerbated when those in power who share supremacist and extremist points of view attempt to force tech companies to keep conservative extremists on their platforms. In August, the White House proposed for the FCC and FTC to act as speech police in an effort to protect Alex Jones and others like him. 

Radicalization may not be the intent of Youtube’s algorithms, but it nevertheless provides a platform for extreme views that are more accessible than ever. In contrast to mainstream media and education, YouTube has almost no barriers to entry, whereas an education costs thousands and cable charges add up. It’s an easy way to monetize thought and provides a video medium that is far more accessible than text. 

The audience for these views has always been there, but algorithms now make it easier to push people to one extreme or the other. It’s not that moderate people are being radicalized, it’s that malleable audience members who are often young, isolated and frustrated find it easier to fall in and drink the Kool-Aid. 

YouTube may not be directly angling users toward the extremes, but its platform has become the place where extremism begins for many. Confront your political and personal views, talk with people with whom you disagree and gain a new perspective on why your views are different. But don’t just question your beliefs, question the beliefs of others. Think critically about how you get the information that fuels your opinions — interrogate it. Challenge others and by doing so our perspectives will become collectively stronger.

Written by: Calvin Coffee –– cscoffee@ucdavis.edu 

Disclaimer: The views and opinions expressed by individual columnists belong to the columnists alone and do not necessarily indicate the views and opinions held by The California Aggie

1 COMMENT

  1. Bad headline title: the problem isn’t moving to the right, it’s moving to the far right (as the article itself correctly identifies). Bigger problem is that it’s one-sided: radical left echo chambers like Chapo Trap House are just as widely available, deluded, and pernicious as anything you’ll find in the alt-right. And, given that your audience are mainly college students in California, the problems posed by the radical left are almost certainly more relevant. (Case in point, the sometimes shockingly low-brow articles from The Aggie itself.)

    The overall point is quite right, though. The Internet and social media have a bubble problem in general. Even without the “guidance” of algorithms, it’s too easy to indulge in bias, become an ideologue divorced from reality, and demonize anyone who dares deviate, thereby stagnating within the comfort of a post-truth fantasy instead of acknowledging the uncomfortable truth that there are always people smarter, better educated, and better intentioned, that disagree.

LEAVE A REPLY

Please enter your comment!
Please enter your name here