54.5 F
Davis

Davis, California

Saturday, January 10, 2026

Just say what you mean

The rise of “algospeak” has created a new era of self-censorship

By SABRINA FIGUEROA — sfigueroaavila@ucdavis.edu

I’ve spent a lot of my time at The California Aggie writing columns about language, and so have many others. But, my writing comes from a place of good intention — literacy and critical thinking are important, especially in an age where the Internet is democratized to the point where misinformation and disinformation run rampant. One thing I have yet to see discussed, however, is the impact of social media on the language we use and how it could possibly affect us in the process.

Social media — particularly TikTok, Instagram and X — has been credited with changing the way we communicate: a concept coined “algospeak” by Adam Aleksic, a linguist and content creator. This includes the use of emojis, the creation of new expressions and, of course, words.

This is, in part, because some social media apps, like TikTok, can ban you for using provocative words which violate their community guidelines. Perhaps you’ve seen or said words like “grape” instead of “rape,” “unalive” in place of “commiting suicide” or “pew pew” in reference to guns — these are examples of algospeak. At first, this was the Internet community’s not-so-sneaky way of getting around guidelines, filters and algorithms that could restrict them, but now it’s seeped into our everyday vernaculars. If we need to talk about something graphic or sensitive, we can opt to use one of these words to make the subject more palatable. 

On the surface, using these words might seem harmless. But in reality, it is censorship; on social media, app developers and algorithms are quite literally censoring users’ language. This is due to Section 230, where platforms are allowed to moderate their own services by removing content that violates their own standards — however, these guidelines do not apply to real life. Once we start using it in person, where we are allowed to talk freely without a tech company breathing down our necks, it becomes self-censorship.

Additionally, the more we beat around the bush, the more we become desensitized to the issues the words are actually referring to. Rape, suicide, guns, death, porn and sex are all topics that can be shocking when discussed out loud, but some of them — like rape and suicide — should be regarded as heavy, serious topics; they should not be normalized. Using the actual word instead of their “lighter” versions has more impact to get your points across, not to mention that people who don’t go on social media might not even understand what’s being said if they hear these algospeak words.

These words can dehumanize the people who have gone through harsh events. They trivialize both these experiences and the victims themselves, which deserve to be talked about seriously and without being sugarcoated or diminished. Many victims already feel small, overlooked or silenced, and dumbing down the words we use to discuss their experiences only harms them more.

From social media to the press and to the government, the language we use matters. Many are already desensitized from continuous exposure to violent, graphic content or dehumanizing language that pops up in the media; we must be intentional and careful with our choice of words. Our voices are truly ours and can be weapons against our oppressors — don’t censor yourself. 

Written by: Sabrina Figueroa — sfigueroaavila@ucdavis.edu

Disclaimer: The views and opinions expressed by individual columnists belong to the columnists alone and do not necessarily indicate the views and opinions held by The California Aggie.