For all their value, platforms like Instagram are toxic, leaving users to self-regulate social media usage
Waking up on Monday morning, many students might have panicked when they tried to open Instagram and found themselves unable to refresh their feeds. Maybe they restarted their phones, tried logging out of their apps, disconnected from campus’ dismal “eduroam” Wi-Fi, but these desperate attempts to view Instagram were without avail. Others might have tried to log on to WhatsApp to connect with family or tried to open Facebook to check for a post about the day’s club meeting or read up on their carefully curated news feed, with similarly disappointing outcomes.
This outage sent some people into a spiral — refreshing their apps every 10 minutes in case maybe this time their feed would load or worrying about who liked their posts from the night before. The outage was loud and clear proof of what Facebook’s leaked internal reports showed last week: Instagram is “toxic,” and it dominates many of our lives.
When news initially emerged about Facebook’s internal investigation into its platforms’ effects on mental health, especially that of teens, the Editorial Board was not surprised. We know that Instagram is damaging, but seeing the figures — one in three teen girls say Instagram has worsened their body image and 6% of American and 13% of British users credit suicidal thoughts to the application — and learning that Facebook tried to bury this information was distressing. Since 2019, the massive company has known that their apps were hurting users, and instead of taking steps to reverse course and correct these wrongs, they doubled down in March 2021, introducing the concept of an Instagram platform exclusively for children.
While in theory it’s easy to say that the Instagram app and Facebook as a company are “toxic” and we should all just delete our accounts, lower our screen time, spend time in nature or read a book to improve mental health, it’s not that simple. As became obvious during the company-wide outage on Monday, people rely on Facebook’s apps for many reasons. Some use WhatsApp to communicate, many student-run organizations make announcements and plan events via Facebook groups and frankly, Instagram is hard to quit cold turkey.
It is not our fault as consumers that we have become reliant on and obsessed with these apps — they were designed to be addictive. But because the company has designed its algorithm to be so irresistible and platforms to be so universal, it should be held responsible for these consequences and called on to be transparent about and address data that show that its apps are harmful.
Instead of ignoring the damage done by its apps and looking to increase its profits by designing a social media platform for children — the very group that is hurt the most by its products — Facebook must look for ways that it can relieve some of the pressure of comparison, focus on aesthetics and obsession over follower counts, likes and comments that Instagram promotes. It has a responsibility to prioritize the health of its users and use the research that it has voluntarily conducted to address the damage that its platforms have done and continue to do.
If Facebook won’t hold itself accountable to protect users, it’s important to keep in mind the impacts these apps have on us. While we don’t expect users to delete Instagram and Facebook, we encourage everyone to be conscious of their usage of these platforms and prioritize mental health above likes and followers.
Written by: The Editorial Board