Facebook fuels terrorism, lynchings, conspiracies under guise of connecting people
“So we connect more people. That can be bad if they make it negative […] Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good.”
This comes from a leaked company memo penned by Facebook’s vice president Andrew Bosworth in June of 2016 amidst mounting allegations regarding the platform’s gross mishandling of Russian interference in the 2016 election. The core message of the memo is that connecting people — Facebook’s self-proclaimed objective — outweighs the site’s contribution to violence on a global scale. As Bosworth later stated in the memo, “That isn’t something we are doing for ourselves. Or for our stock price (ha!). It is literally just what we do. We connect people. Period.”
The memo is patently intended to comfort employees while absolving Facebook of responsibility for its role in the 2016 election interference, which might be justified if Facebook were simply a platform, but it isn’t. While Facebook does not commit violence directly, it feeds and encourages people’s basest emotional impulses regardless of whether those impulses are racist, violent or irrational. Based on the memo, Facebook only seems to care when it’s in damage control mode.
The algorithms employed by Facebook and other platforms like YouTube and Twitter filter what each person sees based on what will keep them on the site longer — most often the posts that make us linger on the site are those that rile us. When angry and hateful people go on Facebook, the search algorithm intended for Facebook’s growth will supply those people with fresh provocations, feeding and exacerbating their rage. But it’s not just anger; Facebook trades in the conspiratorial and irrational. As Mark Zuckerberg’s former mentor Roger McNamee said, go on Facebook and investigate vaccines, and in a year you’ll be protesting in the street over a mythical government conspiracy.
Facebook does more than connect people — it enrages people. The ethnic cleansing of Rohingya Muslims in Myanmar was incited, and made very successful, by Facebook. Right-wing terrorists have attained much of their ideological ammo from incendiary and false posts on Facebook. At least two dozen people have been murdered in mob lynchings in India in 2018 — killings ignited by rumors spread on WhatsApp (a Facebook-owned messaging service). In Brazil, citizens avoided a government-mandated yellow-fever vaccine because messages on WhatsApp falsely claimed the vaccine was dangerous.
Facebook also sells its users’ data to third party sites like Cambridge Analytica, a consulting firm that harvested data from 87 million users and Geofeedia, a social media monitoring company that attained special access to Facebook’s user data and sold it to police departments targeting Black Lives Matter protestors.
Connecting people is good; making the world “unified” is good. But when Facebook and other platforms purportedly “connect people” by hoarding them into an enraged echo chamber, by selling their data so law enforcement can target minorities, that is not connecting people — that is exploiting people for profit regardless of the consequences. Bosworth said that because Facebook supposedly connects people, it’s okay if there is a terrorist attack or if an innocent person is ripped apart by an angry mob. Following an admission of its role in terrorism with “we connect people” exposes the fallacy of Facebook’s proclaimed objective.
With a continuous stream of scandals and apologies followed by more scandals, it’s painfully obvious that Facebook pritorizes growth over the safety of its users. For Facebook to say this is because it’s solely driven by connecting people is not only false but absurd. Facebook isn’t trying to grow by connecting people as much as possible because it’s ultimately a “de facto good” for society — if that were the case, it might strive more toward regulating the spread of false information.
Facebook grows by connecting people because it’s profitable — profit is why Russian hackers were able to pay $100,000 (at least) in ads to undermine American democracy. Profit is why Facebook has failed to make a genuine, concerted effort to regulate the spread of false information. Profit is why Facebook’s user data isn’t private, or safe. Profit is why Bosworth wants his employees to feel that what they do is a “*de facto* good” in the face of all the evil that has been done. And when that profit comes not from selling a product or service that improves lives but inspires anger, irrationality and injury, that’s exploitation.
With a market cap of $509 billion, employing some of the most highly educated and accomplished people in the world, does anyone believe Facebook is accidentally exploiting people?
Written by: Hanadi Jordan — firstname.lastname@example.org
Disclaimer: The views and opinions expressed by individual columnists belong to the columnists alone and do not necessarily indicate the views and opinions held by The California Aggie.