Why Do Women Face So Much Abuse Online?
When Thorlaug Agustsdottir, a woman from Iceland, found a “Men are better than women” Facebook group in 2012, she was horrified. She posted some of what she found on her Facebook page. But that led her to become the group’s next victim. They photoshopped her face onto a beaten and chained woman and posted it on their page. One member commented, “You just need to be raped.”
Agustsdottir contacted Facebook to get the photo removed. But Facebook moderators labeled the photo “Controversial Humor,” and didn’t take it down. Other Facebook pages that have been labeled humor included “I kill bitches like you” and “Domestic Violence: Don’t Make Me Tell You Twice.” Agustsdottir said the photoshopped photo of her was eventually taken down—but only after she got the local press involved.
Catherine Buni and Soraya Chemaly feature Agustsdottir’s story in their recent article in the Atlantic, which explores how social media companies’ standards on hate speech are repressing female users. By now, Facebook no longer uses the “Controversial Humor” label, and it, along with Twitter, has taken steps to improve its policies. But the struggle to stop violence against women on social media sites continues.
Buni and Chemaly write:
"If, as the communications philosopher Marshall McLuhan famously said, television brought the brutality of war into people’s living rooms, the Internet today is bringing violence against women out of it. Once largely hidden from view, this brutality is now being exposed in unprecedented ways. … On the one hand, these online images and words are bringing awareness to a longstanding problem. On the other hand, the amplification of these ideas over social media networks is validating and spreading pathology."
Their Atlantic article notes Demos’ report “Misogyny on Twitter” which found more than 6 million instances of the word “slut” or “whore” used on Twitter. Researchers deemed 20 percent of these threatening.
Buni and Chemaly’s piece was published on the eve of recent death threats to Anita Sarkeesian, a feminist critic of video games. She was forced to cancel her lecture at Utah State University, as the university wouldn’t ban weapons from its event. Sarkeesian has been subject to online harassment and threats so violent she was forced to leave her home.
Online harassment toward women comes in many forms, Buni and Chemaly explain. Some women receive threats through a tweet or post directed at them. (Gamers who take part in harassing women often use the Twitter hashtag #GamerGate in their tweets.) Female journalists are harassed via the comment section of their articles. In cases of “revenge porn,” women have photos and videos posted online without their consent. In a growing number of countries, rapists are filming rapes and threatening to upload the videos on the Internet if their victims don’t stay silent.
Companies like Facebook, Twitter and YouTube frequently host this harassment, but often argue that they aren’t “in the business of policing free speech.” Buni and Chemaly show the hypocrisy of that assertion, pointing out that Twitter has worked with the government to block certain user accounts in the past. And when one woman created a Facebook page to try to expose the misogyny she found on the social media site, the company deleted her page because she was posting others’ content without written permission. It didn’t take any action against those posting the offensive material.
In addition, the Atlantic article explains that when companies don’t properly deal with harassment toward women, they work to censor the free speech rights of female users.
“We have the expressive interests of the harassers to threaten, to post photos, to spread defamation, rape threats, lies on the one hand,” Danielle Keats Citron, law professor at the University of Maryland and author of the recently released book Hate Crimes in Cyberspace told the Atlantic. “And on the other hand you have the free speech interests, among others, of the victims, who are silenced and are driven offline.”
The authors also write about the cases of several women who have faced harassment and ultimately were silenced on social media. In one case, harassers on Facebook traced a woman down and posted her address, phone number and children’s names. The woman had started a Facebook page where people could submit offensive content the company refused to remove. She eventually suspended her account.
There has been some success in exposing this content on social media sites. One movement targeted companies that had ads on Facebook pages with misogynistic content. A campaign was launched to make Facebook recognize that expressing violence against women is a form of hate speech. Within a day, 160 organizations and corporations signed a letter, and more than 15 companies pulled their ad money from Facebook altogether.
But there is still a lot of work to be done. As recently as last year Twitter had no way for users to report abuse. It now has a report button, but it has no comment section and requires each tweet to be reported separately—making it tough to illustrate a case of ongoing harassment.
The Atlantic reports, “Facebook is doing more than most companies to address online aggression against women,” pointing to the fact that members of the National Network to End Domestic Violence have served on Facebook’s Safety Advisory Board since 2010. They have also teamed with Yale’s Center for Emotional Intelligence to create a bully prevention project.
The piece also paints a picture of what it’s like being a content moderator, often swamped with cases. But the terms of how they moderate are still in question. Most companies outsource these jobs to private companies that don’t disclose their methods for “proprietary reasons.” And while a Facebook spokesperson said they have “objective standards” for moderation, critics say it comes down to human decision-making. And in a world where men dominate the tech industry, gender can affect what content is deemed hate speech.
The Atlantic reports that experts recommend developing platforms that explain when and why content was removed in order to provide greater transparency. They also suggest hiring more staff for moderation as well as more female workers. But it also notes “social media is more symptom than disease.” And violence against women is an epidemic, which “is thriving in the petri dish of social media.”
Ultimately, it’s our job as a society to end this epidemic. In the meantime, social media companies need to take better steps to make sure the voices of women—who play a key role in ending this epidemic—are heard.
Alyssa Figueroa is an associate editor at AlterNet.