By Lisa A. Mallett and Liz Waterhouse @ www.listening2lesbians.com
For years many of us have repeatedly reported revenge porn, child exploitation material, harassment and misogyny, rape and death threats, and been told that what we have reported, “doesn’t contravene Facebook’s community standards.”
We are now seeing an ever-increasing number of women, lesbians and our allies, having posts deleted and being banned for using the word dyke.
While it has been wonderful to see women clustering informally around this issue in defense of lesbians, the Facebook control mechanisms, as illuminated by this wave of removal and banning, are alarming.
“Dyke” a banned word?
I started hearing about women banned for using the word dyke in early 2017.
Prompted by an irate post in a lesbian group in June 2017, I went to look at the San Francisco Dyke March page, having been advised the event page included anti-lesbian content. This is far from the first such occurrence, with the recent Chicago Dyke March 2017 explicitly telling female only, same-sex attracted lesbians to keep their bigoted selves away.
In response to the comments I had read on the SF Dyke March, I made this comment:
(It should not be controversial that women’s sexual boundaries should be respected and that our sexual orientation should be supported within the broader LGBT community.)
Then a woman in my group told me about this happening;
It had been posted on the very same page.
These two experiences, as well as reports I had been hearing since early 2017, prompted me to make a public post calling for screen caps of the word dyke being banned. Sometime later, I could not access the SF Dyke March thread or my comment, even through notifications, and assumed it had been deleted. Furious, I posted a screen cap of my comment to my page saying;
“The San Francisco dyke march deleted this comment from their wall. The lesbophobia is staggering.”
I didn’t tag the event, or broadcast it beyond my friends list, but soon afterwards discovered that the post had been deleted.
I’m not sure if the post was reported or auto-removed by Facebook.
And then it turned out that I was banned from Facebook, for calling a dyke march lesbophobic.
I took from this that it was socially acceptable to:
- tell lesbians that lesbian includes anyone who identifies as lesbian;
- tell lesbians that dyke includes anyone who identifies as a dyke;
- tell lesbians to stay away from lesbian originating events;
- prioritise absolutely everyone over lesbians, even at nominally lesbian events.
As my 24-hour ban began, I started receiving screen caps and stories from women everywhere:
And these are just a small sample. Please scroll to the bottom to see the Hall of Shame for more images and stories of banned, blocked and deleted dykes.
So this is only words and who cares about the identities of others, I hear you say?
Why do we care?
Well, words matter. If every single term used to describe us (female, woman, female-only same-sex attracted, lesbian, dyke) is redefined to either include others, or explicitly exclude us, how do we describe ourselves, and analyse what happens to the group of women who form their primary focus around women? And in a male-centered society, the (attempted) removal of that capacity has strong political meaning.
Even if you don’t agree with where we draw the lines, it shouldn’t be forbidden for lesbians to defend the language we need to discuss ourselves. We certainly shouldn’t be told we are not allowed to reclaim our own word and declare them with pride. In an age of endlessly touted freedom of speech, it is telling who is told to shut up and who does the telling. It is becoming increasingly obvious that dykes are on the losing end and are experiencing systematic erasure from public spaces.
So what IS Facebook’s problem with dykes?
Facebook has always logistically, socially and ethically, had issues with censorship. It has managed to dig itself deeper and deeper with every new algorithm, AI, program, corporate/NGO cooperative, and office it creates, to deal with the over 1 billion users using its platform. As many have noted over the years, transparency at Facebook has been lacking with regards to many of the company’s functions, but perhaps most importantly, how it decides what content we are allowed to post and see. This lack of transparency makes it extremely difficult to determine how and why dyke content is being censored, as well as why we believe there has been a very recent increase in the number of women experiencing post deletion and bans or blocks, by Facebook. What we are left with is reporting on what we do know and asking questions on the rest. At Listening2Lesbians.com, we believe we may be witnessing a perfect storm brewing. The convergence of programs, politics, social discord, hate speech, censorship and Dyke Pride, that Facebook management understand very little about and show very little regard for, allowing unchecked erasure of lesbian content, interaction, movement and cooperation.
Here are the old and new storms heading towards a dyke post near you.
The Community Operations Team
The Community Operations Team is actually a bunch of teams located in California, USA, Texas, USA, Dublin, Ireland and Hyderabad, India that uses Facebook’s Community Standards to evaluate posts for, among other things, terrorism and hate speech. Time and again, Facebook has declared that the team relies mostly on users reporting questionable posts (Sherr, 2016) and that every post that is reported is looked at and acted upon by a member of the team, for content and context (Green, 2015). Julie de Bailliencourt, Facebook’s Safety Policy Manager for Europe, the Middle East and Africa, has stated that it is a myth that the more a post is reported, the more likely it will be deleted and that, “one report is enough” (Green, 2015). Monika Bickert, Facebook’s Head of Global Policy Management has also confirmed that they rely on user reports, that all reports are viewed by an actual human being, but added in 2015 that Facebook had no plans to automatically scan for and remove content, otherwise known as an algorithm (Goel, 2015). However, we now know that Facebook is indeed using algorithmic tools to scan news content and now, user posts. It’s just unclear to what extent it is being utilized and how. More on that later.
There have been few insights into the inner workings of the Community Operations Team, but what we have been able to learn is truly disturbing and has potentially huge consequences for dykes on Facebook and all of Facebook’s other platforms. That interview with Julie de Bailliencourt took place at Facebook’s largest headquarters in Dublin, where it was reported that the team is under immense pressure and often has heated arguments about what content meets community standards. “We don’t hire people to just press the same button X amount of times per hour,” says de Bailliencourt. “We hire people with very different backgrounds, and they sometimes disagree. It feels almost like the UN sometimes” (Green, 2015).
However, there are hints that the troubles run much deeper. In 2016, NPR was given rare access to employees working on the Community Operations Team and found that many feel they are in way over their heads. Sources told NPR in 2010 that Facebook found it needed more workers fast to carry the immense load the team was under. At first they tried crowdsourcing solutions like CrowdFlower, but eventually turned to Accenture who made a team of subcontractors consisting of several thousand people in offices located in the Philippines and Poland (Shahani, 2016). If these locations scare you, they should. Poland was ranked the third worst country to be LGBTI in Europe, according to a 2016 report (Sheftalovich, 2016) and although the Philippines has shown tolerance for LGB people, it is generally viewed as a country that does not really understand homosexuality, or support it.
Adding to the climate of these countries with regards to lesbian rights, are further reports that these subcontractors are worked extremely hard, are expected to make a decision on a piece of content in 10 seconds and often are not able to view the entire post for content and context. This has led NPR to conclude that Facebook’s Community Operations Team may be, “the biggest editing — aka censorship — operation in the history of media” (Shahani, 2016).
In order to grasp the potential consequences of the functionality of Facebook’s Community Operations Team on the lesbian community, consider this example. In the same NPR article, it was reported that when India first opened its office, employees interpreted French kissing as inappropriate sexual content and senior management was floored. They had not anticipated such a cultural influence on interpretations of Community Standards. Seriously, what was Facebook thinking? It appears they weren’t. So the questions are: “Who is sitting in the cubicle judging your dyke post?”, “Did they even see your post?” and “What exactly do they believe about dykes”?
Facebook’s Network of Support (NOS)
In 2010, Facebook responded to bullying, harassment, hate speech and increasing suicides in LBGT youth by forming a consultation group of LGBT advocacy organizations to offer guidance on what, how and who to monitor for hate speech against LGBT youth. The organizations are GLAAD (formerly known as the Gay and Lesbian Alliance Against Defamation) , GLSEN (formerly known as the Gay, Lesbian and Staright Education Network), The Human Rights Campaign (HRC), The National Center for Transgender Equality (NCTE), PFLAG (formerly known as Parents, Families and Friends of Lesbians and Gays), and The Trevor Project (Facebook, 2017). Although we don’t have enough information on the extent of their influence seven years later, we do know that several of these organizations have taken adversarial stances against lesbians, including changing definitions of lesbian and woman and working against protections for women and girls. Many in the feminist community know that members of the radical feminist community do not support these new definitions. Regardless of how someone feels about this, it is crucial to understand the impact that organizations can have on influencing what can be seen by the public, and that someday, these organizations might just decide to come after you through mediums like Facebook. At this time, that focus is on limiting the voices of lesbians and their allies and it appears a lot of people are okay with that. More on the NOS later (and algorithms!).
Facebook’s Online Civil Courage Initiative
In January of 2016, Facebook was experiencing extreme pressure from European countries, led by Germany, to combat online hate speech. The result was a pilot program called the Online Civil Courage Initiative, which focused its efforts in France, Germany and the UK. By September 2016, they had decided to expand their program by offering advertising credits and marketing advice to NGOs and other groups willing to work online to “counteract extremist messaging” (Toor, 2016). On June 23, 2017, Facebook announced that it had officially launched this program in the UK to “curb the spread of hate speech and extremist material online,’ by offering “funding and training to help local organizations track and counteract hate speech and terrorist propaganda” (Toor, 2017).
Just a week earlier, on June 15th 2017, Facebook announced new measures it was taking to combat terrorist propaganda and violent material. Organizations that participate in this program will be able to communicate to Facebook via a “dedicated support desk” (Toor, 2017). They also announced a new series of blogs, to be released over time that Facebook will use to convey to its users information about how it works behind-the-scenes, especially in the area of controlling its content. The first blog, entitled, “Hard Questions: How We Counter Terrorism”, was written by By Monika Bickert (see above) and Brian Fishman, Counterterrorism Policy Manager. In the first section labeled “Artificial Intelligence”, they lay out the following methods Facebook will be using. I will add Facebook’s own description of each, but in abbreviated form.
- Image matching: When someone tries to upload a terrorist photo or video, our systems look for whether the image matches a known terrorism photo or video.
- Language understanding: We have also recently started to experiment with using AI to understand text that might be advocating for terrorism. That analysis goes into an algorithm that is in the early stages of learning how to detect similar posts. The machine learning algorithms work on a feedback loop and get better over time.
- Removing terrorist clusters: [When] we identify Pages, groups, posts or profiles as supporting terrorism, we also use algorithms to “fan out” to try to identify related material that may also support terrorism. We use signals like whether an account is friends with a high number of accounts that have been disabled for terrorism, or whether an account shares the same attributes as a disabled account.
- Recidivism: We’ve also gotten much faster at detecting new fake accounts created by repeat offenders.
- Cross-platform collaboration: [We] have begun work on systems to enable us to take action against terrorist accounts across all our platforms, including WhatsApp and Instagram (Bickert & Fishman, 2017).
They also acknowledge their use of “human expertise” through the Community Operations Team, as well as their partnerships with governments and organizations to run these initiatives. Finally, they mention something called “counterspeech training,’ in which they have partnered with NGOs and community groups to “empower the voices that matter most”. Even in a counter-terrorism context, this sent shivers down my dyke spine. And now we see that there is an AI actually running amok on Facebook looking for those voices that don’t “matter most”, partnering with organizations that are dictating what those important voices are saying (I’m thinking here about Network of Safety!) and being judged by an underpaid and overworked Community Operations employee from I-don’t-know-where, who thinks I-don’t-know-what about lesbians and our right to exist anywhere, let alone Facebook.
Remember, algorithms are not neutral and an AI is a bunch of algorithms written by people with a mess of biases and prejudices. We all know that logically, none of these entities at play here are chock-full of dykes and dyke influence. In fact, if history holds, it’s quite the opposite. So if you’ve been seeing your dyke posts getting removed faster and faster, and your bans getting longer and longer, look at what policies, procedures, programs and personnel Facebook has been bringing to their platform over the last seven years, and especially over the last year and a half. And if you have seen, like we have, a huge increase in anti-dyke activity by Facebook in June 2017, look at what Facebook has introduced in this month alone.
The perfect storm continues…
US Pride and Dyke March
So we know Facebook relies heavily on its community members sending in reports of material they deem inappropriate and those hard working Community Operations employees get to work putting our posts into context and pushing the red button or the green button. They see lovely posts that say “I LOVE DYKES!!!”, laugh, and hit the green button and they see “DYKES BURN IN HELL!!!”, get very angry, and hit the red button.
Yeah, right. We wish.
As we explored earlier, it just doesn’t work that way at Facebook HQs around the world and we’re all smart enough to know that any algorithms are going to struggle with having to make a distinction between nice dyke posts and hate speech. We also have no idea who is actually influencing the Online Civil Courage Initiative, or the guidelines for “counterspeech’ measures. So, into this Facebook created hell we bring US Pride Month and Dyke March!
You would have to be living under a rock (or not be a dyke) to know that the L is having issues with the GBT. Whatever side you are on, we are where we are. Dykes are being told to stay away from Dyke Marches if they won’t accept dick into their life and Pride has become a practice in extreme queer theory with a tendency to alienate and shame lesbians for being all about each other. Again, believe what you want, but June has been an intense month for lesbians.
What I am suggesting here, and I’m sure many of you have guessed already, is that trolls are abound this June and any dyke-positive group or individual is at greater risk for being reported, even if we are just a bunch of dykes going out for a walk. However, this is only a piece of the perfect storm that is converging on June.
Victoria Brownworth wrote:
Lesbians are being no-platformed out of our very existence, whether through the insidiousness of silencing or the oppressive demands of compulsory heterosexuality or through violence that at best leaves us shattered and at worst, dead. Lesbians deserve the same level of autonomy as any other group, be it minority or majority. If you aren’t supporting that autonomy, then you are inadvertently or directly a participant in the erasure that is perhaps slowly but very definitely steadily, wiping us off the face of the earth. (Brownworth, 2015)
Here’s What We Can Do
Join Listening2Lesbians in asking Facebook the hard questions. On June 15th 2017, Facebook said, “the decisions we make at Facebook affect the way people find out about the world and communicate with their loved ones” (Bickert & Fishman, 2017). They also said, “we take seriously our responsibility — and accountability — for our impact and influence. We want to broaden that conversation” (Schrage, 2017).
This sounds really good to me! Dykes are awesome at conversation!
Send your dyke concerns, screencaps, stories, experiences and more to email@example.com. Listening2Lesbians will also be sending a version of this piece. Let’s try to find out what is going on and raise a stink. Speaking out about the silencing of dykes needs to happen now, because we really don’t know what it’s going to look like for us once this storm passes.
We would like to thank all the dykes out there for coming together and helping us see what has been happening to us on Facebook these days. We would also like to thank our bisexual and straight women allies who also threw themselves in front of the Facebook bus to test out this theory. Their love of dykes got them the ban hammer too. Looking back at the last couple of days, we actually created women’s space on the very ground of the oppressor. We did that. We can do it again.
HALL OF SHAME
Women who have had their posts removed or been banned for a pro-lesbian use of the word dyke:
- On the same SF dyke march thread, another lesbian had this comment removed by Facebook for apparently breaching the Facebook Community Standards:
- This lesbian was had her post removed and was banned for 30 days for enjoying a dyke band at a farmer’s market.
- This lesbian was banned three times for posting this picture – once in January 2017 for 24 hours, once in early March 2017 for 3 days and once in late march for 30 days.
- This was also removed in March 2017 but without a Facebook ban.
Other comments removed for breaching the standards:
- Another post removed in the context of identity and US dyke marches
- A telling example is the woman, a lesbian ally, who posted “I love dykes!!!!” to test for us all if dyke really was a banned word. This post does nothing but support lesbians and still she had the post removed and she was banned from Facebook for 7 days.
- This young lesbian posted this video of the lesbian avengers starting dyke marches with the comment “When dyke marches were still for dykes ❤ “.
Her post was removed.
She was banned.
- An older dyke, well known in international lesbian circles, had a post inviting friends to go for a walk in nature (with dyke in the text) removed three times for breaching Facebook Community Standards. She was later banned.
- Kate Hansen, also lesbian, was banned for 30 days after posting that lesbians were getting banned for using the word dyke.
- Another woman posted, on a rainbow background, “I love dykes! Dykes for Dykes!” Her post was removed for breaching community standards and she was banned from Facebook for 24 hours.
- The post of this tweet was removed on June 10, presumably for the comment about dyke action and visibility. The poster was banned for 24 hours, banned for another 24 after that and threatened with a permanent ban after 9 years on Facebook without any warnings. She submitted an appeal which Facebook did not respond to. Others reposted the original tweet without the comment and were not banned.
Another woman posted about dykes on bikes, with hearts, and the post was removed. This ocurred in Pride Month.
Another woman had a photo of her and her partner, captioned dyke pride, removed, again still in Pride Month.
Other women have been banned by Facebook for using the word dyke but we haven’t been able to contact them due to their ban, which speaks to the power of banning and consequently isolating women.
Removal and ban photos and stories sent to us after this was posted:
- This lesbian had her post removed and banned after sharing the video of the Lesbian Avengers starting dyke marches and was banned for 3 days. Facebook has not responded to her appeal.
- This woman has had yet another dyke post removed (her 5th that we know about).
If you have more screen caps of lesbians being banned or having posts removed for using the word dyke or pro-lesbian statements please:
- post them in a comment
- email them to us at firstname.lastname@example.org or Lisa@listening2lesbians.com
- message us via facebook at https://www.facebook.com/LlSTEN2LESBlANS/
Bickert, M., & Fishman, B. (2017, June 15). Hard Questions: How We Counter Terrorism. Retrieved from Facebook Newsroom: https://newsroom.fb.com/news/2017/06/how-we-counter-terrorism/
Brownworth, V. (2015, March 5). ERASURE: THE NEW NORMAL FOR LESBIANS BY @VABVOX. Retrieved from A Room of Our Own: http://www.aroomofourown.org/erasure-the-new-normal-for-lesbians-by-vabvoc/2015
Facebook. (2017). What is the Facebook Network of Support (NOS) and what NOS resources are available for LGBTQ people? Retrieved from Facebook: https://www.facebook.com/help/202924156415780
Goel, V. (2015, March 16). Facebook Clarifies Rules on What It Bans and Why. Retrieved from The New York Times: https://mobile.nytimes.com/blogs/bits/2015/03/16/facebook-explains-what-it-bans-and-why/?referer=
Green, C. (2015, February 13). What Happens When You ‘Report Abuse’? The Secretive Facebook Censors Who Decide What Is-and What Isn’t Abuse. Retrieved from Independent: http://www.independent.co.uk/life-style/gadgets-and-tech/features/what-happens-when-you-report-abuse-the-secretive-facebook-censors-who-decide-what-is-and-what-isnt-10045437.html
Schrage, E. (2017, June 15). Hard Questions. Retrieved from Facebook Newsroom: https://newsroom.fb.com/news/2017/06/hard-questions/
Shahani, A. (2016, November 17). From Hate Speech To Fake News: The Content Crisis Facing Mark Zuckerberg. Retrieved from NPR: http://www.npr.org/sections/alltechconsidered/2016/11/17/495827410/from-hate-speech-to-fake-news-the-content-crisis-facing-mark-zuckerberg
Sheftalovich, Z. (2016, May 11). Latvia, Lithuania and Poland worst countries to be gay in EU. Retrieved from Politico: http://www.politico.eu/article/latvia-lithuania-and-poland-worst-countries-to-be-gay-in-eu/
Sherr, I. (2016, September 9). How Facebook censors your posts (FAQ). Retrieved from CNET: https://www.google.com/amp/s/www.cnet.com/google-amp/news/how-zuckerberg-facebook-censors-korryn-gaines-philando-castile-dallas-police-your-posts-faq/
Toor, A. (2016, September 22). Facebook is expanding its campaign to combat hate speech. Retrieved from The Verge: https://www.theverge.com/2016/9/22/13013440/facebook-hate-speech-campaign-expansion
Toor, A. (2017, June 23). Facebook launches program to combat hate speech and terrorist propaganda in the UK. Retrieved from The Verge: https://www.theverge.com/2017/6/23/15860868/facebook-hate-speech-terrorism-uk-online-civil-courage-initiative?yptr=yahoo