Mass. Peace Action organized a study group about the danger of right wing nationalism on February 13 (https://masspeaceaction.org/…/study-action-group…/), and we posted the event on Facebook.
The photo we used was taken at the Unite the Right rally in 2017 — the one in Charlottesville when Nazis and white nationalists rioted and killed a woman, and Trump said there were great people on both sides. We used the photo from Wikipedia’s page about the event: https://en.wikipedia.org/wiki/Unite_the_Right_rally. It has people carrying Nazi and Confederate flags.
Facebook removed our posts on Feb 8 and Feb 11, put our page (and all pages of which I’m administrator) under restriction and removed me as administrator, and threatened to unpublish them. But the event we were posting about was an anti-Nazi, anti-right wing nationalist study group, using a picture from Wikipedia. A glance at our post would make it clear that we don’t support those ideas and are organizing to oppose them. Does it make sense to remove our post and put us in the penalty box?
Facebook doesn’t think so. Their Community Standards say: “We recognize that people sometimes share content that includes someone else’s hate speech to condemn it or raise awareness. In other cases, speech that might otherwise violate our standards can be used self-referentially or in an empowering way. Our policies are designed to allow room for these types of speech, but we require people to clearly indicate their intent. If intention is unclear, we may remove content.”
Presumably Facebook’s algorithm saw flags in our image and didn’t understand our words.
So, our first complaint is that Facebook isn’t following its own standards.
Facebook’s “Page Quality” area makes clear what triggered their action. But we didn’t know to look there, and the warning messages they gave us on desktop (see below) didn’t link to Page Quality or say which posts were the problem. We only figured out what was going on when I happened to look at Facebook on my phone; the messages there are much clearer and more detailed, and link to Page Quality.
So our second complaint is that Facebook messages on desktop aren’t clear, didn’t specify which posts were triggered, and didn’t link to any supporting material other than the 10,000 word Community Standards document. And the Page Quality area, when we found it, does show the image that gave offense, but not the text that we posted with it that made clear we were opposed to the behavior shown in the image.
Our third complaint is that Facebook offers no way to appeal a ruling. I’ve sent four messages so far to Help > Report a Problem and two messages in response to ” Was this information helpful”. None have been answered.
Mark Zuckerberg, we’re here to fight fascism and right wing nationalism. Stop penalizing Mass. Peace Action and threatening to unpublish us! Clean up your product and make things clear! Provide an avenue to appeal mistakes!
And to state the obvious: it’s a huge, huge problem when an unaccountable, untransparent private corporation controls the public square where political ideas are debated. Dozens of progressive voices have been silenced by @Facebook — even if we weren’t hit because we were progressive but due to a product error, others have been. CODEPINK’s national co-director Ariel Gold, Olivia Katbi-Smith, the Palestinian co-chair of the Portland DSA, reporting on the coup in Bolivia, analyst Paul Jay, and many more. Indeed, Facebook is so notorious for censoring or banning Palestinian and anti-Zionist voices, that Palestinian actvists formed the facebook, we need to talk campaign, which we support.