After Korryn Gaines entered her Baltimore apartment earlier this month, she pulled out a shotgun and then her phone. She opened the Instagram and Facebook apps so she could post details of her standoff with the police.
Facebook cut her off.
Gaines wasn’t the first black person to post about a run-in with the police. Others include Philando Castile, whose fiancee streamed live video on Facebook as he died after a police officer shot him in July. A few days later, a protester streamed a sniper attack on Dallas police on the social network.
But her clash marked a turning point in the conversation about Facebook’s role in empowering speech. That’s because her friends watching did more than just watch: They urged her not to acquiesce.
Facebook temporarily shut down Gaines’ account at the request of the police. It came right before officers opened fire, killing her and striking her 5-year-old son.
The decision put a spotlight on another Facebook activity: its routine censorship of users.
Mark Zuckerberg’s company has teams of people working in offices around the world — Austin, Texas; Dublin, Ireland; Hyderabad, India and its Menlo Park, California headquarters — who process millions of reports a week.
Facebook doesn’t disclose how many times people post, how many live videos are streamed or how many posts it’s deleted. It has said it’s responded to about 20,000 law enforcement requests over a five-month period last year.
Activists say Facebook needs to more clearly spell out policies, particularly after what happened with Gaines.
“The lack of transparency is part of the problem,” said Rashad Robinson, the executive director of Color Of Change, an online organization focused on racial issues.
His organization and consumer advocacy group SumOfUs sent a letter on Monday to Facebook demanding the social network stop censoring personal accounts at the request of the police.
“News isn’t just getting shared on Facebook, it’s being broken on Facebook,” said Reem Suleiman, a campaigner for the organization. “If Facebook is making decisions about how news reaches the public, then it needs to be transparent about how those decisions are made.”
Just how far is Facebook willing to go to keep its platform in check? Chances are you haven’t read the fine print in its terms of service. But even if you did, you’d find Facebook’s censorship policies are vague and confusing.
So, I’ve put together this FAQ to help you understand how Zuckerberg and his team judge the comments you post — and decide what can stay and what they’ll pull from your feed.
What is Facebook’s general policy on censorship?
Facebook wants to be a place where people feel free to express themselves within reason. Hate speech and terrorism are no-nos, for example, as is most nudity and pornography. Contrary to popular belief, breastfeeding photos are allowed on the site.
In the last year, Facebook has touted “counter speech” as an alternative to censorship. The idea is that someone who responds to a hateful post by saying it’s wrong will have more impact than merely removing that post in the first place.
For example, Facebook’s content team will likely allow a photo of graphic violence with the caption “This is wrong.” But post the same photo with wording that celebrates violence or encourages others and it will likely be removed if Facebook gets complaints about it.
Does Facebook cooperate with law enforcement?
It does, though we don’t know to what extent. Many of its rules are are outlined on its website.
For example, it automatically alerts law enforcement about suspicious activity involving minors, such as child porn. Facebook also contacts the authorities in cases where it believes there’s imminent danger or credible threats of violence.
The company also said it turned over at least some data in more than 81 percent of the 19,235 law enforcement-related requests it received between July and December 2015.
Can Facebook limit free speech?
Freedom of speech isn’t what you think it is. The beginning phrase of the First Amendment to the US Constitution says “Congress shall make no law…” That means, within reason, the government doesn’t mess with what you say. But Facebook isn’t the government. It can set rules pretty much however it wants.
To that end, Facebook does have “community standards,” including rules against nudity, terrorism and hate speech.
Are there different types of video on Facebook?
Yes. You can upload videos to Facebook’s main service or its Instagram photo-sharing service.
The company also has a new feature called Facebook Live. It’s a video-streaming feature made available in April to all the 1.7 billion people who use the service each month. Like Facebook itself, the live feed is free.
The feature is built into the Facebook app for iPhones and devices powered by Google’s Android software. It’s super easy to use: all you do is open the app, click on status update and choose live video. Now you’re broadcasting to the world.
Can Facebook censor live videos?
It’s possible in cases where something in the video violates Facebook’s community standards. Facebook is considering plans to check on particularly popular live videos, but otherwise the company’s employees aren’t watching. Facebook relies on its community to alert it when people break the rules.
Why isn’t Facebook able to stop bad posts?
With more than 1.7 billion people using the social network every month, Facebook can’t monitor everything that passes through its site. The company does have teams of people around the globe devoted to policing Facebook, but those teams largely rely on its community of users to call out another’s behavior.
I saw an offensive cartoon about a man killing a cop. How is this ok?
This is one of those cases where things get confusing.
If you’re talking about this sickening and graphic cartoon, it’s an interesting case with no clear answers. At first, Facebook defined it as art, even though many users complained that it was offensive and awful. Eventually, the post was removed. The page of the group to which it was posted — the Black Panther Party of Mississippi — continues to operate despite seemingly being linked to an extremist organization. Facebook declined to discuss specifics of its decision in this instance.
I heard ISIS posts on Facebook. Does the company allow terrorists to post?
Facebook doesn’t allow terrorist speech, but it’s up to anyone using the site to report it when we see it.
Why did the Philando Castile video get taken down?
Facebook blamed a technical glitch for temporarily removing the video by Castile’s fiancee. The video was restored shortly after.
Why did Facebook attach a graphic warning to the Castile video after putting it back?
Facebook began adding warnings to videos and photos it deems require them. Consider that people as young as 13 years old can join Facebook.
Why doesn’t Facebook remove racist posts my [insert family member here] wrote?
Facebook weighs the value of removing content on a case-by-case basis. Sometimes, it relies on counter-speech to do the job. It also takes into account artistic merit, commentary and other factors.
The challenge, the company often says, is that it’s trying to be a gathering place for nearly everyone online. So forcing a specific ideology or set of political principles on everyone will tick them off.
And being a global gathering place also means Facebook needs to create a set of policies that apply across all its users. That’s more than half the world’s online population.
Who decides these things?
Facebook.
Who gave Facebook the right to do this anyway?
You did. And so did I. Everyone who logs on to Facebook gives it this power. Make no mistake about it: There are more people using Facebook than there are citizens of any country on Earth.
Zuckerberg was never elected to any office. But he’s built one of the most important websites in the world, and he makes decisions that affect all of us. There’s no way for any of us to appeal.
OK. There is one way.
Log off for good.
Source: CNET
AUTHOR: IAN SHER