It is tempting to make every fiasco at Facebook about the power (and the abuse of power) of the algorithm. The "napalm girl" controversy does not neatly fit that storyline. A little-known team of humans at Facebook decided to remove the iconic photo from the site this week.
That move revealed, in a klutzy way, just how much the company is struggling internally to exercise the most basic editorial judgment, despite claims by senior leadership that the system is working.
In a public address in 2014, CEO Mark Zuckerberg said his company's goal is to be the "perfect personalized newspaper for every person in the world."
Well, if that's the case, Facebook doesn't show much deference to the editorial wisdom of newsrooms.
Back in 1972, in the bloodied fields of the Vietnam War, an Associated Press photographer took a photo of children screaming, mouths wide open, as they flee a napalm attack. One of them, a 9-year-old girl, is naked.
Fast-forward to 2016. A Norwegian writer shared that image and six others on Facebook, in a post about photos that "changed the history of warfare." His account was suspended. He violated Facebook bans on nudity and child porn.
The decision was extraordinary. A photo deemed so significant that it won a Pulitzer Prize did not make the cut for Facebook's Community Standards — the rules on what you can and cannot post.
It took a global campaign — with the editor-in-chief of a Norwegian newspaper running a front-page letter to Zuckerberg, with the Norwegian prime minister re-posting the now-banned photo, and with the world watching — for Facebook to back down, days later, and say: OK, we'll let the photo stay after all.
Notably, Zuckerberg did not issue a public apology. He stayed silent.
The company put a vice president in charge of media partnerships, Justin Osofsky, out front. Osofsky wrote in a post that Facebook "made a mistake" but, on the bright side, "one of the most important things about Facebook is our ability to listen to our community and evolve, and I appreciate everyone who has helped us make things right."
It was a teachable moment, and the lesson was learned.
Facebook users had different reactions. Uma Venkatraman darted back in a comment: "A simple 'sorry, we were wrong' would have been more effective."
The Humans At The Wheel
Facebook does use algorithms to decide what viral stories should be "trending news," even when the "news" is factually false. But in this "napalm girl" controversy, humans are at the wheel.
One could point out that Facebook has algorithms to spot what might be child porn, in order to take it down. But in this instance, it was a human who flagged the post and a human inside the company who decided to hit the delete button.
In fact, with very few exceptions (for example, for spam attacks), people at Facebook are the ones manually removing content, according to Monika Bickert, Facebook's head of policy.
Bickert has a curious dual role. She's responsible for deciding what stays up and what comes down on the site; and she has to run around the world and kiss the rings of global leaders who are angry with Facebook. She faces the unenviable challenge of developing global standards that work across markets, striking the right balance between free speech and suppression.
In an interview with NPR about U.S. hate speech, which occurred in July, Bickert explained that humans have to remain in charge of moderating posts because of the problem of context.
Algorithms can't make out if a racial slur is being used to attack a person or as social commentary in a rap song. Algorithms can't make out if a violent photo is mocking victims or educating the public. The technology simply isn't there yet.
"Context is so important, it's critical," Bickert said and repeated for emphasis. "Context is everything."
Her commitment to context is so clear, it's hard to imagine how the "napalm girl" photo ever got taken down. Unless, that is, you consider look at which humans are making the call.
Facebook did not reveal details of its internal decision-making process. NPR scraped LinkedIn for the resumes of a few hundred employees and contractors in the "community operations" teams, the self-described "safety specialists" in charge.
The team members are scattered around the world — in California, Ireland, India. Many are recent college grads with questionable training on what would be considered, in legacy newsrooms, very tough decisions that only veteran editors can make.
And the volume of work is extraordinary. While Bickert would not say how many posts Facebook removes on average, her colleague Osofsky shared in his non-apology: "It's hard to screen millions of posts on a case-by-case basis every week."
Repeat: millions. That would make the unit an editorial sweatshop.
Facebook did not have to remove "napalm girl" under law, according to Thomas Vinje, an attorney in the European Union who represents American internet companies. The idea that anyone in the company had a legitimate concern about liability is "unimaginable," he says. "Anyone with basic knowledge knows it's a very famous picture. It's a strange situation."
It's a revealing move about the business. While Facebook maintains it is just a platform (so it's not liable for the content that users choose to share), the company is also a multinational entity that's trying to build a digestible product: a digital Coca-Cola that everyone, from hipster San Francisco to conservative Jordan, can enjoy.
That'll take a lot of editorial judgment.
Vinje says in that respect, Facebook could be playing a dangerous game. If the company goes too far in pulling content at their discretion, "then they cross the line into becoming something at least akin to a publisher."
As traditional newsrooms know, with great power comes great responsibility.
AUDIE CORNISH, HOST:
Facebook is being accused of censorship and abuse of power after it banned an iconic photo from the Vietnam War. It's the photo of a naked 9-year-old girl fleeing from a napalm attack. Facebook says they took the image down because it violated the site's standards on nudity. The company is now reposting the photo.
I spoke with NPR's Aarti Shahani about this earlier. She first explained how Facebook got into this mess over what's clearly a historical image.
AARTI SHAHANI, BYLINE: Here is a really important detail. OK, while it is true that Facebook uses automated systems to identify prohibited content - right? - they've got software crawling through the site and looking for nudity and porn videos. In this case, software was not at work. The company says a human being flagged the photo. That means a real-life Facebook user saw it. Maybe it was in their feed. You know, they're friends of the writer or somebody who shared it.
The point is a person saw it and reported it, and inside Facebook, a person, not an algorithm, decided to take it down. Now, this is an extraordinary decision. A photo deemed so significant that it won a Pulitzer Prize did not make the cut for Facebook's community standards, the rules the company set for what's OK.
CORNISH: Has Facebook said who that person was inside the company who made the decision on the photograph?
SHAHANI: No, we have no clue about that. We don't know the name of the person or even what country they're in. And Facebook has a small army of quote, unquote, "safety specialists" - people responsible for removing content. They're scattered all over the world in the U.S., India, Ireland.
The initial decision could have come from a recent college grad with zero background in journalism who saw a stark photo and felt, oh, I've yanked off things like this before and, you know, hit delete.
CORNISH: So where does Facebook go from here? Will the company essentially rethink this kind of black-and-white policy on nudity?
SHAHANI: Well, it's funny you say black and white because they would say - in fact they have told me in past interviews that they are nuanced. They always consider context unless something is clearly illegal. And by the way, I did run this photo in this issue by a lawyer in the European Union. The user who initially posted it was in Norway. And the lawyer says it's unimaginable, legally speaking, that Facebook would have thought this is child porn. So the fact that it was a discretionary decision - that leaves us with two big so what's.
OK, one is, how much does Facebook exercise editorial judgments anyway - a lot, a little, is it arbitrary? The company says it's just a platform, but it's acting like a publisher when it decides to edit and remove things. And number two is, how much do they explain themselves to their stakeholders? Do they adopt best practices that traditional publishers have had when they choose to act and explain to the public, or is it OK to be a black box of sorts?
CORNISH: That's NPR's Aarti Shahani. Aarti, thanks so much.
SHAHANI: Thank you. Transcript provided by NPR, Copyright NPR.