For better market penetration
I wonder if Zuckerberg ever lies awake at night thinking about the Rohingya. They think about him.
Facebook’s negligence facilitated the genocide of Rohingya Muslims in Myanmar after the social media network’s algorithms amplified hate speech and the platform failed to take down inflammatory posts, according to legal action launched in the US and the UK.
The platform faces compensation claims worth more than £150bn under the coordinated move on both sides of the Atlantic.
A class action complaint lodged with the northern district court in San Francisco says Facebook was “willing to trade the lives of the Rohingya people for better market penetration in a small country in south-east Asia.”
It adds: “In the end, there was so little for Facebook to gain from its continued presence in Burma, and the consequences for the Rohingya people could not have been more dire. Yet, in the face of this knowledge, and possessing the tools to stop it, it simply kept marching forward.”
Marching forward for the market penetration.
In the US and UK, the allegations against Facebook include: Facebook’s algorithms amplified hate speech against the Rohingya people; it failed to invest in local moderators and fact checkers; it failed to take down specific posts inciting violence against Rohingya people; and it did not shut down specific accounts or delete groups and pages that were encouraging ethnic violence.
The US complaint cites Facebook posts that appeared in a Reuters report, with one in 2013 stating: “We must fight them the way Hitler did the Jews, damn Kalars [a derogatory term for Rohingya people].” Another post in 2018, showing a photograph of a boatload of Rohingya refugees, says: “Pour fuel and set fire so that they can meet Allah faster.”
You know, you’d think once it’s a national news story it would be worth taking action, if only to cover the corporate ass.
The Facebook whistleblower Frances Haugen has alleged the platform is fanning ethnic violence in countries including Ethiopia and is not doing enough to stop it. She said 87% of the spending on combating misinformation at Facebook is spent on English content, while only 9% of users are English speakers.
Facebook says no no it has a strategy, really it does.
Sounds similar to the way Facebook deals with violent threats against women…doing nothing. I would say that only white men matter to Zuckerberg, but I think the truth is one particular white man matters to Zuckerberg and no other.
What iknklast said.
Also, this reminds me of the interrogation of a representative of Big Tobacco in court at some point in the 1990s:
PROSECUTOR: You knowingly put a chemical that is used as a rat poison in a product that you were marketing for human consumption.
REP: Yes.
PROSECUTOR: Why couldn’t you have removed it?
REP: That would have affected the taste, and thus potentially lowered sales.
And while we’re on the topic, I remember another such interview with some high-up in Big Tobacco:
Q: Do you yourself smoke?
REP: No sir.
Q: Why not? Aren’t you always defending the constituational “Right to Smoke”?
REP: Yes sir. But we reserve that right for the young, the poor, the, black, and the stupid.
Well yeah, of course they proceeded. Even the most impoverished nation still has some money, and as we know all money needs to be swept up by the billionaires of the world. They are just that much better than us, they deserve it all.
Can you sue an algorithm? Can they be held accountable? If corporations are people, why not bits of procedural code? Of course it would end up saying “I was only following orders!”
I suppose it would cut too deeply into profits to have more actual humans monitoring things. So much easier to just blame the machines.
Bruce,
A lot of people have been trying very hard to answer that question for the last couple of decades. Sooner or later, one government or another will work up the courage to use it as the basis of breaking up the social media giants into smaller, more autonomous individually evil units instead of one giant collectively evil one. For now, they are biding their times because they don’t fancy their chances. Neither do I. It’s a shame, because there will definitely be an enormous amount of tragedy written off as unavoidable accountability failure before anything is done.
latsot and Bruce,
Here is a series of essays on the social aspects of technology, and tentatively lays out a potential roadmap for how we may come to grips with it. I found it quite informative.
So much press is wasted on accountability for teachers (I’m not against it, but I promise you they are doing it all wrong). Nobody cares about accountability for the Pentagon or for social media.
Fun fact: teaching is dominated by women.
Gee, I wonder if my first observation has anything to do with my second one?