From FAIR <[email protected]>
Subject 'Hate Speech and Disinformation Flow on Facebook'
Date July 10, 2020 11:47 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
[link removed]

FAIR
View article on FAIR's website ([link removed])
'Hate Speech and Disinformation Flow on Facebook' Janine Jackson ([link removed])

Janine Jackson interviewed Free Press’s Jessica González about Facebook promoting hate for the July 3, 2020, episode ([link removed]) of CounterSpin. This is a lightly edited transcript.
Play
Stop
pop out
*

X

MP3 Link ([link removed])
WaPo: Zuckerberg once wanted to sanction Trump. Then Facebook wrote rules that accommodated him.

Washington Post (6/28/20 ([link removed]) )

Janine Jackson: Civil rights and social justice groups have been grappling for years with ways to address hateful speech, harassment and disinformation on Facebook. The issue is on the front burner again, as major companies like Unilever ([link removed]) and Starbucks ([link removed]) are pausing their ads—the platform's source of revenue—as part of a coordinated effort to get Facebook to change policies that allow politicians and others to make false and incendiary claims.

A Facebook security engineer quit in disgust when the platform refused to take down a post from Brazil's Jair Bolsonaro that said ([link removed]) , “Indians are undoubtedly changing. They are increasingly becoming human beings just like us.” That would seem to be a clear violation of internal guidelines against “dehumanizing speech,” but as revealed in a recent Washington Post exposé, ([link removed]) the engineer was told that it didn't qualify as racism, and “may have even been a positive reference to integration.”

That sort of casuistry has marked Facebook's actions, and activists have heard enough. The group Free Press ([link removed]) has been one of those working for change; we're joined now by Free Press co-CEO Jessica González. She joins us by phone from Los Angeles. Welcome back ([link removed]) to CounterSpin, Jessica González.

Jessica González: Hi, Janine. Thanks for having me.
Hit Pause on Hate

Stop Hate for Profit ([link removed])

JJ: It's worth stating at the outset that Free Press, like FAIR, opposes censorship, believes in the free flow of ideas and in debate. That doesn't require acceptance of the promotion of dangerous medical misinformation ([link removed]) , Holocaust denial ([link removed]) or instigations to violence ([link removed]) against people protesting police brutality. We have to grapple with the tremendous influence of social media somehow. So that said, tell us about the Stop Hate for Profit ([link removed]) campaign, which companies from Adidas to Williams-Sonoma are taking part in. What are the problems that the campaign is looking to address?

JG: You're right, Janine; Free Press stands for a free press. And we imagine a free press that frees people from oppression. We imagine a free press that holds the powerful accountable. So unlike calls for government to censor speech, the Stop Hate for Profit campaign is seeking for advertisers to vote with their feet. It’s seeking to hold up the really vast amount of hate, bigotry and disinformation that is happening on Facebook’s platform.

Facebook has known about this problem. Our organizations have been in dialogue with Facebook for some time. We've been calling on them to institute a comprehensive change, to keep people safe on the platform, because we understand that when hate speech and disinformation flow on Facebook, that it puts people's lives at risk in real life, and that it also makes it harder for people from historically oppressed groups to speak out, when we speak out and face an onslaught of hate and harassment.

So what the campaign is calling for is for all major advertisers on a global scale to drop their advertising on Facebook for the month of July. And we're now up to over 700 advertisers ([link removed]) that have agreed to drop from Facebook, including Honda, Ford, Unilever, Coke and other major brands that have essentially called on Facebook to meet our requests. And the interesting thing here is that the companies came along really easily, because it's not good for their brands to be associated with the types of hate and disinformation that are running rampant on the platform.

JJ: It isn't that Facebook just allows extremist or toxic content. There's something, isn't there, in the business model that encourages polarization?

JG: You're absolutely right. Ninety-nine percent of Facebook's business model is advertising ([link removed]) . And we are the product on Facebook: Facebook is selling access to us, consumers, individuals that use the platform. That's what they're selling to their advertisers.

So how do they make the most money? By keeping us, their product, on the platform as much as possible. And we know that hate, harassment and wild disinformation are the types of content that garner high attention and high engagement, and keep us on the platform, even when we don't agree with those things and we're, in fact, fighting back against hate and disinformation, it's still generating time on the platform, engagement on the platform, and that is how they make their money.

So, yes, this is built right into their business model. And until now, nobody's really been talking about that. Or we've been talking about it, but it hasn't received the widespread attention that it's receiving in this moment.
WSJ: Facebook Executives Shut Down Efforts to Make the Site Less Divisive

Wall Street Journal (5/26/00 ([link removed]) )

JJ: The Wall Street Journal, some may listeners may know, reported ([link removed]) an internal Facebook report that executives got in 2018, that found that the company was well aware that its recommendation engine stoked divisiveness and polarization. But they ignored those findings, because they thought any changes would disproportionately affect conservatives, which is just, I think, mind blowing. So this is not a problem that they don't know about. And the Journal also cites a separate report ([link removed]) in 2016, that said that 64% of people who joined an extremist group on Facebook only did so because the company's algorithm recommended it to them. So this is, as you're saying, it's not passive.

JG: Right. It's absolutely not. This is intentional. They've known these things. This reminds me of how the tobacco industry hid information about the damaging health effects of cigarettes, back in the day. This is Facebook hiding information about the toxic effects of their own platform. And it's really shameful, frankly, that it's taken this much to get the attention on to what Facebook has been up to.

JJ: It's not passive, but it's also not equal opportunity. It tends to go in one direction, right?

JG: No, and this whole conservative bias red herring that gets thrown out there as a reason for not to do anything ought to be really offensive to conservatives. Last time I checked, they haven't said that conservatism and antiracism are opposites. I think this is a nonpartisan issue, or at least it should be. We all have an interest, regardless of political party, race, religion and whatnot, to end racism in our society, and to use this red herring as a reason not to is really immoral.
Forbes: Black Employees Allege Racial Discrimination At Facebook In New Legal Complaint

Forbes (7/2/20 ([link removed]) )

JJ: It seems relevant that a group of Black workers at Facebook just filed ([link removed]) a class action with the EEOC, alleging that Facebook discriminates against Black workers and applicants in hiring, evaluations, promotions and pay. Black people are just 3.8% of Facebook's workforce; ([link removed]) 1.5% of tech workers, and that hasn't increased ([link removed]) , even as the company's gone from 9,000 workers to nearly 45,000 ([link removed]) . One wonders how that company culture has bearing on their decision-making about when is something racist.

JG: Oh, absolutely. And I'm not surprised at all that workers are facing discrimination inside of Facebook, because the product itself is discriminatory. There's discriminatory algorithms ([link removed]) at play, and there's a business model that is essentially hate profiteering. So this isn't much different than things I've thought about in the past with hate radio, for instance, some of these really hateful pundits that are often on iHeartRadio ([link removed]) , that you hear a lot of complaints about hate and harassment within. This is a pervasive cultural issue at companies that trade in hate.

JJ: This June 28 Washington Post piece ([link removed]) charts how Facebook shifted its policies to accommodate Trump. The engineer who quit in disgust, David Thiel, is quoted saying, “The value of being in favor with people in power outweighs almost every other concern for Facebook.” For Trump, that's meant that everything he says is newsworthy just because he said it, no matter how false or racist or inflammatory, and that carveout for politicians is galling to people, but it's not, of course, the only problem. But that does seem to be a serious thing, to simply say that because someone's a politician, they can say whatever they want.

JG: Right. This really speaks to the question of, “What are we talking about when we talk about a free press?” When I think of a free press, I think of the Fourth Estate, one that holds the powerful accountable. And he's done just the opposite. There's a set of content moderation rules ([link removed]) that users have to follow, that the president doesn't, [or] other powerful leaders. That's an incredibly big problem. The free press is supposed to hold power accountable; it's not supposed to give them a free ride.

And, frankly, it shows an appalling lack of awareness about the moment we're in, the cultural moment we're in, where we are reckoning with racism across the government, in our society, in our businesses, and in our own organizations and minds. All of us need to be thinking about anti-blackness in particular. And it shows that he's really not thinking about that, or if he is, he's made a calculated decision to put profit over morals.

JJ: Let's talk about some of the recommendations or next steps that the campaign has put forward. What would you like to see happen? What are some of the elements?

JG: We have a number of recommendations that are on our website, StopHateForProfit.org ([link removed]) , but I'll highlight a few of them. Facebook needs a permanent civil rights infrastructure and accountability system inside the company. They need to comply with regular third-party audits that track how they are doing in complying with the civil rights infrastructure that needs to be built, and they need to overhaul their content moderation system.

The Change the Terms Coalition ([link removed]) , which is a coalition of over 55 civil rights and racial justice organizations, has put forth a comprehensive set of model policies ([link removed]) aimed at Facebook and other social media companies. And we're asking them to ban hateful activities, to ban white supremacists, and to significantly invest in enforcement, in transparency about their content moderation process, in rights of appeal, so that people of color and religious minorities and others who are protesting racism and hate are not the ones that get taken down, but, in fact, it's actually the hate and proliferation of racism and recruitment into white supremacist groups that gets taken down. We're calling ([link removed]) for Facebook to ban all state actor bot and troll campaigns that trade in hateful activities.

And so we have a larger set of policy recommendations ([link removed]) on StopHateForProfit.org ([link removed]) , including a call for Facebook to develop a hotline, so that its users who are experiencing hate and harassment have somewhere to call, to take care of when they're experiencing hate, much like you might call your internet service provider or your water company if you are having a problem there.

So those are some of the policy changes that we’re calling for from Facebook.

JJ: At the end of this Washington Post piece ([link removed]) , we see Mark Zuckerberg saying ([link removed]) Facebook is going to start labeling problematic newsworthy content. I read somewhere they're talking about commissioning research on polarization ([link removed]) . Does this look like genuine engagement with the problems that you're talking about to you? And I wonder, you've been working with them for so long, do you think that they have evolved? Or has your way of engaging with them changed over time? And how real, how seriously do you think they're taking this right now?

JG: I think this is more chipping away at the edges and failing to do comprehensive reform. So if they think they're done, they're sorely mistaken. And while I think it's a step in the right direction, we're super tired of steps in the right direction. I don't know whether or not this is sincere; I think not. I think it's a response to all the bad PR that they're experiencing and all the dissent they're feeling, even inside the company. And while there are some things that I'm interested in tracking--for instance, they've claimed that they are going to ban hateful activities aimed at people based on immigration status. They've claimed they're not going to allow hate in ads, they claim they're going to apply the rules towards politicians. I frankly don't believe them, because they've made a lot of promises over the years and failed to enforce them.

JJ: What, finally, comes next? What if they do the same kind of hand-waving that they've done in the past and nothing really changes? Where do we go from there?
Jessica Gonzalez

Jessica Gonzalez: "There's a real question over whether Facebook is just too damn powerful, and whether we need further regulatory and legislative interventions to hold this company accountable to the people."

JG: That's a really good question. Right now, we are continuing to organize to move this campaign to the global level. So we will continue to levy advertiser pressure. And, listen, there's a real question over whether Facebook is just too damn powerful, and whether we need further regulatory and legislative interventions to hold this company accountable to the people. And those are not off the table as far as Free Press is concerned. We've already called, at Free Press, for an ad tax on Facebook, taxing 2% of their profit, and reinvesting that money back into quality local and Independent news production, to support reporters who are going to have to do the hard work of putting Facebook's hate in context, and correcting the record on the disinformation that runs rampant on their sites.

We've also called for robust reform in the privacy realm ([link removed]) , and we have a piece of model legislation that we are recommending the US Congress adopt, to make sure that Facebook is not violating our privacy rights, our civil rights, and that the power about the kind and the ways that Facebook collects data about us, and then monetizes our data, is in the control of us, the people, and that we have more transparency about what they're collecting, and that we have a private right of action when Facebook is violating our rights.

So I think, at a minimum, those need to be seriously considered now, and I think there's probably further interventions that need to happen in Congress. If Facebook refuses to comply with these demands, and perhaps even if they do comply, this really shines a light on just how powerful they are.

JJ: We've been speaking with Jessica González, co-CEO of the group Free Press. They're online at FreePress.net ([link removed]) , and you can learn more about this campaign at StopHateForProfit.org ([link removed]) . Jessica González, thank you so much for joining us today on CounterSpin.

JG: Thank you for having me, Janine.


Read more ([link removed])

© 2020 Fairness & Accuracy in Reporting. All rights reserved.
You are receiving this email because you signed up for email alerts from
Fairness & Accuracy in Reporting

Our mailing address is:
FAIRNESS & ACCURACY IN REPORTING
124 W. 30th Street, Suite 201
New York, NY 10001

FAIR's Website ([link removed])

FAIR counts on your support to do this work — please donate today ([link removed]) .

Follow us on Twitter ([link removed]) | Friend us on Facebook ([link removed])

change your preferences ([link removed])
Email Marketing Powered by Mailchimp
[link removed]
unsubscribe ([link removed]) .
Screenshot of the email generated on import

Message Analysis