25 October 2019 — TRRN
Mark Zuckerberg’s Capitol Hill testimony shows just how dangerous Facebook has become to democracy and the need for public debate around how to deal with the company.
This is a rush transcript and may contain errors. It will be updated.
Marc Steiner: Welcome to The Real News Network. I’m Marc Steiner. Good to have you all with us today.
Now, Facebook Founder, Mark Zuckerberg, has been grilled by the House Financial Services Committee pretty intensely.
AOC: Under your policy, using census data as well, could I pay to target predominantly black zip codes and advertise them the incorrect election date?
M. Zuckerberg: When we roll out the census suppression policy, we will take that content down.
AOC: So you will … There is some threshold where you will fact-check political advertisements. Is that what you’re telling me?
M. Zuckerberg: Well, Congresswoman, yes, for specific things like that where there’s imminent risk of harm, but also-
AOC: Could I run ads targeting Republicans in primaries saying that they voted for the Green New Deal?
M. Zuckerberg: Sorry, can you repeat that?
AOC: Would I be able to run advertisements on Facebook targeting Republicans in primaries saying that they voted for the Green New Deal? I mean, if you’re not fact-checking political advertisements, I’m just trying to understand the bounds here. What’s fair game?
M. Zuckerberg: Congresswoman, I don’t know the answer to that off the top of my head. I think probably?
AOC: So you don’t know if I’ll be able to do that.
M. Zuckerberg: I think probably.
AOC: One more question. In your ongoing dinner parties with far right figures, some of who advanced the conspiracy theory that white supremacy is a hoax, did you discuss so-called social media bias against conservatives? And do you believe there is a bias?
M. Zuckerberg: Congresswoman, sorry, I don’t remember everything that was in the question [crosstalk 00:01:31]-
AOC: That’s all right. I’ll move on.
Marc Steiner: That, of course, was Congresswoman AOC, out there pushing him with some really interesting questions. She’s very tenacious, it’s clear. We’re not talking about that today though. The questions from the committee though, ranged from allowing and limiting what Facebook defined as hate speech, to data privacy and their sale of personal information, to the lack of diversity at Facebook itself, discrimination, some people argue, against people of color, another issue that needs to be explored, and why they allow lies to be told by political figures, yet monitor others. And now they’re working with the Murdoch world, in the news world, to create a new newsfeed. What will that mean?
Well, we’re about to talk with Timothy Karr, who is Senior Director of the Free Press, and who’s been covering this, and their paper’s been covering it pretty intensely. And Timothy, welcome. Good to have you with us here on The Real News.
Timothy Karr: Happy to be with you.
Marc Steiner: So let me begin this way. Let me begin with a quote from the Senior Policy Counsel at the Free Press, Gaurav Laroia. So he wrote: “Facebook’s newsworthy exemption and ad policies are broken if the company is allowing its platform to be the vector for misinformation in the lead-up to the 2020 election. The company has learned nothing from 2016, when it allowed malicious foreign actors to use the platform to influence the U.S. election. By profiting off politicians selling false statements to the public, Facebook is complicit in the erosion of our civic health, discourse, and democracy. The company should show some courage and stand up for the truth – at least in its advertising policies.”
So let’s unpack that.
Timothy Karr: Sure.
Marc Steiner: Because you, in the stuff you all have been doing, you really hit this hard in your reporting. So let’s talk a bit about what all that means.
Timothy Karr: Sure. Well, I think central to that concern is this idea, and we’ve seen it all week, that Facebook is a champion of free speech, by allowing politicians to lie without recourse, is in some way a champion of free expression. Mark Zuckerberg gave a very lengthy speech at Georgetown. He’s been making the rounds at Washington D.C. repeating a lot of these themes. And one of the things that he does say is that what’s most important for Facebook is that it gives everyone a voice.
But consider that this voice is not equal. It’s sort of like the George Orwell quote where “some people are more equal than others.” In this case, politicians are given free rein to lie, to say things that are dangerous, in some cases, and dishonest, while Facebook’s own community standards doesn’t allow its regular users to make the same lies. So clearly, this isn’t about the principles of free expression as much as it is about the politics of Washington.
And there’s an interesting backdrop to all of this, is that, while Mark Zuckerberg is on this free expression tour, a lot of politicians on both the left and the right are talking about antitrust. They’re talking about taking measures to break up Facebook. So my interpretation of this giving away, or giving free rein to Donald Trump – in this instance, to lie in political ads – it’s really about currying favor with certain politicians to see that they don’t pursue the other option, which is, out of anger, to push for more antitrust.
And antitrust, and Zuckerberg, himself, said a number of weeks ago that it posed an existential threat to the organization. So while they’re making a public face about championing free expression, I think, in the back rooms and the corridors of these meetings, he’s really most concerned about antitrust action.
Marc Steiner: So let me talk a bit about this. I want to lead up to antitrust, because I think it’s a fascinating topic that has not really been gone into in depth, in terms of what it really means for the 21st century and what it would mean for this new industry that dominates our our economy, and our country, and the world, actually.
But one of the things you alluded to here is, when places like Facebook become the commons, become the place where people have dialogue, or become the place where people express their opinions, and it’s controlled by one place that can easily say no to this person that they define as hate speech, or yes to this because it gives them money, because they want to build during this political campaign, they did in 2016, apparently are doing it again in 2020. And that to me seems to be the clearest danger, especially through those of us who work in the press all the time to try to build a way to have a free expression in this country. I think therein lies a huge danger.
Timothy Karr: Yeah. The problem is that Facebook and a lot of the other online platforms don’t like to think of themselves as publishers. They don’t want to have any liability for the third party content that goes across their network. And so, they’re in this situation where, on the one hand, they’re working with news organizations and they’re trying to make sure that content on their site isn’t false or misleading, but at the other hand, they don’t want to have anything to do with it. Because they know, when you have more than 2.5 billion users who, according to Zuckerberg, before Congress, are posting, he said 100 billion pieces of content a day, it is virtually impossible for a network of that scale to effectively monitor and to police the type of content that goes across its network.
So they either have this laissez-faire approach, which is like, “We should let everything go,” or they attempt to do something else, in this case, hire 30,000 content moderators, improve your artificial intelligence so it can flag this stuff. In either case, it’s very problematic. On the one hand, you certainly don’t want a social network that has that much power to allow any sort of speech. There are concerns about child sexual abuse being spread via social media. It’s used to sell drugs. There’s racism. There’s a whole range of bad things that have been happening via social media. So you need some controls against that.
But at the same time, do we really want Facebook and these other social networks deciding what is appropriate content and what is not appropriate content? As we’ve seen in the case of political ads from Donald Trump that tell lies, they’re deciding that that is appropriate. And there are a lot of questions about that. Representative Ocasio-Cortez brought that up very effectively in saying that, “How do you decide what’s appropriate and what’s not? I mean, when does it go too far? Where’s your line?” And I don’t think Mark Zuckerberg gave a very good answer to that. In fact, I don’t think he answered it at all.
Marc Steiner: Because maybe he can’t answer it, the way they do things. I mean, when you talk about, when they had this switch in their rules and you talk about having an independent third party doing fact-checking, I mean, what does that mean? And then admitting they couldn’t do it with … and they’re not doing it in the way they should when it comes to putting up political ads. And I mean, that raises a lot of issues. And the 30,000 monitors monitoring everything else, who are these people? I mean, how much are they paid? What’s their profession?
Timothy Karr: Right.
Marc Steiner: This is not like you have a newspaper or a public radio station and you have fact-checkers to make sure [inaudible 00:09:04] your people say things that are correct. This is something much deeper and more complex than that because of the nature of that institution.
Timothy Karr: Yeah. And I think there could be an antitrust argument made there, because when you have 2.5 billion members, you have 100 billion, according to Mark Zuckerberg, 100 billion pieces of content being posted, uploaded on your various social networks in a day, you really are too big. You’re really too big to govern.
But the other issue here is the economics of Facebook. The economics of a lot of these online platforms are built on this idea that they harvest our data and then they target content to us that will most likely elicit a response. Originally, that concept, what some people called surveillance capitalism, was built around this idea of getting information on their users, so you can appropriately target ads to sell things to their users. But it’s also being abused, as we learned from the Cambridge Analytica scandal, to target misinformation, to target misinformation to discourage people from voting, to spread lies about political opponents. And it’s raised to a level where this economic model has become unethical. It is being abused in ways that pose fundamental threats to our functioning democracy. And Facebook is not willing to change that model. You would have to basically rip up the whole organization from the roots.
And so, one of the things that we’ve been advocating for at Free Press is, beyond Facebook, what does social media look like in a world where this kind of surveillance capitalism … where people aren’t being treated as data points that can be sold to the highest bidder? Can we build a social media, a social network, that doesn’t rely on that kind of predatory business model?
Marc Steiner: So that raises kind of the question here, before we run out of time, the whole question of monopolies and antitrust, and what that means in this 21st century, in this kind of industry. As we talked about before we started the conversation, this is not the steel mills and banks in the earlier 20th century, we’re talking about a much different kind of economic model that has a huge pervasive effect on society. And so, when we wrestle with the question of what that means, I mean, I think that that’s something, we’ve only begun to touch the very surface. It’s beyond what either Senator Warren or others would say, “Simply break them up.” What does that mean even, in this context?
Timothy Karr: Well, I mean, you’re looking at an industry that earns hundreds of billions of dollars every year. And most of it is through this targeted online advertising. And we think that the antidote to misinformation, the antidote to all of the negative impacts of networks like Facebook, is good, hard-hitting, independent journalism.
And one of the proposals that we put forth at Free Press involves taxing online advertising to create what’s called a public interest media endowment. In the United States, for example, a 2% tax on the online advertising industry, which is dominated by Facebook and Google, would generate an annual fund of $2 billion that could then support the kind of local independent journalism that, again, acts as the antidote to the type of misinformation that’s being spread across these networks.
Marc Steiner: That’s a really interesting proposal. And I think that maybe the next time we have a chance to talk together, we should really probe that one in depth and talk about what these new models are for the 21st century that we have to kind of wrestle with. Because we do have to create something, a), if we’re going to have a democracy, b), if we’re going to thrive as a society. We have to come up with new ideas that fit the time we’re in and not just hearken to the back, that we can learn from.
But it is fascinating stuff. And Timothy Karr, I really appreciate your work, and appreciate what the Free Press does, and appreciate you taking your time with us today. Look forward to more.
Timothy Karr: Thank you.
Marc Steiner: And I’m Marc Steiner here at The Real News Network. Please go online, let us know what you think about the controversy around Facebook, what you think about all the issues around antitrust. We’d love to hear it as we develop this series of conversations. And I’m Marc Steiner here for The Real News Network. Take care.
One thought on “Facebook’s Threat to Democracy Could Motivate Redefinition of Anti-Trust Laws”