© 2024 KMUW
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Facebook pushes back on whistleblower claims it's endangering users

STEVE INSKEEP, HOST:

How does Facebook defend itself against revelations from inside the company? A former employee is testifying before a Senate committee today. Frances Haugen left the company with documents in hand. We have reported how those documents reflected the concerns of Facebook employees. They questioned whether their company was doing enough to stop the spread of extremism or misinformation, among other things. Facebook is an NPR sponsor, which we cover like any other company, and we begin our coverage this morning with Monika Bickert, the company's vice president of content policy.

MONIKA BICKERT: The documents that were taken by this employee and the way that they're being portrayed, it just is not an accurate representation of the work that this company does every day to ensure safety on our sites.

INSKEEP: Let me ask about election information or misinformation - political conversations. Frances Haugen spoke to this on "60 Minutes," and let's listen to a little bit of what she had to say.

(SOUNDBITE OF TV SHOW, "60 MINUTES")

FRANCES HAUGEN: But its own research is showing that content that is hateful, that is divisive, that is polarizing - it's easier to inspire people to anger than it is to other emotions.

INSKEEP: Monika Bickert, do you face a fundamental business problem, or moral problem, really? Facebook is built on encouraging more interactions, and it's easier to encourage interactions when you give people content that makes them angry.

BICKERT: Our business interest is in making sure that people have a good experience so that they want to stay with these sites and use them over the long term. And in fact, when we changed our News Feed algorithm in January of 2018, we expected that engagement would go down, and we knew we would take a hit, and that all happened. And that's because what we were doing was promoting meaningful social interactions. That means content from family and friends that is likely to help people have conversations about things that are important to them rather than prioritizing public content.

INSKEEP: I want to grant some of the efforts that you've just highlighted there, but we have some of the quotes now from some of these internal memos suggesting that something else was happening after that change of the algorithm in 2018. The Wall Street Journal quotes an internal memo saying, "misinformation, toxicity and violent content are inordinately prevalent among reshares." We have, in 2019, political party in Poland saying that they discovered, after the change in the algorithm, they had to share more negative content in order to get engagement. In April 2020, there was an internal memo about a presentation to Mark Zuckerberg about proposals to address this problem, and Zuckerberg seems to resist this because it would damage meaningful social interactions. In other words, he is concerned about cutting down traffic on the site. What is going on there?

BICKERT: Well, let me also remind you that anecdotes about, you know, people being concerned about how their posts are getting engagement or not getting engagement - those are certainly things to look into. That's not the same thing as a deliberate move to prioritize, you know, engagement bait. In fact, we do the opposite. And if you have any doubt about whether or not we would prioritize engagement or profits over safety, look at what we did with political ads around the 2020 election. We paused all ads on political or social issues before the election, even though we got criticism about it, because, of course, people wanted to be able to advertise on those issues. We kept that pause in place through March. So we were directly sacrificing revenue and getting complaints about it. We were doing that because we wanted to err on the side of safety.

INSKEEP: I would grant you that people complained about your ad policy, but this April 2020 memo appears to show Mark Zuckerberg himself weighing an additional safety measure and saying, I don't want to do this if it costs us too many interactions.

BICKERT: Meaningful social interactions was about prioritizing family and friends' conversations, so those interactions are interactions that the research tells us are good for well-being.

INSKEEP: Frances Haugen argues ultimately that Facebook is making dangerous trade-offs between speech and your business and safety. Would you agree that that is the trade-off you have to make in order to stay in business in a free society?

BICKERT: Even the nature of social media itself requires balancing issues like freedom of expression, an important value, with other values like privacy, like safety. Those are all issues that we take very seriously and, frankly, that we think shouldn't be ours to decide alone. And that's why, for more than 2 1/2 years, we have been asking governments to play a role by passing regulation. And we're happy to be a part of that effort, but we think that it needs to happen.

INSKEEP: Your company, Facebook, is without a doubt the largest medium for public discourse in the world. Facebook is not the whole public discourse. It's not the only thing influencing public discourse. But you're huge, and you have been huge for more than 15 years. Would you argue that public discourse is better now than it was 15 years ago?

BICKERT: It depends on what you mean by public discourse, Steve. I mean, on a number of issues, I think the ability of people to reach out online - you know, we see this especially with people who are dealing with health issues or other wellness issues - to be able to get support can be just so meaningful. On political issues, polarization in the United States has been on a steady increase since the late '70s or early '80s, and that's decades before social media, so very important to keep that in mind. The other thing to keep in mind is that, in other Western democracies that have as high of social media usage rates and Facebook rates as the United States, we see polarization declining. So as tempting as it may be to want to blame social media or Facebook for issues like polarization, I think we as a society have to look deeper if we want to solve those issues.

INSKEEP: Do you worry that, on some level, for some people, Facebook may be like a bartender who says, you're drinking too much, have another drink?

BICKERT: I actually think we do a very good job of trying to make the experience good for people but also being mindful of the wellness concerns. We are being thoughtful about these issues and doing the best we can to give people a healthy and positive experience, and we'll keep doing that.

INSKEEP: Monika Bickert, thanks for taking the questions.

BICKERT: Thanks very much, Steve. Transcript provided by NPR, Copyright NPR.