Friday, November 22, 2024
Politics

New studies: Facebook doesn't make people more partisan

They’ve become targets of both the right, which accuse them of squelching conservative views, and the left, which sees them as vehicles for right-wing misinformation.

The studies released Thursday tried to tease out the influence of particular factors, such as Facebook’s algorithm for serving up content to users. Two studies published in the journal Science that examined the effects of Facebook’s algorithm and reshare feature during the fall of 2020 found that both features increased user engagement — but neither affected people’s existing political attitudes or polarization.

A separate study published in the journal Nature found that reducing users’ exposure to sources that echo their existing beliefs didn’t affect their political attitudes either.

Meta trumpeted the results in a memo circulated ahead of the studies’ release: “Despite the common assertions that social media is ‘destroying democracy,’” the company wrote, “the evidence in these and many other studies shows something very different.”

Social media critics — many of whom have spent years sounding the alarm about the ways it has changed American politics — suggested the studies were too limited, and too close to Meta itself, to be persuasive.

Frances Haugen, the former Facebook executive who leaked internal company files in 2021, called the studies “badly designed” and suggested the three-month study period may be too short to capture changes in users’ views.

Jesse Lehrich, co-founder of Accountable Tech, an advocacy group focused on information controls for social media, said it was “a little bit absurd” to draw conclusions from studies that altered a single facet of a user’s social media experience over a three-month period.

Lehrich, who worked on the Hillary Clinton campaign’s response to Russian meddling in 2016, credited Meta for cooperating with outside researchers, but pointed out that social media companies and their critics both tout studies that support their preferred conclusions.

“Maybe my worst fears are super overblown,” said Lehrich. “But to take a couple of these data points and conclude that anyone worried about their impact on the world is disproven by research is pretty disingenuous.”

A fourth study, also published in Science, found that a cluster of news sources consumed by conservatives produced most of the misinformation flagged by the platform’s third-party fact-checking system. (A study co-author, Sandra González-Bailón of the University of Pennsylvania, declined to provide a list of those sources.)

The studies were the result of a collaboration between Meta and 17 outside researchers from universities including Northeastern, Stanford and Princeton.

An independent rapporteur tasked with evaluating the collaboration vouched for the soundness of its results, but said its framework gave Meta influence over the ways in which outside researchers evaluated its platforms.

“Meta set the agenda in ways that affected the overall independence of the researchers,” wrote Michael Wagner, a professor at the University of Wisconsin-Madison’s School of Journalism and Mass Communication,

He called for governments to mandate that social media companies share data so that researchers are not dependent on platforms’ cooperation in order to study their practices.

The role of social media in American politics has been a matter of intense debate since 2016, when critics of former President Donald Trump blamed “fake news” stories and a covert Russian propaganda campaign for his upset 2016 presidential victory.

It remains unclear what effect, if any, Russia’s social media campaign had on the outcome of that election. Some studies have found correlations between the influence campaign and metrics like polling and election betting markets. But a study published in January in Nature found no discernible effect on the U.S. election that year, and other studies have failed to substantiate claims that Russian internet meddling was a meaningful factor in the race’s outcome.

In the years since Trump’s election, liberals and the establishment-minded have generally decried the free-wheeling information environment on social media, arguing that it is a breeding ground for dangerous disinformation and extremism. Populists and conservatives have resisted efforts to rein in the online information ecosystem, arguing that they provide liberal-leaning institutions cover to censor politically inconvenient facts and opinions.

In undermining arguments that blame social media for polarization, while affirming that conservative-linked sources produce the lion’s share of misinformation, this new batch of studies is unlikely to put these arguments to rest.

Billionaire builder and philanthropist Frank McCourt, a Meta critic who is working on alternative social media models, said that the studies do not address the most fundamental civic issues created by concentrating power over information flow in the hands of for-profit businesses.

“You get what you optimize for,” said McCourt, “and social media platforms are not optimizing for a healthy society.”

Katie Harbath, who served as Facebook’s public policy director during the 2020 campaign, said that “more research is needed,” and that ongoing updates to Facebook’s algorithm mean that research from 2020 may already be out of date.

“Algorithms are always changing, and so while this is a very helpful snapshot, it is just that — a snapshot,” wrote Harbath, who is now a senior advisor for technology and democracy at the International Republican Institute, in an email. “This is why transparency is important.”

Correction: An earlier version of this article misstated Katie Harbath’s title.

Rebecca Kern contributed to this report.

source

Leave a Reply

Your email address will not be published. Required fields are marked *