Exclusive: Elon Musk’s X restructuring curtails disinformation research, spurs legal fears

Exclusive: Elon Musk’s X restructuring curtails disinformation research, spurs legal fears

Nov 6 (Reuters) – Social media researchers have canceled, suspended or changed more than 100 studies about X, formerly Twitter, as a result of actions taken by Elon Musk that limit access to the social media platform, nearly a dozen interviews and a survey of planned projects show.

Musk’s restrictions on critical methods of gathering data on the global platform have suppressed the ability to untangle the origin and spread of false information during real-time events such as Hamas’ attack on Israel and the Israeli airstrikes in Gaza, researchers told Reuters.

The most important method was a tool that gave researchers access to data about 10 million tweets per month. Twitter notified researchers in February it would end free academic access to this application programming interface (API) as part of an overhaul of the tool, according to an email seen by Reuters.

The survey of 167 academic and civil society researchers conducted at Reuters’ request by the Coalition for Independent Technology Research in September quantifies for the first time the number of studies that have been canceled due to Musk’s policies.

It also shows a majority of survey respondents fear being sued by X over their findings or use of data. The worry follows X’s July lawsuit against the Center for Countering Digital Hate (CCDH) after it published critical reports about the platform’s content moderation.

Musk did not respond to a request for comment and an X representative declined to comment. The company has previously said that nearly all content views are of “healthy” posts.

Musk’s first year of ownership of X has been marked by advertisers fleeing the site, concerned that their ads could appear next to harmful content. X’s U.S. ad revenue declined at least 55% year-over-year each month since Musk’s acquisition, Reuters previously reported.

The survey showed 30 canceled projects, 47 stalled projects and 27 where researchers changed platforms. It also revealed 47 ongoing projects, though some researchers noted that their ability to collect fresh data would be limited.

The affected studies include research on hate speech and topics that have garnered global regulatory scrutiny. In one example, a stalled project sought to study child safety on X. The platform was recently fined by an Australian regulator for failing to cooperate with a probe into anti-child abuse practices.

The researcher for the stalled project and several others who responded to the Coalition’s survey requested to remain anonymous. An author of the survey said researchers may seek to avoid backlash from X or protect ongoing studies.

European Union regulators are also currently investigating X’s handling of disinformation, which was the focus of multiple stalled or canceled independent research studies, the survey found.

The reduced ability to study the platform “makes users on (X) vulnerable to more hate speech, more misinformation and more disinformation,” said Josephine Lukito, an assistant professor at the University of Texas at Austin.

She helped conduct the research survey for the coalition, a global group with more than 300 members, that works to advance the study of technology’s impact on society.

The survey was sent in mid-September by email to the coalition’s members as well as email lists for other academic groups, such as experts focused on political communication or social media.

The EU’s investigation of X, under new strict internet rules that took effect in August, underscore the potential regulatory threat to the San Francisco-based company. Any violation could result in fines of up to 6% of global revenue.

An EU Commission spokesperson said it is currently monitoring X’s, as well as other large platforms’, compliance with the law’s obligations, which includes allowing researchers who meet certain conditions to gain access to publicly available data.

Reuters Graphics

UNAFFORDABLE COST

Before Musk bought Twitter for $44 billion, a large proportion of studies about social media had been related to Twitter, because the platform was a valuable source of information about politics and current events. Its data was easily accessible, four researchers told Reuters.

But almost from the moment Musk stepped into Twitter’s headquarters, he began slashing costs and laying off thousands of employees, including those who worked on the research tools.

‘X’ logo is seen on the top of the headquarters of the messaging platform X, formerly known as Twitter, in downtown San Francisco, California, U.S., July 30, 2023. REUTERS/Carlos Barria/File Photo Acquire Licensing Rights

Now, X offers three paid tiers of the API ranging from $100 to $42,000 per month, and the lower-priced tiers provide less data than what was available to researchers for free previously. Nearly every researcher who spoke with Reuters said they could not afford the costs.

One former employee, who declined to be named for fear of backlash from Musk, said the decision to shut down free academic API access came down to an urgent need to focus on boosting revenue and cutting costs in the aftermath of Musk’s takeover.

A majority of survey respondents cited the API changes as the reason for canceling or pausing their studies about the platform.

The unaffordable cost of paying to receive less data than what was available previously means research ahead of 2024, a major election year globally, is severely challenged, Lukito said.

Tim Weninger, a professor of engineering at University of Notre Dame, said his team has been “flying blind” while trying to track China-linked information operations without data from the API, the cost of which is prohibitive, he said.

Several researchers told Reuters they now have limited options to study X, such as manually analyzing posts.

Researchers also face limitations in gathering data from other social platforms. Short-form video app TikTok announced an academic research API earlier this year, but its onerous terms and conditions limit its usefulness for researchers, said Megan A. Brown, a doctoral student at the University of Michigan, in a blog post she wrote for Tech Policy Press while a researcher at New York University.

Facebook and Instagram-owner Meta Platforms has partnered with external researchers on studies, which is not a substitute for independent research, but shows Meta’s willingness to collaborate, Lukito said.

LEGAL CONCERNS

The CCDH, an organization that said it aims to fight hate speech and disinformation, published several reports after Musk’s acquisition that claimed the social media platform failed to moderate and also profited from harmful content.

X sued CCDH in July, accusing the organization of improperly accessing data from the platform and promoting false claims about X’s moderation.

“Musk wants to silence any criticism of the way he does business,” said CCDH Chief Executive Imran Ahmed, adding CCDH stood by its reports.

In the survey from the Coalition for Independent Technology Research, 104 out of 167 respondents cited the possibility of legal action due to either their use of data or their research findings as a concern about their projects.

“The move against the CCDH communicates to researchers looking at misinformation and hate speech on online platforms that there is intrinsic liability in publicly disseminating findings,” said Bond Benton, an associate professor at Montclair State University, which produced a study last year that found hate speech increased on Twitter in the hours after Musk’s takeover.

One researcher, who declined to be named, was studying how the subject of rape is discussed on X and told the survey they were worried about legal risk and the scientific validity of data collected without access to the API. The researcher said they moved the study to examine a different social media platform.

Musk and X CEO Linda Yaccarino have articulated a new policy called “freedom of speech, not reach” that restricts the distribution of some posts but refrains from deleting them from the platform.

X has said 99% of content that users see on the platform is “healthy,” which the company attributed in July to estimates from Sprinklr, a software company that helps brands monitor market trends and customer sentiment online.

A spokesperson for Sprinklr, which is listed as an official partner of Twitter, declined to confirm the figures cited in the July post after Reuters requested comment and said “any recent external reporting prepared by Twitter/X has been done without Sprinklr’s involvement.”

The spokesperson pointed to a March blog post that said toxic posts on X received three-times fewer views than non-toxic posts.

Reporting by Sheila Dang in Austin; additional reporting by Zeba Siddiqui in San Francisco, Martin Coulter in London and Supantha Mukherjee in Stockholm; editing by Kenneth Li and Anna Driver

Our Standards: The Thomson Reuters Trust Principles.

Acquire Licensing Rights, opens new tab

Comments are closed.