In 2018, Lisa Kaplan assembled a small team inside the re-election campaign for Senator Angus King, an independent from Maine. Wary of how Russia interfered in the 2016 presidential election, it set out to find and respond to political disinformation online.
The team noticed some false statements shared by voters, and traced the language back to Facebook pages with names like “Boycott The NFL 2018.” It alerted Facebook, and some pages were removed. The people behind the posts, operating from places like Israel and Nigeria, had misled the company about their identity.
Today, Ms. Kaplan said, she knows of no campaigns, including among the 2020 presidential candidates, that have similar teams dedicated to spotting and pushing back on disinformation.
They may “wake up the day after the election and say, ‘Oh, no, the Russians stole another one,’” she said.
Less than a year before the 2020 election, false political information is moving furiously online. Facebook users shared the top 100 false political stories over 2.3 million times in the United States in the first 10 months of this year, according to Avaaz, a global human rights organization.
The examples are numerous: A hoax version of the Green New Deal legislation went viral online. Millions of people saw unsubstantiated rumors about the relationship between Ukraine and the family of former Vice President Joseph R. Biden Jr. A canard about the ties between a Ukrainian oil company and a son of Senator Mitt Romney, the Utah Republican, spread widely, too.
Still, few politicians or their staffs are prepared to quickly notice and combat incorrect stories about them, according to dozens of campaign staff members and researchers who study online disinformation. Several of the researchers said they were surprised by how little outreach they had received from politicians.
Campaigns and political parties say their hands are tied, because big online companies like Facebook and YouTube have few restrictions on what users can say or share, as long as they do not lie about who they are.
But campaigns should not just be throwing their hands up, said some researchers and campaign veterans like Ms. Kaplan, who now runs a start-up that helps fight disinformation. Instead, they said, there should be a concerted effort to counter falsehoods.
“Politicians must play some defense by understanding what information is out there that may be manipulated,” said Joan Donovan, a research director at Harvard University’s Shorenstein Center. Even more important for politicians, she said, is pushing “high-profile and consistent informational campaigns.”
Too many campaigns are now left on their heels, said Simon Rosenberg, who tried to thwart disinformation for the Democratic Congressional Campaign Committee before the 2018 midterm election.
“The idea of counterdisinformation doesn’t really exist as a strategic objective,” he said.
Political groups are not ignoring false information. Bob Lord, the chief security officer of the Democratic National Committee, encourages campaigns to alert his organization when they see it online.
The committee also gives advice on when and how to respond. He said campaigns must decide when the costs of ignoring a falsehood outweighed drawing additional attention to it by speaking out.
But he said his reach was limited.
“The amount of disinformation that is floating around can cover almost any possible topic,” Mr. Lord said, and his team cannot look into each reported piece. If campaigns need connections to social media companies, he said, “we’re happy to make some.”
In September, President Trump’s re-election campaign released an ad that included an incorrect statement about Mr. Biden’s dealings with Ukraine. The campaign posted the ad on Facebook and the president’s Twitter account. Between the two services, the ad has been viewed more than eight million times.
Mr. Biden’s campaign publicized letters that it had written to Facebook, Twitter, YouTube and Fox News, asking the companies to ban the ad. But it remained up. In mid-November, the Biden campaign released a website called Just the Facts, Folks.
Jamal Brown, a spokesman for the Biden campaign, said it was not the campaign’s responsibility alone to push back on all falsehoods. But, he said, “it is incumbent upon all of us, both public- and private-sector companies, users, and elected officials and leaders, to be more vigilant in the kinds of content we engage and reshare on social media.”
Several months ago, a team at the Democratic Congressional Campaign Committee flagged some ads on Facebook to the office of Representative Ilhan Omar, a Minnesota Democrat. The ads called for an investigation into unfounded accusations that she had violated several laws.
After the committee and Ms. Omar’s campaign contacted Facebook, the company said it would limit the prominence of the ads in people’s feeds. But the ads, which have now reached over one million views, remain active.
Facebook does not remove false news, though it does label some stories as false through a partnership with several fact-checking organizations. It has said politicians like Mr. Trump can run ads that feature their “own claim or statement — even if the substance of that claim has been debunked elsewhere.”
Last month, Twitter announced plans to forbid all political ads. But the company does not screen for false accusations. Twitter said it did not want to set a precedent for deciding what is and is not truthful online.
In an email, Ms. Omar said it was “not enough” to rely on private companies alone.
“We as a nation need to think seriously about ways to address online threats to our safety and our democracy while protecting core values like free speech,” she said.
Academics and researchers said it was surprising how little outreach there had been from campaigns that faced disinformation operations. Many of the researchers can dissect when a false idea first appeared online, and how it spread.
Graham Brookie, the director of the Atlantic Council’s Digital Forensic Research Lab, said there needed to be “more ingrained information sharing” among politicians, campaign staff, social media companies, civil society groups and, in some cases, law enforcement to counteract the increasing volume of election disinformation.
But when disinformation is used as a tool in partisan politics, Mr. Brookie said, the discussion becomes “a Rorschach test to reaffirm each audience’s existing beliefs, regardless of the facts.”
“One side will accuse the other, and then disinformation itself is weaponized,” he said.
Chris Pack, communications director of the National Republican Congressional Committee, said the disinformation that his party fought was “perpetuated by a liberal press corps that is still incapable of wrapping their heads around the fact that President Trump won the 2016 election.”
That leaves some in the research community wary of wading in at all, said Renee DiResta, the technical research manager for the Stanford Internet Observatory, which studies disinformation.
“I think this is a concern for a lot of academics who don’t want to work directly with a campaign,” Ms. DiResta said, “because that would be problematic for their neutrality.”
Nick Corasaniti contributed reporting.