• Facebook is once again presenting problems ahead of a presidential election, but this time it’s local officials sounding the alarm.
  • Interviews with nearly a dozen regional and statewide officials with election-related duties underscore the challenges Facebook poses.
  • Layoffs in trust and safety and customer service along with Facebook deprioritizing news have made it difficult to dispel misinformation, officials said.

Derek Bowens has never had such an important job. He’s the director of elections in Durham County, North Carolina, one of the most-populous areas of a state that’s increasingly viewed as crucial to the 2024 presidential contest.

So when a former precinct official emailed Bowens in July to warn him of a post containing voting misinformation that was spreading virally on Facebook, Bowens quickly recognized that he may be facing a crisis.

The post, written as if from an authority on the subject, said voters should request new ballots if a poll worker, or anyone else, writes anything on their form, because it would be invalidated. The same incorrect message was spread on Facebook during the 2020 election, but the platform flagged the content at the time as “false information” and linked to a story that debunked the rumor by Facebook’s fact-checking partner, USA Today.

Bowens said no such tag appeared on the post, which was widespread enough that the North Carolina State Board of Elections had to issue a press release on Aug. 2, informing voters that false “posts have been circulating for years and have resurfaced recently in many N.C. counties.”

“It was spreading and there wasn’t anything happening to stop it until our state put out a press release and we started engaging with our constituency on it,” Bowens told CNBC in an interview.

The elections board wrote a post on Facebook, telling voters to “steer clear of false and misleading information about elections,” with a link to its website. As of Wednesday, the post had eight comments and 50 shares. Meanwhile, multiple Facebook users in states like North Carolina, Mississippi and New Jersey continue to share the ballot misinformation without any notification that it’s false.

CNBC flagged posts with the false information to Meta. A company spokesperson said, “Meta has sent them to third-party fact-checkers for further review.”

Across the U.S., with 40 days until the Nov. 5 election, state and local officials say they are puzzled by what to expect from Facebook. Like in the past two presidential election cycles, the spread of misinformation on the social network has threatened to disrupt voting in what’s expected to be another razor-thin contest decided by thousands of voters in a handful of states. Recently, a Facebook post containing a false claim about Haitian immigrants eating pets in Springfield, Ohio, ballooned out of control and gained resonance after it was repeated by Republican nominee Donald Trump in a debate.

In 2016, Facebook was hammered by Russian operatives, pushing out false posts about Hillary Clinton to bolster Trump. In 2020, the site hosted rampant misinformation about politically charged issues like Covid treatments, masking and voter fraud.

The big difference this go-round is that Facebook has largely removed itself from the equation. In 2021, Meta began pushing political and civic content lower in its algorithms, which contributed to a dramatic decline in news traffic last year for publishers. Earlier this year, Meta announced that it would deprioritize the recommendation of political content on Instagram and its Twitter-like Threads service, a move the company said more aligns with what consumers want to see on their feeds.

Still, posts with false information can spread rapidly across wide swaths of users along with comments that amplify the misinformation, and government agencies have little ability to counteract them, because they have such limited reach on the platform.