On Oct. 7, during the first round of Brazil’s presidential election, Facebook employees noticed something suspicious on the social network.

A story posted on the social network incorrectly claimed the election was delayed because of protests. Facebook’s data scientists and operations team scrambled to pull down the misinformation before it went viral.

These employees were working face-to-face in Facebook’s new “war room,” an operation the social media giant launched in September to stamp out fake news, disinformation and fake accounts in real time.

“We know that when it comes to an election really every moment counts,” said Samidh Chakrabarti, Facebook’s director of elections and civic engagement work.

As the clock winds down before the run-off elections in Brazil and the midterm elections in the US, Facebook is trying to prove to both the public and lawmakers it’s more prepared to combat election interference on its network. There’s a lot at stake not only for democracy but for Facebook, which has seen RussiansIranians and even Americans exploit the social network to spread hoaxes and sow discord.

“You bear this responsibility. You’ve created these platforms. And now they are being misused. And you have to be the ones to do something about it. Or we will,” Sen. Dianne Feinstein, D-Calif., told Facebook, Twitter and Google lawyers last year during a congressional hearing about Russia’s interference in the 2016 US presidential election.

While the social media giant’s executives said it’s moving swiftly, Facebook also has to balance concerns it’s censoring certain voices. The tech firm, which has tried to stay away from being the “arbiter of truth,” has been looking at the behaviour of fake accounts instead of their content to decide whether to pull them down.

And with more than 2 billion monthly active users worldwide, Facebook is relying on both man and machine to police the large amounts of content that not only flows through its site but other apps such as WhatsApp and Instagram.

Despite the tech firm’s efforts, other fake news stories tied to Brazil’s presidential election still spread on Facebook and WhatsApp, highlighting the difficulties facing the world’s largest social network. Last week, Brazil’s electoral court ordered Facebook to pull down links to 33 fake news stories about vice presidential candidate Manuela D’Ávila.

“Facebook has a very tough challenge to be able to deal with the problem of on the one hand keeping false information out of the hands of voters who might be misled by it, but on the other hand protecting free speech where they’re not seen as taking sides,” said Richard Hasen, a political science and law professor at UC Irvine.

Inside the ‘war room’

From the outside, Facebook’s “war room” looks like a typical conference room on the tech firm’s Menlo Park, California, campus. But on the inside, the open space, flags, clocks, TV screens, posters and blue and white labels next to computer screens signal that it isn’t your average meeting room.

A red, white and blue sign tacked onto the door reads, “war room.”

On Tuesday, Facebook invited reporters for a rare glimpse into the work the company’s doing to thwart election meddling.

Surrounded by TV screens broadcasting news and dashboards displaying real-time data, about 20 Facebook employees take shifts within the company’s war room at any given time. They’re part of different teams at the tech firm, including data scientists, engineers, lawyers and security experts, who work together to combat election meddling. In total, about 20,000 staffers work on safety and security at Facebook, representing the tech firm’s line of defence against election interference.

Next, to the Brazilian and American flags, clocks displaying the times in both countries hang on the wall. Time is of the essence when it comes to the spread of misinformation and Facebook thinks there’s no substitute for face-to-face interaction.

Chakrabarti said workers in the war room keep track of data such as the volume of foreign political content and user reports of voter suppression. They’re also monitoring other social media sites, including Twitter and Reddit while keeping in touch with employees in other Facebook offices and their partners.

If the team detects something out of the ordinary, it will be placed on a “situation board” and data scientists will investigate the problem before passing it onto the operation specialists to determine how to apply Facebook’s rules including those against hate speech, false news and spam.

Experts said it’s too early to tell if Facebook’s efforts to stop election interference have been effective.

It’s also unclear if Facebook’s war room will remain a permanent fixture at the tech firm. Since the war room is new, executives said they’re still evaluating how well it’s working.

But even if the physical space gets dissolved, Facebook executives noted that their work to protect democracy will never be over.

“This is really going to be a constant arms race,” said Katie Harbath, director of Facebook’s global politics and government outreach team. “This is our new normal because bad actors are going to keep trying to get more sophisticated in what they’re doing.”