Beneath an American flag, 20 people packed tight into a beige conference room are Facebook’s, and so too the Internet’s, first line of defence for democracy. This is Facebook election security war room. Screens visualize influxes of foreign political content and voter suppression attempts as high-ranking team members from across divisions at Facebook, Instagram, and WhatsApp coordinate rapid responses. The hope is through face-to-face real-time collaboration in the war room, Facebook can speed up decision-making to minimize how misinformation influences how vote.
In this video, TechCrunch takes you inside the war room at Facebook’s Menlo Park headquarters. Bustling with action beneath the glow of the threat dashboards, you see what should have existed two years ago. During the U.S. presidential election, Russian government trolls and profit-driven fake news outlets polluted the social network with polarizing propaganda. Now Facebook hopes to avoid a repeat in the upcoming US midterms as well as elections across the globe. And to win the hearts, minds, and trust of the public, it’s being more transparent about its strategy.
“It’s not something you can scale to solve with just human.s And it’s not something you can solve with just technology either” says Facebook’s head of cybersecurity Nathaniel Gleicher. “I think artificial intelligence is a critical component of a solution and humans are critical component of a solution.” The two approaches combine in the war room.
Who’s In The War Room And How They Fight Back
Engineers – Facebook’s coders develop the dashboards that monitor political content, hate speech, user reports of potential false news, voter suppression content, and more. They build in alarms that warn the team of anomalies and spikes in the data, triggering investigation by…
- Data Scientists – Once a threat is detected and visualized on the threat boards, these team members dig into who’s behind an attack, and the web of accounts executing the misinformation campaign.
- Operations Specialists – They determine if and how the attacks violate Facebook’s community standards. If a violation is confirmed, they take down the appropriate accounts and content wherever they appear on the network.
- Threat Intelligence Researchers and Investigators – These professional cybersecurity professionals have tons of experience in deciphering the sophisticated tactics used by Facebook’s most powerful adversaries including state actors. They also help Facebook run war games and drills to practice defense against last-minute election day attacks.
- Instagram and WhatsApp Leaders – Facebook’s acquisitions must also be protected, so representatives from those teams join the war room to coordinate monitoring and takedowns across the company’s family of apps. Together with Facebook’s high-ups, they dispense info about election protection to Facebook’s 20,000 security staffers.
- Local Experts – Facebook now starts working to defend an election 1.5 to 2 years ahead of time. To provide maximum context for decisions, local experts from countries with the next elections join to bring knowledge of cultural norms and idiosyncracies.
- Policy Makers – To keep Facebook’s rules about what’s allowed up to date to bar the latest election interference tactics, legal and policy team members join to turn responses into process.
Beyond fellow Facebook employees, the team works external government, security, and tech industry partners. Facebook routinely cooperates with other social networks to pass each other information and synchronize take-downs. Facebook has to get used to this. Following the mid-terms it will evaluate whether it needs to constantly operate a war room. But after it was caught be surprise in 2016, Facebook accepts that it can never turn a blind eye again.
Facebook’s director of our global politics and government outreach team Katie Harbath concludes. “This is our new normal.”