Alex Stamos, the Chief Security Officer at Facebook, recently stepped down from his role and this news was published on an op-ed in The Washington Post today. The post mentioned Stamos saying that Facebook could have responded earlier to the Russian interference on its platform. Also, he went on to tell that the issue was much larger than Facebook: Congress needed to update its laws regarding political advertisers, and social media users needed to “adjust to a media environment in which several dozen gatekeepers no longer control what is newsworthy.”
Stamos opened up by confirming one of the details in the recent New York Times report about how the social media giant dealt slowly with Russia-linked activity, how she yelled at him after he told Facebook’s board of directors about his team still working in order to uncover the extent of the misinformation on the platform, and how she later apologized for the same.
Stamos went on to say that Facebook and other tech companies made a number of mistakes in 2016, and they “were so enamored with the utility of our own products” that it was difficult for them to see how their tools were being misused. He specifically called out Facebook, telling that it spent too long of a time with the hope of minimizing the issue and that it “should have responded to these threats much earlier and handled disclosure in a more transparent manner.”
Stamos noted that there was plenty of blame to go around: the issues faced by Facebook regarding the Russian cyberwarfare were the ones which stymied the US Intelligence community and the US government didn’t contribute much in helping them post the cyberwarfare. He also added that major media outlets played the role of dissipating online disinformation campaigns by reporting information which was not completely correct, amplified the misinformation, and added that the tech companies weren’t equipped enough to understand the threats of geopolitics.
In the end, he noted a handful of things which the US could do- “Congress needs to codify standards around political advertising. The existing laws are decades out of date and don’t cover the types of platforms that exist today. New laws are needed to limit “micro-target[ing] tiny segments of the population with divisive political narratives. Companies such as Facebook, Google, and Twitter should be part of the effort, “instead of quietly opposing it.” Companies should be provided more guidance on how to act alongside government agencies, and while reporters are more aware of misinformation, media outlets need to figure out how to best cover something like leaked data it without aiding the bad actors who perpetrate the leaks in the first place.
Ultimately, foreign tampering was successful only because of their targets’ unwitting willingness to participate. We don’t live in a world where a majority of the US population gets their news from three dominating television networks. That is because the media landscape contains a number of outlets, “ the last line of defense will always be citizens who are willing to question what they see and hear, even when it means questioning our own beliefs.”
How Facebook would act to this further is something essential. Earlier this week, CEO Mark Zuckerberg said that the company would take better steps in changing the delivery of content to its users. It would also try to deemphasize sensational content and misinformation, all with high hopes of discouraging people from posting it in the first place.
Stamos is doing his part. Yesterday, Stamos announced launching the Stanford Internet Observatory, which is a help center in the Silicon Valley and Washington DC which has been opened up to address these very issues.