Editor’s note: Ahead of her upcoming keynote presentation at ISACA North America Conference: Digital Trust World, which will take place in Boston, Massachusetts, USA, 9-11 May 2023, ISACA Now visited with closing keynote speaker Frances Haugen. Haugen, otherwise known as the “Facebook Whistleblower,” is a data engineer who revealed the media giant’s handling of misinformation. The following Q&A interview has been edited for length and clarity:
ISACA Now: How did the way you think about misinformation and digital trust change during your time with Facebook?
During my time at Facebook, I think the way in which my outlook on safety online—particularly safety on social media—changed the most was that I became aware of how important the design and the oversight of digital platforms is. We’ve been told a very simple story by Facebook, by Meta—and they have spent $250 million or more promoting this story—the idea that the only way we can be safe is by using content moderation. They should be allowed to do whatever they want with their platforms; they should be able to run free and wild, and they will just pluck out “the bad content” after the fact. The thing I learned at Facebook was that very simple decisions—things like how the share button should work, the process of resharing, how all those systems should work—really makes a huge difference in terms of how much misinformation and violence-inciting content is spread throughout the system.
It’s little things like, should you have to click on a link before you reshare it? Things as simple as that. But something like that can reduce the amount of misinformation in a system by 10-15 percent. It’s really dramatic. And so I think the key message that I want to leave people with is the idea that there is hope, that we can design systems that are safe that we feel good about, but we have to do it intentionally, and we can’t just hope to solve it after the fact.
ISACA Now: Is there more risk for companies than they generally realize when it comes to their brand presences on social media?
When we think about risk, particularly around brands and their presence on social media, I think there’s a couple different kinds of risks we need to think about. The first is really basic and obvious—as we saw with the rise of the blue checkmarks on Twitter recently, people were coming in, impersonating brands and making statements that sometimes had very, very dramatic impacts on how those companies are perceived and their stock prices. So, impersonation and things like that matter. And social platforms need to show that they respect advertisers and brands that operate on their systems, because otherwise, very rapidly, they become forces of chaos and really damage a lot of real-world value.
But I think the larger issue is around how the public’s perception is changing around social media. One of the things that I think is going to be vitally important in the next 5-10 years is that brands, particularly large advertisers, are going to need to take a proactive stance in how should these platforms operate. Should they have to be transparent? Should they have to open the curtains a little bit so we can see how they’re operating? Brands hold the power of the purse. They can go in there and say, ‘Hey, I’m not going to participate unless I understand what this product is.’ Because we have seen that people have been responding very dramatically—many have opted out of platforms like Facebook because when they found out the truth about how they operated, it was such a shock. So, I think there’s two sides: one is, “I need to push for brand safety,” and second, making sure that platforms are being seen not just as complicit in enabling these platforms but are actively advocating for the public good.
ISACA Now: How has your outlook changed the most since the Facebook whistleblowing incident?
When I blew the whistle on Facebook, I had very, very limited expectations. The thing that mattered most to me is that I felt like I was carrying a secret, and that if I didn’t act, then I thought that many people would be harmed, particularly people in some of the most fragile areas of the world where Facebook is the Internet, and yet they receive very, very few safety protections. The thing that has changed the most for me is seeing progress begin to spread around the world, that we see things like in the European Union, they passed a generational law called the Digital Services Act, which says for the first time, at least Europeans get to ask questions about how these platforms operate and get real answers, get to access the real data.
That is a game-changer because part of the reason why the platforms have been able to slide as much as they have is because they knew that all of their decisions were hidden behind a curtain, and unless they told the public, the public would never find out about how they were actually operating. And to see movement in places like the European Union, the UK, Australia, Canada, all these things are very, very exciting to me because it shows that, globally, people are beginning to wake up and take a stance and say, “No, no, no, no, tech companies are not beyond the democratic process.” They need to exist in the same framework for governance and oversight as every other similarly powerful industry.