In 2021, the year she blew the whistle on Facebook, Frances Haugen graced the cover of Time magazine, which stated that “no single person has been as effective as Haugen in bringing public attention to Facebook’s negative impacts.”
A former product manager at Facebook (now called Meta), Haugen vaulted into the headlines after she released thousands of internal Facebook documents that showed the social media giant was aware of harms caused by its products. One article in a Wall Street Journal series revealed that Facebook’s own research found that its Instagram app worsened body image issues for one in three teen girls who faced those concerns.
Haugen, who advocates for accountability and transparency in social media, has brought her big tech expertise to McGill’s Centre for Media, Technology and Democracy where she is the new senior-fellow-in-residence.
In heralding her arrival, the centre said it comes at a crucial time with Canada’s federal government gearing up to introduce online safety legislation.
“Of places where I think I could make a difference, working in Canada and raising awareness around what’s possible is one of the highest impact places in the world right now,” says Haugen, who calls the team at the centre “one of my favourite groups working on public policy in this space.”
Moreover, the academic environment has become very politicized in the United States, according to Haugen who doesn’t feel she’ll have to compromise her bipartisan inclination at McGill. “I don’t think this is a political issue. We have options that are far beyond content moderation.”
At the centre, part of McGill’s Max Bell School of Public Policy, Haugen will support its research and public engagement on online safety policy, youth digital rights, and data transparency.
“Canada has been slow to act when it comes to online safety, but we now have a chance to learn from the experiences of others and get it right,” the centre’s founding director Taylor Owen said in a release about Haugen’s appointment. “There are few people in the world better positioned to help us than Frances,” added Owen, an associate professor at the Max Bell School and the Beaverbrook Chair in Media, Ethics and Communications.
In an interview, Haugen praised the European Union for passing the Digital Services Act, which she called “a generational law in terms of saying things like hey, if you know you have a risk to your product, you have to tell us about it.”
For the first time, the law also gives the public the right to ask questions and get answers from social platforms. “And it sounds really basic – of course we should have that right,” Haugen acknowledges. The reality is that more and more big technologies that play an increasingly larger role in our lives, run on data centers or on chips “that we can’t inspect. These companies know that if they don’t want to answer a question, they don’t have to because we can’t get the information on our own.”
When countries introduce laws regarding tech accountability, the conversation basically starts from zero, Haugen says. She and colleagues at the centre see a big opportunity around developing a playbook for society to engage in conversations about what people want their relationship to be with social media companies.
“Another reason why I’m so excited about the public policy centre at McGill is that I think they have a lot more of that participatory DNA,” she says. “The way we deal with messy complicated issues is by working together and having conversations about how we move forward as a society.”
One of those conversations will be in late June in Winnipeg, an event that Haugen plans to attend. The centre is hosting a Youth Assembly on Digital Rights and Safety with a few dozen 18-year-olds to discuss how they interact with online platforms and how they hope to change them.
“The more research we do over a longer period of time, the clearer it is that kids are being very, very harmed by social media,” Haugen says. “There’s definitely good there. But there’s a huge amount of harm that is unnecessary.”
The Centre for Media, Technology and Democracy has been a prominent voice in the policy debate about regulating social media platforms. Owen is a member of an expert advisory group that has provided advice to the federal government on a legislative and regulatory framework to tackle harmful content online. It recommended a duty to act responsibly; online platforms would have to design their systems in a way that minimizes harm.
“What’s interesting about these platforms and about AI in general, is that you can have very good intentions, and the systems can take very different paths from what you intended,” Haugen says.
“I don’t think anyone intended to design Facebook’s algorithms to amplify extremism. But they made choices that benefited the business that had the side effect of amplifying extremism. And so, I think the larger question is what system of incentives do we want to have these technologies operate within?
“And in a world where we have the ability to ask questions and monitor the performance of these systems – and there’s consequences – that motivates these companies internally to figure out how to solve their problems more proactively.”
Haugen’s new book, The Power of One: How I Found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook, hits stores this month. She also co-founded Beyond the Screen, a non-profit that focuses on online harms that she’s getting off the ground. “One of our intentions is to build a simulated social network, so that we can teach data science classes that are as messy as what students would actually encounter in real life,” she says. “And the hope is to be able to do that co-development with faculty at McGill.”
“What’s really interesting about building a simulated social network is that we can begin to expose people to the idea that choices are being made when these systems are built” – and there are trade-offs associated with those choices, she says.
“If we can build that lab bench, where students can go in and say, hey, you know, these tools that I live [with] on online, they’re not inevitable. They’re full of choices. That gives people an empowerment to begin asking questions around how do we govern these systems? That’s part of what I’m really excited about.”