Now, hours of testimony and thousands of pages of documents from Facebook whistleblower Frances Haugen have renewed scrutiny of the impact Facebook and its algorithms have on teens, democracy and society at large. The fallout has raised the question of just how much Facebook, and perhaps platforms like it, can or should rethink using a bevy of algorithms to determine which pictures, videos and news users see.
But algorithms that pick and choose what we see are central not just to Facebook but to numerous social media platforms that followed in Facebook’s footsteps. TikTok, for example, would be unrecognizable without content-recommendation algorithms running the show. And the bigger the platform, the bigger the need for algorithms to sift and sort content.
Algorithms are not going away. But there are ways for Facebook to improve them, experts in algorithms and artificial intelligence told CNN Business. It will, however, require something Facebook has so far appeared reluctant to offer (despite executive talking points): more transparency and control for users.
What’s in an algorithm?
An algorithm is a set of mathematical steps or instructions, particularly for a computer, telling it what to do with certain inputs to produce certain outputs. You can think of it as roughly akin to a recipe, where the ingredients are inputs and the final dish is the output. On Facebook and other social media sites, however, you and your actions — what you write or images you post — are the input. What the social network shows you — whether it’s a post from your best friend or an ad for camping gear — is the output.
At their best, these algorithms can help personalize feeds so users discover new people and content that matches their interests based on prior activity. At its worst, as Haugen and others have pointed out, they run the risk of directing people down troubling rabbit holes that can expose them to toxic content and misinformation. In either case, they keep people scrolling longer, potentially helping Facebook make more money by showing users more ads.
Many algorithms work in concert to create the experience you see on Facebook, Instagram, and elsewhere online. This can make it even more complicated to tease out what’s going on inside such systems, particularly in a large company like Facebook where multiple teams build various algorithms.
“If some higher power were to go to Facebook and say, ‘Fix the algorithm in XY,’ that’s really hard because they’ve become really complex systems with many many inputs, many weights, and they’re like multiple systems working together,” said Hilary Ross, a senior program manager at Harvard University’s Berkman Klein Center for Internet & Society and manager of its Institute for Rebooting Social Media.
“You can even imagine having some say in it. You might be able to select preferences for the kinds of things you want to be optimized for you,” she said, such as how often you want to see content from your immediate family, high school friends, or baby pictures. All of those things may change over time. Why not let users control them?
Transparency is key, she said, because it incentivizes good behavior from the social networks.
Another way social networks could be pushed in the direction of increased transparency is by increasing independent auditing of their algorithmic practices, according to Sasha Costanza-Chock, director of research and design at the Algorithmic Justice League. They envision this as including fully independent researchers, investigative journalists, or people inside regulatory bodies — not social media companies themselves, or companies they hire — who have the knowledge, skills, and legal authority to demand access to algorithmic systems in order to ensure laws aren’t violated and best practices are followed.
James Mickens, a computer science professor at Harvard and co-director of the Berkman Klein Center’s Institute for Rebooting Social Media, suggests looking to the ways elections can be audited without revealing private information about voters (such as who each person voted for) for insights about how algorithms may be audited and reformed. He thinks that could give some insights for building an audit system that would allow people outside of Facebook to provide oversight while protecting sensitive data.
Other metrics for success
A big hurdle, experts say, to making meaningful improvements is social networks’ current focus on the importance of engagement, or the amount of time users spend scrolling, clicking, and otherwise interacting with social media posts and ads.
Changing this is tricky, experts said, though several agreed that it may involve considering the feelings users have when using social media and not just the amount of time they spend using it.
“Engagement is not a synonym for good mental health,” said Mickens.
Can algorithms truly help fix Facebook’s problems, though? Mickens, at least, is hopeful the answer is yes. He does think they can be optimized more toward the public interest. “The question is: What will convince these companies to start thinking this way?” he said.
In the past, some might have said it would require pressure from advertisers whose dollars support these platforms. But in her testimony, Haugen seemed to bet on a different answer: pressure from Congress.