Frances Haugen
Frances Haugen, a former Facebook product manager, appears before the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security on Capitol Hill on October 5, 2021. (Chris Kleponis/Sipa USA)

This week, I’ve lamented more than I usually do that my high school math education stopped just short of algorithms. Such mathematical formulas were the hot topic on Capitol Hill Tuesday, as Congress heard from Facebook whistleblower Frances Haugen, who opened her testimony by explaining how Facebook’s algorithms work—and why they are dangerous. Facebook’s products, she said, “harm children, stoke division, and weaken our democracy.” She also called out Facebook’s leadership for choosing profits over public health.

I hope kids today are paying attention to Euclid’s rules for solving a problem in a finite number of steps—because, apparently, algorithms are more important now than they were in the 1980s. If you are searching the internet or scrolling a social media platform, then you are being driven by a formula that knows more about you than you will likely ever know about it.

This is scary enough for adults, but it’s terrifying when you think about children being yanked around by their brain stems from one dopamine fix to the next. As I wrote in the Washington Monthly last week, the consequences of social media use for kids can be dire—and it’s up to the adults, tech company executives, and elected officials to protect children from the darkest parts of the web.

Instead, as we learned from Haugen’s testimony and Sunday-night appearance on 60 Minutes, Facebook uses “amplification algorithms” and “engagement-based rankings” that increasingly drive teenagers to destructive content that fuels body image issues, mental health crises, and even online bullying and exploitation. Worse yet, added Haugen, a former product manager for Facebook, the company knows these are the consequences of its algorithms—but it continues to employ them anyway, all for more clicks and more revenue.

So I was discouraged to read that Facebook spokeswoman Lena Pietsch scrambled to discredit Haugen’s testimony by saying that Haugen didn’t have any staff, didn’t attend the right meetings, and didn’t work at the company long enough to know what she’s talking about.

Haugen is hardly the first person to raise her hand and speak up about Facebook’s growing and often unhealthy reach. In fact, in 2018, the Washington Monthly featured a cover story by Robert McNamee, one of Mark Zuckerberg’s earliest advisers, in which he laid out how the platform was being used in unintended and dangerous ways.

Of McNamee’s 6,000-plus words, there are two that I’ve never forgotten: “brain hacking.” Two years before writing his Monthly piece, McNamee interviewed Tristan Harris, a former design ethicist at Google and an expert in persuasive technology, who “described the techniques that tech platforms use to create addiction and the ways they exploit that addiction to increase profits. He called it ‘brain hacking.’”

These very public conversations took place more than five years ago—and still, nothing has been done to curb the algorithms’ most harmful tendencies.

Facebook’s original, ambient sales pitch, when it exploded onto the scene at the beginning of the 21st century, was that it would give each person a voice and a face on the internet and that it would connect us with faraway friends and loved ones. It has certainly given each of us a face, and it has connected us, for better or worse, but I’m not sure about the voice. There are currently 2.85 billion people on Facebook. Being just one of them makes me feel very small. It means my voice is like a Who in Dr. Suess’s Whoville screaming to prove I exist. And the voices smaller than mine, the voices of the youngest among us, can they be heard at all? Or are they just guinea pigs in a large social corporate experiment?

I’m grateful to Frances Haugen for raising her voice on behalf of the rest of us down here in Whoville. I hope our elected representatives will listen to her and figure out a way to get the horse and cart in the right order in Silicon Valley. Maybe they could start by reforming and updating Section 230 of the Communications Decency Act, which protects platforms from liability for content posted by its users. In fact, it’s one of the few issues in Washington on which there is genuine bipartisan energy. Hopefully, something can be done about it.

I’m sure reining in 2.85 billion scrollers and posters will be no easy feat—but on the other hand, there must be an algorithm for that.

Sarah P. Weeldreyer

Sarah P. Weeldreyer is a freelance writer and editor whose work has appeared in The Atlantic, the Arkansas Democrat-Gazette, and other outlets.