As I watched Facebook whistleblower Frances Haugen’s Senate testimony today, I found myself oddly encouraged by the proceedings.
No, it doesn’t make me happy to hear how the people running Facebook have consciously chosen profit over the health and welfare of their users. If Mark Zuckerberg could have somehow been happy with just twenty billion dollars of personal wealth instead of $120 billion, his company could have done a hundred billion dollars less damage to the world. Countless deaths, psychological traumas, political meltdowns, and disinformation could have been avoided, and our vital social justice movements would not have been steered by fake activist organizations down such dangerous ideological dead-ends.
But I am delighted to hear an algorithmic product management expert explain the structural deficits of Facebook’s entire ethos and functional strategy — and to do so in a way that even U.S. senators can understand. Yes, many of us have been making these same arguments for over a decade, but now it’s coming from a manager at Facebook, accompanied by proof that Facebook was fully aware of the precise, documented damage it has been causing all while pretending it had no idea.
What makes Haugen’s testimony different and especially important, however, is that she is calling attention to the automated fashion in which companies like Facebook are run. This is Silicon Valley, remember, where “scaling” a business means getting human sensibilities out of the way. Put in place a system of metrics for people to reach, and let the games begin. In Haugen’s words, the company is “flat.” It has the largest single floor of workers in the world, she reminded us. The pretense is that this is a holocracy, where no one is really any more important than anyone else. Everyone is just working to achieve the metrics.
In reality, what this means is that the company has no executive function. There is no one in charge. No leader. Literally everyone is just following orders from the machine. The result, as Haugen neatly explained, is that no one is accountable.
Instead, everyone just does whatever they have to for their metric — user engagement — to go up. Everyone makes their own, tiny decisions about how to do this. In aggregate, all these short-term decisions lead to terrible outcomes, not just for society but for the company itself. The company is like Zuckerberg himself, dropping out of college before the myelin sheaths had fully formed around his prefrontal ganglia, leaving him without the benefit of adult impulse control, and without full executive function.
What Facebook needs is some human intervention. Actual human beings with a broader level of concerns and awareness than an algorithm. Only human beings can consider the implications of decisions, and slow down and even arrest certain kinds of progress if they are driving the company or its users off a cliff. Facebook needs a few conscious human actors like Haugen — not simply to blow a whistle, but to steer the company.
Facebook is itself asking for regulation. Zuckerberg says he wants government to make sure all the social media platforms have measures for stopping the spread of disinformation. In Facebook’s case, that means algorithms, which according to Haugen (who managed them), can catch up to 10% of the stuff they’re looking for.
While regulations can help, they’re not the same as true intervention. They’re really just more rules. Programmers see rules as simple, temporary challenges, because they can always code around them — much the way a good tax lawyer can always seem to find a loophole in the tax code. It’s an invitation for more of the same disastrously automated approach to “problem solving.”
No, the real solution is to invest Facebook (or, to their logic, infect Facebook) with living, breathing, thinking people. Human beings with as much authority as the algorithms, and with values that hold as much weight as metrics.