Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets
In the shadow of the exponential rise of “big tech” comes journalists like Jeff Horwitz from the Wall Street Journal reporting on it. Yet Horwitz’s experience and this book aren’t a standard fare analysis on a subject he’s covered as a technology reporter, but is instead about news that he essentially broke himself, having groomed a Facebook insider to share 25,000 internal company pages with him. These made global news back in 2022 as the “Facebook Files” revealing the incredible influence of the firm and the disconnect between many of its professed objectives.
In essence this is the “broken code” of the books title. Horwitz is hugely indebted to Frances Haugen, the data engineer turned whistleblower, who risked so much both professionally and personally to provide the concrete evidence as to widely held concerns as to how Facebook operated. The process of gathering and then releasing this confidential information comes at the end of the book that is frontloaded with an analysis as to the issues and contradictions that Facebook and its senior leadership are trying to balance.
How to be both a media company and have a social mission while growing at a vast pace with all the incentives and pressures that brings? How to walk the line of defending free speech while not becoming a megaphone for hate speech, extremist views and polarising campaigns who look to hijack your algorithms for their own benefit? Horwitz recognises what to most is surely obvious, that the proliferation of Facebook as a platform gives it “immense power and societal influence,” yet as a private entity in an emerging sector there are huge gaps in understanding how it actually works and what safeguards are in place for when it doesn’t.
This challenge is amplified by the centralized nature of a social media giant that is inextricably linked to its founder Mark Zuckerberg, a person who “had once declared himself the only guy capable of fixing Facebook.” Zuckerberg’s public vision to connect people, be good and facilitate “meaningful social interaction” is juxtaposed with how the system’s code rewards people who seek confrontation through hyper-engagement and sharing, versus the passive and moderate majority. Within America’s divided body-politic Facebook almost becomes symbolic of the rise of the radical, and Horwitz paints a fascinating picture as how to regulate subjective experiences such as what constitutes bullying and hate speech.
The central narrative is that Facebook’s “inattention and indifference” to those that would manipulate it for nefarious ends is driven by its competing incentives as an organization, primarily between growth and purity or quality of mission. This story is told through teams and individuals at Facebook trying to steer the organization in different paths, with those in “Civic” and “Public Policy” completely at odds for much of the time. With this tension at the heart of the organization Facebook becomes, in the words of a former employee: “inherently incendiary, unstable and prone to manipulation.” This was especially the case outside of America and in countries where Facebook didn’t even have the language skills to understand the outbreak of mass violence in Myanmar or communal tensions in India, for example.
Facebook’s scandals or controversies are well known; from links to Russian election disinformation, being manipulated by Cambridge Analytical, or more specific issues like facilitating domestic servitude in the Gulf. Where Horwitz’s access to internal documents takes the book is into a more specialist space where Facebook policies toward countering manipulation or division—described as “breaking the glass policies”—are deployed. Horwitz is generally scathing of their effectiveness describing them as “remedial” at best.
Frances Haugen, Horwitz’s whistleblower, said it best in explaining her own motives to go public to warn as to the dangers of Facebook; the company was, she argued, “consistently turning up the temperature of societal discourse. If the company didn’t change course, its products would kill a lot of people.”