Page

Writing

Thoughts On "An Ugly Truth: Inside Facebook's Battle for Domination"
The challenge of fully and accurately capturing Facebok's impact on society is a significant one.

July 17, 2021

Topics = { Ø }

Every time I think of writing a non-fiction book about anything other than my personal experience, I shiver. As someone who has mostly published a few reports about large corporations, I can attest that doing research based on primary sources is painstaking work and requires a huge amount of time for what can often be little immediate payoff. Add in the complexities of handling human sources, many of whose identities need to be protected or hands need to be held (often virtually) to assure them that their non-disclosure agreements won’t come back to haunt them, and the task grows even more daunting.

An Ugly Truth: Inside Facebook's Battle for Domination is therefore an impressive work in that it manages to cram so many facts, apparently from 400 sources, into only about 300 pages. It’s definitely an overall enjoyable read and anyone who thinks about Facebook the company or Facebook the product at all seriously should read it. At the same time, however, it is not the book I expected it would be, in a number of respects. First of all, given Facebook’s global footprint and long history, there is so much more important material that would have undoubtedly required chopping down numerous additional forests to include. So, without hurting a single tree, I hope to provide pointers to some of that material here, piggybacking on what Sheera Frenkel and Cecilia Kang have already set forth as the starting point for a discussion our entire nation needs to be having right now.

Second, with two accomplished and respected New York Times journalists as authors and the title “An Ugly Truth,” I thought that their much-hyped and eagerly anticipated book would more than pierce, but rather, obliterate, the thick narrative that Facebook spin doctors have been using to mask reality for the better part of two decades. In 2021, it goes without saying that the consequences of Facebook’s corporate propaganda, as well as the propaganda on Facebook, have been disastrous to an extent that cannot be overstated, actually putting the very existence of the United States of America in jeopardy. Yet I found that while the book certainly contributes a few new and interesting insights, especially about Russia and the company’s failure to handle the genocide it caused in Myanmar, Frenkel and Kang have unwittingly contributed to the morass of misinformation that their book seeks to expose and, I think, have unintentionally done Facebook an enormous favor by blindly repeating many of its talking points. Post-publication, the company’s narrative armor is still very much intact.

I should stop here to explain that I in no way think that Frenkel and Kang have any sympathies for or biases toward (or against) Facebook or its executives, which is something that cannot be said for just about all of their predecessors in this genre—an interesting fact in and of itself. Ben Mezrich, author of a 2009 supposedly-non-fiction-but-really-not-even-close book about Facebook’s origins, will gladly lick the boots of anyone perceived to be wealthy if he thinks he can get a movie option deal out of it. Mezrich also lies with abandon so long as it makes for a story where a white kid gets rich. David Kirkpatrick, who published a biography of the company in 2010, is a corporate shill of the worst kind. (This book describes him as “friendly” to Facebook and mentions Mezrich not at all.) Steven Levy, who set out to write the “authoritative” book on Facebook not long ago, comes from a generation of journalists who for decades thought that technology companies could do no wrong, and is accordingly all about access, which he had in spades. Like Kirkpatrick’s, his book had Mark Zuckerberg’s blessing, no fact checking, and was biased accordingly. Even Roger McNamee, author of Zucked, maintained in a feat of pretzel logic that it would be improper of him to sell his valuable Facebook stock. An Ugly Truth is arguably the most objective serious book written about Facebook so far.

The book’s biggest problems arise with what I will attribute to editing. Frenkel and Kang may have conducted exhaustive interviews to obtain material, but once they had it, either they or their editors were rather careless with its description and arrangement. The book is full of errors—material errors—of multiple types that should not have made it past pre-publication review. In many areas it also lacks crucial context and omits essential facts, and as a result, ends up reading more like an extremely long, at times interesting, at times tedious, newspaper column containing summaries of stories long since considered old news. The book’s vocabulary seems limited, much like a newspaper’s is, perhaps to attract maximum readership. That makes it an easy read, but it also hampers the message, making deeply alarming events seem ho-hum and yawn-inducing when they definitely are not.

Given that The Facebook has had more of an impact on my life than on most people’s—I had a role in creating it, as the book notes in Chapter 2—I of all people should not be bored. Even though I am extremely glad to have been among the authors’ sources (for reasons I’ll get to later), one has to wonder if some of their time would have been better spent interviewing fewer people to include irrefutable, easily citable documents, many of which Facebook has been forced to hand over in various court cases. The fact that the leading headline about the book on publication day (something that would have been planned in advance) was that Facebook engineers had abused their access privileges—something we have known for a long time—indicates at least to me that the endeavor to find new information that might explain how our current political environment was shaped by this company, this technology, and this particular set of reckless executives, was not entirely a success.

Just as disappointing as the book’s somewhat lazy approach to its subject material, at least in certain chapters, is the way in which the authors have been discussing it on their promotional tour. On an unrecorded Twitter Spaces event the night of publication, they emphasized that lots had been left on the cutting-room floor, so to speak, and that they had decided that the book they really wanted to write was not so much a historical analysis of Facebook as it was a “page-turner.” A comprehensive historical analysis is precisely what we all need right now, and by including more of the literally unbelievable details, readers would undoubtedly have been glued to the pages.

The book starts out with a not-at-all-surprising description of Facebook engineers stalking their ex-girlfriends by abusing their access privileges. There are some interesting paths the authors could have led readers down from this starting point, but nonetheless, the phrase “God mode” never appears in the book, nor does “root access.” Mark portraying himself as “The Creator,” labeling every page with “A Mark Zuckerberg production,” and putting “I’m CEO, bitch” on his business cards are omitted completely or mentioned only in passing, free of any damning context (and only one of the above, on page 194).

In Chapter 2, I make an appearance, and since I know the facts of my own life fairly well, I can confirm that what is printed in the book is often inaccurate. This is something that at this point I am used to. Mezrich plagiarized my writing, called me by the wrong name in his hardcover edition, invented much of his story, and generally implied that I was an idiot who made a text-based “BBS” of the kind that was used in the early 1990s with dial-up modems. Kirkpatrick never even contacted me, but described me as arrogant nonetheless. Levy, who did interview me, conjured up emotions I never had and at various turns fabricated and confused numerous details.

The upshot of these serially unflattering and mostly wrong descriptions of events in 2003 and 2004 at Harvard is that for nearly all of my adult life, I have been radically mischaracterized in the public sphere by writers with significant financial incentives to gloss over or distort any detail that might threaten Mark Zuckerberg’s “genius” narrative. None of these works ever made it clear the extent to which Mark’s product overlapped with mine, how much he used it as he built his, what we discussed, or the degree to which I predicted so many of Facebook’s ultimate problems because I understood what I was building. Instead, starting with Ben Mezrich’s 2008 e-mail to me inquiring about my willingness to cooperate for his book, I began facing the recurring dilemma of agreeing to be portrayed incorrectly or not being portrayed at all. As it turns out, both are really bad options, and I experienced each first-hand: errors with each of the books, and total omission with the blockbuster movie.

A former Facebook employee turned prominent critic, Yaël Eisenstat, recently wrote in response to the publication of An Ugly Truth,

“For an individual with no institutional or financial backing to speak out against the most powerful company in the world is not an easy choice. I turned down all media requests and sought counsel from my closest friends and confidants. To be frank, it was a dark period.”

I have no doubt she is speaking from her heart. Some context is useful, however: when Yael left Facebook in November 2018, she already had under her belt a 13-year career as a national security advisor to the White House, a Senior Intelligence Officer at the Central Intelligence Agency, a 2-year stint at ExxonMobil, and an Adjunct Professorship at NYU—and it was still understandably difficult for her to speak up. For the sake of comparison, when I began speaking out against Mark Zuckerberg, I had zero institutional backing of any kind (least of all from Harvard), a group of friends who hadn’t witnessed the events in question and generally had little idea what I was talking about, barely any income, and unlike the Winklevoss twins, no lawyers at the family business or in the family at all to turn to. The end result was that even though I did nothing wrong, my reputation ended up being shaped by Facebook’s PR spin (and later, Sorkin’s screenplay, which Facebook provided feedback on) from the minute I entered the workforce, and the damage has been lasting and real. It has torpedoed amazing job opportunities, countless social relationships, my ability to raise capital for my ideas, and my overall reputation. To this day I am the target of vocal social media trolls who insist that I must be mentally ill for firmly insisting upon what actually happened in Cambridge. One stalker, against whom I have a restraining order, has made efforts to have me involuntarily committed to a psychiatric ward and interfered with my vehicle registration, making it impossible for me to drive legally for a time.

For these reasons, I am sensitive to printed errors about my involvement with Facebook, and I dream of the day when Facebook’s origins truly get a historian’s treatment, laying out every detail, minute by minute, of the interplay between my work, Mark’s copying of that work, the Winklevoss twins, and even criminal Paul Ceglia. For now, the reality is that there is no such book (though I took a stab at it in a 2019 report I published, declaring my bias up-front). Sadly, even after I informed Sheera Frenkel’s independent fact checker of several errors when we spoke on December 30, 2020, they were not corrected. It’s hard to understand how this could have happened. Frenkel, Kang, and their publisher deserve credit for going through the trouble to hire a fact checker in the first place. It’s something many book authors and publishers simply don’t do. But why bother checking facts if problems are not fixed?

Much of the material in Chapter 2 comes from an interview Frenkel conducted with me on November 13, 2020 for about an hour. I told her that I first read about Mark Zuckerberg in Fifteen Minutes, the Harvard Crimson’s weekly news magazine, which covered his “Synapse” MP3 player software. In the book, this turned into “around campus.”

I developed (by which I mean conceived of, coded, designed, tested, and marketed) The Universal Face Book, also called “The Face Book” and “The Facebook” in writing, in August 2003 as part of a student portal I built called houseSYSTEM. While Mark incorrectly gets credit in the book for making a site that was “limited to Harvard” and “private by design,” houseSYSTEM was designed that way months prior. Thanks to an overzealous and particularly loud student I had never met in Lowell House (where I lived; he is now an attorney at the Securities and Exchange Commission), a flame war on the “lowell-open” mailing list erupted about privacy in part because of a false accusation that I was stealing student passwords, and the Harvard administration began making noises about shutting down houseSYSTEM and kicking me out of school. (These events became the subject of a memoir I published in 2008, when very few details about Mark’s Facebook that we know now were available, and when no major publisher would agree to put their imprint on a book critical of the site.)

Harvard’s administration was antsy when as part of a round of negotiations I informed them that I would be launching my own Face Book on houseSYSTEM (as opposed to Harvard’s extremely basic, often black-and-white, online Face Books for each residential house), and they found new absurd pretexts, such as the word “The” appearing in a copyright disclaimer, to limit what I could build. They were concerned solely about students posting their own information. What never happened is what Frenkel and Kang misstate in their book, which was a controversy over “allowing students to post personal details about classmates.” Harvard administrators were upset by the notion that students could post details about themselves, because Harvard’s Office of the General Counsel viewed the University as the holder of copyrighted material, e.g. student ID photographs (which I never planned to use and never did). Digital cameras were just starting to gain popularity, and there was no need to rely on Harvard’s photographs. Eventually, Harvard realized it had no legal grounds to stand on whatsoever, and after I hired pro bono attorneys and caught an administrator in a lie, the gang of elderly academics in University Hall so put off by an undergraduate taking initiative mostly backed down.

On January 8, 2004, just after we had dinner, Mark wrote to me “would you be interested in possibly partnering up to make a site if it would not be incorporated into housesystem” and “which isn't something i want right off the bat, and maybe not at all” regarding combining his then-unspecified project with my work. An Ugly Truth essentially says the opposite. Mark only “floated the idea of combining” our efforts explicitly to reject the idea, which we had also discussed hours earlier, in a confusing conversation where he refused to fully disclose the nature of his project…because it was my project.

At a different point in the conversation, which is quoted in An Ugly Truth, Mark responded via AIM to my analogy, that “delta has song airlines,” with a seeming restatement: “Delta owns song airlines”. His response was a question with a missing question mark (because it was written on AIM). It was not Mark correcting me Aaron Sorkin-style, which is what the book suggests given the lack of context and the fact that most people’s impressions of these early days have been formed by the film The Social Network, in which Jesse Eisenberg, playing Mark, frequently lectures others in rapid-fire fashion. Mark had virtually no knowledge about business in 2004. For example, he also didn’t know that Viacom owned MTV:

ThinkComp: i'm pretty sure viacom does...
ThinkComp: they're basically just a holding company for smaller companies
zberg02: oh

This might seem a petty point that’s irrelevant in 2021, except that even Frenkel and Kang seem to occasionally fall victim to the narrative surrounding Mark, and having him seemingly correct a classmate about an obscure business detail—when in fact he has no clue—only serves to bolster that narrative. It bears repeating: if Facebook’s success has been built on anything, it’s narrative.

An Ugly Truth is incorrect when it says “both versions of Thefacebook (as the name was now being styled)” in Chapter 2. On January 8, 2004, the only names being used interchangeably were “The Facebook,” “The Face Book” and “The Universal Face Book” for my site because Mark’s version hadn’t launched yet. Mark didn’t even purchase the domain name thefacebook.com until three days later—of course, without telling me.

When the authors write, “‘I kind of want to be the new MTV,’ [Mark] told friends,” one might naturally wonder who those friends were. As it turns out, “friends” refers to me. That quote comes directly from the AIM conversations Frenkel interviewed me about—which is why I was talking about Viacom. It’s therefore strange that the book’s footnote 4 cites a Rolling Stone article by Claire Hoffman, who discusses this passage because Hoffman also interviewed me back in 2010. The article even mentions me by name in the section cited. The roundabout citation which transforms me into nameless “friends” is perplexing.

Frenkel and Kang then quote me in the book saying, “Mark was acquiring data for the sake of data because, I think he is a lot like me. I think he saw that the more data you had, the more accurately you could build a model of the world and understand it” and “Data is extremely powerful, and Mark saw that. What Mark ultimately wanted was power.”

That data is important isn’t necessarily an incorrect point, but it’s not the point I actually made in the interview. I said Mark and I were alike in that we were both interested in building models of the world around us. I raised the issue of this quotation being incomplete with the fact checker, Hilary McClellen. Here’s what I told Hilary:

“[Sheera] might have combined a few things I said in different places there into one quote,” and “One thing that she didn’t include that I said, that I think is important… I did say that I think he’s like me in that we both understand that in order to build an accurate model of the world you need a lot of data—but I also said that if you end up with duplicate data, you end up with a very inaccurate model. And that’s one of the major differences [between us]: that I was concerned with ensuring that there was quality in my data, whereas he was more concerned with quantity.”

Sheera Frenkel spent some additional time as we talked trying to drill down on why this mattered, and I did my best to explain it: the billions of fake accounts on Facebook are, in fact, the direct result of Mark ultimately ignoring privacy issues I made certain to address with houseSYSTEM’s Facebook and my warnings about data quality and unchecked growth, which as I told him, inevitably leads to “spam.” Regrettably, none of this made its way into the book despite its direct importance to today’s political reality. Every Internet Research Agency campaign, every Russian influence operation, every Macedonian fake news site—essentially everything wrong with Facebook that isn’t a pure privacy issue ultimately boils down to fake accounts.

Here are the facts that I wish could have made it into the book, which render two decades of events in a much different light:

As a student, Mark actually did steal his users’ passwords to break into the e-mail accounts of Harvard Crimson reporters—something he likely asked those reporters to stay quiet about years later through the use of non-disclosure agreements, possibly paid, since his actions violated 18 U.S.C. § 1030While I was fending off critics eager to make nuanced points about encryption and MD5 hashing that had little bearing on houseSYSTEM’s actual risk profile but sounded scary enough to impede trust, Mark was able to spy on student reporters because his Facebook stored passwords in plaintext for yearsMany of the integrated features on Facebook, e.g. digital RSVPs to on-campus events advertised with images on a virtual calendar, birthday reminders, etc., actually appeared on houseSYSTEM first, and Mark implemented them one by one without my permission—something he later did to numerous other competitorsI struggled with what to do about “Organizations” on houseSYSTEM (which on Facebook are “Groups”) because of the risk that making them too open would lead to data quality and real-world problems—an issue that did not stop Mark—but it’s difficult to earn much credit for not building something riskyIn 2005, my explicit, written warnings to Mark and his subsequent decision to ignore (and blame Dustin Moskovitz for) a security issue I discovered implicating a friend-of-friend mass disclosure problem—something we now refer to as “Cambridge Analytica”—were a direct precursor to events disclosed in 2018, and prove that Mark understood but ignored the risks that came back to bite him years laterAlso in 2005, my written warning that his conduct was likely violating Section V of the Federal Trade Commission Act presaged FTC actions against FacebookWe reached a settlement agreement in 2009 which Facebook insisted be kept confidential and announced the Friday before Memorial Day weekend just before midnight to bury it in the press, which worked like a charmThe probable reason we reached a settlement agreement was so that the Kremlin could have a clean investment in Facebook through VTB Bank and Yuri Milner—a deal announced (without the part about the Kremlin) the next business day, which I had absolutely no knowledge of ahead of timeMark brazenly lied at a Stanford University event in 2005, claiming that no other similar product had existed at Harvard before his, showing his eagerness to erase from history someone he had once called one of the six smartest people in the world (a ridiculous notion to be sure, but indicative of his simplistic thinking)The release of The Social Network in 2010 supercharged positive media coverage of and Silicon Valley’s obsession with the company and since I was excluded, made my warnings seem more esoteric, unbelievable and alarmist than everSome issues never came up in the interview for An Ugly Truth because I didn’t know what content the book would focus on (and perhaps the authors didn’t yet either) and was never asked about them. For example, had I been asked about Andrew Bosworth, I would have told Frenkel that when I took Computer Science 50 in 2001, Bosworth was the TF (or Teaching Fellow, basically a student teacher) I recall refusing to help me with a question on a problem set, who reviewed my code and clarified my issue with, “You’re wrong.” After asking why, he repeated, “You’re just wrong.” Then he rushed away to attend to the questions of a female student. Two years later, when I started a seminar called “! CS50” (read “Not CS50”) to teach students computer skills that were actually useful in the internet age as opposed to just the C programming language, Bosworth sent out a furious e-mail nominally on behalf of all CS50 TFs, calling the endeavor “offensive” and “outrageous.” (Harvard completely revamped the CS50 curriculum soon after, with the University publicly citing, of course, Facebook as an inspiration.) Just as with his more famous and demented memo, “The Ugly,” Bosworth managed to get everything completely backwards while being offensive himself.

Nor did we discuss Ami Vora, one of two former presidents of the Harvard Computer Society (HCS) who went on to work at Facebook in senior positions, and who probably should have appeared in An Ugly Truth for their contributions to global misery. (The other, Carl Sjogreen, served as a sort of mentor from the time I was a 13-year-old computer camper in Cleveland until just after college graduation. In April 2006, he heard the short version of the story of my involvement with Mark over lunch in a Google cafeteria and adjudged that I just complained too much. Soon after, Carl left Google to work for Mark, where he created the infamous Facebook Platform, the flip side of the product referred to in the book as the Open Graph, where users could throw sheep at one another and fake accounts really began to take off.) Vora, who expressed absolutely no interest in anything I suggested for HCS, such as focusing on digital video, was openly hostile to the idea of even web-based e-mail, writing, “webmail is a Thing to be Despised” “because we try to be at least nominally hardcore techie.” Now, her name appears atop a recently unsealed damning e-mail (presumably written from a 2002-era Digital UNIX terminal somewhere) in which she fears the very thing I called Facebook out for in my 2019 report and later testified about before the U.K. Parliament’s DCMS Sub-Committee: the overwhelming, actually criminal, number of fake accounts on the platform. This, and not anti-trust law, is the key to taming Facebook’s influence. In Ami’s words, Facebook’s worst nightmare is the headline “Facebook lies about its user #s to get record profits”—because that’s true.

If only that could have been squeezed in before publication.

By Chapter 3, with the company scaling its growth trajectory, my direct role in Facebook, Inc.’s development was admittedly over. Mark had taken the features he wanted from houseSYSTEM, ignored my cautionary advice for the sake of growth, and left it and me for dead. I stopped speaking with Mark in 2005, with the exception of a brief telephone call in 2007 after his lawyer e-mailed me out of the blue due to Mark’s difficulties with the Winklevoss twins that he thought I might be able to help with (by proving that Mark stole from me and not from them). I ended up being deposed for the ConnectU litigation as a result, with Facebook covering the cost of my attorney. And lunch.

In the rest of the first half of the book, some of the content about the company’s early days and Mark’s interactions with true adults such as Donald Graham and Sheryl Sandberg, is quite interesting. But errors still plague the book throughout. A reviewer on Amazon.com points out that,

“There are multiple errors of basic fact throughout the book. A few examples:

An early chapter refers to Pedro C as "director of product engineering" so support the narrative that management didn't want to slow down product development by requiring approvals to access data. Pedro was director and later VP of production engineering (read SRE), not product.

The distinction between product and production engineering may seem small, but it dramatically changes reporting chain and motivation. His org's interest was in making sure that people could fix problems quickly and keep the site up, not in building the next cool feature or selling ads faster. One could have easily verified his title and role by searching his name and looking at any of the conference talks that he's given.

Less critical, but the term "tribes" has not been used for Facebook's company-internal groups since pre-2016 and not widely used since 2014.

If errors like these made it past fact checking, what less obvious errors slipped through as well?”

Another significant point was raised by David Carroll of The Social Dilemma fame and advertiser lobbyist Jason Kint, who follows the company closely:

Twitter avatar for @profcarrollDavid Carroll @profcarrollWe are reminded of a factual error in the book #AnUgluTruth: Facebook learned about Cambridge Analytica before the Guardian report. This is public knowledge sourced from the Washington DC AG’s lawsuit against FB RE: CA. Twitter avatar for @jason_kintJason Kint @jason_kintExactly. And I was disappointed in that even this outstanding book repeated the falsehoods core to FB’s cover-up: that they learned about Cambridge Analytica from Guardian, that the data was transferred rather than sold, and ignores they had an employee who was directly involved. https://t.co/sjk5jIC7nr https://t.co/YFyqIYL5FU12:22 PM ? Jul 14, 2021111Likes54RetweetsFocusing on errors alone is perhaps losing the forest for the trees as An Ugly Truth does have an important overall message to send, which is that the company’s leadership never quite wants to take responsibility for anything. However, we already knew (and the quotations on the back cover remind us) that Facebook is a serial offender. The question of why deserves a discussion, but there’s no room in the book for it. More on that in a bit.

In Chapter 7, entitled “Company Over Country” after a remarkably tone-deaf yet completely accurate mantra Zuckerberg used in the company’s early days, the authors allude to Mark’s short-lived pass at a presidential campaign. There is unfortunately no examination of this cataclysmically bad idea, and like so many other major topics, it is frustrating that the authors do not even admit that it happened, referring to his fifty-state tour as “a chance to show he cared about the country.” While the very title of the chapter would seem to discredit such a description of the obvious political stunt, the authors do not seem to have written this phrase with any sense of irony, which again makes the reader wonder whose side they are really on. For my taste, their “ugly truth” is at times a bit too anodyne to do Facebook’s narrative machine any real damage. Yes, Mark and Sheryl did not always get along. That doesn’t seem to have made much of a difference to the company’s stock price or influence in Washington.

The book does a good job of summarizing what must have been many hours of interviews with Alex Stamos, who is in some ways the star of the show. Stamos deserves credit, first and foremost, for quitting. And yet, although the book portrays him as a concerned person trying to do his best, anyone familiar with large corporations would have to wonder why he didn’t quit sooner, or maybe immediately on his first day at work, given that his position identifying security issues reported not to Sheryl, who brought him in, and not to Mark, who was running the show, but to the General Counsel, whose true job in any large, bureaucratic organization is “risk management:” making problems disappear. It also does an excellent job of peeling back the curtain around the internal deliberations behind numerous decisions that were widely reported in the news, all of which seem to exhibit the same pattern of chaos and incompetence thanks to Mark’s fabulously inept attempts at “leadership” and Sheryl’s simultaneous loyalty, unflappable denialism, and obsession with optics.

I was disappointed that Frenkel and Kang did not spend more time describing the characters in this endless tragedy in more depth. I think it would have been useful to explain that Sandberg is but one of thousands of smart, hyper-ambitious students who have, through the decades, entered Harvard Yard (or before that, Radcliffe) hoping to “save the world,” only to find after much research, networking, and the application of their own ingenuity, that business school (for Sandberg, HBS) consulting (McKinsey) and banking (the World Bank) are the only way to achieve world-saving. This persona is so predictable that the deans practically keep a cookie cutter in a closet in University Hall. So too is the end result: the stripping of a person’s moral fiber and, in some exceptional cases such as this one, the ultimate destruction of everything they touch. (See also MBAs and health care.)

Zuckerberg is someone I personally detest, but his character is a bit more complex because he actually suffers from an obvious but publicly undisclosed neurological condition, which the authors only reference in passing when they mention “Zuckerberg didn’t seem to blink.” This extremely notable oddity, picked up on immediately by anyone who has ever met him in real life, is just the tip of the iceberg that is his person: cold, emotionless, forced, robotic, calculating, and immature. In sum, a textbook sociopath, incapable of relating to the average person, and far more comfortable treating the world as a video game—an environment where rigid rules and explicit quantitative metrics are the only things that matter. The true quirk with Zuckerberg, however, is that he is a sociopath who knows virtually nothing, except a bit of classical history, how to code, and how to exploit others. The book confirms this again and again, and it’s one of Frenkel and Kang’s best thrusts at the narrative. Such a lopsided person would face significant challenges in a normal environment. Yet with the gift of a missing conscience, the willingness to lie and exploit more than compensates.

The years 2015-2020 documented in truly painstaking detail how and why the media is unable to handle this type of sociopathic persona. The average journalist acts in reasonably good faith and therefore, on average, assumes sources are acting in good faith. Sociopaths never act in good faith. That is why they are, to borrow the words of Masha Gessen, “bad-faith actors.” This describes Donald Trump perfectly, but also Mark Zuckerberg, and Elon Musk, and Bill Gates, and Larry Ellison, and on and on down the list of billionaires. There are some exceptions, but in general, our society selects for these qualities at the top, and once a sociopath is vaulted into a position of power, the first step is to hide past violations of laws and ethical norms, quickly followed by step two, which is to co-opt the media with public relations firms and lawyers. It is a tried-and-true playbook, and that could use much, much more discussion in public discourse.

For all the good this book will do, this is the most exasperating aspect of Frenkel and Kang’s work. Eighteen years into the morass that began with well-intentioned, reasonably secure, functional code on my hard drive, still many journalists, even some very good ones, do not seem to understand what it means for Mark Zuckerberg to be a bad-faith actor. In this, they are not alone. Even having written this book, Kang frequently repeats company statistics and terms (“Zuck” is not your friendly next-door neighbor and never was; “AI” is a largely meaningless buzzword and is not the same as hashing or regular expressions, which are usually the tools being employed to combat content issues) without questioning them. Both authors emphasize the difficulty of the underlying societal problems but forget to mention that they are very much avoidable. No one made Facebook do business in Myanmar, a country where its profits barely register, if they even exist.

As a result of such oversights, the “big lie” goes completely unaddressed. For Trump, the big lie, now so often discussed in the media, used to be that he was a successful businessman, and now is that the 2020 election was “stolen,” when it was not. For Musk, it is that he is the genius eco-friendly founder of Tesla, when he is actually a wasteful fool who did not found the financially unsustainable company and happily charges its electric vehicle fleet with diesel generators. For Zuckerberg, it is that he created Facebook in his Harvard dorm room and that it now has “2 billion” users worldwide, or now “3 billion,” as Kang often recites, when in fact probably about 90% of those users are actually fake accounts invented to scam real people and exploited by Facebook to cement the appearance of perpetual growth, enhance the political influence that commands, and fraudulently elevate the company’s stock price.

Without a doubt, Frenkel and Kang’s best work is filling in the blanks around what happened with Facebook’s investigations into Russian interference. The chapter on the genocide in Myanmar should be required reading for every employee at every technology company. The description of Sheryl Sandberg’s role throughout leaves one wondering what it is that makes her worried about being fired, as opposed to worried about working for the company, and specifically Mark, in the first place. For all of her suave social skills and political pull, Sandberg seems to be utterly blind to reality, and captivated by the radical idiocy spouted by her boss, that the world really is better off being “more open and connected” even as people literally die from it.

This is all to say that we have an enormous problem. To reiterate, two accomplished New York Times journalists decided to take a stab at the narrative around Mark Zuckerberg, putting his face on the cover of their book just behind the word “Ugly”—and still so much remains unsaid, unexamined, or off. If two Times reporters with all of the resources at their disposal, let alone the newspaper itself, are not up to the task of holding a large corporation accountable, it’s really not clear who is left to do the job.

My 2019 report was intended to put forward the long-form article I had waited for a journalist or historian to write for over 15 years, but the terrifying reality is that no one ever did. I’ve spoken to somewhere around 50 journalists over time about Facebook, but save for John Markoff (now mostly retired from the Times), virtually none of them have ever expressed any interest in the full range of documents I have compiled. CBS 60 Minutes, Bloomberg Television, CNBC, and GQ refused to print or air what I had to say. One reporter I spoke with is a name you read in the paper daily now, building her personal brand as leading the charge against Facebook, when only a few years ago she was Mark Zuckerberg’s sassy-mother-figure-slash-cheerleader. There has yet to be any examination of the media’s vital enabling role in this scandal, and there needs to be one starting with the Harvard Crimson and even including The New York Times, just as there must be regarding the media’s coverage of Donald Trump.

In order to understand and effectively treat a disease, or even a computer virus, you have to understand how it works. An Ugly Truth gets us closer to that goal, but it does not get us over the line. One aspect of the disease I have only recently begun to understand is that there exists a God Mode Paradox, which is to say that once a tech billionaire achieves sufficient scale with a product that has more intelligence data than the intelligence agencies, or affords more influence on world affairs in some narrow respect than the President of the United States, that person, that human being, is effectively a god among men with one important difference. In the conventional Judeo-Christian tradition, God is everywhere but he is nowhere; God can be addressed in general but he never responds. In contrast, Jack Dorsey is somewhere. So is Mark Zuckerberg. They are physical beings, with home addresses (or estates), telephone numbers, and e-mail accounts. They can be addressed, specifically. So when a vast swath of humanity is upset by their product for whatever reason, these mortal deities not only hear about it rather directly, but they are expected to respond. This is the hidden cost of God Mode, of being The Creator, of having the bank accounts with a combined balance that requires nine or ten zeroes to write out.

As it turns out, dealing with the life-and-death problems of hundreds of millions of people is actually a pretty heavy burden for your average thirty- or forty-something; the One Ring really does turn out to consume its wearer. It sucks. There’s a tendency to tune out after a while; to want to stop playing that real-life video game you thought, and that everyone told you, you’d won. One option is to stop playing completely, but if actually carried out, that might involve losing billions of dollars to a tanking stock price, not to mention the infinite respect of society. The next best option is to make it simply look like you’re still playing, when in fact, you have totally checked out. That way, the benefits of oligarchy remain yours as you call in occasionally from a remote island somewhere. No one needs to know that you’re not really playing anymore. That is what narrative is for.

This dynamic already pretty well describes Larry Page, Sergey Brin, Jack Dorsey, Elon Musk, and I think increasingly, Mark Zuckerberg. In June 2004, Mark told the Crimson, “I’m just like a little kid. I get bored easily and computers excite me.” He was already bored of Facebook, working on Wirehog, when he said that. In this context, doomed projects out of left field like Oculus make sense. Managing the world’s problems is drudgery when there’s escapism—Facebook’s initial goal—to think about.

Whether I’m right or wrong about this as it applies to Mark almost doesn’t matter. As President Biden said yesterday when asked what his message is about Facebook, “They’re killing people.” This is the predictable result of clinically inflexible and arbitrary dogma, apathy, perverse financial incentives, and decades of regulatory decay. This is what happens when, as a Facebook employee muses in An Ugly Truth, you build “a radio for a future Hitler.” That’s exactly what Mark did, and he still doesn’t care. He doesn’t even care that he’s partially responsible for at least one genocide himself.

Of course, once you admit the ugly truth about how Mark works as a person, you realize that he truly does not care about anything or anyone but himself because he is incapable of doing so and doesn’t actually know much about the world besides. You realize that he will never tire of issuing empty apologies as the company engages in more rapacious and destructive activity. You realize that the media will never stop cravenly vying for access because journalists simply are not trained to work the system the way that lawyers are, it’s the only way most know how to operate, and there are a hundred lawyers willing to lie for every excellent journalist. You realize that so long as everyone pretends that the “billions” of users are real, whether Facebook is split into pieces or not, Mark will never lose his grip on power.

I was hoping that this book would mark the moment we all stop pretending. I’m glad it’s out there. Alas, we still have a long way to go.

363 Views


No comments have been added yet. You can comment on this using the box below.
Name*
E-Mail*
Web Site
Comment*
What is the total when you put 6 and 9 together and add one hundred?
 *

About | Writing | Technology

Copyright © 2001-2023 Aaron Greenspan. All Rights Reserved.