Categories
Education Globalization Reading Technology

Latest Read: An Ugly Truth

An Ugly Truth: Inside Facebook’s Battle for Domination by Sheera Frenkel and Cecilia Kang. Sheera is a prize-winning technology reporter based in San Francisco. Cecilia covers technology and regulation.

An Ugly Truth: Inside Facebook’s Battle for Domination by Sheera Frenkel and Cecilia Kang

This story of Facebook’s corruption and lack of a moral compass cannot not be any more clearer than this book. In addition, this is a story of greed, control, egos of company leaders. At first glance you may think this is simply not possible.

In contrast, the authors are tapping into current and former employees. All paint a rather horrific picture of Facebooks’ single focus: profits.

Above all, documents and interviews reveal how company wide lack of action weakened American democracy. Furthermore, ignoring the genocide of Rohingya peoples across Myanmar paints a rather shocking picture of Facebook as a company.

A key point certainly overlooked is how Facebook ‘arrived’ at the beginning of the internet’s gilded age. Therefore, no rules applied, just profits whatever the cost. Profit is certainly the only focus for Mark Zuckerberg. On the other hand, attempting to derail his profit focus results with aggressive and sometimes illegal tactics.

Chapter 7: Company over Country

An acknowledgment by Facebook’s security team that Russian spies hacked user accounts of GOP politicians certainly was not a surprise. Yet, the Russians hacked accounts of children from those GOP candidates. This also was certainly very shocking. Accordingly, all this activity was culminating in an internal report to company leadership:

Hence, Facebook did nothing. No notice to American intelligence services. Ultimately Facebook engineers held a rather unique eye on all user interactions.

Employees views of Mark Zuckerburg and Sheryl Sandburg

Revelations via internal emails and company meetings really weaken the sterling reputations of both Zuckerburg and Sheryl Sandburg. Sheera and Cecilia bring Sandburg’s actions and many faults into view:

And perhaps most damning of all, she was out of touch with the reality of the majority of working women, women without partners at home to share domestic duties or the financial resources to hire nannies, tutors, and cleaners, as Sandberg could.
pg. 255

She had been warned that she would draw criticism for the book. One friend told her she came across as elitist: most women would not be able to relate to her experience as a white, highly educated woman with great wealth. But Sandberg was blindsided by the naysayers.
pg. 256

Over the summer, Zuckerberg and Sandberg had provided testimony to the FTC and states, and their aides thought their messages were airtight. Appearing in video testimony from her home, Sandberg casually kicked off her shoes and folded her legs under her, as she often does in meetings, and spooned the foam off her cappuccino while taking questions…As had been the case with FTC officials a decade earlier, Sandberg’s casualness took some regulators by surprise and showed her misreading the seriousness of their intent. It was as if she were having a chat with friends, one person recalled. For all the talk of her vaunted political instincts, time and again, she revealed herself to be curiously oblivious and overconfident.
pgs. 467-468.

Certainly both are scene as out of touch strategically. However, they both continue holding controls to all key business decisions.

Alex Stamos

In addition, Alex arrived at Facebook in 2015 as their new chief security officer. He quickly documented flaws in Facebook’s architecture that in fact, permitted privacy violations:

Worst of all, Stamos told them, was that despite firing dozens of employees over the last eighteen months for abusing their access, Facebook was doing nothing to solve or prevent what was clearly a systemic problem. In a chart, Stamos highlighted how nearly every month, engineers had exploited the tools designed to give them easy access to data for building new products to violate the privacy of Facebook users and infiltrate their lives. If the public knew about these transgressions, they would be outraged: for over a decade, thousands of Facebook’s engineers had been freely accessing users’ private data. The cases Stamos highlighted were only the ones the company knew about. Hundreds more may have slipped under the radar, he warned.

Zuckerberg was clearly taken aback by the figures Stamos presented, and upset that the issue had not been brought to his attention sooner. “Everybody in engineering management knew there were incidents where employees had inappropriately managed data. Nobody had pulled it into one place, and they were surprised at the volume of engineers who had abused data,” Stamos recalled.

Why hadn’t anyone thought to reassess the system that gave engineers access to user data? Zuckerberg asked. No one in the room pointed out that it was a system that he himself had designed and implemented. Over the years, his employees had suggested alternative ways of structuring data retention, to no avail. “At various times in Facebook’s history there were paths we could have taken, decisions we could have made, which would have limited, or even cut back on, the user data we were collecting,” said one longtime employee, who joined Facebook in 2008 and worked across various teams within the company. “But that was antithetical to Mark’s DNA. Even before we took those options to him, we knew it wasn’t a path he would choose.
pgs. 22-24.

Alex Stamos finds Russian hackers

This example is certainly key to the complete failure to stop Russian spies from hacking Facebook’s platform. Alex certainly understood the risks:

Within a year of joining Facebook, Stamos had unearthed a major problem on the platform. But no one was responding to his reports, and the Russian activity was only escalating. On July 27, 2016, one of Facebook’s security engineers watched from his sofa as cable news broadcast a live press conference of Donald Trump speculating about the DNC hack. Trump wasted no time zeroing in on his opponent, suggesting that Russia might have intercepted some of Clinton’s emails from her private server. “I will tell you this: Russia, if you’re listening, I hope you’re able to find the thirty thousand emails that are missing,” he said. “I think you will probably be rewarded mightily by our press.”

The candidate’s words left the engineer astonished. He opened up his laptop to check if news sites were reporting on what he had just heard. Had a U.S. presidential candidate just actively called for Russia to hack his opponent? The engineer walked to the shower, where he stood under the hot water for a long time. “It just felt,” he said, “so wrong.” All summer his company had witnessed the Russian hacking firsthand, and now Trump was cheering the hackers on to go even further.
pgs. 173-174.

Catching Russian spies in real-time inside Facebook accounts

It is certainly astonishing to discover Facebook engineers had deep access to user data. Then Russia began hacking the company for the 2016 US Presidential Election:

Ned Moran was staring at his laptop watching a conversation unfold between a Russian hacker and an American journalist. A security analyst who worked with a specialized group at Facebook known as the threat intelligence team, Moran had earned a name among cybersecurity professionals for his prodigious knowledge of and experience in studying aggressive, state-backed hackers ….when he did speak, his words were so softly and deliberately spoken that people stopped what they were doing to lean in and listen. His reputation guaranteed that whatever he was about to say was worth hearing.

Moran knew more about foreign-backed hackers than nearly any other cybersecurity professional, but even he had not seen an exchange between hackers and a journalist target before. At times, minutes passed between messages as he waited, along with the American journalist, to see what the Russian would say next. From his perch in Facebook’s DC office, he knew the identities and locations of the two users trading messages.

Ever since he discovered the DCLeaks page earlier that August, he had been obsessively reading all its chats. In any other circumstance, he would not have been spying on the real-time communications of a journalist. But he had been following the Russians across Facebook and had seen when they started up a chat in Facebook Messenger with the journalist.

Moran saw what type of devices they were using and what type of searches they were running within Facebook. He knew that the Facebook page calling itself “DCLeaks” was a Russian asset. Just weeks before, he had uncovered the page while following a trail of bread crumbs left by the Russians as they sought to establish Facebook pages, groups, and accounts before the U.S. elections.

The Russians had created DCLeaks on June 8, and now they were using it to try to lure a U.S. journalist into publishing documents that the same group of Russian hackers had stolen from the Democratic Party.

It was an unprecedented moment of espionage, breaking every norm of cyberwarfare previously established. Moran knew it was significant, and he reported it to his superiors.

Just a few miles away from where Moran sat, U.S. intelligence officers were scrambling to learn as much as possible about the Russian hackers who had infiltrated the Clinton campaign. As experienced as the intelligence officers were, they lacked Facebook’s bird’s-eye view.

Facebook had no playbook for the Russian hackers, no policies for what to do if a rogue account spread stolen emails across the platform to influence U.S. news coverage. The evidence was clear: Russian hackers posing as Americans were setting up Facebook groups and coordinating with one another to manipulate U.S. citizens. But Facebook didn’t have a rule against it.
pgs. 159-164.

Sandburg and Zuckerberg were never compelled to act upon these actions. Yet HBO’s The Perfect Weapon sheds further light on Facebook’s culpability. Nicole Perlroth’s This is How They Tell Me The World Ends hi-lights how Russia used Facebook as a weapon. Yet, another textbook response from Zuckerberg confirms his arrogance on November 11, 2016:

I think the idea that fake news on Facebook––of which it is a very small amount of the content––the idea that influenced the election in any way is a pretty crazy idea.

pg. 188

Really? Yes, thankfully Mark was absolutely roasted for this comment. Could there be any more bloody example of why Facebook is so wrong as a company?

Permitting genocide by looking the other way

The most damning example their drive for profits above humanity was exposed during the genocide of the Rohingya. Above all the failures the company tripped over, Mark’s decision to ignore this holocaust is simply inexcusable. In 2018 a human rights group led by Matthew Smith called upon Facebook for help in tracking genocide attacks:

Most Burmese soldiers had Facebook on their phones, so the company would have records of the locations of army units’ soldiers to match with attacks on Rohingya villages.

Officials at Facebook had recently removed thousands of Facebook accounts and pages secretly run by the Burmese military that were flooded with hate speech, disinformation, and racist tropes. Facebook took down the posts following a front-page story that ran in the October 16 issue of the New York Times.

Smith approached members of Facebook’s policy and legal teams and asked them for their cooperation and to give the human rights groups evidence to prosecute. “Facebook had a lot of actionable information that could be used in prosecution for crimes,” Smith said. “They had information that could connect soldiers to where massacres took place. I told them that that could be used by prosecutors looking to bring people to justice at the International Criminal Court.”

The lawyers at Facebook refused the request, claiming that handing over data could be a violation of privacy terms. Then they got bogged down in legal technicalities. Soldiers could claim they were carrying out orders from military leaders and sue Facebook for exposing them.

Facebook would cooperate, Smith was told, only if the United Nations created a mechanism to investigate human rights crimes. When Smith pointed out that the United Nations had such a system in place, known as the Independent Investigative Mechanism for Myanmar, the Facebook representative looked at him with surprise and asked him to explain. “I was shocked. We were meeting with Facebook to talk about international justice in Myanmar, and they didn’t know the basic frameworks created by the UN.”
pgs. 294-298.

Perhaps my review is excluding all the privacy violations to stay under 1,000 words and also to keep my blood pressure down. But worry not, all of Facebook’s invasions of user privacy are well documented.

In conclusion, Sheera Frenkel and Cecilia Kang are telling an amazing story of lost values and hockey stick profits that harmed the world. After reading a number of powerful books this year, An Ugly Truth is perhaps my book of the year winner.


PBS NewsHour | Is Facebook putting company over country?

MSNBC | Book ‘An Ugly Truth’ Looks At Facebook And Its ‘Battle For Domination’

Yahoo Finance | Cecila Kang discusses her book, ‘An Ugly Truth’

WGN News | “An Ugly Truth: Inside Facebook’s Battle for Domination”

The Realignment | Sheera Frenkel and Cecilia Kang

ABC News (Australia) | New book looks at inner workings of Facebook