"I'll have my team get back to you" was the constant refrain from Facebook Inc. (FB) CEO Mark Zuckerberg during the April Congressional hearing that forced the company to defend its data practices. (In case you weren't counting, Zuckerberg promised follow-ups at least 21 times.)
Well, Facebook finally delivered, in the form of a 450-page document released Monday night. In the document, Facebook responded to some 2,000 follow-up questions ranging from the Cambridge Analytica scandal to antitrust issues and a 75-page cross-examination from Sen. Ted Cruz focused on political bias in content moderation.
The flap isn't over for Facebook. Last week, an undisclosed data-sharing partnership with Huawei, a Chinese device maker flagged as a threat by U.S. intelligence, rankled lawmakers. The House Judiciary Committee is also calling representatives of Facebook, Google and Twitter to testify further on political bias later on June 26.
So far, there's no reason to suspect that the ongoing scrutiny will seriously damage Facebook's empire, according to GBH analyst Daniel Ives.
"It's background noise for Facebook and Zuckerberg, as post-Beltway and Brussels meetings it appears regulatory risk is minimal with the company's advertising kingdom thus far showing no signs of weakness post Cambridge," he said.
Still, Facebook's lengthy, oftentimes evasive written statements pulled back the curtain—just a little bit—on the company's position on regulation, its vast data trove, and more. Below are some of Facebook's most intriguing answers, and non-answers.
Is Facebook a Monopoly?
Citing the combined $2.8 trillion market capitalization of the four largest tech companies (Amazon (AMZN) , Alphabet (GOOGL) and Microsoft (MSFT) ) Sen. Dan Sullivan (R-AK) asked if Facebook's power is preventing 'the next Facebook' from emerging. Facebook didn't respond with a yes or no, but said that "the average American uses eight different apps to communicate with their friends and stay in touch with people." Name-checking numerous other services, they argued that users have a wealth of choice in sharing content. "For instance, if you want to share a photo or video, you can choose between Facebook, DailyMotion, Snapchat, YouTube, Flickr, Twitter (TWTR) , Vimeo, Google Photos and Pinterest among many other services." They didn't bother mentioning Instagram, the most popular photo-sharing app at 100 million users, or WhatsApp, both of which Facebook owns.
Data Sharing: Everybody's Doing It
Facebook defended its data sharing practices as a standard feature of the Internet, citing companies such as Google and even the Senate's own websites as a counterpoint to the charge that it knows too much about users. "Most websites and apps share this same information with multiple different third-parties whenever people visit their website or app. For example, the Senate Commerce Committee's website shares information with Google and its affiliate DoubleClick and with the analytics company Webtrends," Facebook wrote. Still, the company's data collection abilities are vast, and direct answers regarding the extent of its reach often read like a deliberate filibuster. Outside of its own website, Facebook admitted it can harvest data from "nearby Wi-Fi access points, beacons, and cell towers... information such as the operating system, hardware and software versions, battery level, signal strength, available storage space, browser type, app and file names and types, and plugins." And that barely scratches the surface. The 'Like' button and Pixel, a web tag used to harvest information for advertisers, were also cited as means of shadowing users around the Internet. "The button appeared on 8.4M websites, the Share button on 931K websites covering 275M webpages, and there were 2.2M Facebook pixels installed on websites," Facebook said.
Content Moderation: We're Only Human
Despite the company's vast resources, it argued that when it comes to managing its own content, it's only human. In response to a lengthy grilling by Sen. Ted Crux (R-TX), Facebook again admitted to "mistakes" in how it deals with issues like offensive content and its controversial Trending Topics section, which the company recently axed. Blaming human error, Facebook also reiterated that it's hiring several thousand more humans to keep an eye on content around the world. "Our content review teams around the world—which grew by 3,000 people last year—work 24 hours a day and in dozens of languages to review these reports. By the end of 2018, we will have doubled the number of people working on safety and security as compared to the beginning of the year—to a total of 20,000," Facebook wrote.
GDPR Rules Coming to America?
The European Union's General Data Protection Act (GDPR) has forced companies that do business in Europe to shift their policies involving data collection. Asked if data protections would be extended to Americans and users outside of Europe, Facebook repeatedly made the case that it's already doing that. "In any case, the controls and settings that Facebook is enabling as part of GDPR are available to people around the world, including settings for controlling our use of face recognition on Facebook and for controlling our ability to use data we collect off Facebook Company Products to target ads," Facebook wrote.