Facebook (FB) is no longer allowing the data it has on its nearly 2 billion members to be used for surveillance purposes, such as locating activists and protestors, the company announced on its privacy page.
"We are committed to building a community where people can feel safe making their voices heard," Facebook deputy chief privacy officer Rob Sherman said in the post.
The move to protect its users' data comes only after heavy nudging from the American Civil Liberties Union, which released a report in October detailing how software developers at Geofeedia were using data from Facebook, Instagram (owned by Facebook) and Twitter (TWTR) to help police officers locate participants in the racially charged protests in Baltimore and Ferguson, Mo.
In one email obtained by the ACLU through public record requests, a Geofeedia employee told police officers about its special partnerships with Twitter and Instagram that gives it access to data on their users.
Geofeedia marketed its platform to law enforcement officials and even suggested it could be used to specifically target activists of color, Nicole Ozer, the technology and civil liberties policy director at the ACLU of California, wrote in a post on Medium in September. Geofeedia said its services have been used by more than 500 organizations and 10,000 users.
After the ACLU alerted the three social media platforms last fall, they all cut off data access to Geofeedia. But the ACLU said that Facebook needed to create a public policy that specifically banned developers from using its data for surveillance, which the company did on Monday.
"Today we are adding language to our Facebook and Instagram platform policies to more clearly explain that developers cannot use data obtained from us to provide tools that are used for surveillance," Sherman wrote on Facebook's privacy page.
"So there's kind of a push and pull there where the public is getting two different messages about Facebook," he said. "It will be interesting to see how these conflicting messages ultimately wash out."
Snyder says he mainly worries about the chilling effect that will happen now that protesters have reason to believe their locations could be monitored by the police. "That can have a very serious chilling effect on protesters' ability to exercise their first amendment rights," he said. "It's going to cause some people to have second thoughts."
Cybersecurity expert John Sileo said he doesn't see this development making people more at ease about using Facebook. "I think as people learn more and more about what Facebook has been doing and it comes to light what the company's policy has been, I don't necessarily think they will get more comfortable," he said.
Sileo agreed, saying this was a good first step by Facebook, but that the real question is how strongly Facebook will implement the policy. "Do you have oversight and accountability? Do you have actions to take if someone violates the policy?" he pointed out. "It will be really interesting to see the first case where the policy gets utilized and to see how Facebook takes action on it," he added.
Of course, Facebook will still allow access to public posts for software developers of advertisers who want to monitor what customers are saying about its products, as well as to an organization like the Red Cross, which has used social data to provide better assistance during natural disasters. This policy update simply bars its data from software developers of third parties who want to use it for surveillance.
The fact that Facebook doesn't actually define "surveillance" in its policy update shows that the company is allowing flexibility in its language, Sileo said. "My guess is that Facebook will never take steps that could limit revenue potential," he said.
Last year, Facebook came under fire about concerns that its advertising policy was violating anti-discrimination laws by allowing housing, employment or credit ads to exclude certain races from their target audience, ProPublica first reported. This could have been in violation of the Fair Housing Act of 1968 and the Civil Rights Act of 1964. "There are many nondiscriminatory uses of our ethnic affinity solution in these areas, but we have decided that we can best guard against discrimination by suspending these types of ads," Erin Egan, Facebook's chief privacy officer, said in response through a blog post.
Facebook has faced more broad concerns about the privacy of its users in the past, prompting it to do a privacy checkup on each of its 1.28 billion users worldwide (at the time) in 2014. The company wants people to feel safe sharing photos and updates on its site because that's what keeps the platform alive and growing.
"What we really want is to enable people to share what they want," Mark Zuckerberg, Facebook's co-founder and chief executive, told the New York Times in 2014.
But Facebook is not the only tech giant that has been placed in a tight situation when it comes to the debate on whether such companies should hand over private data to the government when asked.
In 2016, Apple (AAPL) refused to accommodate the U.S. Federal Bureau of Investigation's (FBI) request that it unlock the iPhone of Syed Farook, who had shot 14 people in San Bernardino. Apple CEO Tim Cook said the request was "chilling" considering the company would have to build new software that could then be used to unlock millions of other iPhones. The FBI ended up using a third party to gain access to the smartphone.
E-commerce giant Amazon.com (AMZN) found itself in a similarly sticky situation earlier this year when it received a search warrant asking it to turn over audio recorded by one its Echo devices near an alleged murder in Bentonville, Ark. in November 2015. Amazon fought against the request because it argued that doing so would be a violation of defendant James Andrew Bates' First Amendment rights and would "chill users" from exercising their free speech in their own homes.
While Facebook's case is different from Apple and Amazon's because Facebook was not asked to provide information directly to the government through a legal process, all three companies made decisions from a business perspective, Snyder said. "They have a business interest in making sure their users feel that their information is safe and not being sold off willy nilly," he explained.
For example, while Apple should be praised for not being willing to crack its own code for the FBI, the company didn't want users to think, "Well if Apple can crack the code, then others can too," Snyder said. "Something like that could diminish people's confidence in Apple's products," he said. "These companies all have a real business interest."