NEW YORK (MainStreet) -- From financial background to health information that could impact our ability to acquire insurance or get a job, how much data are we really giving away about ourselves?
The risks take on lots of different shapes: You've got the hackers in one corner, threatening to steal your identity and take advantage of you through illegal means. You've got corporations and governments in another, happily learning everything about you that they can, usually from data whose release you signed off on, whether you knew it or not. And then you've got the in-betweens: the flashlight apps that gather details about you that far transcend turning on the flash of your phone, the companies (and governments) that cross the line. In a largely unregulated new world of information technology, it's often hard for consumers to know where the line is at all, or if it even exists.
When thinking about which parts of your personal information are being shared, how they're being shared and whether that sharing is the default, the core question of "Can I trust this company with my data?" is something consumers need to address, said Chris Babel, CEO of online privacy management company TRUSTe.
Sometimes we choose to let companies collect data about us because of the benefits we receive: While addressing a room at the recent Internet Week conference in New York, Babel said he could see his Google Nest account in real time and confirm that the heat was off in his house. "This app doesn't collect much data, just if the heat is on or off, if you're home, and so on," he said. "The program was bought by Google, but they repeatedly insist they're not merging their data with Google's." Is the customer to believe that? It comes down to how much you trust the company.
Babel also owns a watch that collects data on where he is at all times. He uses it specifically for kite surfing, so the location information is both interesting and "almost critical," he says.To protect himself from danger at sea, the technology is particularly valuable. But that data can be potentially lucrative for a company or person to have. The question in any trade of information for utility is whether the benefits outweigh any compromise of privacy.
Other information could be damning in unforeseen ways that could affect a person's wallet. For example, Babel says, he ran a race in 2012, and tracked his activity levels. It made him nervous that, should he continue to be tracked, Anthem might follow his activity levels and affect his insurance rates over time.
As wearable technology grows and anxiety heightens, authorities are taking a more active confrontation of consumer protection.
FTC Commissioner Julie Brill plainly predicts that data connectivity will be a constant -- leading to the proliferation of personal information.
"Instead of opening a laptop or firing up your smartphone, you'll be online all the time, because all your devices will be connected," she said. "This will have a profound impact on consumers and the data flowing about them." Marketers will be able to target us demographically from our smart fridges, smart cars, smart digital scales--sometimes offering us better rates based on our good behaviors, and perhaps worse rates based on our weaknesses.
Not all data-gathering is bad, Brill says, but the biggest issue is when information is being collected out of context, such as that infamous flashlight app. She believes that the mere act of forcing companies into transparency will create a feedback loop in which they start to think about what consumers would and wouldn't like: "It's the 'Aunt Emily' test. If you have to tell her and she's going to have a problem with it, maybe you shouldn't be doing that."
In one FTC study of 12 wearables and apps, the agency found that the companies were collecting and sending information to as many as 70 third parties, including data regarding pregnancy, ovulation and other deeply sensitive pieces of information about health. "It's one thing if the information is going where the consumer expects it to go, but once we're talking about third parties, we're outside the context of what consumers think is happening," Brill said.
Unbeknownst to many consumers, detailed profiles are being built to categorize them into several demographic categories used by marketers. "Information can be benign for marketing purposes, 'Like do you enjoy gardening, do you like cats?'" Brill said. But these data brokers also create demographic profiles to cover wide swaths of people, a fact she finds more troubling when that leads to categorizing people from a particular ethnic group. Though this arguably can still have good uses -- like targeting and helping a particular demographic at risk for a particular health condition -- the very same information can be used to target these individuals for scams and other problematic reasons. Brill described a case that the FTC brought against a company that used information such as this to target susceptible consumers with payday loans and scam artists.
Brill is also interested in the potentially discriminatory effect of some data gathering efforts. For example, when FICO scores first came out in the late 1990s and early 2000s, Congress required the FTC and other federal agencies to study whether it was having a discriminatory impact, she said. "It took us three years, and we determined it wasn't. That's one reason they're so ubiquitous right now, because they got a clean bill of health." Some companies' data models might have discriminatory effects they might not even be aware of, including the ways that companies use their own proprietary data, she said, like "if you are thinking about which customers you want to push to the head of your customer service line and who to put on hold."
Regulating an industry as new and ever-evolving as data technology is deeply difficult, because we don't yet have well-defined structures to determine what is and isn't privileged information. "As a society, we decided that health information will be sensitive," Brill said. "We created HIPAA back in the '90s, when it was thought that medical information would only be in the hands of providers, so it was a siloed law." But fast forward 20 years, and deeply sensitive information doesn't just appear in a doctor's medical record, but in wearables, apps and web searches when going to WebMD. It's possible that could find way into a data broker's file, or into another piece of information that could be potentially harmful to consumers.
"Whenever you try to bring regulation into high tech, the minute the law is written, it's instantly outdated," Babel said. There's the Children's Online Privacy and Protection Rule, which is supposed to be updated every five years, he said, but last time the updating process took a year and a half in its own right, and it didn't take effect until two years after it was supposed to. Some measure of regulation comes from the industry itself, such as the Digital Advertising Alliance, which has set up a self-regulatory framework.
As businesses conceptualize their own privacy strategies, Babel recommends thinking about "privacy by design," or thinking through the plethora of privacy issues at the onset rather than after the fact, including how companies collect data and what they will do with it.
Beyond how companies use data for themselves and distribute it to third parties, there's also the risk posed by hackers. "Especially with the Internet of Things, security is a huge issue," Brill said. "Hewlett-Packard did a study and found that 90% of devices are collecting personal information and 70% are sending it out unencrypted."
So what's a consumer to do? Terms and conditions don't exactly make light reading, but Brill and the FTC are working on layered permissions so consumers can say yes to certain line items and not others, instead of trudging through legalese. Brill also envisions a comprehensive control panel to manage all of your Internet of Things devices in one place, and to control permissions and understand privacy settings for all of them at once.
In the meantime, make sure you have a strong password, and be conscious about which programs you grant your permissions to. It may take a while yet for regulations and consumer protection policies to catch up, but we can at least protect ourselves.