Privacy Versus The 'Tyranny Of The Algorithm'
Health, social media, buying trends, and other data and activity are routinely bartered for profit, but at what cost to the consumer or user?
November 5, 2014
PRIVACY XCHANGE FORUM 2014 -- Scottsdale, Ariz. -- There's the black market trade of patient medical information, and then there's the legal one. Twitter, for example, sells tweets that mention mind and body wellness to some data analytics firms, privacy advocates say.
"The worst thing possible has happened: Information about our minds and bodies is for sale and in millions of databases," Dr. Deborah Peel, founder and chairwoman of Patient Privacy Rights, said here at the Privacy XChange Forum this week.
A recent study looked at more than 500,000 tweets about depression, took 4,000 tweets that mentioned a diagnosis or medication, and followed those Twitter users in order to create an app that predicts suicide. This use of tweets crosses a line, Peel said. "This is far more intrusive" than standard data-gathering from social media.
"There's so much legal trade in medical data now," said Peel, whose organization advocates for patients' rights to control the bartering and use of their electronic health record information and activity. You can't stop the data collection, but there needs to be a "chain of custody" and some boundaries and disclosure for patients.
Medical data is also valuable to criminals, and medical identify theft often takes longer to discover than other types. By the time many victims learn that their medical insurance was used by someone else, their insurance carrier may have dropped them or jacked up their premium, according to Peel. Criminals are after electronic medical records, as well as prescriptions and insurance information to pay for their own medical expenses or to acquire prescription drugs illegally.
Cybercrime today is dominated by theft of payment cards and other personally identifiable information that can be monetized easily and quickly in the online carder community and other nefarious forums. But consumers and users are also at risk of their privacy being abused or inadvertently exposed to attackers by legal data brokers (a.k.a. data analytics companies), whose business is all about the gathering, buying, and selling of information that can be aggregated into intelligence for marketing and business purposes.
The Internet of Things also comes with privacy and security implications. Kevin Ashton, general manager of Belkin International's Conserve and a creator of sensor-based technology used in smart grids and smart meters, says the convenience of emerging network-connected devices also comes with some risk. "It means your privacy is not just at risk when you interact with a device, but your privacy is now at risk when you interact with the world" of devices around you, he said in a keynote address here.
"Privacy is not the default setting," he said. And most free online things come at the price of privacy. "If you're not paying for it, you're the product. The price online is free, but not free of the cost of personal privacy."
David Vladeck, former director of the Federal Trade Commission's Consumer Protection Bureau and now a faculty member of Georgetown University Law Center, says data analytics companies hire data analysts and cognitive psychologists to manipulate consumers into certain purchasing decisions.
Jonathan Mayer, a computer scientist and lawyer from Stanford University, led Vladeck and Jules Polonetsky, executive director and co-chairman of Future of Privacy Forum, in a head-to-head debate at the Privacy Xchange Forum over controversial ways companies like Target and Facebook have used customer and member information.
"It's what I call the tyranny of the algorithm," Vladeck said. "What happens on the Internet is driven by algorithms. There are ethical constraints that need to be debated."
Mayer cited a recent social experiment by Facebook and Cornell University, where Facebook skewed some members' news feeds to show positive-sounding posts to see if it would result in more positive engagement on the social network.
"How we police algorithms and ethics will be the defining moral and ethical issue for this generation," Polonetsky said of the study. "Where and when are these decisions going to be shaped by corporate greed? Government benevolence? We don't even have a clear way to think about it."
Then there's the problem with the quality of the data being gathered by the major data brokers. Vladeck cited a recent Federal Trade Commission report on nine of the largest brokers. The report found that much of the data they are gathering is of poor quality, and some of it is inaccurate. "They get about half of it right and half of it wrong," he said.
Meanwhile, privacy is not exactly high on the corporate budget priority list. A new survey of Fortune 1000 companies by the International Association of Privacy Professionals found that, though 33% plan to hire more staff with privacy skills, the privacy budget today is dwarfed by the average security budget. Privacy gets about $2.4 million per year from the average Fortune 1000 company, while security gets about $4.1 million.
About the Author
You May Also Like