Advances in artificial intelligence have created new threats to the privacy of health data, a new UC Berkeley study shows.
The study, led by professor Anil Aswani of the Industrial Engineering & Operations Research Department (IEOR) in the College of Engineering and his team, suggests current laws and regulations are nowhere near sufficient to keep an individual’s health status private in the face of AI development. The research was released today on JAMA Network Open.
In the work, which was funded in part by UC Berkeley’s Center for Long-Term Cybersecurity, Aswani shows that by using artificial intelligence, it is possible to identify individuals by learning daily patterns in step data (like that collected by activity trackers, smartwatches and smartphones) and correlating it to demographic data. The mining of two years’ worth of data covering more than 15,000 Americans led to the conclusion that the privacy standards associated with 1996’s HIPAA (Health Insurance Portability and Accountability Act) legislation need to be revisited and reworked.
“We wanted to use NHANES (the National Health and Nutrition Examination Survey) to look at privacy questions because this data is representative of the diverse population in the U.S.,” Aswani says. “The results point out a major problem. If you strip all the identifying information, it doesn’t protect you as much as you’d think. Someone else can come back and put it all back together if they have the right kind of information.”
“In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying health care data from another company and matching the two,” he explains. “Now they would have health care data that’s matched to names, and they could either start selling advertising based on that or they could sell the data to others.”
Aswani makes it clear that the problem isn’t with the devices, but with how the information the devices capture can be misused and potentially sold on the open market.
“I’m not saying we should abandon these devices,” he says. “But we need to be very careful about how we are using this data. We need to protect the information. If we can do that, it’s a net positive.”
Though the study specifically looked at step data, Aswani says the results suggest a broader threat to the privacy of health data. “HIPAA regulations make your health care private, but they don’t cover as much as you think,” he says. “Many groups, like tech companies, are not covered by HIPAA, and only very specific pieces of information are not allowed to be shared by current HIPAA rules. There are companies buying health data. It’s supposed to be anonymous data, but their whole business model is to find a way to attach names to this data and sell it.”
Aswani says he is worried that as advances in AI make it easier for companies to gain access to health data, the temptation for companies to use it in illegal or unethical ways will increase. Employers, mortgage lenders, credit card companies and others could potentially use AI to discriminate based on pregnancy or disability status, for instance.
“Ideally, what I’d like to see from this are new regulations or rules that protect health data,” he says. “But there is actually a big push to even weaken the regulations right now. For instance, the rule-making group for HIPAA has requested comments on increasing data sharing. The risk is that if people are not aware of what’s happening, the rules we have will be weakened. And the fact is the risks of us losing control of our privacy when it comes to health care are actually increasing and not decreasing.”
The Latest on: Health data privacy
via Google News
The Latest on: Health data privacy
- Alain Ghiai, CEO of GlobeX Data Ltd., Interviewed on ICI Radio-Canadaon September 6, 2019 at 5:35 pm
The physician's Gmail account was improperly used to transmit health information in contravention of Information Security & Privacy policies that prohibit conducting clinical business on behalf of AHS ...
- Rising Health Care Data Breaches Are A Wake-Up Call for CLOs, Compliance Officerson September 6, 2019 at 9:11 am
... care providers’ liability from data breaches comes mainly from the U.S. Department of Health and Human Services, which enforces the federal Health Insurance Portability and Accountability Act of ...
- 2019 Denver-Area Home Health-Care Agencieson September 6, 2019 at 3:31 am
- What California’s New Data Privacy Law Means For Youon September 5, 2019 at 6:49 pm
Emory Roane, an attorney at the Privacy Rights Clearinghouse in San Diego ... “It was really only financial information, health information, and certain types of data that had regulations. And now the ...
- AMCA Breach Tally Grows; Other Health Data Breaches Revealedon September 5, 2019 at 8:50 am
Attorney David Holtzman of the security consultancy CynergisTek says the hospital is missing a chance to better educate the wider U.S. healthcare sector about the privacy and security risks posed to ...
- ASHG asserts core genetic data privacy principles for all research and funding arenason September 5, 2019 at 8:03 am
today affirmed the crucial role of genetic and genomic data sharing to advance medicine and health research, and asserted core principles about privacy protections that should apply to all human ...
- Mental Health Sites Sell Your Psychological Data to Advertiserson September 4, 2019 at 1:36 pm
“It is exceedingly difficult for people to seek mental-health information and for example take a ‘depression test’ without countless third parties watching,” Privacy International’s ...
- Mental health websites in Europe found sharing user data for adson September 4, 2019 at 5:23 am
Research by a privacy rights advocacy group has found popular mental health websites in the EU are sharing users’ sensitive personal data with advertisers. Europeans going online to seek support ...
- Stop Mental Health Data Mining of Our Kidson September 3, 2019 at 9:01 pm
Here's the big lie: Wright's group promises that privacy will be "safeguarded ... Prohibited Gun Possessors Beth Baumann Mental health data mining in schools is already happening.
- When Apps Get Your Medical Data, Your Privacy May Go With Iton September 3, 2019 at 12:00 am
Federal privacy protections, which limit how health providers and insurers may use and share medical records, no longer apply once patients transfer their data to consumer apps. The American Medical ...
via Bing News