info@ehidc.org

 202-624-3270

Privacy & Cybersecurity

Hide On Website: 
No

How Are We Going To Use Our Health Data For Public Good?

November 18, 2019

How Are We Going To Use Our Health Data For Public Good?

Google is accessing the health data of millions of Americans, supposedly to develop algorithms able to diagnose some medical problems. What it is doing is legal, but has set off a privacy scare and a federal inquiry after an employee of the company wrote an anonymous article in The Guardian highlighting the lack of anonymity and concerns about the possible uses of the data in the future.

Known internally as Project Nightingale, the project involves some 250 employees from Google and from the health giant Ascension: Google has denied any wrongdoings or privacy violations stating that the company is just building a new internal search tool for the Ascension hospital network and that no patient data is being used for Google’s artificial intelligence research. This is an extremely interesting project, but one that requires strict privacy protections. Google’s parent company, Alphabet, which also works on health-related issues through its companies Calico or Verily, and that recently acquired Fitbit, is not the only technology company interested in data collection and its analysis: Apple has been hiring doctors for some time and working with Stanford University to develop macro-studies with almost half a million users using heart rate data provided by their Apple Watches; Amazon also seems to have set its sights on the healthcare market.

These types of studies, which combine machine learning and the obvious expertise of these companies to handle mass health data, are a new frontier in the field of medicine, where studies are much less ambitious and typically involve much lower amounts of data, while offering the possibility of great advances for society. That said, concerns are justified if the studies take place under conditions that allow, by action or omission, the health data of its participants to be exposed or used for purposes other than originally established, or if people are not allowed to opt out, as seems to be the case.

The full Forbes article can be viewed at this link.  

Name: 
Anna

Patients’ and public views and attitudes towards the sharing of health data for research: a narrative review of the empirical evidence

November 17, 2019

Patients’ and public views and attitudes towards the sharing of health data for research: a narrative review of the empirical evidence

International sharing of health data opens the door to the study of the so-called ’Big Data’, which holds great promise for improving patient-centred care. Failure of recent data sharing initiatives indicates an urgent need to invest in societal trust in researchers and institutions. Key to an informed understanding of such a ’social license’ is identifying the views patients and the public may hold with regard to data sharing for health research.

We performed a narrative review of the empirical evidence addressing patients’ and public views and attitudes towards the use of health data for research purposes. The literature databases PubMed (MEDLINE), Embase, Scopus and Google Scholar were searched in April 2019 to identify relevant publications. Patients’ and public attitudes were extracted from selected references and thematically categorised.

Twenty-seven papers were included for review, including both qualitative and quantitative studies and systematic reviews. Results suggest widespread—though conditional—support among patients and the public for data sharing for health research. Despite the fact that participants recognise actual or potential benefits of data research, they expressed concerns about breaches of confidentiality and potential abuses of the data. Studies showed agreement on the following conditions: value, privacy, risk minimisation, data security, transparency, control, information, trust, responsibility and accountability.

Our results indicate that a social license for data-intensive health research cannot simply be presumed. To strengthen the social license, identified conditions ought to be operationalised in a governance framework that incorporates the diverse patient and public values, needs and interests.

The full article can be downloaded below.  

Name: 
Anna

Public Concern About Monitoring Twitter Users and Their Conversations to Recruit for Clinical Trials: Survey Study

November 03, 2019

Public Concern About Monitoring Twitter Users and Their Conversations to Recruit for Clinical Trials: Survey Study

Social networks such as Twitter offer the clinical research community a novel opportunity for engaging potential study participants based on user activity data. However, the availability of public social media data has led to new ethical challenges about respecting user privacy and the appropriateness of monitoring social media for clinical trial recruitment. Researchers have voiced the need for involving users’ perspectives in the development of ethical norms and regulations.

This study examined the attitudes and level of concern among Twitter users and nonusers about using Twitter for monitoring social media users and their conversations to recruit potential clinical trial participants.

We used two online methods for recruiting study participants: the open survey was (1) advertised on Twitter between May 23 and June 8, 2017, and (2) deployed on TurkPrime, a crowdsourcing data acquisition platform, between May 23 and June 8, 2017. Eligible participants were adults, 18 years of age or older, who lived in the United States. People with and without Twitter accounts were included in the study.

While nearly half the respondents—on Twitter (94/603, 15.6%) and on TurkPrime (509/603, 84.4%)—indicated agreement that social media monitoring constitutes a form of eavesdropping that invades their privacy, over one-third disagreed and nearly 1 in 5 had no opinion. A chi-square test revealed a positive relationship between respondents’ general privacy concern and their average concern about Internet research (P<.005). We found associations between respondents’ Twitter literacy and their concerns about the ability for researchers to monitor their Twitter activity for clinical trial recruitment (P=.001) and whether they consider Twitter monitoring for clinical trial recruitment as eavesdropping (P<.001) and an invasion of privacy (P=.003). As Twitter literacy increased, so did people’s concerns about researchers monitoring Twitter activity. Our data support the previously suggested use of the nonexceptionalist methodology for assessing social media in research, insofar as social media-based recruitment does not need to be considered exceptional and, for most, it is considered preferable to traditional in-person interventions at physical clinics. The expressed attitudes were highly contextual, depending on factors such as the type of disease or health topic (eg, HIV/AIDS vs obesity vs smoking), the entity or person monitoring users on Twitter, and the monitored information.

The data and findings from this study contribute to the critical dialogue with the public about the use of social media in clinical research. The findings suggest that most users do not think that monitoring Twitter for clinical trial recruitment constitutes inappropriate surveillance or a violation of privacy. However, researchers should remain mindful that some participants might find social media monitoring problematic when connected with certain conditions or health topics. Further research should isolate factors that influence the level of concern among social media users across platforms and populations and inform the development of more clear and consistent guidelines.

The full article can be downloaded below.  

Name: 
Anna

Long-term integrity protection of genomic data

November 03, 2019

Long-term integrity protection of genomic data

Genomic data is crucial in the understanding of many diseases and for the guidance of medical treatments. Pharmacogenomics and cancer genomics are just two areas in precision medicine of rapidly growing utilization. At the same time, whole-genome sequencing costs are plummeting below $ 1000, meaning that a rapid growth in full-genome data storage requirements is foreseeable. While privacy protection of genomic data is receiving growing attention, integrity protection of this long-lived and highly sensitive data much less so. We consider a scenario inspired by future pharmacogenomics, in which a patient’s genome data is stored over a long time period while random parts of it are periodically accessed by authorized parties such as doctors and clinicians. A protection scheme is described that preserves integrity of the genomic data in that scenario over a time horizon of 100 years. During such a long time period, cryptographic schemes will potentially break and therefore our scheme allows to update the integrity protection. Furthermore, integrity of parts of the genomic data can be verified without compromising the privacy of the remaining data. Finally, a performance evaluation and cost projection shows that privacy-preserving long-term integrity protection of genomic data is resource demanding, but in reach of current and future hardware technology and has negligible costs of storage.

The full article can be downloaded below.  

Name: 
Anna

Building a Secure Biomedical Data Sharing Decentralized App: Tutorial

October 26, 2019

Building a Secure Biomedical Data Sharing Decentralized App: Tutorial

Decentralized apps (DApps) are computer programs that run on a distributed computing system, such as a blockchain network. Unlike the client-server architecture that powers most internet apps, DApps that are integrated with a blockchain network can execute app logic that is guaranteed to be transparent, verifiable, and immutable. This new paradigm has a number of unique properties that are attractive to the biomedical and health care communities. However, instructional resources are scarcely available for biomedical software developers to begin building DApps on a blockchain. Such apps require new ways of thinking about how to build, maintain, and deploy software. This tutorial serves as a complete working prototype of a DApp, motivated by a real use case in biomedical research requiring data privacy. We describe the architecture of a DApp, the implementation details of a smart contract, a sample iPhone operating system (iOS) DApp that interacts with the smart contract, and the development tools and libraries necessary to get started. The code necessary to recreate the app is publicly available.

The full article can be downloaded below.  

Name: 
Anna

HealthGuard: A Machine Learning-Based Security Framework for Smart Healthcare Systems

September 29, 2019

HealthGuard: A Machine Learning-Based Security Framework for Smart Healthcare Systems

The integration of Internet-of-Things and pervasive computing in medical devices have made the modern healthcare system ”smart.” Today, the function of the healthcare system is not limited to treat the patients only. With the help of implantable medical devices and wearables, Smart Healthcare System (SHS) can continuously monitor different vital signs of a patient and automatically detect and prevent critical medical conditions. However, these increasing functionalities of SHS raise several security concerns and attackers can exploit the SHS in numerous ways: they can impede normal function of the SHS, inject false data to change vital signs, and tamper a medical device to change the outcome of a medical emergency. In this paper, we propose HealthGuard, a novel machine learning-based security framework to detect malicious activities in a SHS. HealthGuard observes the vital signs of different connected devices of a SHS and correlates the vitals to understand the changes in body functions of the patient to distinguish benign and malicious activities. HealthGuard utilizes four different machine learningbased detection techniques (Artificial Neural Network, Decision Tree, Random Forest, k-Nearest Neighbor) to detect malicious activities in a SHS. We trained HealthGuard with data collected for eight different smart medical devices for twelve benign events including seven normal user activities and five diseaseaffected events. Furthermore, we evaluated the performance of HealthGuard against three different malicious threats. Our extensive evaluation shows that HealthGuard is an effective security framework for SHS with an accuracy of 91% and an F1 score of 90%. 

The full article can be downloaded below.  

Name: 
Anna

A review on intelligent wearables: Uses and risks

September 21, 2019

A review on intelligent wearables: Uses and risks

Intelligent wearable technology is becoming very popular in application fields such as clinical medicine and healthcare, health management, workplaces, education, and scientific research. Using the four-element model of technological behavior, the first part of this review briefly introduces issues related to the uses of intelligent wearables, including the technologies (i.e., what kind of intelligent wearables are used?), the users (i.e., who use intelligent wearables?), the activities involving the technologies (i.e., in what activities or fields intelligent wearables are used?), and the effects of technology usages (i.e., what benefits intelligent wearables bring?). The second part of this review focuses on the risks of using intelligent wearables. This part summarized five common risks (i.e., privacy risks, safety risks, performance risks, social and psychological risks, and other risks) in the use of intelligent wearables. The review ends with a discussion of future research.

The full article can be downloaded below.  

Name: 
Anna

Opportunities and Challenges in Interpreting and Sharing Personal Genomes

September 01, 2019

Opportunities and Challenges in Interpreting and Sharing Personal Genomes

The 2019 “Personal Genomes: Accessing, Sharing and Interpretation” conference (Hinxton, UK, 11–12 April 2019) brought together geneticists, bioinformaticians, clinicians and ethicists to promote openness and ethical sharing of personal genome data while protecting the privacy of individuals. The talks at the conference focused on two main topic areas: (1) Technologies and Applications, with emphasis on personal genomics in the context of healthcare. The issues discussed ranged from new technologies impacting and enabling the field, to the interpretation of personal genomes and their integration with other data types. There was particular emphasis and wide discussion on the use of polygenic risk scores to inform precision medicine. (2) Ethical, Legal, and Social Implications, with emphasis on genetic privacy: How to maintain it, how much privacy is possible, and how much privacy do people want? Talks covered the full range of genomic data visibility, from open access to tight control, and diverse aspects of balancing benefits and risks, data ownership, working with individuals and with populations, and promoting citizen science. Both topic areas were illustrated and informed by reports from a wide variety of ongoing projects, which highlighted the need to diversify global databases by increasing representation of understudied populations.

The full conference report can be downloaded below.  

Name: 
Anna

Health Law Advisor: Free the Data! … Better Think Twice … Legal Issues Regarding Data Sharing and Secondary Data Use

August 29, 2019

Blog from Patricia Wagner & Alaap Shah of Epstein Becker Green.

Data is king! A robust privacy, security and data governance approach to data management can position an organization to avoid pitfalls and maximize value from its data strategy. In fact, some of the largest market cap rms have successfully harnessed the power of data for quite some time. To illustrate this point, the Economist boldly published an article entitled “The world’s most valuable resource is no longer oil, but data.” This makes complete sense when research shows that 90% of all data today was created in the last 8/23/2019 Free the Data! ... Better Think Twice ... Legal Issues Regarding Data Sharing and Secondary Data Use | Health Law Advisor https://www.healthlawadvisor.com/2019/02/04/free-the-data-better-think-t... 2/4 two years, which translates to approximately 2.5 quintillion bytes of data per day.

Download to read more...

Health Law Advisor: Follow the Leader: California Paves the Way for Other States to Strengthen Privacy Protections

August 29, 2019

March 7, 2019 blog by Daniel Kim & Alaap Shah of Epstein Becker Green.

Consumer privacy protection continues to be top of mind for regulators given a climate where technology companies face scrutiny for lax data governance and poor data stewardship. Less than a year ago, California passed the California Consumer Privacy Act (CCPA) of 2018, to strengthen its privacy laws. In many regards, the CCPA served as a watershed moment in privacy due to its breadth and similarities to the E.U. sweeping General Data Protection Regulation (GDPR) law.

Download to read more..