Security and Privacy Challenges for Healthcare
Permanent URI for this collection
1 - 6 of 6
ItemDiscovering mHealth Users’ Privacy and Security Concerns through Social Media Mining( 2023-01-03)The purpose of this study is to explore the various privacy and security concerns conveyed by social media users in relation to the use of mHealth wearable technologies, using Grounded Theory and Text Mining methodologies. The results of the emerging theory explain that the concerns of users can be categorized as relating to data management, data surveillance, data invasion, technical safety, or legal & policy issues. The results show that over time, mHealth users are still concerned about areas such as security breaches, real-time data invasion, surveillance, and how companies use the data collected from these devices. Further, the results from the emotion and sentiment analyses revealed that users generally exhibited anger and fear, and sentiments that were negatively expressed. Theoretically, the results also support the literature on user acceptance of mHealth wearables as influenced by the distrust of companies and their utilization of personally harvested data.
ItemAt What Price? Exploring the Potential and Challenges of Differentially Private Machine Learning for Healthcare( 2023-01-03)The increased generation of data has become one of the main drivers of technological innovation in healthcare. This applies in particular to the adoption of Machine Learning models that are used to generate value from the growing available healthcare data. However, the increased processing of sensitive healthcare data comes with challenges in terms of data privacy. Differential privacy, the method of adding randomness to the data to increase privacy, has gained popularity in the last few years as a possible solution. However, while the addition of randomness increases privacy, it also reduces overall model performance, generating a privacy-utility trade-off. Examining this trade-off, we contribute to the literature by providing an empirical paper that experimentally evaluates two prominent and innovative methods of differentially private Machine Learning on medical image and text data to deepen the understanding of the existing potential and challenges of such methods for the healthcare domain.
ItemPrivacy-Preserving Deep Learning Model for Covid-19 Disease Detection( 2023-01-03)Recent studies demonstrated that X-ray radiography showed higher accuracy than Polymerase Chain Reaction (PCR) testing for COVID-19 detection. Therefore, applying deep learning models to X-rays and radiography images increases the speed and accuracy of determining COVID-19 cases. However, due to Health Insurance Portability and Accountability (HIPAA) compliance, the hospitals were unwilling to share patient data due to privacy concerns. To maintain privacy, we propose using differential private deep learning models to secure the patients' private information. The dataset from the Kaggle website is used to evaluate the designed model for COVID-19 detection. The EfficientNet model version was selected according to its highest test accuracy. The injection of differential privacy constraints into the best-obtained model was made to evaluate performance. The accuracy is noted by varying the trainable layers, privacy loss, and limiting information from each sample. We obtained 84\% accuracy with a privacy loss of 10 during the fine-tuning process.
ItemToward an Effective SETA Program: An Action Research Approach( 2023-01-03)This study uses action research methods at a large US healthcare facility to create a security education training and awareness (SETA) program that is focused on three threats: phishing, unauthorized use of cloud services, and password sharing. The SETA training was based on self-regulation theory. Findings indicate that the training was effective at helping users to identify and avoid all three threats to the environment. Future research directions based on this study are also discussed.
ItemDesign Features and User Perspectives of Patient Confidentiality and Consent Features for Substance Use Disorder( 2023-01-03)Software initiatives that collect, store, and share substance use disorder (SUD) patient data experience significant barriers, including additional patient privacy and confidentiality requirements beyond HIPAA. This paper reports on a design science research effort aimed at managing peer support and clinician interactions with individuals with SUD and sharing their pertinent information with emergency rooms (ERs). Privacy and confidentiality challenges are presented along with the resulting system design that was implemented in the state of Alabama. A usability and security survey was administered to system end users (n=18) to assess information security perceptions. Findings suggest respondents were satisfied with the system, while several users felt that the system was unnecessarily complex due to MFA and auto-log off features. Specific confidentiality considerations and logic mechanisms should be designed into the application functionality at the project outset to facilitate transitions of care benefits for SUD patients.