AI Needs to Protect Data

Although healthcare generates vast amounts of data year after year, most of the data isn’t available because of the need to protect identifiable patient information.

With limited data access, AI models often aren’t as reliable in the real world, which limits how they can be used by healthcare professionals.

To expand AI applications while still protecting patient data, the Department of Energy (DOE) has committed $1 million towards a one year collaborative research project titled PALISADE-X: Privacy Preserving Analysis and Learning in Secure and Distributed Enclaves and Exascale Systems https://palisadex.net.

The goal for the project is to create a secure AI framework to enable healthcare organizations to improve AI models used in biomedicine while keeping sensitive data secure.

DOE’s Argonne National Laboratory is leading the project in collaboration with DOE’s Lawrence Livermore National Laboratory (LNLL), and the University of Chicago, Broad Institute, and Mass General Brigham. Researchers will also collaborate with NIH to create a framework that supports Bridge2AI, a NIH program that’s developing new data sets to be used with AI to improve healthcare.

PALISADE-X will deliver a framework to enable organizations to train AI models using data across multiple organizations, while keeping protected data secure. Once built, the PALISADE-X team will test the system using AI models that predict the severity of COVID-19, using public and private biomedical datasets and also use the framework to predict the risk of developing cardiovascular diseases.

If all is successful, the research will make it possible for organizations to develop and confidently share their AI models with scientists and relevant research groups without the worry of leaking private information.

This project is sponsored by the Office of Advanced Scientific Computing Research within DOE’s Office of Science. Go to https://science.osti.gov/ascr for more information.