More accurate diagnostics, earlier detection of disease, more personalised treatment, and greater opportunities for preventing ill health. All this could become a reality with the help of AI – but it requires a toolbox of advanced technologies for privacy protection.
Artificial intelligence has the potential to revolutionise healthcare. Today already, algorithms are used to interpret X-ray and ultrasound images, for example. For these AI models to interpret the images as well as or better than an experienced doctor, they need to be trained on large volumes of high-quality data. This is easier said than done, since health data is sensitive information subject to various laws.
“The laws exist to protect individuals’ privacy,” says Rickard Brännvall, senior researcher at RISE. “Perhaps they will need to be amended in the future, but for now we must comply with them. It’s important to work with the needs owners and those who know the law to understand how we can use various advanced privacy protection technologies to fully utilise the potential of the data collected.”
According to Brännvall, there is a mixed toolbox available. One of the tools is federated learning. Simply put, it means that algorithms are trained on data held by different organisations without the data leaving their IT systems:
“Using federated learning, healthcare providers can jointly build an AI model, without having to share their private datasets. Instead, they exchange model updates.”
The process is repeated in many steps, with the end result being a better model than if everyone had trained separately. There is a risk however that the updates leak information that could be traced back to individuals.
This is where the homomorphic encryption tool is especially useful. Homomorphic encryption allows encrypted data to be processed without first being decrypted. In the example of federated learning between healthcare providers, homomorphic encryption provides enhanced protection of healthcare provider data. The combination of these tools enables training of algorithms for use in healthcare, with a significantly reduced risk of sensitive data being compromised.
“We have the opportunity to be involved in building an infrastructure and developing different types of models through federated learning and homomorphic encryption,” says Joakim Börjesson, a unit manager at RISE. “It will benefit the primary use of data, as well as secondary use for innovation and research.”
Based on the right type of data sources, you can predict a change
Börjesson highlights an example of the primary use of data made possible using privacy-protecting technologies. It involves utilising wellness data from our own mobile phones:
“By correlating data collected during almost our entire waking hours with data generated when we visit healthcare, which may only be an hour a year, we can see behavioural changes and deviations. Based on the right type of data sources, you can predict a change. When you require healthcare, or perhaps even before seeking care, underlying problems can be identified based on collected wellness data.
“When we visit a healthcare facility, new measurement values are taken, because doctors don’t have the right conditions to access our wellness data at present. Many, myself included, argue that the data we generate ourselves should be taken into account when making diagnoses.”
If the providers of health apps could make their data available in a secure way, this data could be used to prevent ill health and reduce the burden on healthcare.
RISE runs several projects in this area. In the Sjyst data! (Fair Data) project, RISE helps operators in the business sector to tackle challenges related to data protection and privacy.
“We study the companies’ use cases and support the companies with expertise in how to use different privacy-protecting technologies,” says Brännvall. “This is an example of a meeting space where we discuss so-called close-to-market solutions, and it can serve as a good entry point for companies and industry organisations.”
In another project, Brännvall and his research colleagues have developed a solution that enables secure sharing and analysis of sensitive data from diabetics, service providers, and healthcare.
“By working together with healthcare providers and the business community, RISE can help define platforms with standardised interfaces, which allow you to work with health data and gain access to various tools for privacy protection,” explains Brännvall. “It’s very important to get all the pieces in the right place, including secure management of encryption keys. Otherwise, there is a risk of making sensitive data accessible. RISE can help with both the construction of platforms and by acting as a sounding board, such as through testing in Cyber Range, our testbed for cybersecurity.”
Vid federerat lärande tränas en AI-modell med användarnas data, utan att denna data behöver samlas till en central inlärningspunkt. Det som skickas över är i stället modelluppdateringar, alltså ändringar i modellen. I praktiken kan detta innebära att en AI-modell tränas på att sätta diagnoser baserat på journaldata, utan att journalerna behöver delas.
Homomorf kryptering gör det möjligt att utföra beräkningar på krypterade data. Vi använder kryptering dagligen, när vi skickar data över internet eller sparar ner filer på ett moln. Med homomorf kryptering ges möjligheten att även behandla och göra beräkningar med krypterade data, inte bara överföra och spara.
Vid användning av persondata är det viktigt att beakta principerna om dataminimering och avsiktsbegränsning — enbart data som är nödvändig för den specifika uppgiften ska delas, och data ska inte användas för andra syften än vad som först avsetts. Genom att använda homomorf kryptering under modelluppdateringsskedet i federerat lärande begränsas mängden information som varje part får tillgång till och vad den kan användas till.