10 Break-Out Sessions
[timetable id="9" column_title="0" filter_visible="1" filter_multiple="1" event_box_time="0"]
[timetable id="9" column_title="0" filter_visible="1" filter_multiple="1" event_box_time="0"]
Have you met your digital twin? You may have caught glimpses of him or her when looking at personalised advertisements. Digital twins come from the engineering industry, for example in airplane manufacturing. Instead of flying an airplane into a thunderstorm, a flight simulator with a virtual model of the plane is used to assess what happens. A similar idea has arisen in healthcare. Rather than finding the best medication for a patient by trial-and-error, a computer model could test each medication on a digital twin, a virtual model of someone’s genetic data, medical history and daily behaviour.
The virtual traces we leave behind have great potential for healthcare, especially the by-products of our daily life. Most people do not see their doctors often yet use their smartphones every day. The sensors of the smartphone register movements, location, screen use, which are indicators of our circadian rhythm, mobility and sociability. These indicators reveal health problems long before a doctor can detect it: smartphone data can predict complications after operations, the onset of psychosis and quantify patients’ behavior during a pandemic.
Health data can save lives – especially when people avoid care-seeking because of an ongoing pandemic. But is everybody keen to get a digital twin? There is a data sharing paradox: opposed to data sharing when we are asked, yet extensively sharing data in our daily life.
On the one hand, public concern over the consequences of data sharing and linkage has been rising. In 2016, the European Union passed a law that gives citizens more control over their personal data, the GDPR. Various governments started implementing systems for health data sharing that were later abandoned over privacy concerns. Companies with a business model built on linking and sharing data, have come under increasing scrutiny and millions of people changed to messaging apps that do not share data.
On the other hand, people use more digital technology than ever. Our search history, phone contacts and daily movements leave digital traces everywhere, accessible to apps and network providers. From these by-products, elaborate digital twins can be created and sold. While granting one app access to your contact list does not seem a big deal, these digital glimpses can become full-colour pictures when linked to datasets collected for other purposes. Marketing companies use these digital twins to develop personalised advertisements, governments to study mobility, journalists to personally identify secret agents and locate secret military bases or identify visitors to New York strip clubs and find out whether celebrities tip their cab drivers.
Healthcare researchers are caught within the paradox. They want to explore and unlock the potential of our digital twins. If people share by-products of their phone use, it may help us in the fight against COVID or to prevent suicides. If researchers can link health data sources, we can build a learning health system that studies the outcomes of patients-like-you from the past to find out what treatment is best for you. Yet when people get a free choice to share their data, they often opt-out.
Interestingly, healthcare researchers have found that when people are well-informed, they may be more willing to share data, not less. In an innovative way to gauge and gain the public’s trust, researchers invented the ‘citizen’s jury on data sharing’. It is modelled after American trials-by-jury, where a cross-section of the population is invited to decide on the outcome of a lawsuit. The Manchester-based research team invited a cross-section of society for a jury verdict around how the organization should protect and use health data. During three days, the jury heard expert ‘witnesses’ giving their view on data sharing and linkage. After being thoroughly informed, the jury voted on whether health data should be shared with public institutions and companies, and on whether people should opt-in or opt-out. Through the course of the three days, jury approval for some uses of data increased – especially uses with a clear public benefit.
Healthcare researchers are dependent on citizens’ willingness to opt-in to data sharing. Where commercial apps can collect (and sell) data completely unrelated to the services they provide, healthcare researchers have to justify why the data they collect is essential to answer their research question. If it isn’t essential, ethics committees will refuse approval. Researchers need to specify how the data will be used and have to provide options to opt-in or out to secondary use by other parties. Commercial apps can bury that information in lengthy Terms & Conditions full of legal terminology, and then require users to ‘agree and proceed’.
Having two standards that are so fundamentally different – one for commercial use, one for healthcare use of data – undermines public trust. If we gain more control over the digital by-products of our daily life, we may be more willing to reveal parts of it to the right people, for the right purpose. That is important, because our digital twin can save lives.