Author of abortive GP data sharing study in England withdrew from program | NHS

The author of a government review into medical data sharing has personally backed out of the aborted plan to share GP health data, a parliamentary committee has heard.
Professor Ben Goldacre, a former Guardian columnist and author of the Goldacre review, has exercised his right to withdraw from the government’s general practice data program for planning and research, he told the science and technology committee of Commons, because he was concerned about the risks. of de-anonymization.
“I withdrew my consent,” Goldacre said, “because I know so much about how this data is used and how people can be anonymized. And also because, in the past, I been in the public eye doing public engagement work, and I have friends whose data was accessed illegally through national datasets, but not health datasets. And I guess that because I work in this area, the risks are very important to me.”
The government should even consider jail time for those who misuse sensitive data, Goldacre suggested, citing the finding that over 30 Metropolitan Police staff caught accessing case notes for the murder of Sarah Everard.
He said: “That’s over 30 people, working in very trusted roles, illegally accessing data outside of the purposes of their job, even in an environment where most or all of them need to know that they are subject to audit.
“You need to block people who misuse data, you need to make sure you catch it when they do, and you need to make sure the penalties are so high that people are afraid to do it.”
Goldacre criticized the idea that “data is the new oil”, arguing that it was more about nuclear material. “When you first access it, it’s not really very useful. It needs to be refined and processed. But after being refined and processed, two things happen. First, it becomes extremely powerful.
“But secondly, it also becomes quite dangerous. Once there is a leak, it cannot be disclosed, and you must work very carefully with it in order to do some good with it, while minimizing the damage.
Women are particularly at risk of de-anonymization, the committee said, because “childbirth is something that shows up in your medical records, and it’s also something that’s typically experienced by co-workers or people on the doorstep. school or so on.
Goldacre said: ‘The classic example that appears in security engineering textbooks, for example, is that you can re-identify Tony Blair in health data, because you know, the approximate dates he had an abnormal heartbeat reset while he was prime minister. And knowing the week in which this occurs, the type of procedure he had on two dates, his approximate age and approximate location, you could probably only find one person with these characteristics. After finding a unique ID for that person, you can then see everything else in their record.
“And women are particularly at risk, in my opinion.”
He added: “As a result, future efforts to share NHS data with private industry should take place in ‘trusted research environments’.” These would make the data accessible to legitimate users without the risk of leakage. “I’m confident that by doing this, not only can you mitigate risk, but you can also start to build public trust.”