Interview transcript:

Terry Gerton You are doing important work, and I want to talk about this program specifically, the All of Us Research Program at NIH. Tell us what that’s all about.

Charles Summers NIH has the goal of enrolling a million-plus Americans into a research program that is based around genetics and precision medicine. So it’s really the future of medicine, really where things are headed. Instead of treating you based on age or race or any of those things, they would treat you based on genetics. So, if this medicine is specifically for you, it would help you better than just a guess that we get on some things. That’s the overall goal of what NIH wants to do with this research data and research outputs.

Terry Gerton A million people, that’s a lot of folks. And genetic information is pretty personal. What is going on in terms of being able to protect participant data?

Charles Summers For us at the OIG, definitely that’s one of our highest and most important things, protecting the data and securing the data and just lending a hand as far as oversight in pointing where there are places to make improvements or recommendations to improve and secure data to requirements that are out there, as well as if we see something that just doesn’t seem right. So that’s our main goal is to protect that data, because it’s not just their data. It’s our data and it’s our parents’ data and its American data.

Terry Gerton What prompted you then to look at the data protection processes in this research program? Was there a red flag or something that happened?

Charles Summers We’ve done work at NIH since I’ve been working here. We go through risks and things like that, so the All of Us program with genomic data, the Office and National Council has said that this is a national security concern. At NIH we’ve done some work here in the past with the All of Us program, so we checked up on some of those findings. There’s also persistent cybersecurity and national security threats that have increased in challenges across the board, as far as technology goes and those vectors that NIH must lock down to protect this data. So all of those combined, it’s critical operations and programs that we feel are everybody’s concern, as well as partnering with HHS as far as how we’re going to protect the program as a whole.

Terry Gerton It seems like if you’re going to get a million people to volunteer, one of the core guarantees is that their data be protected. What did you find as you got into the audit?

Charles Summers Absolutely, and I do want to emphasize what you said. It is core to the program because if you lose that trust, less people are going to join. Currently, NIH says they have over 600,000 participants already in the program enrolled. The goal is a million, but that’s where they are currently, over 600,000 already enrolled. Some of the key cybersecurity gaps that we found there were inadequate access controls allowed access of the systems from abroad, which was employees. It’s not just somebody trying to get access, but the employees could access that data from abroad. We also found that the system permitted the research participants to download data, even though policies and procedures restricted that and did not allow for that. In both of those access controls, we like to say it’s kind of like there was warning banners that pop up to say, “hey, you can’t do this.” It’s kind of like a door with a scary sign on it that says, “hey, don’t go in here.” But the door is unlocked and you can go through it. So you could close those and go ahead and access this data and download some of this data. And that’s not what we normally would see. We would want to block it, and if there was a need for it, then you would go through the proper procedures to get it, to open those for you. And then the last two things that we found was NIH had failed to inform the Data Research Center, which was the awardee of this that the data for them, they didn’t inform them of the national security concerns around genomic data. So when that Data Research Center awardee is setting up the system and going through checks and balances on security controls, they set it to a certain level based on risk in which they do the risk assessments. But NIH didn’t give them that key piece of information which could have changed the level that maybe that security should be; actually maybe it should be up another notch. So that wasn’t factored in when they were doing risk assessments. Lastly, we found that the Data Research Center was not remediating findings that they find themselves as well as something that may come up from an audit, or something like that, in the time requirements that align with the federal requirements and that were in the contract. So they had some different timeframes … in the system security plan, so those were in opposition of what they had already agreed to as far as the contract.

Terry Gerton You mentioned as you walked through those findings that the genomic database is managed by a contractor, DRC. Is that a normal thing for these kinds of research projects and what does that say to you about the need for better oversight between HHS, NIH and the contractor?

Charles Summers NIH does have the responsibility for oversight for all their contractors, and that is very typical for this type of program in all of these large op-divs. Throughout HHS we leverage contractors and grantees and things like that. The need for so much expertise in these areas, a lot of times that is where you have to go to get that much expertise because you’re needing large numbers of people. So it is very much NIH’s responsibility to have that in the contracts as well as the oversight to ensure that the proper security levels are being maintained. And I think that’s noted in some of the findings. For instance, the finding of not providing them the information that genomic data is of national security concerns — so that would factor into the risk assessment, which is part of the agreement for the DRC to complete. Without that complete information, things may be not at the correct level. Definitely the burden is on NIH for that oversight and ensuring that is taking place.

Terry Gerton So, you made five recommendations in this report. Are any of them, do you think, particularly urgent, and what are you hoping to see NIH do quickly to respond?

Charles Summers We don’t normally rank these findings, but our access controls tend to drift to the top because that is the gate to getting access in the system and restricting access to people you don’t want in the systems. For instance, the first recommendation was to enforce the restrictions for remote users. Very important to us as we have certain countries of concern … so we want to ensure if people are there, we’re using proper security and you have proper approvals. That was one of one of our findings as well. Also blocking unauthorized downloads of data. So both of those around access control, very high, as well as remitting findings timely, because those are weaknesses that you know are there. Timeframes to remediate those, you want them to be as short as possible to close those windows of known vulnerabilities that adversaries may use to try to gain access to the system. NIH did reply to us and respond, very acceptable and concurred with those findings as we describe it. And they have already taken some actions on all these findings, and part of our tracking system is where they’re at, and we have timeframes and they report, here’s where we’re at, here’s what we’re going to do next. That goes back and forth until the completion so we ensure that those are completed.

The post NIH wants 1 million Americans for a study — but first must close critical data security gaps first appeared on Federal News Network.

X