FDA’s Patient Engagement Advisory Committee (PEAC)

September 10, 2019; Gaithersburg, MD; Highlights - Draft

Executive Highlights

  • The FDA’s Patient Engagement Advisory Committee (PEAC) convened in Maryland to create recommendations for the FDA on cybersecurity in medical devices. Committee members based their recommendations on a morning of testimony from invited experts and members of the public, an afternoon of roundtable discussion, and lengthy question and answer periods.

  • Connectivity and interoperability of diabetes technology were a constant topic of discussion. The meeting featured plenty of discussion on insulin pumps and DIY closed loop systems. Throughout the day, patient advocates reminded the committee of the huge benefits of DIY systems and almost everyone agreed on this point. Still, many were fearful that devices are quickly moving to a level of wireless connectivity that, without proper safeguards, could eventually lead to large-scale, remote attacks. To this end, some attendees emphasized the need for devices to fail “gracefully,” maintaining a level of operation even after part of the system fails.

  • A beach analogy beautifully illustrated the balance between perceived risk vs. probable risk: when people go to the beach, they are often so afraid of shark attacks they forget to wear sun protection. Dr. Nathanael Paul (University of Tennessee) tied this nicely to diabetes: glycemic control and hypoglycemia are “sunburn,” while cybersecurity is “the shark.” The latter feels scarier, but ignores the more common, pressing concerns that connected/interoperable devices can help with

  • The committee outlined a few suggestions for communicating discovered vulnerabilities with the public, though finding the proper balancing act is certainly no easy task. PEAC recommended the FDA create communications in collaboration with all stakeholders (particularly manufacturers and patients), to inform users without causing undue burden or anxiety, and to be redundant without creating fatigue.

Acting FDA Commissioner Dr. Ned Sharpless was on hand in Gaithersburg, MD today to kick off the Patient Engagement Advisory Committee (PEAC) meeting on cybersecurity in medical devices. Though the meeting was focused on cybersecurity in all medical devices, many patients with diabetes were in attendance and diabetes technology was the focus of much of the discussion. See our top highlights below!

1. Automated Insulin Delivery (AID) will likely be one of the first major cybersecurity challenges for the FDA to tackle

While the PEAC meeting was intended to give the FDA suggestions on regulating cybersecurity in any medical device, insulin pumps and automated insulin delivery were clearly on the minds of committee members, FDA representatives, and attendees throughout the meeting. Insulin pumps/AID and pacemakers/defibrillators comprised the vast majority of specific devices discussed.

  • Ms. Karen McChesney (JDRF New England) gave a ten-minute presentation in the testimonial portion of the morning with positive feedback on the FDA’s June safety communication regarding old Medtronic pumps, calling it adequate, clear, and timely. (We’d take issue with “timely,” as the vulnerability was known about for eight years.) According to patients with whom Ms. McChesney talked, the language used in the safety communication was not “panic-inducing” and gave most people enough information to make an informed decision on their own. Importantly, patients also responded very positively to the seven recommendations provided by the FDA in the safety communication to minimize risk of an attack. One patient even reported, “I thought to myself that perhaps all of us using devices to help manage our T1D should do this.”

    • Despite the safety communication, Ms. McChesney emphasized that using these old pumps, especially as part of DIY closed loop systems, was a risk that most people were willing to take. In an informal survey, one patient told her they were suspicious that Medtronic’s communication was in fact about marketing to people to purchase newer devices. Additionally, many patients took comfort in the physical proximity required to actually exploit the pump’s vulnerability.

  • Renowned diabetes hacker Mr. Jay Radcliffe (Thermo Fisher Scientific) compared his experiences hacking his Medtronic pump in 2011 and his Animas Ping in 2016, demonstrating the changes in cybersecurity between that time period. Mr. Radcliffe was one of the first (Ben West, Jay Radcliffe, Barnaby Jack) to successfully reverse engineer an insulin pump about eight years ago – see his talk from SXSW in 2015 for more details. At the time, he published and presented his findings at the BlackHat security conference in 2011, where his talk “became a huge deal” and necessitated a press conference. Medtronic was not prepared for the discovery and Mr. Radcliffe found himself bombarded with emails from patients asking for advice (e.g. “Should I stay on this pump?”, “Should I wrap my pump in aluminum foil?”, etc.). Mr. Radcliffe switched onto Animas Ping in 2016, when he found a similar vulnerability that allowed for unauthorized insulin delivery. By this time, the industry had changed and he was able to share the information in cooperation with Johnson & Johnson in a way that didn’t cause users to panic.

    • Despite the maturation in the industry between 2011 and 2016, Mr. Radcliffe ended his presentation with a warning: Newer insulin pumps connect to smartphones over Bluetooth, which has some security features, but also has proven vulnerabilities. Additionally, as pump control migrates to the smartphone and eventually even to the cloud, the potential for malicious actors to remotely target hundreds or thousands of people at a time becomes a possibility that should be considered, he emphasized.

2. Patient benefits from connectivity, interoperability, and customizability trump any risks (for now)

Cybersecurity risks represented the obvious focus of the meeting, but for diabetes technology, nearly everyone on the committee and in attendance seemed to agree that the benefits outweighed the risks. During open public hearing, Dr. Nathanael Paul (University of Tennessee) took perhaps the strongest position, stating the benefits of DIY, open-source systems were so great, that these systems should never be taken away from patients. In fact, according to Dr. Paul, the “greatest risk” to his glycemic control and quality of life was actually device manufacturers moving away from connectivity, interoperability, and customizability for fear of cybersecurity vulnerabilities. Dr. Paul closed with a specific request for the FDA: for devices that may be recalled or returned for cybersecurity fears, allow manufacturers to re-distribute them to others who are willing to take on that risk. We love that!

  • With an excellent analogy, Dr. Catina O’Leary (Health Literacy Media) underscored how public perceptions of risk are often based more on the perceived magnitude of harm and less on the probability of that harm occurring: when people go to the beach, they are often so afraid of shark attacks, they forget to wear sun protection. During his public statement, Dr. Paul made the comparison more explicit, calling his glycemic control “sunburn” and cybersecurity risks “the shark.” The data support Dr. O’Leary’s comparison. Based on the recent batch of T1D Exchange data, ~75,000 (6%) type 1s in the US had a seizure or loss of consciousness due to hypoglycemia in the prior three months. With DIY systems, there has only been one confirmed case of hospitalization due to severe hypoglycemia (though, anecdotally there may have been others). Of course, countless lives have been saved with remote CGM monitoring and connectivity.

  • In the roundtable discussion period of the meeting, the potential for malicious actors to perform large scale attacks connected medical devices was a common fear. As devices become connected wirelessly (and particularly, the cloud), they can open themselves to potential exploits. The WannaCry ransomware attack, which hit a third of British hospitals, was brought up as one common example, though strictly it was not an attack on a “medical device.” Dr. Kevin Fu (University of Michigan) discussed the relative ease with which his group was able to grab the device state, patient name, date of birth, make, model, and serial number of a commercial pacemaker wirelessly in 2006. Just two years later, they were able to induce a fatal heart rhythm in a pacemaker. To this end, Dr. Paul claimed that one of his students was able to write code that identified all users on NightScout across the world.

    • Addressing the risk of attacks that could affect people at scale, Dr. Fu and other attendees encouraged manufacturers and the FDA to consider systems that fail “gracefully.” Fault-tolerant design would allow systems to continue functioning, likely at a reduced level, even when components of the system fail. Medtronic MiniMed 670G switching to Manual Mode is one example of this, though even that has issues – e.g., Auto Mode will time out if delivery has been stopped for a number of hours (due to hypoglycemia), potentially putting someone at a higher risk going back to their baseline normal basal rates. Fault-tolerant, “safe” failure will be key for the field to understand, especially for AID systems when the CGM may be offline or not functioning.

3. Best practices for communicating discovered vulnerabilities: collaborate with all stakeholders, inform without causing burden, be redundant without creating fatigue

At the end of the day, the committee made specific recommendations to the FDA on how to handle cybersecurity communications and underscored the difficult balancing act the Agency must play. Committee members viewed the FDA as “the only arbiter in the marketplace,” and unanimously agreed that the burden for patching cybersecurity vulnerabilities in medical devices is on manufacturers, rather than patients.

  • In a nod toward a multi-stakeholder approach to security communications, the testimonial portion of the meeting closed with forty minutes of presentations on communications from CDRH, industry, and patients. Presenters discussed the many sources patients look to for information on their devices: healthcare providers, FDA, manufacturers, media, friends, family, and social media. PEAC members emphasized that communicating cybersecurity risks (and later discovered vulnerabilities) would require properly educating all sources to empower patients to make an informed decision.

  • There was some disagreement among committee members on whether vulnerabilities should be immediately disclosed to device users, particularly when the risks may not be clear or when risk mitigation steps are not available. While some found it “paternalistic” to withhold information from patients, others maintained that patients shouldn’t be unduly burdened. The Medtronic pump cybersecurity issue is a case in point – it was not communicated by FDA until eight years after the issue was discovered. Ultimately, it’s clear that this is a multi-dimensional issue, but any communication should always be carefully considered with patient perspectives first – including numbers on the probable risk of harm and consideration of the risks of discontinuing a therapy. One popular sentiment was that the FDA and manufacturer should develop a timeline for a patch when vulnerabilities are discovered and that the timeline should be shared at the initial communication with patients.

    • One PEAC member brought up a benefit of immediately disclosing discoveries before a patch is available: notified patients can serve as a sort of real-world gauge for the risk of the vulnerability. How many patients are being affected and how quickly? How gracefully or catastrophically are devices failing?

  • In order to ensure that communications reach all users, PEAC members encouraged redundancy, though some pushed back with the possibility of alarm fatigue. The committee encouraged regular communication between device manufacturers, the FDA, and patients. In order to reach patients, the communication should come through various media and organizations. Many of the patients Ms. McChesney talked to learned about the Medtronic security communication through the FDA’s device-related Twitter account; while most people preferred the digital format (i.e. through email, published online, etc.), there were people would have also liked a hard copy sent in the mail. Additionally, communications should be written in simple, plain English (or even “visual English,” according to one member encouraging the use of pictures).

4. Patients want access to their data and the algorithms driving their devices

During the public comment period, Dexcom’s Mr. Benjamin West (“representing [his] own opinions”) encouraged the FDA to reveal certain details needed about devices to the public examine the security of a device before it is on the market. Information on manufacturers’ algorithms are generally protected by law, but Mr. West said that this created a less-than-ideal system of “security by obscurity.” According to Mr. West, this type of open, pre-market review could have mitigated much of the confusion when pumps were first hacked. Mr. West is certainly qualified to talk about this, as his foundational work on cracking the Medtronic pump code took years of effort, but ultimately led to the thriving DIY AID ecosystem of today. He also stated his belief that device users should have access to their data, which is usually owned by the device manufacturer. It’s worth noting that PEAC made a similar recommendation to the FDA in November 2018.

  • UCSD Professor Dr. Christian Dameff asked the committee to consider what the informed consent process should look like for cybersecurity. It is a difficult question to answer given the difficulty in quantifying risks without any historical context, lack of similar, “unconnected” versions of devices, lack of cybersecurity literacy in providers, and risk of causing patients undue anxiety. Dr. Dameff suggested a “tiered approach,” in which a provider may broach the subject on an initial visit, but slowly go into more detail as the patient requests over the next few visits before device implantation.

 

--by Albert Cai and Kelly Close