An extended audio version of this story can be heard on Episode 18 of Work in Progress, Slack’s podcast about the meaning and identity we find in work.
What if someone told you there’s a chance the electronic pacemaker defibrillator implanted in your heart — the one that monitors your heartbeat and revives you in case of sudden heart failure — could be remotely manipulated by a hacker? It sounds far-fetched, but Karen Sandler, Executive Director of the not-for-profit organization Software Freedom Conservancy, assures that, though improbable, it’s not impossible.
Sandler knows this thanks to her own background in computer science and law, and her work as an advocate for open source software. But her research was initially compelled by a personal matter: her own heart.
In her 30s, she discovered she’d inherited a congenital heart defect. The only way to survive, the doctor said, was to get a device implanted in her heart.
As she examined a sample device in the doctor’s office, her doctor wasn’t quite prepared for her line of questioning: “What type of code do these things run?” she asked.
It’s a reasonable question coming from someone like Sandler, given that pacemaker defibrillators are designed to administer a shock if they detect irregularities in a person’s heartbeat. She wanted to know exactly what kind of software the device operated on and how often that software would be updated by the manufacturer. It was, after all, going to live inside her body.
Her doctor didn’t have an answer. Sandler would discover most didn’t.
“I wasn’t convinced about getting it, but I realized that getting the device was unavoidable,” she says. “Exploring the safety and ethics of its software had to be secondary.”
Post-surgery, she dedicated her expertise to conducting a research project on the issue.
Software makers of consumer products will sometimes release a product’s source code to a wider community of professional security researchers who test it for bugs and vulnerabilities and report back. In her research, Sandler found this kind of program is non-existent among medical device manufacturers.
“If the manufacturer experiences some kind of catastrophic failure, like losing their data, or being sabotaged, or even just going out of business, there’s no governing agency in the world that reviews the source code to make sure it’s still effective,” says Sandler. “There isn’t a repository of the code anywhere public. People just live with the devices they have or have to get surgery to have them replaced.”
Sandler finally published her research paper on the potential risks of allowing medical device manufacturers to keep their source code closed off from the public, though she kept her personal story out of it. She posted the report to online patient forums for sufferers of heart defects.
Her research was met with skepticism and several accusations from forum members that she was merely fear mongering. For Sandler, fear of causing unwarranted paranoia was secondary to her beliefs that patients have a right to this kind of knowledge, and to her dream that they would work together to ensure manufacturers institute and uphold ethical policies in making life-saving devices.
These reactions illuminated just how much of this information is kept from patients, but it also emphasized the importance of communicating these issues in a more humane way, free of technical jargon and legalese.
It might have behooved her to mention in her research that she also had a device implanted.
But her explorations through various patient forums also wound up connecting her to a whole other community of people just like her.
“In many cases young, tech savvy people are turning their careers towards medical devices because they’ve had to confront these issues the same way I have.”
Over the years, she’s seen progress that she believes is correlated, however indirectly, with the awareness generated by her organization. One example is former Vice President Dick Cheney’s well-publicized decision to get his own defibrillator blocked from Wi-Fi connections as a matter of national security.
In the fall of 2016, Sandler’s former colleague from the legal research community was hired to consult on a potential vulnerability issue found in one of Johnson & Johnson’s insulin pumps. The company proactively released a warning to diabetic patients letting them know there was a risk their insulin pumps could be exploited by a hacker, even though the risk was low.
This kind of transparency is not typical protocol in the industry, says Sandler, but it’s a big step forward in protecting the interests of patients.
Work in Progress story produced by Dan Misener.