Internet of Things security is anything but a homogenous concept. It is, rather, extremely dependent on the type of products being developed and – in many cases – the sort of regulatory restrictions they are subject to.
Of all the sectors where IoT is proliferating, however, it is arguably medical that is the most fraught. In medical IT, developers have to operate in a minefield of intense regulation, life and death safety issues, and an unusually high (and of course very much unwelcome) degree of scrutiny from hackers.
The hacking of medical data is a popular criminal enterprise, particularly in the US, where just last week UCLA Health hospitals say hackers may have accessed personal information and medical records of as many as 4.5 million patients.
However, while no-one would be overjoyed at the thought of something as intimate as their medical records falling into the hands of digital crooks, it is arguably the patient who has the least to worry about here. The main targets of medical data theft are US insurance companies and the institutions that administer Medicare. In the US, patients usually collect medication and leave it to pharmacists to bill the insurance companies.
A single refill for five months’ medication can easily add up to a few thousand dollars, so the rewards for effective fraud – with hackers posing as pharmacists – are large. Insurance companies, of course, foot the bill, while for those impersonated the results can cost time, stress, and in worst case scenarios a potentially dangerous delay in securing their medication.
It’s just one example of why security around medical data – medical IoT’s bread and butter – has to be so tight.
Someone extremely familiar with the territory is Sridhar Iyengar, one of the founders of AgaMatrix. At AgaMatrix, Iyengar helped develop the first iPhone –connected medical device, a glucose monitor called iBGStar, then a revolutionary innovation for diabetes sufferers.
Nowadays Iyengar’s focus is on Misfit, a wearables company focussing on fitness rather than illness, but he is still deeply involved with issues surrounding IoT, health, and security. In September, he will attend Internet of Things Security conference in Boston as a keynote speaker, where he will draw on his expertise in diabetes to illustrate the wider challenges confronted by developers in the realm of medical IoT.
“The Holy Grail in this world of diabetes is what they call an artificial pancreas,” he says, “meaning that, if you can sense how much glucose is in your blood, you can pump in the right amount of insulin to automatically regulate it. Nobody has made a commercial version of that. Partly because the folks who make a glucose sensor are different to the folks that make the pumps and it has been difficult for the two to cooperate due to trade secrets and the complexities of sharing the liability of devices from different manufacturers that must work in unison. The patients are left to suffer.”
In one famous incident, this frustrating discontinuity was first overcome by a “citizen scientist,” a father who hacked his diabetic child’s separate devices and was able to link the two together. While this was never marketed, it signalled that the race for a commercially viable artificial pancreas was very much on. However, while no-one would resent such intrepid ingenuity on the part of the “citizen scientist,” Iyengar points out that it is also demonstrates the devices in question were very much hackable.
“If somebody hacks into an insulin pump you could kill someone,” he says. “They overdose, they go into a coma, they die. None of these insulin pump manufacturers are going to open source anything: they can’t, because of the deadly consequences of someone hacking it.”
Ultimately, it will prove an interesting challenge to future regulators to establish precisely where to draw the line on issue such as this. Still, the capacity for others to easily take control of (for instance) a connected pacemaker is bound to generate a degree of concern.
Many of these issues are complicated by existing regulations. The US Health Insurance Portability and Accountability Act (HIPAA) requirements state that medical data can only be shared after it has been completely anonymised, which presents something of a paradox to medical IoT, and frequently requires complex architectures and dual databases, with pointers enabling healthcare professionals to blend the two together and actually make sense of them.
Issues like this mean developers can’t rely on industry standard architectures.
“You can’t rely on this network immune system that exists in the consumer software space where many different parties are vigilant in monitoring breaches and bugs because multiple vendors’ code is used by a product,” says Sridhar, picking an apt metaphor. “If you want to develop security related features you kind of have to do it yourself.” In turn this means that, if there are breaches, you have to address them yourself. “It raises this interesting dilemma,” he says. “On the one hand the way that software’s written in the medical field, it’s supposed to be more safe. But in some situations it may backfire and the entire industry suffers.”