Security, privacy issues we need to solve before non-medical implants become pervasive
The cybernetic revolution is happening, and it’s imperative that civil liberties and privacy issues are addressed by system designers, innovators, regulators, and legislators, says James Scott, a Senior Fellow at cybersecurity think tank ICIT (Institute for Critical Infrastructure Technology).
With a recently released paper on the topic of implantable devices, he provided a comprehensive overview of the current situation regarding the use of medical and non-medical implants, and the security and privacy issues that already arose from their use, and are likely to arise in the future.
He expects sophisticated cybernetic implant systems to be more widely deployed in the next decade, and would like to see security-by-design be prioritized while implant devices are still in the inceptive phase. If we fail to do that, he says, there may not be any way to mitigate the onslaught of privacy and security harms poised to disrupt humanity’s potential evolution.
What we need is “responsible regulatory legislation that does not pander to the whims of metadata curators and data brokers and that mandates security-by-design.”
Wearables
Scott views wearables as a gateway technology that desensitized consumers to continual monitoring and collection of biological and behavioral data, which is then used and can be capitalized by the companies that develop them, associated third parties, and possibly cyber-threat actors that manage to get their hands on it.
“Even if the data were only used ‘by authorized parties,’ users could suffer if the data were deterministically employed in decisions determining hiring or credit. For instance, hypothetically, users could be charged more for life insurance based on the data collected from a fitness monitor,” he notes.
And, of course, these are devices that users can choose to take off and stop using at any time. But what happens when they can’t?
Implants
The issue of the security of medical implants has lately achieved a degree of visibility, thanks to many security researchers who, for personal or altruistic reasons, decided to test it.
Medical implants are welcomed by users out of necessity: if you can’t see or hear, wouldn’t you embrace the possibility of an implant that will help you regain a modicum of vision or hearing – security and privacy issues be damned?
And if you need a pacemaker or an insulin pump, you are more likely to prefer one that can be repeatedly reconfigured according to your needs, accessed remotely by medical personnel (if necessary), and able to collect and provide information for better healthcare decisions in the future.
Unfortunately, if security protections are not there or are poor, all those positive things can be exploited by attackers for nefarious purposes. Still, for most, the idea of a cyber attack against their implants is not realistic, and the pros outweigh the cons, so they gladly accept them.
Implantable NFC chips, on the other hand, are not something that most users will currently accept, even if they can be convenient (they can be used to communicate contact information, make Bitcoin payments, unlock mobile devices or doors, starts smart vehicles, and so on).
But, the time may come when they become ubiquitous, and the choice of implanting them or not could have considerable positive or negative (and potentially disastrous) economical consequences. And is it really a choice if the consequence of your refusal of your behavior being tracked through an implantable device is (in the most extreme case) your family starving?
Other issues that need to be addressed
The aforementioned “involuntarily voluntary” adoption of implantable devices is just one of the problems that should be addressed before implants become ubiquitous.
Other issues that should be tackled is the lack of security-by-design, the problem of users often considering cybersecurity features as barriers rather than defenses, and insecure connections.
“Users naively assume that devices will not be targeted because ‘they are too niche’ or ‘an attacker would not gain much;’ however, threat actors have consistently proven that if a device is vulnerable, it is worth exploiting,” Scott noted.
“Attackers are not always logical or correct. A script kiddie might detect an implant device via local traf c snif ng or the Shodan search engine and decide to compro- mise it ‘for fun.’ Some aspiring cybercriminal might detect an IP address and try to infect the associated device with ransomware without bothering to check if it is ‘worth compromising.'”
Also, we should have a guarantee that implants can’t be repurposed in the future without our knowledge and/or approval. And we should know how the data they collect is stored, secured, and used.
“‘Cyborgification’ raises a host of ethical questions and security and privacy concerns that private sector organizations should address before adopting emerging implant technologies,” he notes.