Humans Won't Have Memory Implants Anytime Soon, But The Technology To Hack Them Already Exists
Plenty of tech experts, when considering the future, talk about how eventually humanity will progress to the point where we integrate ourselves with machines. But while we¡¯re considering how cool that is, no one is thinking about the security risks.
Plenty of tech experts, when considering the future, talk about how eventually humanity will progress to the point where we integrate ourselves with machines. But while we're considering how cool that might be, no one is thinking about the security risks.
That's one thing Kaspersky is investigating in its latest report, in partnership with the University of Oxford's, Functional Neurosurgery Group. It talks about a possible future where people are augmenting their brains with memory implants, either to save digital versions of our memories, or to interface directly with devices we use, or even to borrow their processing power.
The problem is, because of how nascent the idea is, there's no security protocol for it even being talked about. On the other hand, the technology to exploit such hardware already exists, even if the threats they will be a part of are still decades away. So one day, hackers could literally enter our minds to steal data.
Scientists at Oxford have been studying how memories are created in the brain by electrical impulses, as well as how to target, restore, and enhance them using brain stimulation devices. These are called implantable pulse generators (IPGs) or neurostimulators, and they send electrical impulses to specific targets in the brain. In some cases, doctors use these to treat conditions like Parkinson's disease, major depression, and OCD.
The thing is, these devices come with software for both physicians and patients, installed on regular tablets and smartphones that connect via Bluetooth. And that's where the litany of problems begin.
For one thing, the data being transferred is unencrypted, meaning anyone can intercept the connection and steal critical data about a patient, or the hospital. Worse, they can also tamper with the functioning of the device, causing anything from mild annoyance to even pain and paralysis.. And if other patients connect their implants to the same infrastructure, they're compromised too.
Another point is that these devices are geared towards the patient's safety rather than security. That means they need to be easily accessible by a doctor in case of an emergency, which translates to no password, and a software backdoor. Of course, if a hacker were to attempt to crack into such a device, it would be child's play.
And lastly of course, is the usual vulnerability in any tech product; the human element. People are always the weakest link in a security chain, and the University found that some hospitals never changed the default password on their software links to these devices. That means the default password was still in effect, in the hands of the programmers behind the software, and anyone familiar enough with the technology.
The really scary part, is when you realise we're only about five years away (in expert estimates) from scientists being able to electronically record the brain signals that build memories and then enhance or even rewrite them before putting them back into the brain. They believe that, in 10 years, we'll have the first memory boosting implants out on the market. And all the same vulnerabilities that threaten neurostimulators will threaten these devices as well. At least with IPGs, they're less likely to be targeted because an attacker could only derive sick pleasure from it. Memory implants on the other hand could hold a wealth of useful data, powerful incentive to be attacked.
Hackers could steal passwords from our very minds, glean information to blackmail someone, or even 'lock' memories in return for a ransom. All of which are terrifying prospects.