Scientists have created a dry sensor that opens up new avenues of mind control. Yep, with these dry sensors, humans could soon operate robots directly from their mind. Such dry sensors are able to measure the brain's electrical activity without relying on conductive gels and wet sensors that are currently used to diagnose neurological disorders and to develop brain-computer interfaces, like Elon Musk's Neuralink.
With these 3D-patterned, graphene-based dry sensors that were used in an elastic headband and used with an augmented reality (AR) headset, people were able to get hands-free control of a robot by interpreting brain signals.
The recent research was published in ACS Applied Nano Materials and states that dry sensors are not as effective as wet sensors yet, but this development marks a positive development towards building non-invasive brain-computer interfaces that may be implemented easily.
What would you need to control a robot through these dry sensors? Just a specialised electronic headband and of course, a robot.?The sensor can read the brain electrical signals even with hair and bumps and what-not on your head.
Current ways of reading electrical signals from the brain rely on electroencephalography (EEG), wherein specialised electrodes are placed on the surface of the head or are implanted. EEG helps diagnose neurological disorders, and is also currently being used to develop brain-computer interfaces that can let people control external devices through brain waves - be it a prosthetic limb, a robot, or even a video game.
Also read:?A Startup Beat Elon Musk's Neuralink With Its Brain-Computer Interface Implant
To develop this graphene-based dry sensor, scientists used polycrystalline graphene that could read brain activity without any stickiness. Out of the 3D graphene-coated structures they created, a hexagonal pattern worked the best on curvy and hairy surface at the base of the head called the occipital region.
Also read:?Brain Implant Thinner Than Human Hair Lets People Use Social Media With Their Mind
Eight of these sensors were incorporated into an elastic headband and combined with an augmented reality headset that showed visual cues. The sensors, it showed, could read which cue was being viewed. Then, these signals were turned converted into commands via a computer to control a four-legged robot.
What do you think about this fascinating research? Let us know in the comments below.?For more in the world of?technology?and?science, keep reading?Indiatimes.com.