Humans have long wished to see through the eyes of other animals — like Bran Stark’s Warg ability, say — but so far the best we’ve achieved is mounting GoPros on them. One Harvard research team, though, has just brought us a step closer to that goal with a prototype noninvasive brain-to-brain interface allowing test subjects to control a rat’s tail with nothing more than their thoughts.
The study, which has been published in the journal PLOS One, used an EEG headset to “read” the human’s brain functions, an intermediary computer to interpret and translate the that data into electronic signals, and a transcranial focused ultrasound device that activates a specific region of an anesthetized rat’s brain, causing the rodent’s tail to twitch. It’s essentially a human-rat mind meld using a computer to translate the respective thoughts. As the research results explain:
Transcranial focused ultrasound (FUS) is capable of modulating the neural activity of specific brain regions, with a potential role as a non-invasive computer-to-brain interface (CBI). In conjunction with the use of brain-to-computer interface (BCI) techniques that translate brain function to generate computer commands, we investigated the feasibility of using the FUS-based CBI to non-invasively establish a functional link between the brains of different species (i.e. human and Sprague-Dawley rat), thus creating a brain-to-brain interface (BBI). The implementation was aimed to non-invasively translate the human volunteer’s intention to stimulate a rat’s brain motor area that is responsible for the tail movement. The volunteer initiated the intention by looking at a strobe light flicker on a computer display, and the degree of synchronisation in the electroencephalographic steady-state-visual-evoked-potentials (SSVEP) with respect to the strobe frequency was analysed using a computer. Increased signal amplitude in the SSVEP, indicating the volunteer’s intention, triggered the delivery of a burst-mode FUS (350 kHz ultrasound frequency, tone burst duration of 0.5 ms, pulse repetition frequency of 1 kHz, given for 300 msec duration) to excite the motor area of an anesthetized rat transcranially. The successful excitation subsequently elicited the tail movement, which was detected by a motion sensor.
The best part: the technique is completely non-invasive. Neither the rat nor the human had to have their heads sawed open or have electrodes embedded into their grey matter. Instead, the humand subject activates the system by watching a specific, computer-generated pattern. The EEG helmet knows to look for the brain waves generated by that specific pattern and, once it finds them, sends a command to the intermediary computer. Initial test have shown the system to be about 94 per cent accurate with just a 1.5 second lag between initial thought and tail twitch.
This is still a very preliminary design so don’t expect to be Doolittling your way through the local zoo anytime in the next decade. Researchers still aren’t even really sure what happens if they reverse the flow of the commands and put a rat at the controls of a human body. Plus there’s the whole ethical dilemma of technological telepathy; whether it’ll be like the cool internal comms from GITS or Big Brother literally getting in your head, that will need to be sorted out. [PLOS One via Extreme Tech]