In the 2009 blockbuster “Avatar,” a human remotely controls the body of an alien. It does so by injecting human intelligence into a remotely located, biological body. Although still in the realm of science fiction, researchers are nevertheless developing so-called ‘brain-computer interfaces’ (BCIs) following recent advances in electronics and computing. These technologies can ‘read’ and use human thought to control machines, for example, humanoid robots.
New research has demonstrated the possibility of combining a BCI with a device that transmits information from a computer to a brain, or a so-called ‘computer-to-brain interface’ (CBI). The combination of these devices could be used to establish a functional link between the brains of different species. Now, researchers from the Korea Advanced Institute of Science and Technology (KAIST) have developed a human-turtle interaction system in which a signal originating from a human brain can affect where a turtle moves.
Unlike previous research that has tried to control animal movement by applying invasive methods, most notably in insects, KAIST researchers propose a conceptual system that can guide an animal’s moving path by controlling its instinctive escape behaviour. They chose the turtle because of its cognitive abilities as well as its ability to distinguish different wavelengths of light. Specifically, turtles can recognize a white light source as an open space and so move toward it. They also show specific avoidance behaviour to things that might obstruct their view. Turtles also move toward and away from obstacles in their environment in a predictable manner. It was this instinctive, predictable behaviour that the researchers induced using the BCI.
The entire human-turtle setup is as follows: A head-mounted display (HMD) is combined with a BCI to immerse the human user in the turtle’s environment. The human operator wears the BCI-HMD system, while the turtle has a ‘cyborg system’ — consisting of a camera, a Wi-Fi transceiver, a computer control module and a battery — all mounted on the turtle’s upper shell. Also included on the turtle’s shell is a black semi-cylinder with a slit, which forms the ‘stimulation device’. This can be turned ±36 degrees via the BCI.
The entire process works like this: the human operator receives images from the camera mounted on the turtle. These real-time video images allow the human operator to decide where the turtle should move. The human provides thought commands that are recognized by the wearable BCI system as electroencephalography (EEG) signals. The BCI can distinguish between three mental states: left, right and idle. The left and right commands activate the turtle’s stimulation device via Wi-Fi, turning it so that it obstructs the turtle’s view. This invokes its natural instinct to move toward light and change its direction. Finally, the human acquires updated visual feedback from the camera mounted on the shell and in this way continues to remotely navigate the turtle’s trajectory.
The research demonstrates that the animal guiding scheme via BCI can be used in a variety of environments with turtles moving indoors and outdoors on many different surfaces, like gravel and grass, and tackling a range of obstacles, such as shallow water and trees. This technology could be developed to integrate positioning systems and improved augmented and virtual reality techniques, enabling various applications, including devices for military reconnaissance and surveillance.
Learn more: Controlling turtle motion with human thought
The Latest on: Computer-to-brain interface
via Google News
The Latest on: Computer-to-brain interface
- Brain to brain interface: where are we? on September 6, 2018 at 5:00 pm
Brain to Computer Interfaces (BCI) have progressed quite a bit both in ... It is much more difficult to move into this direction, computer to brain, since it is both difficult to influence what is goi... […]
- A Personal Deep-Dive Into Saving Net Neutrality on December 11, 2017 at 10:25 am
In this episode of MindHack, a NewsBud production, the host describes and shows live footage of a Facebook executive announcing their “brain-to-computer/computer-to-brain” internet interface… Given ho... […]
- Facebook shows off its brain interface research and... wow on April 19, 2017 at 12:35 pm
No, Facebook didn't unveil invasive brain plugs to link you to its growing virtual-reality metaverse, but the company did reveal some of its fascinating research that will serve as the underpinnings f... […]
- Controlling turtle motion with human thought on March 15, 2017 at 5:00 pm
Credit: Makieni / 123rf Korean researchers have developed a technology that can remotely control an animal's movement with human thought. In the 2009 blockbuster ... or a so-called 'computer-to-brain ... […]
- Head-to-head competition: ‘Computer-brain interface’ lets subjects control video game on December 4, 2016 at 4:00 pm
“Our work is actually the ‘reverse’ of a BCI,” Stocco says. “In fact, we call it a CBI, for Computer-to-Brain interface … In a normal BCI the user receives information with his or her senses, but cont... […]
- Top 10 Emerging Technologies 2014 on September 7, 2014 at 7:53 am
Also in 2013, scientists at Harvard University reported that they were able to establish a functional link between the brains of a rat and a human with a non-invasive, computer-to-brain interface. Oth... […]
- I can't believe this is not a real forest but a game engine on March 20, 2014 at 11:05 am
Holodecks? Direct computer-to-brain interfaces? Whatever it is, it will be so immersive that we will be able to forget about the real world. Sounds crazy, but if the "real world" is just an interpreta... […]
- Harvard Is Actually Developing Mind Control Technology on September 29, 2013 at 8:12 am
Here's how the system works: A human wears a standard EEG-based "brain-to-computer interface" (BCI) while the rat wears the reverse "computer-to-brain interface" (CBI,) according to a report from Extr... […]
- Human-to-Rat, Brain-to-Brain Control Achieved By Neuro Scientists – Advance Raises Serious Questions About Its Abuse [VIDEO] on August 8, 2013 at 2:06 am
The implementation consists of steady-state visual evoked potential (SSVEP)-based brain-to-computer interface (BCI: on the left column) and focused ultrasound (FUS)-based computer-to-brain interface ( ... […]
via Bing News