Human Thoughts Control New Robot

Human Thoughts Control New Robot

Scientists have created a way to control a robot with signals from a human brain.

By generating the proper brainwaves—picked up by a cap with electrodes that sense the signals and reflect a person's instructions—scientists can instruct a humanoid robot to move to specific locations and pick up certain objects [video].

The commands are limited to moving forward, picking up one of two objects and bringing it to one of two locations. The researchers have achieved 94 percent accuracy between the thought commands and the robot's movements.

"This is really a proof-of-concept demonstration," said Rajesh Rao, a researcher from the University of Washington who leads the project. "It suggests that one day we might be able to use semi-autonomous robots for such jobs as helping disabled people or performing routine tasks in a person's home."

The person wearing the electrode cap watches the robot's movement on a computer screen through two cameras installed on and above the robot.

When the robot's camera sees the objects that are to be picked up it passes on the information to the user's computer screen. Each object lights up randomly on the computer screen. When a person wants something picked up and it happens to light up, the brain registers surprise and sends this brain activity to the computer and then to the robot as the choice object. The robot then proceeds to pick up the object.

A similar algorithm is used to decide where the robot will go.

"One of the important things about this demonstration is that we're using a 'noisy' brain signal to control the robot," Rao said. "The technique for picking up brain signals is non-invasive, but that means we can only obtain brain signals indirectly from sensors on the surface of the head, and not where they are generated deep in the brain. As a result, the user can only generate high-level commands such as indicating which object to pick up or which location to go to, and the robot needs to be autonomous enough to be able to execute such commands."

In the future, the researchers hope to make the robot more adaptive to the environment by having them carry out more complex commands.

"We want to get to the point of using actual objects that people might want the robot to gather, as well as having the robot move through multiple rooms," Rao said.

The results of this research were presented last week at the Current Trends in Brain-Computer Interfacing meeting in Whistler, B.C.

Sara Goudarzi
Sara Goudarzi is a Brooklyn writer and poet and covers all that piques her curiosity, from cosmology to climate change to the intersection of art and science. Sara holds an M.A. from New York University, Arthur L. Carter Journalism Institute, and an M.S. from Rutgers University. She teaches writing at NYU and is at work on a first novel in which literature is garnished with science.