First Human Mind-Meld Created

human computer interface
Two new researchers have demonstrated the possibility of a human mind-meld, with one man's brain signals directing the other man's hand to move. (Image credit: University of Washington)

One man has controlled the movements of another person by sending brain signals via the Internet.

The demonstration is the first example of two human brains directly interacting.

"The Internet was a way to connect computers, and now it can be a way to connect brains," said University of Washington researcher Andrea Stocco, who participated in the experiment, in a statement. "We want to take the knowledge of a brain and transmit it directly from brain to brain."

The findings have not been published yet in a peer-reviewed journal.

The human-computer interface has grown by leaps and bounds in recent years. Scientists have developed methods for quadriplegics to move robotic limbs, grasping for chocolate or giving high-fives. Other researchers have created mind-melds between two rats, and between a rat and a human. [9 Cyborg Enhancements Available Right Now]

On Aug. 12, University of Washington computer scientist Rajesh Rao donned a cap studded with electrodes that was attached to an electroencephalography machine. The cap read Rao's electrical brain activity.

In another room across campus, Stocco wore a swim cap fitted with a transcranial magnetic coil placed over the motor cortex, the brain region that controls movement. The magnetic coil delivered electrical signals into Stocco's brain.

As Rao played a video game and imagined moving his fingers to aim a cannon on the screen at a virtual target, his brain waves were sent to Stocco's brain via Internet. Stocco immediately noticed his hand involuntarily twitching, with his index finger aiming for the space bar, as if to press the button to fire the cannon.

"It was both exciting and eerie to watch an imagined action from my brain get translated into actual action by another brain," Rao said in a statement.

So far, the technology can only read simple signals and can't translate a person's complicated thought process. But the team hopes later iterations could become more complicated, perhaps aiding the disabled to translate their thoughts into actions.

"This was basically a one-way flow of information from my brain to his. The next step is having a more equitable two-way conversation directly between the two brains," Rao said.

Follow Tia Ghose on Twitter and Google+. Follow LiveScience @livescience, Facebook & Google+. Original article on LiveScience.

Tia Ghose
Managing Editor

Tia is the managing editor and was previously a senior writer for Live Science. Her work has appeared in Scientific American, Wired.com and other outlets. She holds a master's degree in bioengineering from the University of Washington, a graduate certificate in science writing from UC Santa Cruz and a bachelor's degree in mechanical engineering from the University of Texas at Austin. Tia was part of a team at the Milwaukee Journal Sentinel that published the Empty Cradles series on preterm births, which won multiple awards, including the 2012 Casey Medal for Meritorious Journalism.