Your Brain in 2050: A Mishmash of Biology and Implants?

cyborg human
Brain technology could one day allow humans to naturally control robotic limbs or replace human sight. (Image credit: Lobke Peers | Shutterstock)

NEW YORK — Cathy Hutchinson suffered a brain-stem stroke that left her paralyzed and unable to speak. But 12 years later, a brain implant gave her the ability to move a robotic arm to pick up a bottle and drink from it, using her thoughts alone.

A video of Hutchinson using the robotic arm was shown at a talk here at the World Science Festival Thursday (May 29) entitled "Cells to Silicon: Your Brain in 2050," which explored the brain technology of the future. (You can watch webcasts of the festival talks on Live Science.)

While scientists are a long way from being able to read people's innermost thoughts, brain-interface technology has advanced rapidly. Brain implants are becoming better at taking information from the brain by listening to the whispered conversations of neurons, and using it to control devices in the real world. Other implants can import information into the brain, to restore vision and other senses. [5 Crazy Technologies That Are Revolutionizing Biotech]

As the technology evolves, there may come a day when humans could have prosthetic bodies, or create a computer copy of their minds. These possibilities, however, raise questions about what it means to be human. Even so, first scientists must delve into understanding the brain for which much remains a mystery.

Downloading from the brain

Hutchinson was using the BrainGate system, which was developed by researchers at Brown University, Stanford University, Massachusetts General Hospital and the Providence VA Medical Center.

In the BrainGate system, an M&M-size array of electrodes is implanted in the brain region that controls arm movements and records the tiny electrical signals from neurons so they can be amplified and decoded in order to control a robotic arm, said panelist John Donoghue, a neuroscientist at Brown University.

The state-of-the-art prosthetics require a wire that plugs into the implant through a connector on the skull. Such a system is cumbersome, and may not function well for the patient's entire lifetime for a number of reasons, such as movement of the implant or scar tissue buildup.

What if there were a way to communicate with the brain wirelessly? That's a question panelist Michel Maharbiz, an electrical engineer at the University of California, Berkeley, is exploring. He and his colleagues are developing microscopic sensors — known as neural dust — that could record the electrical signals from neurons. The neural dust system would use ultrasound to provide power and communication to the "dust" particles.

Such a system might allow scientists to record signals from thousands of neurons at once, painting a fuller picture of brain activity.

Uploading to the brain

While scientists are investigating how to make it possible for neurons to speak to the robotic limbs in the outside world, other scientists are working in the opposite direction, developing biomedical implants that can take the outside information — which people would normally sense through their eyes and ears — and bring it into the brain. [Bionic Humans: Top 10 Technologies]

Although they're still far from making futuristic cyborgs with enhanced vision and hearing, scientists have made great progress in developing these so-called neuroprosthetics, which include cochlear implants to restore hearing in deaf people and bionic eyes to reconstruct vision for the blind.

Sheila Nirenberg, another researcher on the panel and a professor of physiology and biophysics at Weill Cornell Medical College, is working on developing artificial retinas to treat blindness in people with retinal damage. The goal is to make a chip that not only transfers outside information to the brain, but does so with the high-definition quality of real retinas.

When light enters the eyes and hits the photoreceptor cells on the retina, the information it carries is converted by these cells into electrical impulses that are then carried to the brain. But each image has a pattern, and as such, the electrical impulses from the retina are in the form of patterns or codes.

Having deciphered the neural codes of the retinal cells, researchers have been able to make a tiny chip that produces and sends to the brain the same electrical pattern that the retina would, while bypassing damaged retinal cells, Nirenberg said. Their approach has been successful in mice, and the researchers are testing the technique on primates before it's used in people.

Closing in on the brain

In the future, there could be a day when the brain could control an entirely robotic body, or perceive the world through artificial senses. It's less likely, however, that scientists could ever faithfully reconstruct the brain in a computer, said panelist Gary Marcus, a cognitive psychologist and science writer at NYU. But if they could, it might not be "you" anymore, Marcus said.

The technology of today, no matter how impressive, is still far from uncovering the mysteries of the brain, the panelists said. Scientists may be able to zero in on one single neuron, and interpret the activity of a large ensemble of neurons, but they still don't know much about what happens in the middle, between the firings of one neuron and the symphony of the brain that makes up humans' conscious experience.

"That middle ground is the great new adventure for brain sciences in the next 50 years," Donoghue said.

Editor's Note: This article has been updated at 6:45 p.m. ET June 3, to change Weill Medical College of Cornell Univeristy to Weill Cornell Medical College.

Follow Tanya Lewis and Bahar Gholipour. Follow us @livescience, Facebook & Google+. Original article on Live Science.

Bahar Gholipour
Staff Writer
Bahar Gholipour is a staff reporter for Live Science covering neuroscience, odd medical cases and all things health. She holds a Master of Science degree in neuroscience from the École Normale Supérieure (ENS) in Paris, and has done graduate-level work in science journalism at the State University of New York at Stony Brook. She has worked as a research assistant at the Laboratoire de Neurosciences Cognitives at ENS.