11 Mar 2013

Some Recent Scientific Developments in Brain Machine Interface (for Robotic Prosthesis), Neuroplasticity, Neurocomputation, and Whole Brain Emulation

summary by Corry Shores
[
Search Blog Here. Index-tags are found on the bottom of the left column.]
[Central Entry Directory]
[Posthumanism Entry Directory]

 

[All boldface is my own]




Some (mostly) Recent Scientific Developments in Brain Machine Interface (for Robotic Prosthesis), Neuroplasticity, Neurocomputation, and Whole Brain Emulation



Brief Summary: New scientific advances support the posthuman vision of robotically enhanced and reconstructed post-humans. Neuroplasticity and brain machine interface (also brain computer interface) empower brains to control robotic parts just like biological ones. Whole brain emulation and cognitive prosthetics could allow brain implanted chips to replace or enhance our brain functioning, perhaps even completely “uploading” our brain onto a computerized simulation. Progressive replacement of bodily and neural parts with robotic and computerized ones could enable one to make a complete and continuous transition from human to robot.

 




"Brain" In A Dish Acts As Autopilot Living Computer

Explore: Research at the University of Florida

Spring 2005 Vol. 10 No.1

http://www.research.ufl.edu/publications/explore/v10n1/extract2.html


Thomas DeMarse has created a miniature living brain on a dish. He placed neurons that grew connections to form a network, and it can perform tasks in a virtual world.

“It’s essentially a dish with 60 electrodes arranged in a grid at the bottom,” DeMarse said. “Over that we put the living cortical neurons from rats, which rapidly begin to reconnect themselves, forming a living neural network — a brain.”

The brain and the simulator establish a two-way connection, similar to how neurons receive and interpret signals from each other to control our bodies. By observing how the nerve cells interact with the simulator, scientists can decode how a neural network establishes connections and begins to compute, DeMarse said.

When DeMarse first puts the neurons in the dish, they look like little more than grains of sand sprinkled in water. However, individual neurons soon begin to extend microscopic lines toward each other, making connections that represent neural processes. “You see one extend a process, pull it back, extend it out — and it may do that a couple of times, just sampling who’s next to it, until over time the connectivity starts to establish itself,” he said. “(The brain is) getting its network to the point where it’s a live computation device.”

To control the simulated aircraft, the neurons first receive information from the computer about flight conditions: whether the plane is flying straight and level or is tilted to the left or to the right. The neurons then analyze the data and respond by sending signals to the plane’s controls. Those signals alter the flight path and new information is sent to the neurons, creating a feedback system.

“Initially when we hook up this brain to a flight simulator, it doesn’t know how to control the aircraft,” DeMarse
said. “So you hook it up and the aircraft simply drifts randomly. And as the data come in, it slowly modifies the (neural) network so over time, the network gradually learns to fly the aircraft.”

Although the brain currently is able to control the pitch and roll of the simulated aircraft in weather conditions ranging from blue skies to stormy, hurricane-force winds, the underlying goal is a more fundamental understanding of how neurons interact as a network, DeMarse said.

“There’s a lot of data out there that will tell you that the computation that’s going on here isn’t based on just one neuron. The computational property is actually an emergent property of hundreds or thousands of neurons cooperating to produce the amazing processing power of the brain.”



Monkeys Think, Moving Artificial Arm as Own

By Benedict Carey

New York Times

Published: May 29, 2008

http://www.nytimes.com/2008/05/29/science/29brain.html?_r=0


Two monkeys with brain-controlled prosthetics successfully use their robotic arms to reach for food and feed it to themselves.

Two monkeys with tiny sensors in their brains have learned to control a mechanical arm with just their thoughts, using it to reach for and grab food and even to adjust for the size and stickiness of morsels when necessary, scientists reported on Wednesday.

The report, released online by the journal Nature, is the most striking demonstration to date of brain-machine interface technology. Scientists expect that technology will eventually allow people with spinal cord injuries and other paralyzing conditions to gain more control over their lives.


ALSO reported at MIT Technology Review

Monkey Thinks Robot into Action

A monkey is able to feed itself with a robotic arm.

    By Emily Singer

MIT Technology Review

May 28, 2008

http://www.technologyreview.com/news/410189/monkey-thinks-robot-into-action/


It’s the first time a monkey–or a human–is directly, with their brain, controlling a real prosthetic arm,” says Krishna Shenoy, a neuroscientist at Stanford University who was not involved in the research. (Singer)




TED

Henry Markram: A brain in a supercomputer
Filmed Jul 2009 • Posted Oct 2009 • TEDGlobal 2009

http://www.ted.com/talks/henry_markram_supercomputing_the_brain_s_secrets.html


Supercomputers are being used to simulate brain activity. They began with animals and are moving up to human brain. They first catalogued neurons and described their interactive behavior. They can simulate human neuronal activity on a small scale. Also see:

http://en.wikipedia.org/wiki/Blue_Brain_Project




Rat memory under computer simulation

Eric Mankin

Public release date: 17-Jun-2011

Restoring memory, repairing damaged brains
Biomedical engineers analyze -- and duplicate -- the neural mechanism of learning in rats

Eureka Alert

http://www.eurekalert.org/pub_releases/2011-06/uosc-rmr061211.php


Scientists have developed a way to turn memories on and off—literally with the flip of a switch.

Using an electronic system that duplicates the neural signals associated with memory, they managed to replicate the brain function in rats associated with long-term learned behavior, even when the rats had been drugged to forget.

"Flip the switch on, and the rats remember. Flip it off, and the rats forget," said Theodore Berger of the USC Viterbi School of Engineering's Department of Biomedical Engineering.” (Mankin)



ALSO reported in The New York Times

Memory Implant Gives Rats Sharper Recollection

By Benedict Carey

The New York Times

Published: June 17, 2011

http://www.nytimes.com/2011/06/17/science/17memory.html?_r=0


The authors said that with wireless technology and computer chips, the system could be easily fitted for human use.
(Carey)




New horizons in auditory prostheses

Zeng, Fan-Gang PhD

Hearing Journal

November 2011 - Volume 64 - Issue 11 - pp 24,26,27

http://journals.lww.com/thehearingjournal/Fulltext/2011/11000/New_horizons_in_auditory_prostheses.5.aspx


There are many recent developments in cochlear implants.

All contemporary cochlear implants use similar signal processing that extracts temporal envelope information from a limited number of spectral bands, and delivers these envelopes successively to 12-22 electrodes implanted in the cochlea. As a result, these implants produce similarly good speech performance: 70-80 percent sentence recognition in quiet, which allows an average cochlear implant user to carry on a conversation over the telephone. Interestingly, though, sentence recognition in quiet has essentially remained at this same level since 1994. (Figure 1.)




Active tactile exploration using a brain–machine–brain interface

Joseph E. O’Doherty, Mikhail A. Lebedev, Peter J. Ifft, Katie Z. Zhuang, Solaiman Shokur, Hannes Bleuler & Miguel A. L. Nicolelis

Nature 479, 228–231 (10 November 2011)

http://www.nature.com/nature/journal/v479/n7372/full/nature10489.html


Monkeys operating virtual robotic arms had their brains given touch stimulations.



ALSO reported by The Huffington Post

Is It Possible To Feel Textures Using Just Brain Waves? New Study Shows How

The Huffington Post

Amanda Chan Posted: 10/07/11 11:49 AM ET

http://www.huffingtonpost.com/2011/10/07/brain-touch-texture-feelings-senses_n_996844.html

 

This is basically one of the holy grails of this field," study researcher Miguel Nicolelis, a neurobiology professor and co-director of the Duke Center for Neuroengineering, told Bloomberg. "No other study has provided an artificial sensory channel directly to the brain of animals. This is really needed to restore in patients that have a spinal cord injury not only their mobility, but their sense of touch." (Chan)




Going mental: Study highlights brain’s flexibility, gives hope for natural-feeling neuroprosthetics

By Sarah Yang, Media Relations

UC Berkeley News Center

March 4, 2012

http://newscenter.berkeley.edu/2012/03/04/brain-flexibility-gives-hope-for-neuroprosthetics/


Researchers at the University of California, Berkeley have shown that neurons used for physical tasks can be retrained for brain machine interface usage. This shows that neuro-prosthetics can feel natural.

“Their new study, to be published Sunday, March 4, in the advanced online publication of the journal Nature, shows that through a process called plasticity, parts of the brain can be trained to do something they normally do not do. The same brain circuits employed in the learning of motor skills, such as riding a bike or driving a car, can be used to master purely mental tasks, even arbitrary ones.

[…]

To clarify these issues, the scientists set up a clever experiment in which rats could only complete an abstract task if overt physical movement was not involved. The researchers decoupled the role of the targeted motor neurons needed for whisker twitching with the action necessary to get a food reward.

The rats were fitted with a brain-machine interface that converted brain waves into auditory tones. To get the food reward – either sugar-water or pellets – the rats had to modulate their thought patterns within a specific brain circuit in order to raise or lower the pitch of the signal.

Auditory feedback was given to the rats so that they learned to associate specific thought patterns with a specific pitch. Over a period of just two weeks, the rats quickly learned that to get food pellets, they would have to create a high-pitched tone, and to get sugar water, they needed to create a low-pitched tone.

If the group of neurons in the task were used for their typical function – whisker twitching – there would be no pitch change to the auditory tone, and no food reward.

“This is something that is not natural for the rats,” said Costa. “This tells us that it’s possible to craft a prosthesis in ways that do not have to mimic the anatomy of the natural motor system in order to work.”





Simulated brain scores top test marks

First computer model to produce complex behaviour performs almost as well as humans at simple number tasks.

    Ed Yong

Nature | News

29 November 2012

http://www.nature.com/news/simulated-brain-scores-top-test-marks-1.11914


A computer simulated brain with 2.5 million virtual neurons can perform simple mathematical calculations.




Mind-controlled robot arms show promise

People with tetraplegia use their thoughts to control robotic aids.

    Alison Abbott

Nature | News

16 May 2012

http://www.nature.com/news/mind-controlled-robot-arms-show-promise-1.10652

[AP Report here]

Two tetraplegics use brain machine interface to gain some lost abilities.

Neurosurgeons implanted tiny recording devices containing almost 100 hair-thin electrodes in the motor cortex of their brains, to record the neuronal signals associated with intention to move.” (Abbott)

Cathy can use her thoughts to direct the motion of a robotic arm. She is able to direct it to grab a bottle of coffee and lift it to her lips. Bob as well operates the arm successfully. There is also a subject who operates a computer cursor using this interface, as if operating a computer mouse. The subjects used the BrainGate2 brain implant system [image below from the BrainGate wiki page.]

Braingate model wiki
(Thanks wiki)




Paralyzed Man Uses Thoughts Alone to Control Robot Arm, Touch Friend's Hand, After Seven Years

Science Daily

Feb. 8, 2013 —

http://www.sciencedaily.com/releases/2013/02/130208124818.htm

Based on this journal article

Wei Wang et al.

An Electrocorticographic Brain Interface in an Individual with Tetraplegia. PLoS ONE, 2013; 8 (2): e55344

http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0055344


Researchers at the University of Pittsburgh School of Medicine and UPMC describe in PLoS ONE how an electrode array sitting on top of the brain enabled a 30-year-old paralyzed man to control the movement of a character on a computer screen in three dimensions with just his thoughts. It also enabled him to move a robot arm to touch a friend's hand for the first time in the seven years since he was injured in a motorcycle accident. (Science Daily)


ALSO reported by AP

Paralyzed Man Uses Mind-Powered Robot Arm To Touch
Tim Hemmes

By Lauran Neergaard  

10/10/11 10:04 AM ET  

AP

http://www.huffingtonpost.com/2011/10/10/mind-powered-robot-arm_n_1003204.html


"It wasn't my arm but it was my brain, my thoughts. I was moving something," Hemmes says. (Neergaard)




Bionic Eye Implant Approved by U.S. for Rare Disease
By Anna Edney

Bloomberg

Feb 15, 2013 7:01 AM GMT+0200

http://www.bloomberg.com/news/2013-02-14/bionic-eye-implant-approved-by-u-s-for-rare-disease.html


New neuroprosthetic eye implant restores some visual capabilities.

While the $100,000-plus system won’t restore sight, it gives patients the ability to perceive the difference between light and dark. The device consists of a video camera, a transmitter mounted on a pair of eyeglasses and a processing unit that transforms images into electronic data sent to an implanted retinal prosthesis, the FDA said.

[…]

Konstantopoulos, of Glen Burnie, Maryland, said he was diagnosed with retinitis pigmentosa when he was in his early 40s and became completely blind about six months ago. He can see shadows now with the device and tell if the sun is behind a tree. Argus II is comfortable and the surgery was painless, he said.

[…]

A clinical study of 30 people showed the eye device helped patients recognize large letters or words, detect street curbs, walk on a sidewalk without falling and match black, gray and white socks.




Rats With Linked Brains Work Together
Megan Gannon, News Editor

Live Science

Date: 28 February 2013 Time: 12:23 PM ET

http://www.livescience.com/27544-rats-with-linked-brains-work-together.html


Brain plasticity so great that brains can use information from other brains.

Scientists have engineered something close to a mind meld in a pair of lab rats, linking the animals' brains electronically so that they could work together to solve a puzzle. And this brain-to-brain connection stayed strong even when the rats were 2,000 miles apart.

The experiments were undertaken by Duke neurobiologist Miguel Nicolelis, who is best known for his work in making mind-controlled prosthetics.

"Our previous studies with brain-machine interfaces had convinced us that the brain was much more plastic than we had thought," Nicolelis explained. "In those experiments, the brain was able to adapt easily to accept input from devices outside the body and even learn how to process invisible infrared light generated by an artificial sensor. So, the question we asked was, if the brain could assimilate signals from artificial sensors, could it also assimilate information input from sensors from a different body?"

For the new experiments, Nicolelis and his colleagues trained pairs of rats to press a certain lever when a light went on in their cage. If they hit the right lever, they got a sip of water as a reward.

When one rat in the pair called the "encoder" performed this task, the pattern of its brain activity — something like a snapshot of its thought process — was translated into an electronic signal sent to the brain of its partner rat, the "decoder," in a separate enclosure. The light did not go off in the decoder's cage, so this animal had to crack the message from the encoder to know which lever to press to get the reward.

The decoder pressed the right lever 70 percent of the time, the researchers said.

[…]

"We saw that when the decoder rat committed an error, the encoder basically changed both its brain function and behavior to make it easier for its partner to get it right," Nicolelis explained in a statement. "

[…]

The connection was not lost even when the signals were sent over the Internet and the rats placed on two different continents, 2,000 miles (3,219 kilometers) apart.”





.

No comments:

Post a Comment