• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

Interpreting and Using Brainwaves


  • Please log in to reply
27 replies to this topic

#1 kevin

  • Member, Guardian
  • 2,779 posts
  • 822

Posted 13 October 2003 - 06:42 AM


Link: http://www.reuters.c...storyID=3600620
Additional Link to a Wired Article: http://www.wired.com.....51039,00.html
Date: 10-13-03
Author: Maggie Fox
Source: Reuters
Title: Monkey Think, Monkey Do Study May Help Paralyzed


Monkey Think, Monkey Do Study May Help Paralyzed
Mon October 13, 2003 12:05 AM ET
By Maggie Fox, Health and Science Correspondent


WASHINGTON (Reuters) - Dr. Miguel Nicolelis knew he had nailed it when the monkey stopped using her arm to play the computer game.

An implanted device had allowed the monkey to control the game using only her thoughts, Nicolelis and colleagues report in the Public Library of Science Biology journal on Monday.

And changes in the way the monkey's brain cells worked suggested the brain was physically adjusting to the device, they reported in the new online science journal.

Nicolelis hopes the device will eventually allow paralyzed patients to regain some ability to use their upper bodies -- virtually, if not physically.

"The monkey suddenly realized that she didn't need to move her arm at all," Nicolelis said in a statement.

"Her arm muscles went completely quiet, she kept the arm at her side and she controlled the robot arm using only her brain and visual feedback."

Three years ago, Nicolelis and colleagues at Duke University in North Carolina reported that they had allowed a monkey to move a robotic arm using only her thoughts and implanted electrodes. But the monkey continued to move her arm.

In the latest experiment, they said two monkeys figured out what was happening and played a computer game using thoughts alone.

AURORA AND IVY

"It's very different because these animals now receive feedback information," Nicolelis added in a telephone interview.

"They could learn to correct their errors and achieve a very high level of proficiency, using brain activity alone to reproduce reaching and grabbing hand movements."

Nicolelis and colleagues first implanted microelectrodes -- each smaller than the diameter of a human hair -- into the brains of two female rhesus macaque monkeys named Aurora and Ivy.

One got 96 electrodes in her frontal and parietal lobes -- known to be the source of commands for muscular movement. The second monkey got 320 implants.

The electrodes transmit faint signals to a computer system the researchers have developed to recognize patterns of signals that represent particular movements by an animal's arm. These signals are translated and in turn control a robotic arm.

At first the animals were taught to use a joystick to control the cursor of a video game -- which Nicolelis said they enjoyed playing. The researchers recorded and analyzed the electrical activity of the neurons near the implanted electrodes.

As the game became more complex, the monkeys learned how to control the cursor.

The group has started working with a small group of human patients, but Nicolelis said he could not give any details yet.

Potential users include 200,000 people in the United States alone who have partial or nearly total permanent paralysis. An estimated 11,000 people a year suffer severe spinal cord injuries, for instance, and sufferers of Parkinson's disease and amyotrophic lateral sclerosis, also called ALS or Lou Gehrig's disease, may also become paralyzed.

"We hope the brain will learn to adapt to the devices and incorporate them as if they were the patient's own limbs," Nicolelis said.

His team is working to miniaturize the device so it can be useful to a human patient outside a laboratory setting.

"There is certainly a great deal of science and engineering to be done to develop this technology and to create systems that can be used safely in humans," he added.

"However, the results so far lead us to believe that these brain-machine interfaces hold enormous promise for restoring function to paralyzed people."

#2 chubtoad

  • Life Member
  • 976 posts
  • 5
  • Location:Illinois

Posted 13 October 2003 - 08:20 AM

Interesting, I've heard of similar things before but never to this extent.

sponsored ad

  • Advert

#3 chubtoad

  • Life Member
  • 976 posts
  • 5
  • Location:Illinois

Posted 15 October 2003 - 12:51 AM

Here is a really detailed report on it. Look at the pictures and the graphs or read it if you got a lot of spare time.
http://www.plos.org/...-02-carmena.pdf

#4 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 24 March 2004 - 04:21 AM

link: http://www.eurekaler...c-hss032304.php

Public release date: 23-Mar-2004

Contact: Dennis Meredith
dennis.meredith@duke.edu
919-681-8054
Duke University Medical Center



Human studies show feasibility of brain-machine interfaces


DURHAM, N.C. -- In their first human studies of the feasibility of using brain signals to operate external devices, researchers at Duke University Medical Center report that arrays of electrodes can provide useable signals for controlling such devices. The research team is now working to develop prototype devices that may enable paralyzed people to operate "neuroprosthetic" and other external devices using only their brain signals.

While the new studies provide an initial proof of principle that human application of brain-machine interfaces is possible, the researchers emphasize that many years of development and clinical testing will be required before such neuroprosthetic devices are available.

The research team, led by neurosurgeon and professor of neurobiology Dennis Turner, M.D., and neurobiologist Miguel Nicolelis, M.D., will publish their results in the July 2004 issue of the journal Neurosurgery. Principal members of the research team also include Parag Patil, M.D., a resident in neurosurgery and lead author of the study, and Jose Carmena, Ph.D., a post-doctoral fellow in neurobiology. The research was supported by the Defense Advanced Research Projects Agency and the National Institutes of Health.

The research builds on earlier studies in the Nicolelis laboratory, in which monkeys learned to control a robot arm using only their brain signals.

In the initial human studies, Patil and colleagues recorded electrical signals from arrays of 32 microelectrodes, during surgeries performed to relieve the symptoms of Parkinson's disease and tremor disorders. These surgical procedures routinely involve implanting electrodes into the brain and then stimulating the brain with small electrical currents to relieve the patient's symptoms. The patients are awake during surgery, and the neurosurgeons typically record brain signals to ensure that permanent electrodes are placed into the optimal location in the brain.

In the experiments being reported in Neurosurgery, the researchers added a simple manual task to the surgical procedure. While brain signals were recorded using the novel 32-channel electrode array, the 11 volunteer patients were asked to play a hand-controlled video game.

Subsequently analyzing the signals from these experiments, the team found that the signals contained enough information to be useful in predicting the hand motions. Such prediction is the necessary requisite for reliably using neural signals to control external devices.

"Despite the limitations on the experiments, we were surprised to find that our analytical model can predict the patients' motions quite well," said Nicolelis. "We only had five minutes of data on each patient, during which it took a minute or two to train them to the task. This suggests that as clinical testing progresses, and we use electrode arrays that are implanted for a long period of time, we could achieve a workable control system for external devices," he said.

While other researchers have demonstrated that individually implanted electrodes can be used to control a cursor on a computer screen, complex external devices would require data from large arrays of electrodes, said the Duke researchers.

According to Nicolelis, another major difference between the initial human studies and the monkey studies is that recording in the human patients were made from electrodes inserted deeper into the brain, in subcortical structures, rather than the cortical surface.

"This shows that one can extract information not only from cortical areas, but from subcortical ones, too," said Nicolelis. "This suggests that in the future, there will be more options for sampling neuronal information to control a prosthetic device," he said.

According to Turner, the progression to human clinical studies presents a number of challenges. For example, he said, the data with monkeys were obtained from electrodes attached to the surface of the cerebral cortex.

"We initially used subcortical electrodes, because they are more stable because they are buried deeper," said Turner. Also, he said, the deeper regions present other advantages. "The way the brain works, all the signals for motor control are filtered through these deep regions of the brain before they reach the final cortical output," he said. "So, they are theoretically easier to record from than cortical areas. The subcortical areas are also denser, which means there are more cells to record from in a smaller area.

Working with Duke biomedical engineers, the research team is currently developing the initial prototype of a neuroprosthetic device that will include a wireless interface between the patient and the device.

According to Turner, while the most obvious application of such technology would be a robot arm for a quadriplegic, he and his colleagues are planning other devices as well. One would be a neurally controlled electric wheelchair, and another a neurally operated keyboard, whose output could include either text or speech. Such devices could help both paralyzed people and those who have lost speech capabilities because of stroke or amyotrophic lateral sclerosis (Lou Gehrig's disease).

A key question in future clinical studies will be whether humans can incorporate such devices into their "schema," or neural representation of the external world, said Turner. The monkeys in Nicolelis' studies appeared to do just that.

"We do know that for all kinds of motor training, such as riding a bicycle, people incorporate an external device into their schema, and the process becomes subconscious," he said. "We will build on that phenomenon in our human studies. It's known, for example, that patients who don't have use of their arm still show in MRI studies that the control centers in the brain are working normally. When they are asked to imagine moving their arm, the control centers become active. So, we have good hope that the neurons in those centers can still provide the same signals, even though the arm isn't physically working."

As their next major step, said Turner, the researchers have already applied for federal approval to begin implanting experimental electrode arrays long-term in quadriplegic patients. Such tests, conducted over the next three to five years would involve implanting the arrays in specific regions, asking the patients to perform specific tasks and then exploring which tasks are optimally controlled by that region.


###

#5 bacopa

  • Validating/Suspended
  • 2,223 posts
  • 159
  • Location:Boston

Posted 24 March 2004 - 08:27 AM

wonderful progress on the neurobiology, neuroprothestic front! thanks Kevin

#6 JonesGuy

  • Guest
  • 1,183 posts
  • 8

Posted 25 March 2004 - 06:11 AM

Monkey Mind Control
Brain Computer Interfaces - Another Option

from Quirks and Quarks.
The first two links are audio articles about this topic. I've already burnt the articles to CD and lent them to a couple friends. I find the site quite useful, because my driving time is more useful if I'm learning on the side.

#7 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 13 April 2004 - 09:37 PM

Link: http://www.nytimes.c...ml?pagewanted=1




With Tiny Brain Implants, Just Thinking May Make It So


By ANDREW POLLACK

Published: April 13, 2004

Posted Image
Dr. John P. Donoghue founded
Cyberkinetics, which hopes to
implant devices in the brain that
will allow injured patients to
operate computers by thought
alone.
Can a machine read a person's mind? A medical device company is about to find out.

The company, Cyberkinetics Inc., plans to implant a tiny chip in the brains of five paralyzed people in an effort to enable them to operate a computer by thought alone.

The Food and Drug Administration has given approval for a clinical trial of the implants, according to the company.

The implants, part of what Cyberkinetics calls its BrainGate system, could eventually help people with spinal cord injuries, strokes, Lou Gehrig's disease or other ailments to communicate better or even to operate lights and other devices through a kind of neural remote control.

"You can substitute brain control for hand control, basically," said Dr. John P. Donoghue, chairman of the neuroscience department at Brown University and a founder of Cyberkinetics, which hopes to begin the trial as early as next month.

The melding of man and machine has long been a staple of science fiction. Indeed, the participants in Cyberkinetics's clinical trial, who have not yet been chosen, will have a cable sticking out of their heads to connect them to computers, making them look something like characters in "The Matrix."

But in real life, several research groups have already implanted devices in monkeys that allow them to control cursors on computer screens or move robot arms using their brainpower alone, setting the stage for the trial in people.

"Among many people in the field, there's a feeling now that the time is here for moving the technology to test in humans," said Dr. Richard A. Andersen, professor of neuroscience at the California Institute of Technology, who is working on his own device for the brain. Still, for the trial, there is trepidation mixed with anticipation.

"A disaster at this early stage could set the whole field back," said Dr. Dawn M. Taylor, a research associate at Case Western Reserve University and the Cleveland Veterans Affairs Medical Center, who is testing similar systems in monkeys.

Devices have long been implanted in the brains of patients with Parkinson's disease to deliver pulses of electricity that reduce tremors and rigidity.

But systems like BrainGate do not deliver current.

Instead, they listen to the electrical signals produced by the brain's neurons as they work. The aim is to discern a pattern of neuronal activity indicating the intention to initiate a particular physical movement.

In typical monkey trials of neural implants, the animals, which are not paralyzed, are trained to perform a task, like moving a cursor with a joystick, while a tiny subset of their neurons is monitored.

After different patterns of neuronal signals are matched with different body movements, cursor control is shifted to their brains.

In some studies, the monkeys eventually appeared to realize that they no longer had to move their arms to perform the tasks.

In a sense, this is a form of mind reading, scientists say. But in addition to passively letting its thoughts be read, the brain also learns to control the cursor actively, just as it acquires any new skill.

The quadriplegics in the trial will not be able to move their arms to train the system, making things a little harder. Instead, they must imagine moving their arms.

Researchers have already shown that this can be done. Dr. Philip Kennedy, a neurologist in Atlanta who started Neural Signals Inc., implanted electrodes into several severely disabled people starting in 1996, and at least one could type through this method, though only three words a minute.

Some other implants have been tested briefly on people undergoing brain surgery for other reasons. Dr. Jonathan R. Wolpaw of the New York State Department of Health has developed a system that does not require implants but uses electroencephalography to pick up brain waves using sensors attached to the scalp.

Though Cyberkinetics is not the first to try neural control in people, it seems the most intent on bringing a product to market, perhaps by 2007 or 2008, said its chief executive, Timothy R. Surgenor.

Started in 2001 and based in Foxborough, Mass., the company has raised $9 million for the project.

Cyberkinetics argues that its system will perform better than other systems tested on people so far. Devices that use sensors outside the skull do not pick up signals as clearly as electrodes under the skull. And though Dr. Kennedy implanted two electrodes per patient, the Cyberkinetics chip has 100 electrodes. That means more neurons can be monitored, providing clearer information, the company says.

To implant the chip, a small hole will be cut in the patient's skull, above the ear. The chip, which measures about 2 millimeters (or just under one-tenth of an inch) square, will be placed on the surface of the brain at the motor cortex, which controls movement.

The electrodes, which are like spikes protruding from the chip's surface, will extend into the brain to a depth of 1 millimeter.

The surgery will be performed at Rhode Island Hospital by Dr. Gerhard Friehs, an associate professor of neuroscience at Brown and a co-founder of Cyberkinetics, who performed the operations on the monkeys. Another neurosurgeon without connection to the company will monitor the procedure to ensure that financial interests do not dictate proceeding with surgery if it is not safe.

Technicians from Cyberkinetics will later visit the participants, whose identities will not be disclosed at first, several times a week at their homes to test the system for an hour or two a day. The trial will last about a year, and then the chips will be removed in a second operation.

Some scientists question whether the benefits will outweigh the risks.

One reason is that the signals from the chip are carried out of the body by wires coming through the skull. When the system is to be used, a cable will be connected to the wires.

The cable will carry the signals to a cart full of electronic equipment that will analyze them and convey the results to the computer. The opening in the skin is permanent and poses a risk of infection.

"We don't like to hang around with wires coming out of our head," Dr. Kennedy said. His system used brain implants that transmitted signals by radio, so no break in the skin was required.

Some experts also question Cyberkinetics's requirement that participants in the trial be able to talk. Such people, the experts say, can control computers through other options, like speech recognition systems or head or eye movements.

"If you are only talking about moving a cursor up and down on the screen, you don't need to get into the brain to do that," said Dr. Miguel Nicolelis, professor of neurobiology at Duke.

In contrast, Dr. Nicolelis said, the system he is developing will control a robot arm, making three-dimensional movements that will be too complex to do without a neural implant. Dr. Kennedy performed his experiments on people who could not talk and had virtually no other means of communicating.

Mr. Surgenor of Cyberkinetics, however, said that having participants who could talk would speed development of the system.

"We need the feedback of what they are imagining when it doesn't work and what they are imagining when it works," he said.

Dr. Jon Mukand of the Sargent Rehabilitation Center in Warwick, R.I., who will be the principal investigator of the trial and will select the participants, said the system had been proved safe in tests on 18 monkeys.

Infections were rare and treatable, Dr. Mukand said, and the incidence should be even lower in people, who understand the risk.

Dr. Donoghue of Cyberkinetics said the prototypes for many medical devices, including pacemakers and cochlear implants, had involved wires coming out of the body.

One uncertainty is whether the implants will move around over time or cause scarring. Either could lead to loss of the neuron signal.

Another question is whether the system, if it does work, will prove superior to more mundane methods like voice recognition or even Dr. Wolpaw's electroencephalography system.

Cyberkinetics said it expected that its system would be faster than other methods, perhaps allowing people to type 20 to 30 words a minute, as fast as a healthy person could type on a BlackBerry hand-held computer.

Marcie Roth, executive director of the National Spinal Cord Injury Association, said the needs of paralyzed people varied. A system like BrainGate "could be fantastically useful for some people," she said.

Cyberkinetics said that it did not intend to sell the current version of BrainGate, but that it hoped eventually to offer an improved model. It would use miniaturized electronics and would be fully implanted in the brain, transmitting information without wires. The price is likely to be in the tens of thousands of dollars, the company said.

Eventually, the system may come in a form that does not have to be calibrated by technicians each time it is turned on. It would always be ready to use, as if it were a part of the body.

"You don't wake up and turn on your hand," Mr. Surgenor said.

#8 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 16 April 2004 - 06:27 AM

Link: http://www.rednova.c...4/story001.html
Posted Image


The FDA Approves Brain Implants
By JUSTIN POPE


Posted Image
Maryam Saleh, a Cyberkinetics Inc.
clinical research assistant, holds
a brain gate censor, which will
be implanted beneath the skull.


BOSTON (AP) -- For years, futurists have dreamed of machines that can read minds, then act on instructions as they are thought. Now, human trials are set to begin on a brain-computer interface involving implants.

Cyberkinetics Inc. of Foxboro, Mass., has received Food and Drug Administration approval to begin a clinical trial in which four-square-millimeter chips will be placed beneath the skulls of paralyzed patients.

If successful, the chips could allow patients to command a computer to act - merely by thinking about the instructions they wish to send.

It's a small, early step in a mission to improve the quality of life for victims of strokes and debilitating diseases like cerebral palsy or Lou Gehrig's. Many victims of such ailments can now survive for long periods thanks to life support, but their quality of life is poor.

"A computer is a gateway to everything else these patients would like to do, including motivating your own muscles through electrical stimulation," said Cyberkinetics chief executive Tim Surgenor. "This is a step in the process."

The company is far from the only research group active in the field. An Atlanta company, Neural Signals, has conducted six similar implants as part of a clinical trial and hopes to conduct more. But for now, its device contains relatively simple electrodes, and experts say Cyberkinetics will be the first to engage in a long-term, human trial with a more sophisticated device placed inside a patient's brain. It hopes to bring a product to market in three to five years.

A number of research groups have focused on brain-computer links in recent years.

In 1998, Neural Signals researchers said a brain implant let a paralyzed stroke victim move a cursor to point out phrases like "See you later. Nice talking with you" on a computer screen. The next year, other scientists said electrodes on the scalp of two Lou Gehrig's disease patients let them spell messages on a computer screen.

Cyberkinetics founder Dr. John Donoghue, a Brown University neuroscientist, attracted attention with research on monkeys that was published in 2002 in the journal Nature.

Posted Image
Timothy Surgenor, president and CEO
of Cyberkinetics Inc., poses with a
brain model in his company in
Foxboro, Mass.


Three rhesus monkeys were given implants, which were first used to record signals from their motor cortex - an area of the brain that controls movement - as they manipulated a joystick with their hands. Those signals were then used to develop a program that enabled one of the monkeys to continue moving a computer cursor with its brain.

The idea is not to stimulate the mind but rather to map neural activity so as to discern when the brain is signaling a desire to make a particular physical movement.

"We're going to say to a paralyzed patient, 'imagine moving your hand six inches to the right,'" Surgenor said.

Then, he said, researchers will try to identify the brain activity associated with that desire. Someday, that capacity could feed into related devices, such as a robotic arm, that help patients act on that desire.

It's misleading to say such technologies "read minds," said Dr. Jonathan Wolpaw, of the New York State Department of Health, who is conducting similar research. Instead, they train minds to recognize a new pattern of cause and effect, and adapt.

"What happens is you provide the brain with the opportunity to develop a new skill," he said.

Moving the experiment from monkeys to humans is a challenge. Cyberkinetics'"Brain Gate" contains tiny spikes that will extend down about one millimeter into the brain after being implanted beneath the skull, monitoring the activity from a small group of neurons.

The signals will be monitored through wires emerging from the skull, which presents some danger of infection. The company is working on a wireless version.

But Richard Andersen, a Cal Tech expert conducting similar research, said the field is advanced enough to warrant this next step.

"I think there is a consensus among many researchers that the time is right to begin trials in humans," Andersen said, noting that surgeons are already implanting devices into human brains - sometimes deeply - to treat deafness and Parkinson's disease. "There is always some risk but one considers the benefits."

Wolpaw said it isn't clear that it's necessary to implant such devices inside the brain; other technologies that monitor activity from outside the skull may prove as effective. But, he said, the idea of brain implants seems to attract more attention.

"The idea that you can get control by putting things into the brain appears to have an inherent fascination," he said.

Andersen, however, said that for now devices inside the brain provide the best information.

"It would be nice if in the future some technology comes along that would let you non-invasively record form the brain," he said. "MRIs do that. But unfortunately, it's very expensive and cumbersome, and the signal is very indirect and slow."

-----

#9 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 12 May 2004 - 02:02 AM

Link: http://physicsweb.or...icle/news/8/5/5

More progress made in determining what's happening under the hood without the need for implants is promising. With new high-temp superconducting materials likely to appear in the near future, the technology described below might become common place at your local clinic.. or police station.



Posted Image
Brain scans made easy
Posted Image
11 May 2004

Magnetic measurements of brain activity could be free from noise in the future thanks to a new helmet-like device developed by medical physicists at the Los Alamos National Laboratory in the US. Magnetoencephalography (MEG) is the only technique that can directly measure neuronal activity in the brain, but it is plagued by background noise that interferes with signals from the brain itself. The new helmet could provide much more accurate information on brain function (P Volegov et al. 2004 Phys. Med. Biol. 49 2117).

MEG is a non-invasive technique that provides detailed information on the brain in almost real time by using superconducting quantum interference device (SQUID) sensors to measure the magnetic fields generated by currents flowing in and around neurons. However, these magnetic field signals are extremely weak -- typically between about 10-14 and 10-13 Tesla -- and are therefore easily overwhelmed by background magnetic noise. Although various techniques exist to reduce this noise, none are entirely satisfactory because they can also reduce the size of the signals produced by the brain itself.

The helmet designed by the Los Alamos team is made from a layer of superconducting lead and is placed around the SQUID sensors (see figure). The helmet needs to kept at temperatures below 8 kelvin -- in a liquid helium cryostat -- for the lead to be superconducting. The device works on the principle that Meissner currents flow on the surface of the superconductors in the helmet. These currents expel magnetic flux, therefore preventing any external magnetic fields from penetrating the helmet. Moreover, unlike previous methods, the helmet can be placed close to the head without affecting signals produced by the brain.

The scientists have already tested their helmet on real patients and say that background noise signals can be reduced by more than six orders of magnitude, making it the most effective system to date. However, the device still needs to be improved because noise levels are still relatively high around the brim.

Author
Belle Dumé is Science Writer at PhysicsWeb

#10 chubtoad

  • Life Member
  • 976 posts
  • 5
  • Location:Illinois

Posted 09 June 2004 - 10:27 PM

http://news-info.wus...normal/911.html

Human subjects play mind games

'Look, Ma, no hands!'
By Tony Fitzpatrick
June 9, 2004 — That's using your brain.
For the first time in humans, a team headed by researchers at Washington University in St. Louis has placed an electronic grid atop patients' brains to gather motor signals that enable patients to play a computer game using only the signals from their brains.

WUSTL researchers have shown it possible to play video games with your brain, not your hands.
The use of a grid atop the brain to record brain surface signals is a brain-machine interface technique that uses electrocorticographic (ECoG) activity-data taken invasively right from the brain surface. It is an alternative to the status quo, used frequently studying humans, called electroencephalographic activity (EEG) - data taken non-invasively by electrodes outside the brain on the skull.

The breakthrough is a step toward building biomedical devices that can control artificial limbs, some day, for instance, enabling the disabled to move a prosthetic arm or leg by thinking about it. The study was published in the June 8, 2004 issue of the Journal of Neural Engineering and was partially funded by the National Institutes of Health (NIH).

Eric C. Leuthardt, M.D., a physician in the Department of Neurological Surgery, Barnes Jewish Hospital, and Daniel Moran, Ph.D., assistant professor of biomedical engineering, performed their research on four adult epilepsy patients who had the grids implanted so that neurologists can find the area in the brain serving as the focus for an epileptic seizure, with hopes of removing it to avoid future seizures. To do this, the patients and their doctors must wait for a seizure. With approval of the patients and the Washington University School of Medicine Institutional Review Board, Leuthardt and Moran connected the patients to a sophisticated computer running a special program known as BCI2000 (developed by their collaborators at the Wadsworth Center) which involves a video game that is linked to the ECoG grid. They then asked the patients to do various motor and speech tasks, moving their hands various ways, talking, and imagining. The team could see from the data which parts of the brain correlate to these movements. They then asked the patients to play a simple, one-dimensional computer game involving moving a cursor up or down towards one of two targets. They were asked to imagine various movements or imagine saying the word "move," but not to actually perform them with their hands or speak any words by mouth. When they saw the cursor in the video game, they then controlled it with their brains.

"We closed the loop," said Moran. "After a brief training session, the patients could play the game by using signals that come off the surface of the brain. They achieved between 74 and 100 percent accuracy, with one patient hitting 33 out of 33 targets correctly in a row."



#11 lightowl

  • Guest, F@H
  • 767 posts
  • 5
  • Location:Copenhagen, Denmark

Posted 10 June 2004 - 04:58 AM

I think this is the first steps toward cyborgizing humans. When these non-invasive brain joy-sticks are given to children at all ages, they will certainly begin to use that technology to communicate, play and fight as children do with an open mind. This brain out-put in convergence with skin sensation and visual input will make children look like aliens to their parents, much more than children with their mobile phones and game-computers seem like aliens today. The apparent ESP abilities of those children would make them super-human, with the ability to make conversations without talking and playing without moving. In a few decades I think the teenagers of that generation will start gathering brain-power in a previously unseen way. Mind communities being able to combine thought to invent could be the first step toward Bio-AI with the human mind as the enabling processor.

#12

  • Lurker
  • 0

Posted 10 June 2004 - 11:02 AM

ECoG requires an implant? This is not feisable, in terms of upgradability, and the high degree of risk associated with such invasive surgery. If we're going to start implanting devices they need to be well tested and do far more than control motion in 1 dimension.

I little off topic but I've always wondered whether the brain can take part in distributive computing with other computers located near it. For instance with extremely high data throughoutput the brain and surrounding computers could work seemlessly. For the duration of the time they are connected the consciousness of the person may exist in both brain and computer, or so it seems. For this to happen the approximation of certain activity will not be enough, there will have to be some sort of consistent extensive neuronal connection.

#13 lightowl

  • Guest, F@H
  • 767 posts
  • 5
  • Location:Copenhagen, Denmark

Posted 10 June 2004 - 03:46 PM

This is not feisable, in terms of upgradability

We way want to create some kind of interfacing implant that don't need to be upgraded with every advance in cognitive devices. Only that interface would be directly connected to the brain. Of course that would require some kind of implanted plug. To remove the device connected via that plug I think would be like temporarily removing a limb or a cognate ability like speech or orientation, although not restricted to one ability.

I think external reading of activity is still feasible for some extent of control and communication not only involving axis control. Non-invasive interfacing will certainly be easier for the general public to accept in the beginning. If a cheep and general applicable device where offered to the public, it would not take long before the worlds computer freaks started developing software for their own amusement. Then it is only a question of time before a boost, like the one the internet was subject to, will commence.

Edited by lightowl, 13 June 2004 - 12:14 AM.


#14 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 16 June 2004 - 05:04 AM

Link: http://www.scienceda...40615075143.htm


New Technique Developed For Deciphering Brain Recordings Can Capture Thinking As It Happens
A team led by University of California San Diego neurobiologists has developed a new approach to interpreting brain electroencephalograms, or EEGs, that provides an unprecedented view of thought in action and has the potential to advance our understanding of disorders like epilepsy and autism.

Posted Image
Image of the brain with colored spheres indicating clusters of activity. (Photo Credit: Scott Makeig)
The new information processing and visualization methods that make it possible to follow activation in different areas of the brain dynamically are detailed in a paper featured on the cover of the June 15 issue of the journal Public Library of Science Biology (plos.org) The significance of the advance is that thought processes occur on the order of milliseconds--thousandths of a second--but current brain imaging techniques, such as functional Magnetic Resonance Imaging and traditional EEGs, are averaged over seconds. This provides a "blurry" picture of how the neural circuits in the brain are activated, just as a picture of waves breaking on the shore would be a blur if it were created from the average of multiple snapshots.

"Our paper is the culmination of eight years of work to find a new way to parse EEG data and identify the individual signals coming from different areas of the brain," says lead author Scott Makeig, a research scientist in UCSD's Institute for Neural Computation of the Swartz Center for Computational Neuroscience . "This much more comprehensive view of brain dynamics was only made possible by exploiting recent advances in mathematics and increases in computing power. We expect many clinical applications to flow from the method and have begun collaborations to study patients with epilepsy and autism."

To take an EEG, recording electrodes--small metal disks--are attached to the scalp. These electrodes can detect the tiny electrical impulses nerve cells in the brain send to communicate with each other. However, interpreting the pattern of electrical activity recorded by the electrodes is complicated because each scalp electrode indiscriminately sums all of the electrical signals it detects from the brain and non-brain sources, like muscles in the scalp and the eyes.

"The challenge of interpreting an EEG is that you have a composite of signals from all over the brain and you need to find out what sources actually contributed to the pattern," explains Makeig. "It is a bit like listening in on a cocktail party and trying to isolate the sound of each voice. We found that it is possible, using a mathematical technique called Independent Component Analysis, to separate each signal or "voice" in the brain by just treating the voices as separate sources of information, but without other prior knowledge about each voice."

Independent component analysis, or ICA, looks at the distinctiveness of activity in each patch of the brain's cortex. It uses this information to determine the location of the patch and separate out the signals from non-brain sources. Because ICA can distinguish signals that are active at the same time, it makes it possible to identify the electrical signals in the brain that correspond to the brain telling the muscles to take an action --which in the paper was deciding whether or not to press a button in response to an image flashed on a computer screen--and to separate this signal from the signals the brain uses to evaluate the consequences of that action.

According to Makeig, UCSD was a leader in developing the earlier methods of interpreting EEGs forty years ago. "The new, more general 'ICA' method continues this tradition of UCSD excellence in cognitive electrophysiology research," he says.


The coauthors on the paper, in addition to Makeig, include Arnaud Delorme and Tzyy-Ping Jung, Swartz Center for Computational Neuroscience; Marissa Westerfield and Jeanne Townsend, UCSD's Department of Neurosciences; Eric Courchesne, Children's Hospital Research Center and UCSD's Department of Neurosciences; and Terrence Sejnowski, UCSD professor of biology and Howard Hughes Medical Institute professor at the Swartz Center for Computational Neuroscience and the Salk Institute for Biological Studies. The study was funded by the Swartz Foundation, the National Institutes of Health and the Howard Hughes Medical Institute.

Software for performing the EEG analysis is openly available at no cost at http://www.sccn.ucsd.edu/eeglab.


What's Related
"Chaos" Theory Empowers VA And University Of Florida Researchers To Predict Epileptic Seizures

'Shifty-eyed' Monkeys Offer Window Into Brain's Social Reflexes

UCLA Scientists Image How Parkinson's Genes Misfire In Mice

#15 rahein

  • Guest
  • 226 posts
  • 0

Posted 16 June 2004 - 01:36 PM

This study proves that there is little we can not solve if just throw enough computing power at it.

I will be really interested when we have that kind of computing power in a PDA size device and can wear wireless EEG nodes. Then we will have a nonivasive brain interface.

Does anyone know how much computing power it took them to do this in real time?

Dell just can out with a 667Mhz PDA with bluetooth, and the next gameboy will be dual processor. I don't think we are too far away from having that kind of power.

#16 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 11 October 2004 - 08:29 AM

Link: http://www.usatoday....ate-cover_x.htm


Scientists gingerly tap into brain's power
By Kevin Maney, USA TODAY

FOXBOROUGH, Mass. — A 25-year-old quadriplegic sits in a wheelchair with wires coming out of a bottle-cap-size connector stuck in his skull.

The wires run from 100 tiny sensors implanted in his brain and out to a computer. Using just his thoughts, this former high school football player is playing the computer game Pong.

It is part of a breakthrough trial, the first of its kind, with far-reaching implications. Friday, early results were revealed at the American Academy of Physical Medicine and Rehabilitation annual conference. Cyberkinetics Neurotechnology Systems, the Foxborough-based company behind the technology, told attendees the man can use his thoughts to control a computer well enough to operate a TV, open e-mail and play Pong with 70% accuracy.

"The patient tells me this device has changed his life," says Jon Mukand, a physician caring for him at a rehabilitation facility in Warwick, R.I. The patient, who had the sensors implanted in June, has not been publicly identified.

The trial is approved by the FDA. Cyberkinetics has permission to do four more this year.

The significance of the technology, which Cyberkinetics calls Braingate, goes far beyond the initial effort to help quadriplegics. It is an early step toward learning to read signals from an array of neurons and use computers and algorithms to translate the signals into action. That could lead to artificial limbs that work like the real thing: The user could think of moving a finger, and the finger would move.

"It's Luke Skywalker," says John Donoghue, the neuroscientist who led development of the technology at Brown University and in 2001 founded Cyberkinetics.

The brain in control

Further out, some experts believe, the technology could be built into a helmet or other device that could read neural signals from outside the skull, non-invasively. The Defense Advanced Research Projects Agency (DARPA) is funding research in this field, broadly known as Brain Machine Interface, or BMI.

DARPA envisions a day when a fighter pilot, for instance, might operate some controls just by thinking.

BMI is a field about to explode. At Duke University, a research team has employed different methods to read and interpret neural signals directly from the human brain. Other research is underway at universities around the world. Atlanta-based Neural Signals — a pioneer in BMI for the handicapped — has also been developing a system for tapping directly into the brain.

To be certain, the technology today is experimental and crude, perhaps at a stage similar to the first pacemaker in 1950, which was the size of a boombox and delivered jolts through wires implanted in the heart.

The Cyberkinetics trial "is great," says Jeff Hawkins, author of On Intelligence, a book about the brain out this month. But measuring enough neurons to do complex tasks like grasp a cup or speak words isn't close to feasible today. "Hooking your brain up to a machine in a way that the two could communicate rapidly and accurately is still science fiction," Hawkins says.

Layered on all of the BMI research are ethical and societal issues about messing with the brain to improve people. But those, too, are a long way from the research happening now.

Monkeys chasing dots

The Cyberkinetics technology grew out of experiments with monkeys at Brown.

Donoghue and his research team implanted sensors in the brains of monkeys, and got them to play a simple computer game — chasing dots around a screen with a cursor using a mouse — to get a food reward. As the monkeys played, computers read signals from the sensors and looked for patterns. From the patterns, the team developed mathematical models to determine which signals meant to move left, right, up, down and so on. After a while, the team disconnected the mouse and ran the cursor off the monkeys' thoughts. It worked: The monkeys could chase the dots by thinking of what they'd normally do with their hands.

A driving concept is to make the computer control natural, so a patient doesn't have to learn new skills.

The reason it works has to do with a discovery made by neuroscientists in the 1990s. The billions of neurons in each region of the brain work on physical tasks like an orchestra, and each neuron is one instrument.

With an orchestra, if you listen to only a few of the instruments, you could probably pick up what song is being played, but you wouldn't get all its richness and subtlety. Similarly, scientists found that if you can listen to any random group of neurons in a region, you can decipher generally what the region is trying to do — but you wouldn't get the richness and subtlety that might let a person do complex tasks.

The more neurons you can listen to, the more precisely you can pick out the song.

Cyberkinetics' big breakthrough is listening to up to 100 neurons at once and applying the computing power to make sense of that data almost instantly. The 100 sensors stick out from a chip the size of a contact lens. Through a hole in the skull, the chip is pressed into the cortex surface "like a thumbtack," Donoghue says.

Most of the sensors get near enough to a neuron to read its pattern of electrical pulses as they turn on and off, much like the 1s and 0s that are the basis for computing. Wires carry the signals out through a connector in the skull, and the computer does the rest.

Patient gaining accuracy

Cyberkinetics technicians work with the former football player three times a week, trying to fine-tune the system so he can do more tasks. He can move a cursor around a screen. If he leaves the cursor on a spot and dwells on it, that works like a mouse click.

Once he can control a computer, the possibilities get interesting. A computer could drive a motorized wheelchair, allowing him to go where he thinks about going. It could control his environment — lights, heat, locking or unlocking doors. And he could tap out e-mails, albeit slowly.

At this point, though, the equipment is unwieldy. The computer, two screens and other parts of the system are stacked on a tall cart. The processor and software can't do all the computations quite fast enough to move the cursor in real time — not instantly, the way your hand moves when you tell it to move. And because the sensors tap no more than 100 neurons, the cursor doesn't always move precisely. That's why a one-time athlete can play Pong at only 70% accuracy.

Though implanting a chip in the brain might seem alarming, devices are already regularly implanted in brains to help people who have severe epilepsy, Parkinson's disease or other neurological disorders. "We put drugs in our brains to improve them, even caffeine," says Arthur Caplan, head of the Center for Bioethics at the University of Pennsylvania. "I don't think the brain is some sacrosanct organ you can't touch."

Not everyone is a fan of Cyberkinetics' human trials. "I am very skeptical," says Miguel Nicolelis, co-director of the Duke center doing similar research. "They seem to want to simply push their views and make a buck without much consideration of what is appropriate and safe to suggest to different patients."

At the moment, though, "The patient is very, very happy," says Mukand, who is also functioning as the FDA's investigator on the case.

Help with prosthetic limbs?

One way or another, neuroscience and technology are crashing together.

The Duke team has not implanted a permanent device in a human, but it has implanted sensors in monkeys who then move a robot arm by thought. Duke's results, published in July in the journal Neuroscience, show that the idea of using neurons to guide a prosthetic device can work.

To really be useful, the technology will have to get smaller, cheaper and wireless — perhaps a computer worn behind the ear. Down the road, it will have to tap many more neurons, and then the challenge will be building software to analyze more complicated patterns from so many more neurons.

"Brains are incredibly complex organs," author Hawkins says. "There are 100,000 neurons in a square millimeter of cortex. There are very precise codes in the neurons. The details matter."

A yet bigger challenge — the one DARPA faces — will be reading neural signals without drilling holes in people's skulls. Over the past decade, researchers have used the electroencephalogram (EEG) to pick up brain waves through electrodes attached to the head. After months of training, users can learn to play simple video games — such as making a wheel turn faster — with their thoughts. But EEG readings are too broad and weak to drive more specific tasks.

In June, researchers at Washington University, St. Louis, reported using a different external device — an electrocorticographic (ECoG) — to get more precise readings from outside the head. With a few hours of training, users could track targets on a screen.

But researchers at Duke, Brown and Cyberkinetics believe that the only way to get signals that can operate a robot arm, do e-mail or move a wheelchair is to touch the brain directly.

As with most technological developments, the devices will get smaller and better and the software will be made smarter, until some of what now seems bizarre becomes real. Society will be forced to debate the questions the technology raises.

"There are those who say this is slippery slope stuff — that this technology is opening the door to dangerous technologies that could enhance, improve and optimize someone," says bioethicist Caplan. "But I'm unwilling to hold hostage this kind of exciting medical research for those kinds of fears."

#17

  • Lurker
  • 0

Posted 11 October 2004 - 10:05 AM

A yet bigger challenge — the one DARPA faces — will be reading neural signals without drilling holes in people's skulls.


LOL, something to consider in the meantime.

I'm glad the comments of the bioethicists in that article are guarded and not totally in force of stifling this research. I don't like bioethicists who are simply a voice box for those who fear new technology that can improve and enhance human life, ability, intelligence, etc.

Anyway thanks for posting the article.

#18 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 11 October 2004 - 04:53 PM

Arthur Caplan is one of the few bioethicists I've read who is very careful about suggesting the killing of research of a particular variety that has the potential for much good. He recently did an interview in which he supported ESC research.

#19 Bruce Klein

  • Guardian Founder
  • 8,794 posts
  • 242
  • Location:United States

Posted 15 October 2004 - 07:28 AM

Posted Image

Paralysed man sends e-mail by thought

http://www.nature.co...l/041011-9.html

Roxanne Khamsi

Brain chip reads mind by tapping straight into neurons.

An pill-sized brain chip has allowed a quadriplegic man to check e-mail and play computer games using his thoughts. The device can tap into a hundred neurons at a time, and is the most sophisticated such implant tested in humans so far.

Many paralysed people control computers with their eyes or tongue. But muscle function limits these techniques, and they require a lot of training. For over a decade researchers have been trying to find a way to tap directly into thoughts.

In June 2004, surgeons implanted a device containing 100 electrodes into the motor cortex of a 24-year-old quadriplegic. The device, called the BrainGate, was developed by the company Cyberkinetics, based in Foxborough, Massachusetts. Each electrode taps into a neuron in the patient's brain.

The BrainGate allowed the patient to control a computer or television using his mind, even when doing other things at the same time. Researchers report for example that he could control his television while talking and moving his head.

The team now plans to implant devices into four more patients.

Brain waves

Posted Image
The tiny sensor consists of an array of 100 electrodes to capture signals from the brain.

© Alamy

Rival teams are building devices to read brain activity without touching neurons. Neural Signals, based in Atlanta, has patented a conductive skull screw that sits outside the brain, just under the skull. Other researchers are developing non-invasive technologies, for example using an electroencephalogram to read a patient's thoughts.

But BrainGate's creators argue that such techniques only give a general picture of brain activity, and that the more direct approach allows more numerous and more specific signals to be translated. "This array has 100 electrodes, so one can theoretically tap into 100 neurons," says Jon Mukand, an investigator on the team based at the Sargent Rehabilitation Center in Rhode Island.

This makes the technology faster and more flexible, he argues. "It's far more versatile when one can get a larger number of neurons."

But Stephen Roberts, an engineer at Oxford University, UK, who has worked on brain-computer interfaces, says the field is still waiting for a breakthrough. "We have to make something that works robustly and without a lot of patient training," he says. "Most of these devices work well on a small subset of patients, but there's a long way to go before getting them to work for the general population."

#20 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 28 October 2004 - 02:39 AM

Link : http://www.wired.com...html?tw=rss.TOP


Posted Image
Advent of the Robotic Monkeys

If a monkey is hungry but has his arms pinned, there's not much he can do about it. Unless that monkey can control a nearby robotic arm with his brain.

And that's exactly what the monkey in Andrew Schwartz's neurobiology lab at the University of Pittsburgh can do, feeding himself using a prosthetic arm controlled solely by his thoughts.

If mastered, the technology could be used to help spinal cord injuries, amputees or stroke victims. "I still think prosthetics is at an early stage ... but this is a big step in the right direction," said Chance Spalding, a bioengineering graduate student who worked on the project.

The prosthetic limb, the size of a child's arm, has working shoulder and elbow joints and is equipped with a simple gripper to grasp and hold food. The monkey's arms are restrained at its sides and as the monkey thinks about bringing the food to his mouth, electrodes in the monkey's brain intercept the neuronal firings that are taking place in the motor cortex, a region of the brain responsible for voluntary movement.

The brain activity is fed to a computer where an algorithm developed by the University of Pittsburgh interprets the neuronal messages and sends them to the robotic arm. "We have learned to understand the patterns of firing rates and can decode them into movement, direction, velocity and speed," said Schwartz.

Schwartz expounded on the research Tuesday at the annual meeting of the Society for Neuroscience in San Diego.

The unique aspect of Schwartz's research is that he conducted what is known as "closed loop" brain experiments. In a "closed loop" experiment, the monkey is conscious of the robotic arm and is making an effort to control it. Monkeys in previous experiments did not understand that they were having an effect on the world at all. Duke University performed such prosthetic arm experiments as far back as 2000. In one case they even sent the electrode signals over the internet, allowing the monkey to move an arm 600 miles away at MIT.

"The open loop experiment was really very crude," said Schwartz. "The closed loop introduces us into a whole new field because the animal actually sees the arm and the consequence of what it is doing." For Schwartz's monkey the robotic arm is incorporated into its mental body representation, making it an extra limb.

"Getting the monkey to learn that he is controlling this robotic device was the hardest part. For him to figure out that it was under his control, and to decipher the mapping took a very long time," noted Spalding.


To achieve this state of computer-aided telekinesis, the monkey had to go through various stages of training in a virtual environment. First the monkey learned what the task was by using its arms, which were tracked in VR, to hit a blue ball.

Next the monkey had to repeat the task while its arms were restrained in a process called "brain control." The lessons at this stage were necessary as they provided a learning space for the monkey to adapt to using the robotic arm.

Because the prosthetic arm relies on a small percentage of the thousands of neurons that fire when the monkey intends to move its real arm, the monkey had to reform its natural thinking process in order to have steady control over the robotic arm.

In the virtual space the monkey learned through biofeedback how to modify the firing rates of the neurons that are being recorded and sent to the robotic arm for directions. By the end of its "brain control" lessons the monkey mastered this new form of movement and could control its phantom limb in virtual reality by knowing how to fire the few key neurons needed.

After graduating from these virtual lessons the monkey moved to the robot arm. While sitting on a high chair with its arms restrained at its side the monkey had to move the robotic arm, which was placed at its shoulder, from different locations to his mouth so he could eat.

"The initial movement to the mouth is pretty good, but when it gets to his mouth he is concentrating on the food and not on the arm movements so it gets a little clumsy," said Schwartz.

Even further down the road is a plan to give the monkey a more realistic arm. Schwartz wants to replace the simple one-movement gripper at the end of the current prosthetic arm, custom-built by Keshen Prosthetics in Shanghai, China, with a realistic hand containing finger movement.

"It is much more complicated, but we can take it in stages. We can grip first and then try to work individual fingers," said Schwartz.

While the professor thinks applications are far off, he is excited about the advancement that this experiment means for understanding the brain.

"Every time there is a technological advance, we can use it to better understand the goings-on in the brain," which leads to more scientific discoveries, said Schwartz.

John Donoghue of Cyberkinetics has already extended this research to humans. He has implanted electrodes into the motor cortex of a quadriplegic, allowing the patient to move a computer cursor to access e-mail or use other applications. "The human phase of this has moved forward tremendously," said Donoghue. Cyberkinetics will continue its pilot study by expanding the trial to four more patients.

#21 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 11 November 2004 - 06:24 PM

Link: http://www.eurekaler...s-itm111004.php


Posted Image
Public release date: 10-Nov-2004

UK CONTACT - Claire Bowles
claire.bowles@rbi.co.uk
Tel: 44-207-611-1210
New Scientist Press Office, London
US CONTACT – Toni Marshall
toni.marshall@newscientist.com
1-617-558-4939
New Scientist Boston office

Implants that move in your brain
A DEVICE that automatically moves electrodes through the brain to seek out the strongest signals is taking the idea of neural implants to a new level. Scary as this sounds, its developers at the California Institute of Technology in Pasadena say devices like this will be essential if brain implants are ever going to be made to work. Implants could one day help people who are paralysed or unable to communicate because of spinal injury or conditions such as amyotrophic lateral sclerosis (Lou Gehrig's disease). Electrodes implanted in the brain could, in principle, pick up neural signals and convey them to a prosthetic arm or a computer cursor. But there is a problem. Implanted electrodes are usually unable to sense consistent neuronal signals for more than a few months, according to Igor Fineman, a neurosurgeon at the Huntington Hospital, also in Pasadena.

This loss of sensitivity has a number of causes: the electrodes may shift following a slight knock or because of small changes in blood pressure; tissue building up on the electrodes may mask the signal; or the neurons emitting the signals can die. To get around these problems, Joel Burdick and Richard Andersen at Caltech have developed a device in which the electrodes sense where the strongest signal is coming from, and move towards it. Their prototype, which is mounted on the skull, uses piezoelectric motors to move four electrodes independently of each other in 1-micrometre increments. It has successfully been used to decode motor signals in rats and intention signals in monkeys. When surgeons implant electrodes in the brain they normally have to "tune" individual electrodes by positioning them to pick up signals from a single brain cell. With the new device, which the Caltech teams calls an autonomous microdrive, an fMRI scan is enough to locate the electrodes in the general area where the signals are coming from. Each electrode then homes in on the strongest nearby signal. Piezoelectric motors were chosen for the microdrive because they are capable of moving the electrodes hundreds of micrometres with great accuracy, Burdick says. Applying a voltage to the crystal causes it to expand momentarily, and because it has a roughened edge, feeding it a sequence of voltage pulses causes it to ratchet forward another surface in contact with it- in this case an electrode.

To stop it damaging neurons, the microdrive has been given a collision avoidance capability. "If the signal voltage starts rising very rapidly we know we are in danger of puncturing a neuron, so it backs off," Burdick says. While the animal tests have shown that the microdrive can home in on the strongest neural signals, it is still too bulky to be used for people. The team is working with Yu-Chong Tai of Caltech, an expert in microelectromechanical systems (MEMS), to make a smaller version with up to 100 electrodes. The researchers say that within a year they expect to be able to fit a paralysed person with a microdrive implant that will allow them to control a computer cursor and navigate the web. Autonomous microdrives could also eventually be used in other types of implant, such as the deep brain stimulators used to treat Parkinson's disease. Fineman says the microdrives could help make them more effective by providing a feedback mechanism from single neurons.


###
Author: DUNCAN GRAHAM-ROWE

This article appears in New Scientist issue: 13 November 2004

#22 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 03 April 2005 - 01:02 AM

Link: http://news.bbc.co.u...lth/4396387.stm


Brain chip reads man's thoughts

Posted Image
The 'chip' reads brain signals

A paralysed man in the US has become the first person to benefit from a brain chip that reads his mind.
Matthew Nagle, 25, was left paralysed from the neck down and confined to a wheelchair after a knife attack in 2001.

The pioneering surgery at New England Sinai Hospital, Massachusetts, last summer means he can now control everyday objects by thought alone.

The brain chip reads his mind and sends the thoughts to a computer to decipher.

Mind over matter

He can think his TV on and off, change channels and alter the volume thanks to the technology and software linked to devices in his home.

Scientists have been working for some time to devise a way to enable paralysed people to control devices with the brain.

Studies have shown that monkeys can control a computer with electrodes implanted into their brain.

It's quite remarkable

Dr Richard Apps, neurophysiologist from Bristol University

Recently four people, two of them partly paralysed wheelchair users, were able to move a computer cursor while wearing a cap with 64 electrodes that pick up brain waves.

Mr Nagle's device, called BrainGate, consists of nearly 100 hair-thin electrodes implanted a millimetre deep into part of the motor cortex of his brain that controls movement.

Wires feed the information from the electrodes into a computer which analyses the brain signals.

The signals are interpreted and translated into cursor movements, offering the user an alternative way to control devices such as a computer with thought.

Motor control

Professor John Donoghue, an expert on neuroscience at Brown University, Rhode Island, is the scientist behind the device produced by Cyberkinetics.

He said: "The computer screen is basically a TV remote control panel, and in order to indicate a selection he merely has to pass the cursor over an icon, and that's equivalent to a click when he goes over that icon."

Mr Nagle has also been able to use thought to move a prosthetic hand and robotic arm to grab sweets from one person's hand and place them into another.

Professor Donoghue hopes that ultimately implants such as this will allow people with paralysis to regain the use of their limbs.

The long term aim is to design a package the size of a mobile phone that will run on batteries, and to electrically stimulate the patient's own muscles.

This will be difficult.

The simple movements we take for granted in fact involve complex electrical signals which will be hard to replicate, Dr Richard Apps, a neurophysiologist from Bristol University, the UK, told the BBC News website.

He said there were millions of neurones in the brain involved with movement. The brain chip taps into only a very small number of these.

But he said the work was extremely exciting.

"It's quite remarkable. They have taken research to the next stage to have a clear benefit for a patient that otherwise would not be able to move.

"It seems that they have cracked the crucial step and arguably the most challenging step to get hand movements.

"Just to be able to grasp an object is a major step forward."

He said it might be possible to hone this further to achieve finer movements of the hand.

#23

  • Lurker
  • 1

Posted 03 April 2005 - 04:26 AM

I realize that it is not the stuff of cutting edge BMI science, but I am wondering if anyone here has had any experience with IBVA ?

#24

  • Lurker
  • 0

Posted 07 April 2005 - 11:49 AM

http://www.newscient...=mg18624944.600


Sony patent takes first step towards real-life Matrix
07 April 2005
Exclusive from New Scientist Print Edition
Jenny Hogan
Barry Fox


IMAGINE movies and computer games in which you get to smell, taste and perhaps even feel things. That's the tantalising prospect raised by a patent on a device for transmitting sensory data directly into the human brain - granted to none other than the entertainment giant Sony.

The technique suggested in the patent is entirely non-invasive. It describes a device that fires pulses of ultrasound at the head to modify firing patterns in targeted parts of the brain, creating "sensory experiences" ranging from moving images to tastes and sounds. This could give blind or deaf people the chance to see or hear, the patent claims.

While brain implants are becoming increasingly sophisticated, the only non-invasive ways of manipulating the brain remain crude. A technique known as transcranial magnetic stimulation can activate nerves by using rapidly changing magnetic fields to induce currents in brain tissue. However, magnetic fields cannot be finely focused on small groups of brain cells, whereas ultrasound could be.

If the method described by Sony really does work, it could have all sorts of uses in research and medicine, even if it is not capable of evoking sensory experiences detailed enough for the entertainment purposes envisaged in the patent.

Details are sparse, and Sony declined New Scientist's request for an interview with the inventor, who is based in its offices in San Diego, California. However, independent experts are not dismissing the idea out of hand. "I looked at it and found it plausible," says Niels Birbaumer, a pioneering neuroscientist at the University of Tübingen in Germany who has created devices that let people control devices via brain waves.

The application contains references to two scientific papers presenting research that could underpin the device. One, in an echo of Galvani's classic 18th-century experiments on frogs' legs that proved electricity can trigger nerve impulses, showed that certain kinds of ultrasound pulses can affect the excitability of nerves from a frog's leg. The author, Richard Mihran of the University of Colorado, Boulder, had no knowledge of the patent until New Scientist contacted him, but says he would be concerned about the proposed method's long-term safety.

Sony first submitted a patent application for the ultrasound method in 2000, which was granted in March 2003. Since then Sony has filed a series of continuations, most recently in December 2004 (US 2004/267118).

Elizabeth Boukis, spokeswoman for Sony Electronics, says the work is speculative. "There were not any experiments done," she says. "This particular patent was a prophetic invention. It was based on an inspiration that this may someday be the direction that technology will take us."



#25 Guest_da_sense_*

  • Lurker
  • 0

Posted 18 April 2005 - 06:35 AM

Is it better to use headphones or earphones when listening to brainwave cds/programs? I've looked for some recommendation but couldn't find any. I see that profi equipment comes with pretty large headphones.

#26 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 12 August 2005 - 01:04 AM

Link: http://news.bbc.co.u...lth/4715327.stm
Posted Image


'Thoughts read' via brain scans
Scientists say they have been able to monitor people's thoughts via scans of their brains.

Teams at University College London and University of California in LA could tell what images people were looking at or what sounds they were listening to.

The US team say their study proves brain scans do relate to brain cell electrical activity.

The UK team say such research might help paralysed people communicate, using a "thought-reading" computer.

In their Current Biology study, funded by the Wellcome Trust, people were shown two different images at the same time - a red stripy pattern in front of the right eye and a blue stripy pattern in front of the left.

The volunteers wore special goggles which meant each eye saw only what was put in front of it.

In that situation, the brain then switches awareness between both images, sometimes seeing one image and sometimes the other.

While people's attention switched between the two images, the researchers used fMRI (functional Magnetic Resonance Imaging) brain scanning to monitor activity in the visual cortex.

It was found that focusing on the red or the blue patterns led to specific, and noticeably different, patterns of brain activity.

The fMRI scans could reliably be used to predict which of the images the volunteer was looking at, the researchers found.

Thought-provoking?

The US study, published in Science, took the same theory and applied it to a more everyday example.

They used electrodes placed inside the skull to monitor the responses of brain cells in the auditory cortex of two surgical patients as they watched a clip of "The Good, the Bad and the Ugly".

They used this data to accurately predict the fMRI signals from the brains of another 11 healthy patients who watched the clip while lying in a scanner.

Professor Itzhak Fried, the neurosurgeon who led the research, said: "We were able to tell one part of a scene from another, and we could tell one type of sound from another."

Dr John-Dylan Haynes of the UCL Institute of Neurology, who led the research, told the BBC News website: "What we need to do now is create something like speech-recognition software, and look at which parts of the brain are specifically active in a person."

He said the study's findings proved the principle that fMRI scans could "read thoughts", but he said it was a very long way from creating a machine which could read anyone's mind.

But Dr Haynes said: "We could tell from a very limited subset of possible things the person is possibly seeing."

"One day, someone will come up with a machine in a baseball cap.

"Then it really could be helpful in everyday applications."

He added: "Our study represents an important but very early stage step towards eventually building a machine that can track a person's consciousness on a second-by-second basis.

"These findings could be used to help develop or improve devices that help paralyzed people communicate through measurements of their brain activity.

But he stressed: "We are still a long way off from developing a universal mind-reading machine."

Dr Fried said: "It has been known that different areas of the temporal lobe are activated by faces, or houses.

"This UCL finding means it is not necessary to use strikingly different stimuli to tell what is activating areas of the brain."

#27 shermhead

  • Guest
  • 13 posts
  • 0

Posted 13 August 2005 - 08:48 PM

Its just more crap playing of the ability that we have to measure brainwave current from a simplistic view of current , amp, freq, weather its there or not. If a really good interface was ever made I am sure the brain could learn howto use the device as it would anything else more then mechanical parts but additional "brain modules".

Some more work on wireless communication with electrodes that are human powered using existing wireless communication technologies which have already been micronized would be neat I think.

sponsored ad

  • Advert

#28 kevin

  • Topic Starter
  • Member, Guardian
  • 2,779 posts
  • 822

Posted 15 August 2005 - 05:25 PM

Link: http://www.the-scien.../2005/1/31/25/1


Optical Topography and the Color of Blood

OT gives neuroscientists a new and faster view of the brain, and an alternative to fMRI
By Laura Spinney

Courtesy of Atsushi Maki Posted Image
OT TO GO:
This young girl sports a cap used for optical topography measurement. Consisting of optical fibers and an elastic cap, this apparatus frees patients from tense, immobilized stays inside magnetic resonance imaging (MRI) tunnels. It also frees scientists to conduct extended neural analyses and studies of brain activity during movement – research that is impossible or impractical with MRI.

Anyone who has been subject to a magnetic resonance imaging (MRI) scan knows its limitations: the claustrophobia-inducing tunnel, the machine gun rattle, the instruction not to move – none of which is conducive to relaxation. For confused patients and newborn babies, MRI scans are not possible, and researchers who study movement or hearing are severely restricted in what they can test. Now, an alternative noninvasive brain-imaging technique called optical topography (OT) is illuminating areas of brain function previously considered inaccessible and exploding some myths about brain development at the same time.

"The concept of optical topography is really unique," says Atsushi Maki, a cognitive neuroscientist at the Hitachi Advanced Research Laboratory in Saitama, Japan, where OT has been under development since 1987. "It is small, easy to use, and it can measure brain function under natural conditions, from neonates to older people."

It was with the idea of bringing brain imaging into the clinic and daily life that Hitachi put its faith in OT to begin with, says Maki. Its investment seems to be paying off. OT is already in routine clinical use, and even veteran users of functional MRI (fMRI) are turning to OT as their preferred tool for certain kinds of research. "Eyeballing it, it's fantastic," says Brian Butterworth of the Institute of Cognitive Neuroscience in London. "Unlike with fMRI, you can actually see a reconstruction of the blood flowing as it happens."

Actually, both techniques measure the brain's hemodynamic response. But where fMRI measures the difference between the iron content of oxy- and deoxyhemoglobin in blood, based on the metal's response in a magnetic field, OT relies on the different light absorption characteristics of the two forms. "We are measuring the color of blood," says Maki.

HISTORY OF OT

The science behind OT dates back to the late 1970s, when F.F. Jobsis of Duke University published a method for monitoring oxygen consumption in living tissue. 1 Called near-infrared (NIR) spectroscopy, the technique depends on the transparency of tissue to light in the near-infrared range. Photons of NIR light traveling through tissue can be detected as they emerge – the same effect people see when they hold a hand up to a candle flame. Since the two conformational forms of hemoglobin absorb light at different wavelengths, Jobsis could deduce from the two signals the average oxygen consumption across a cat's head.

At University College London, Dave Delpy quickly saw the value of Jobsis' technique for his research. Delpy was studying the brains of premature and brain-damaged infants, which have roughly the same diameter as that of a cat. His team built an instrument consisting of one light-transmitting fiber and one receiving fiber that could detect abnormalities in oxygen consumption in neonate brains. A Japanese firm, Hamamatsu Photonics, later developed the instrument.

For anything larger than an infant's head, though, the technique didn't work, because too much light was absorbed by the scalp, skull, and brain. So several groups hit on the idea of having multiple transmitter-receiver optode or optical fiber pairs across the scalp, and measuring the light reflected back out of the brain at each location, rather than the light passing through it.

Britton Chance at the University of Pennsylvania was one of the first to show that enough light was reflected to produce a signal representing localized oxygen consumption in the cortex. 2 After that it became a question of constructing arrays of transmitter-receptor pairs that could cover the whole scalp, each one generating an image of oxygen consumption at that site. Thus, in 1995 OT as a brain-imaging tool was born.

INITIAL SKEPTICISM

Several companies now produce OT instruments worldwide. At Hitachi, Hideaki Koizumi has been in charge of OT development since 1987. His team independently came up with the idea of simultaneous recording from different optode pairs. But clinicians were initially skeptical of the device.

"Neurosurgeons did not believe that near-infrared light could pass through the thick human skull," says Koizumi. So in collaboration with Eiju Watanabe, a neurosurgeon at the Tokyo Metropolitan Police Hospital, they set out to measure the light transmittance of the human skull. "We found that 0.1 percent of light could pass through," says Koizumi. "It was enough to obtain a good signal-to-noise ratio and rapid observations."

Hitachi's standard OT instrument sports 24 transmitter-receiver pairs or channels, and the most advanced model to date has 120 channels. Over the years the company's scientists have refined the wavelengths of the transmitted light to the peak absorption wavelengths of hemoglobin in its two forms: 690 nm for deoxygenated hemoglobin and 830 nm for its oxygenated counterpart. The equipment consists of a foam-lined, optode-studded helmet that is both light and easy to wear. Patients are thus mobile while they are wearing it. Moreover, recordings can be made over extended periods of time: Delpy has recorded continuously in babies over a full 24 hours.

OT does have a lower spatial resolution than fMRI, at least at standard magnetic field strengths of 1.5 or 3 tesla, but it is also faster. Whereas fMRI can detect changes occurring within only a second of each other, the temporal resolution of OT is in the millisecond range. That makes OT ideal for neurosurgeons, who tend not to need a great deal of spatial detail, but for whom near-real-time images are invaluable. Japanese doctors, at least, have embraced the technique.

Watanabe says OT is now in routine clinical use in Japan, mostly by neurosurgeons who wish to accurately identify the focus of epileptic seizures, and the cerebral hemisphere dominant for language before they surgically remove epileptogenic tissue. If the focus of the seizures turns out to be in the language-dominant hemisphere, then the surgeon must weigh the advantages to the patient of eliminating the seizures, against the disadvantages of any potential language impairment surgery may cause.

"OT is easy to use," says Watanabe. "It takes only 10 minutes to complete the exam, compared to at least one hour if we use fMRI or PET [positron emission tomography] to determine language dominance." With fMRI, because of the difficulty of coinciding a scan with a seizure, it is almost impossible to image the seizure as it happens. But as the OT helmet can be worn for long periods of time without restricting mobility, real-time imaging of seizures becomes feasible.

SPATIAL PROBLEMS

Courtesy of Makoto Iwata and Hitachi, Ltd., Advanced Research Laboratory Posted Image
THE BRAIN AT WORK:
An optical topography image of writing-induced brain activity superimposed on a structural image obtained by MRI demonstrates the power of combining neural imaging techniques. Clearly visible are significant responses in Broca's and Wernicke's areas. Both regions are related to speech, language, and writing; Wernicke's area is also implicated in language comprehension.

In contrast to neurosurgeons, researchers who are interested in mapping brain function have a critical need for good spatial resolution. OT and fMRI both measure deoxyhemoglobin, making it possible to directly compare the level of spatial detail the two methods show.

Last year Silvina Horovitz of Yale University and John Gore of Vanderbilt University used OT to scan the brains of human subjects performing a reading task. At some point during the task, the subjects were presented with an anomalous phrase such as "heard shirts." When the researchers compared their responses to this anomaly as measured by OT with fMRI data on the same task, they found good agreement between the two. "We wanted to see whether we could see differences in localization of activation between a regular reading task and a particular semantic anomaly," says Horovitz, "And we did find that with OT we were able to distinguish a slight shift in these activations." 3

While the spatial resolution of OT may be good enough for most research purposes, it does have one major scientific limitation: In contrast to fMRI, which can generate a full three-dimensional image of the brain including all its subcortical structures, OT doesn't penetrate the brain more than a few centimeters. Light scatters on entering living tissue, with a small proportion being reflected back to the detector at the surface. A probabilistic map of the scattering shows that the reflected light generally follows the path of a shallow, banana-shaped curve. The depth of that curve, about 2 cm at its thickest point, is as far as one can see into the brain with OT.

Fortunately for OT proponents, that depth corresponds roughly to the thickness of the cortex, which Delpy says is sufficient for most researchers. "Almost all the grey matter is in that outer cortex anyhow," he says. At least some researchers agree. Jacques Mehler of the International School for Advanced Studies in Trieste, Italy, purchased his first Hitachi instrument six years ago. He persevered with it despite early technical problems and published one of the first brain-imaging studies in neonates in 2003. 4

For 30 years Mehler has been studying how infants acquire language. Never trusting behavioral measures alone, he was always searching for physiological indicators to corroborate them, from heart rate and measures of respiration to noninvasive brain imaging. But he was wary of fMRI.

"fMRI is so noisy, using it for speech research is particularly difficult," he says. "Besides, I'm not sure that putting a neonate in such a noisy environment is sympathique ." OT seemed to him less invasive, and therefore a more promising tool for his purposes. With Marcela Pena, Maki, and others, he has used OT to demonstrate that even at birth, infant brains are able to process speech differently from other sounds, with a bias towards the left hemisphere, as exists in most adults.

PUSHING OT TO THE LIMITS

At the University of Tokyo, Gentaro Taga uses OT in his work on developmental plasticity. In an as-yet-unpublished study, Taga and his colleagues scanned the brains of sleeping infants between two and nine months of age, to see how they responded to speech. The team found that speech activated not only the auditory area in the left temporal cortex, as might be expected, but also areas in the right temporal cortex and the occipital cortex, which is more usually associated with vision.

Courtesy of Atsushi Maki Posted Image
BANANA ON THE BRAIN:
This illustration shows how light propagates in the human head. Near-infrared light irradiated at the surface of scalp penetrates to a depth of about 2 cm; a small fraction of that light bounces back to a detector located at another position, following a shallow, banana-shaped curve.

Taga wants to replicate his findings before he draws any firm conclusions, but if they turn out to be robust, he says they could provide evidence for the breadth of sensory connections that exists in a baby's brain, before synaptic pruning and programmed cell death divide it up neatly according to sensory modality. He also thinks a disruption or delay in this developmental pruning process could lead to synesthesia, the sensory cross-wiring that causes people to see flashing lights when they hear sounds, for instance.

At the very least, his findings raise intriguing questions about the nature of a baby's sensory experience. "Although there is no direct evidence that infants are synesthetes, many behavioral studies have shown that infants have some ability for cross-modal transfer of perception," says Taga.

Meanwhile, Butterworth has co-opted OT to his long-term study of the development of mathematical ability, and what goes wrong in the brains of those who are dyscalculic, or unable to calculate. "The plan is to do something that is really very hard to do with fMRI, and that is to look at the development of the function of a particular brain area," he says.

His investigation will push OT to its limits. Butterworth's team has already conducted fMRI studies that implicate the parietal lobe, towards the back of the brain, in numerical processing. A deep groove or fissure (the intraparietal sulcus) in that lobe appears to be particularly important for number crunching. It is by no means clear, however, that OT will be able to penetrate that fissure, though Delpy says the laser transmitters can be angled to guide the light along a cortical groove.

Courtesy of Atsushi Maki Posted Image
THE COLOR OF BLOOD:
OT measures oxygen consumption in the brain based on differences in the absorption properties of oxygenated and deoxygenated hemoglobin.

Butterworth's plan also demands the detection of subtle changes in brain activation as development proceeds. At the moment, there is no reliable method for analyzing OT images and determining whether increases in blood flow are statistically significant or not. So his team is collaborating with Delpy and others to adapt Statistical Parametric Mapping (SPM), a modeling package developed for fMRI, for the analysis of OT images.

Even more ambitious plans are in the works. Maki thinks OT could provide a crude means of communicating with patients who have locked-in syndrome, by training them first to alter their brain responses to simple yes-no questions. Koizumi is launching a large child-cohort study involving 10,000 Japanese children, to look at the roots of antisocial behavior using OT. Delpy, meanwhile, has gone back to building full 3-D images of infant brains from transmitted light. This technique, called optical tomography, was once abandoned, but he now considers it viable, thanks to rapid technological advances.

In any event, fMRI proponents needn't worry that OT will crowd them out. "The two techniques are totally complementary; one doesn't replace the other," says Delpy. "MRI gives you high spatial resolution, and OT high temporal resolution. You can then map one onto the other to produce a great combined image."

 

References
1. FF Jobsis "Noninvasive, infrared monitoring of cerebral and myocardial oxygen sufficiency and circulatory parameters," Science 1977, 198: 1264-7.   2. B Chance et al, "Comparison of time-resolved and -unresolved measurements of deoxyhemoglobin in brain," Proc Natl Acad Sci 1988, 85: 4971-5.   3. SG Horovitz, JC Gore "Simultaneous event-related potential and near-infrared spectroscopic studies of semantic processing," Hum Brain Mapp 2004, 22: 110-5.   4. M Pena et al, "Sounds and silence: An optical topography study of language recognition at birth," Proc Natl Acad Sci 2003, 100: 11702-5.




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users