日 時 | 2007年11月21日(水) 10:30 |
---|---|
講演者 | Byron Yu 博士 |
講演者所属 | Stanford University |
お問い合わせ先 | 認知行動発達機構 伊佐 (内線7761) |
要旨 |
The prospect of helping disabled patients, by translating neural activity from the brain into control signals for prosthetic devices, has flourished in recent years. Rapid progress on such neural prosthetic systems has been possible because of systems neuroscience discoveries and major advances in computational and neural-recording technologies. For example, several research groups have now demonstrated that monkeys can learn to move a computer cursor to various target locations simply by activating neural populations that participate in natural arm movements. Despite tremendous advances in the past decade, even these compelling proof-of-concept laboratory demonstration systems fall short of exhibiting the level of control needed for many everyday behaviors, such as typing words rapidly on a keyboard or reaching straight for a cup of water. I will first present the design and demonstration of a fast and accurate key selection system, capable of transmitting up to 6.5 bits/s or <15 words per minute. Next, I will describe how arm trajectories can be accurately decoded from neural activity using probabilistic state-space models. Taken together, these developments should substantially increase the clinical viability of neural prostheses in humans. |