Wednesday, March 22, 2023
OANDP-L
  • Login
No Result
View All Result
The O&P EDGE
  • PECOS
  • Magazine
    • Subscription
    • Current Issue
    • Issue Archive
    • News Archive
    • Product & Service Directory
    • Advertising Information
    • EDGE Flipbooks
  • O&P Jobs
    • Find a Job
    • Post a Job
  • EDGE Advantage
  • O&P Facilities
  • Resources
    • Product & Service Directory
    • Calendar
    • Contact
    • About Us
    • O&P Library
    • The Guide
    • Custom Publications
    • Advertising Information
    • EDGE Direct
    • Amplitude Media Group
  • PECOS
  • Magazine
    • Subscription
    • Current Issue
    • Issue Archive
    • News Archive
    • Product & Service Directory
    • Advertising Information
    • EDGE Flipbooks
  • O&P Jobs
    • Find a Job
    • Post a Job
  • EDGE Advantage
  • O&P Facilities
  • Resources
    • Product & Service Directory
    • Calendar
    • Contact
    • About Us
    • O&P Library
    • The Guide
    • Custom Publications
    • Advertising Information
    • EDGE Direct
    • Amplitude Media Group
No Result
View All Result
The O&P EDGE Magazine
No Result
View All Result
Home News

Study Puts More Natural Movement for Prosthetic Limbs within Reach

by The O&P EDGE
December 1, 2014
in News
0
SHARES
8
VIEWS
Share on FacebookShare on Twitter

In new research that brings natural movement by prosthetic limbs closer to reality, University of California, San Francisco, scientists have shown that monkeys can learn simple brain-stimulation patterns that represent their hand and arm position, and can then make use of this information to execute reaching maneuvers precisely.

Goal-directed arm movements involving multiple joints, such as those we employ to extend and flex the arm and hand to pick up a coffee cup, are guided both by vision and by proprioception. Previous research has shown that movement is impaired when either of these sources of information is compromised. The most sophisticated prosthetic limbs, which are controlled via brain-machine interfaces (BMIs) rely on users’ visual guidance and do not yet incorporate proprioceptive feedback.

“State-of-the-art BMIs generate movements that are slow and labored-they veer around a lot, with many corrections,” said Philip Sabes, PhD, a professor in the Physiology Department and a senior author of the new study, published November 24, 2014 in the advance online edition of Nature Neuroscience. “Achieving smooth, purposeful movements will require proprioceptive feedback.”

Many scientists have believed that solving this problem requires a “biomimetic” approach-understanding the neural signals normally employed by the body’s proprioceptive systems, and replicating them through electrical stimulation. But theoretical work by Sabes’ group over the past several years has suggested that the brain’s learning capacity might allow for a simpler strategy.

In the new research, conducted in the Sabes laboratory by Maria C. Dadarlat, PhD, and postdoctoral fellow Joseph E. O’Doherty, PhD, monkeys were taught to reach toward a target, but their reaching arms and the target were obscured by a tabletop. A sensor mounted on the monkeys’ reaching hands detected the distance from, and direction toward, the target and this information was used to create a “random-dot” display on a computer monitor that the monkeys could use for visual guidance. Random-dot displays allow researchers to precisely control the usefulness of visual cues. In addition to controlling the random-dot display, information from the sensor was also converted in real time to patterns of electrical stimulation that were delivered by eight electrodes implanted in the monkeys’ brains.

The monkeys first learned to perform the task using just the visual display. The brain stimulation was then introduced, and the coherence of the random-dot display gradually reduced, which required the monkeys to increasingly rely on the brain-stimulation patterns to guide them to the target. The monkeys were engaged in natural movement, so proprioception continued to provide them with information about the absolute position of their reaching hands. Eventually the monkeys were capable of performing the task in a dark room, guided by electrical stimulation alone, but they made the most efficient reaching movements when the information from the brain stimulation was integrated with that from the visual display, which Sabes takes as evidence that “the brain’s natural mechanisms of sensory integration ‘figured out’ the relevance of the patterns of stimulation to the visual information.”

“To use this approach to provide proprioceptive feedback from a prosthetic device,” the authors write, instead of capturing information relative to a target, the brain stimulation “would instead encode the state of the device with respect to the body, for example joint or endpoint position or velocity. Because these variables are also available via visual feedback, the same learning mechanisms should apply.”


Editor’s note: This story was adapted from materials provided by the University of California, San Francisco.

Related posts:

  1. Phantom Limb Pain and Low Vision
  2. BCI Devices Open Doors for People with Disabilities
  3. Non-pharmacologic Approaches to Residual Limb and Phantom Limb Pain
  4. Pediatric Applications of Functional Electrical Stimulation
Previous Post

Diabetes in Midlife Linked to Significant Cognitive Decline 20 Years Later

Next Post

National Disability Institute Encourages Passage of ABLE Act

Next Post

National Disability Institute Encourages Passage of ABLE Act

  • VIEW CURRENT ISSUE
  • SUBSCRIBE FOR FREE

RECENT NEWS

News

Muscle Activity Changes With AFO, KAFO Use

by The O&P EDGE
March 14, 2023

To determine whether changes in neural control depend on wearing an orthosis during gait, a team of researchers measured the...

Read more

Trunk Asymmetry Improves With New Scoliosis Bracing Method

Noridian Hosting Lower-limb Prosthesis Prior Authorization Webinar

Inertial Sensors’ Gait Detection Increases Patient Safety

Get unlimited access!

Join EDGE ADVANTAGE and unlock The O&P EDGE's vast library of archived content.
SUBSCRIBE TODAY

O&P JOBS

Pacific

CPO/CP

Mountain

Need OP Team Member

Pacific

Assistant Professor WOT and Assistant Teaching Professor

 

© 2021 The O&P EDGE

  • About
  • Advertise
  • Contact
  • EDGE Advantage
  • OANDP-L
  • Subscribe

CONTACT US

866-613-0257

[email protected]

201 E. 4th St
Loveland, CO 80537

The most important industry news and events delivered directly to your inbox every week.

No Result
View All Result
  • PECOS
  • MAGAZINE
    • SUBSCRIBE
    • CURRENT ISSUE
    • ISSUE ARCHIVE
    • NEWS ARCHIVE
    • PRODUCTS & SERVICES DIRECTORY
    • ADVERTISING INFORMATION
  • O&P JOBS
    • FIND A JOB
    • POST A JOB
  • EDGE ADVANTAGE
  • FACILITES
  • RESOURCES
    • PRODUCTS & SERVICES DIRECTORY
    • CALENDAR
    • CONTACT
    • ABOUT US
    • O&P LIBRARY
    • THE GUIDE
    • CUSTOM PUBLICATIONS
    • ADVERTISING
    • EDGE DIRECT
    • AMPLITUDE
  • OANDP-L
  • LOGIN

© 2023The O&P EDGE

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
The O&P EDGE Magazine
 
Required 'Candidate' login to applying this job. Click here to logout And try again
 

Login to your account

  • Forgot Password? | Sign Up

Reset Password

  • Already have an account? Login

Enter the username or e-mail you used in your profile. A password reset link will be sent to you by email.

Signup to your Account

  • By clicking checkbox, you agree to our Terms and Conditions and Privacy Policy

    Already have an account? Login

Close
Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?
 

Account Activation

Before you can login, you must activate your account with the code sent to your email address. If you did not receive this email, please check your junk/spam folder. Click here to resend the activation email. If you entered an incorrect email address, you will need to re-register with the correct email address.