Tuesday, September 26, 2023
OANDP-L
  • Login
No Result
View All Result
The O&P EDGE
  • PECOS
  • Magazine
    • Subscription
    • Current Issue
    • Issue Archive
    • News Archive
    • Product & Service Directory
    • Advertising Information
    • EDGE Flipbooks
  • O&P Jobs
    • Find a Job
    • Post a Job
  • EDGE Advantage
    • EA Homepage
    • EA Data
  • O&P Facilities
  • Resources
    • Product & Service Directory
    • Calendar
    • Contact
    • About Us
    • O&P Library
    • The Guide
    • Custom Publications
    • Advertising Information
    • EDGE Direct
    • Amplitude Media Group
  • PECOS
  • Magazine
    • Subscription
    • Current Issue
    • Issue Archive
    • News Archive
    • Product & Service Directory
    • Advertising Information
    • EDGE Flipbooks
  • O&P Jobs
    • Find a Job
    • Post a Job
  • EDGE Advantage
    • EA Homepage
    • EA Data
  • O&P Facilities
  • Resources
    • Product & Service Directory
    • Calendar
    • Contact
    • About Us
    • O&P Library
    • The Guide
    • Custom Publications
    • Advertising Information
    • EDGE Direct
    • Amplitude Media Group
No Result
View All Result
The O&P EDGE Magazine
No Result
View All Result
Home News

Computer Vision Helps Prostheses Navigate Varying Terrain

by The O&P EDGE
January 2, 2022
in News
0
SHARES
12
VIEWS
Share on FacebookShare on Twitter

On-glasses camera configuration using a Tobii Pro Glasses 2 eye tracker.

Photographs courtesy of NC State.

Researchers at North Carolina State University (NC State) have developed software that can be integrated with existing hardware to enable people using robotic prostheses or exoskeletons walk in a safer, more natural manner on different types of terrain. The new framework incorporates computer vision into prosthetic leg control and includes artificial intelligence (AI) algorithms that allow the software to better account for uncertainty.

 

“Lower-limb robotic prosthetics need to execute different behaviors based on the terrain users are walking on,” said Edgar Lobaton, PhD, co-author of a paper on the work and an associate professor of electrical and computer engineering at the university. “The framework we’ve created allows the AI in robotic prostheses to predict the type of terrain users will be stepping on, quantify the uncertainties associated with that prediction, and then incorporate that uncertainty into its decision-making.”

 

The researchers focused on distinguishing between six different terrains that require adjustments in a robotic prosthesis’s behavior: tile, brick, concrete, grass, upstairs, and downstairs.

 

Lower-limb data acquisition device with a camera and an IMU chip.

Photograph courtesy of Photographer.

“If the degree of uncertainty is too high, the AI isn’t forced to make a questionable decision—it could instead notify the user that it doesn’t have enough confidence in its prediction to act, or it could default to a ‘safe’ mode,” said Boxuan Zhong, PhD, lead author of the paper.

 

The new “environmental context” framework incorporates both hardware and software elements. The researchers designed the framework for use with any lower-limb robotic exoskeleton or robotic prosthetic device, but with one additional piece of hardware: a camera. In their study, the researchers used cameras worn on eyeglasses and cameras mounted on the lower-limb prosthesis itself. The researchers evaluated how the AI was able to make use of computer vision data from both types of camera, separately and when used together.

 

“Incorporating computer vision into control software for wearable robotics is an exciting new area of research said Helen Huang, PhD, a co-author of the paper. “We found that using both cameras worked well but required a great deal of computing power and may be cost prohibitive. However, we also found that using only the camera mounted on the lower limb worked pretty well—particularly for near-term predictions, such as what the terrain would be like for the next step or two.” Huang is the Jackson Family Distinguished Professor of Biomedical Engineering in the Joint Department of Biomedical Engineering at NC State and the University of North Carolina at Chapel Hill.

 

The most significant advance, however, is to the AI itself, the researchers said.

 

“We came up with a better way to teach deep-learning systems how to evaluate and quantify uncertainty in a way that allows the system to incorporate uncertainty into its decision making,” Lobaton said. “This is certainly relevant for robotic prosthetics, but our work here could be applied to any type of deep-learning system.”

 

To train the AI system, researchers connected the cameras to able-bodied individuals, who then walked through a variety of indoor and outdoor environments. The researchers then did a proof-of-concept evaluation by having a person with lower-limb amputation wear the cameras while traversing the same environments.

 

“We found that the model can be appropriately transferred so the system can operate with subjects from different populations,” Lobaton said. “That means that the AI worked well even thought it was trained by one group of people and used by somebody different.”

 

The next steps are incorporating the framework into the control system for working robotic prosthetics and making the system require less visual data input and less data processing.

 

The paper, “Environmental Context Prediction for Lower Limb Prostheses with Uncertainty Quantification,” is published in IEEE Transactions on Automation Science and Engineering.

 

Editor’s note: This story was adapted from materials provided by NC State.

Related posts:

  1. O&P Education: Guiding the Transition to Client-centric Training
  2. I, ROBOT
  3. Take-Charge Software Helps Manage O&P Practices
  4. O&P Research Supports Evidence-based Care
Previous Post

Study: Important to Encourage Upper-limb Prosthesis Use

Next Post

CMS Updates COVID-19 POD Signature Requirements

Next Post

CMS Updates COVID-19 POD Signature Requirements

  • VIEW CURRENT ISSUE
  • SUBSCRIBE FOR FREE

RECENT NEWS

News

Women With Limb Loss Cite Prosthetist Relationships in Needs Survey

by The O&P EDGE
September 21, 2023

Noting the critical gap in knowledge of evidence-based healthcare practices aimed at women with limb loss, a research team developed...

Read more

Ottobock Opens Production Center in Bulgaria

SBA Opens 2024 Small Business Award Nominations

ABC Accepting O&P Student Scholarship Applications

Get unlimited access!

Join EDGE ADVANTAGE and unlock The O&P EDGE's vast library of archived content.
SUBSCRIBE TODAY

O&P JOBS

Eastern

Full Time Certified Ped Orthotist

Eastern

CO/Certified Orthotist

Eastern

Customer Service Representative

 

© 2023 The O&P EDGE

  • About
  • Advertise
  • Contact
  • EDGE Advantage
  • OANDP-L
  • Subscribe

CONTACT US

866-613-0257

[email protected]

201 E. 4th St
Loveland, CO 80537

The most important industry news and events delivered directly to your inbox every week.

  • About
  • Advertise
  • Contact
  • EDGE Advantage
  • OANDP-L
  • Subscribe

© 2023 The O&P EDGE

No Result
View All Result
  • PECOS
  • MAGAZINE
    • SUBSCRIBE
    • CURRENT ISSUE
    • ISSUE ARCHIVE
    • NEWS ARCHIVE
    • PRODUCTS & SERVICES DIRECTORY
    • ADVERTISING INFORMATION
  • O&P JOBS
    • FIND A JOB
    • POST A JOB
  • EDGE ADVANTAGE
    • EA Homepage
    • EA Data
  • FACILITIES
  • RESOURCES
    • PRODUCTS & SERVICES DIRECTORY
    • CALENDAR
    • CONTACT
    • ABOUT US
    • O&P LIBRARY
    • THE GUIDE
    • CUSTOM PUBLICATIONS
    • ADVERTISING
    • EDGE DIRECT
    • AMPLITUDE
  • OANDP-L
  • LOGIN

© 2023 The O&P EDGE

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
The O&P EDGE Magazine
 
Required 'Candidate' login to applying this job. Click here to logout And try again
 

Login to your account

  • Forgot Password? | Sign Up

Reset Password

  • Already have an account? Login

Enter the username or e-mail you used in your profile. A password reset link will be sent to you by email.

Signup to your Account

  • By clicking checkbox, you agree to our Terms and Conditions and Privacy Policy

    Already have an account? Login

Close
Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?
 

Account Activation

Before you can login, you must activate your account with the code sent to your email address. If you did not receive this email, please check your junk/spam folder. Click here to resend the activation email. If you entered an incorrect email address, you will need to re-register with the correct email address.