SmartHMD for Improved Mobility
Study Details
Study Description
Brief Summary
The National Eye Institute estimated about 3 million people over age 40 in the US had low vision in 2010 and projects an increase to nearly 5 million in 2030 and 9 million in 2050. Current assistive technologies are a patchwork of mostly low-technology aids with limited capabilities that are often difficult to use, and are not widely adopted. This shortfall in capabilities of assistive technology often stems from lack of a user-centered design approach and is a critical barrier to improve the everyday activities of life (EDAL) and the quality of life (QOL) for individuals with low vision.
An intuitive head mounted display (HMD) system on enhancing orientation and mobility (O&M) and crosswalk navigation, could improve independence, potentially decrease falls, and improve EDAL and QOL. The central hypothesis is that an electronic navigation system incorporating computer vision will enhance O&M for individuals with low vision. The goal is to develop and validate a smartHMD by incorporating advanced computer vision algorithms and flexible user interfaces that can be precisely tailored to an individual's O&M need. This project will address the specific question of mobility while the subject crosses a street at a signaled crosswalk. This is a dangerous and difficult task for visually impaired patients and a significant barrier to independent mobility.
Condition or Disease | Intervention/Treatment | Phase |
---|---|---|
|
N/A |
Study Design
Arms and Interventions
Arm | Intervention/Treatment |
---|---|
Experimental: HMD We have several versions (listed below) of a headworn smartHMD. Each can provide verbal and/or tactile feedback to the user. Feedback is controlled by either the experimenter or by computer vision algorithms. 1. The ODG Smartglasses is commercially available. This system uses computer vision to guide a user to the destination using audio and/ or vibration feedback. 2) Tactile stimulator array. This device uses an Arduino Micro, HC-05 Bluetooth Module, L293D Motor Driver and coin vibration motors attached to a head-worn headband or glasses frame. The motors can be controlled directly by an experimenter or by computer vision algorithms. 3) Computer Vision Navigation prototyping system consists of two components: Intel RealSense camera and Alienware M15 laptop. All participants will receive the same 3 interventions: no HMD used, HMD worn but not active, and HMD worn and active. Participants may be tested with any or all of the systems described above. |
Device: No HMD used
Participants will use their existing mobility skills and strategies to navigate toward a goal. If the participant cannot perform this task, the participant will not be forced to.
Device: HMD worn but not active
Participants will wear the HMD, but the HMD will not be active. This will test whether or not the HMD physical components obscure the participants remaining vision and reduce the participants ability to navigate toward a goal.
Device: HMD worn and active
Participants will wear the HMD and the HMD will be active. This will test the HMD function for navigation toward a goal.
|
Outcome Measures
Primary Outcome Measures
- Mobility accuracy: Percentage Correct Alignment [2 hours]
Percentage Trials correctly aligned at the crosswalk (Yes/No Classification). Alignment of feet position relative to lines on sidewalk parallel to street and perpendicular to crosswalk. +/- 10 degrees will be considered correct.
- Mobility accuracy: Veering [2 hours]
The amount of deviation, in degrees, from optimal path (Optimal path for baseline and sham conditions: straight down the middle of pathway; optimal path for smartHMD condition: cued path from smartHMD)
- Mobility accuracy: Percentage Cue Usage [2 hours]
Percentage Times subject needed cues from smartHMD and how well they responded to the cues (smartHMD condition only)
Secondary Outcome Measures
- Detection of Signal [2 hours]
Percentage Trials subject correctly identified the displayed signal on the Pedestrian Signal (Yes/No Classification)
- Time-to-Complete [2 hours]
Duration from start of trial to subject locating the pedestrian signal (seconds)
Eligibility Criteria
Criteria
Inclusion Criteria:
-
Diagnosed with low vision
-
Self reported difficulty with mobility and finding doors (either indoors or outdoors) and using signalized crosswalks.
-
Stratify vision with half best corrected vision better than or equal to 20/100, and other half best corrected vision worse than 20/100
-
Ability to use smart phone
-
Ability to cooperate for tests
-
Able to participate in all visits
Exclusion Criteria:
-
Unable to use head mounted display or smart phone technology
-
Unstable age-related macular degeneration within the past 3 months
-
Unstable diabetic retinopathy within the past 3 months
-
Unstable diabetes within the past 3 months
-
Ocular infection or ocular inflammation in the past 3 months
-
Ocular trauma within the past 6 months
-
Intraocular surgery within 6 months
-
Optical coherence tomography retinal findings of concern to investigator for unstable vision during the study
-
Women who are pregnant (due to risk of falls and change in gait).
-
Uncontrolled seizure disorder in the past 6 months
-
Cerebrovascular accident occurring in the past 6 months
-
Parkinson disease or neurological condition that limits mobility
-
Alzheimer disease or other forms of dementia
-
Conditions of concern to investigator that would confound orientation and mobility, such as severe arthritis, pain that limits ambulatory activities, or orthopedic surgery (e.g., hand, arm, shoulder, knee, or hip surgery within 12 months)
Contacts and Locations
Locations
Site | City | State | Country | Postal Code | |
---|---|---|---|---|---|
1 | North Campus Research Complex | Ann Arbor | Michigan | United States | 48105 |
Sponsors and Collaborators
- James Weiland
Investigators
- Principal Investigator: James Weiland, PhD, University of Michigan
Study Documents (Full-Text)
None provided.More Information
Publications
None provided.- HUM00141598