The Perceptual Experience of Argus II Users

Sponsor
University of California, Santa Barbara (Other)
Overall Status
Active, not recruiting
CT.gov ID
NCT05285618
Collaborator
Johns Hopkins University (Other), University of Michigan (Other), University of Minnesota (Other), National Eye Institute (NEI) (NIH)
105
4
1
19.6
26.3
1.3

Study Details

Study Description

Brief Summary

The goal of this project is to quantify and computationally model the perceptual experiences of Argus II retinal prosthesis patients. The investigators will produce visual percepts in patients either by directly stimulating electrodes or by asking them to view a computer or projector screen and using standard FDA-approved stimulation protocols (as is standardly used for their devices) to convert the computer or projector screen image into pulse trains on their electrodes. Performance of patients will be compared to that of sighted control subjects viewing a simulation of the vision generated by Argus II in virtual reality.

Condition or Disease Intervention/Treatment Phase
  • Device: Argus II
N/A

Detailed Description

Retinal dystrophies such as retinitis pigmentosa and macular degeneration result in profound visual impairment in more than 10 million people worldwide. One treatment technology, visual neuroprostheses, aims to restore vision by electrically stimulating surviving cells in the retina or early visual cortex, analogous to cochlear implants. Two devices are approved for commercial use, with others scheduled to start clinical trials within a few years. However, current devices suffer from substantial perceptual distortions due to interactions between the implant electronics and the underlying neurophysiology. Here the investigators will combine human psychophysics, computational modeling, virtual reality (VR), and machine learning (ML) to develop and validate novel stimulation protocols for visual prosthesis patients that reduce perceptual distortions and thereby improve the effectiveness of neuroprostheses.

Aim 1: To optimize stimulation protocols using a psychophysically validated phosphene model.

The simplistic simulation protocols currently used by prosthetic device manufacturers do not account for the perceptual distortions reported in clinical trials. In contrast, a computational model based on psychophysical data may better predict the perceptual distortions of Argus II (Second Sight) users.

  1. The investigators will psychophysically validate the computational model by stimulating arbitrary combinations of single, paired, and multiple electrodes. The model will be tested in Argus II patients, and their performance will be compared to that of 'virtual patients'-sighted subjects viewing percepts generated by the model.

  2. Using training sets generated by the model, the investigators will train a deep neural network to learn an approximate inverse transform of the model that can be used to identify the stimulation protocol that minimizes perceptual distortions for oriented gratings and moving bars. The investigators will then test whether this improves performance of Argus II and virtual patients in an orientation discrimination and direction-of-motion identification task.

Aim 2: To develop an improved model by incorporating electrophysiological data that can further optimize stimulation protocols.

The model developed in Aim 1 was based on psychophysical data. However, the distortions described above have also been observed in vitro. In this aim, the investigators will develop an improved model to study how spatiotemporal neuronal dynamics impact prosthetic effectiveness. This will expand the search space within which to search for optimized protocols. The investigators will:

  1. Incorporate key physiological observations from the literature into the model described above in a step-by-step fashion. After each step, the effect of these alterations on the model's ability to predict psychophysical data will be measured.

  2. Use a deep neural network to learn an approximate inverse transformation of this more complex improved model, and test whether this improves performance of Argus II and virtual patients in an orientation discrimination and direction-of-motion identification task.

Aim 3: To develop novel stimulation protocols by applying computer vision to 'virtual patients' in VR.

The investigators will embed the model developed in Aim 2 in VR to generate 'VR virtual patients' to quickly and efficiently test novel stimulation protocols using visually typical individuals. Protocols that result in good VR performance will be validated in real prosthesis patients. The investigators will:

  1. Use VR virtual patients to predict systematic errors of Argus II users in a two-point discrimination task.

  2. Assess the impact of eye movements on behavioral performance using a gaze-contingent display in VR.

  3. Assess the potential of computer vision preprocessing methods such as edge enhancement, retargeting, and decluttering to improve vision by measuring behavioral metrics on an object recognition task in VR. The most promising techniques will be tested in Argus II patients.

These experiments are designed to quantify the perceptual experiences of retinal prosthesis patients and follow standard procedures for collecting behavioral data. Performance in patients will be compared to performance in normally sighted control subjects viewing stimuli manipulated to match the expected perceptual experience of prosthesis patients.

The investigators will produce visual percepts in patients either by directly stimulating electrodes (using FDA approved pulse trains) or by asking them to view a computer or projector screen and using standard FDA approved stimulation protocols (as is standardly used for their devices) to convert the computer or projector screen image into pulse trains on their electrodes. All stimulation protocols will be FDA approved. Control subjects will view stimuli on a computer monitor or through a virtual reality (VR) system. In some cases, the stimuli presented to control subjects with normal vision will be distorted so as to match the perceptual experience of subjects with prostheses.

The normal method of stimulation is a chain from a camera mounted on eye glasses through a video processing unit (VPU) which converts the video image into FDA-approved electronic pulse trains.

Sometimes subjects will be tested using the camera. More often, the investigators will carry out 'direct stimulation' by using an external computer to directly specify pulse trains (e.g. a 1s 10 Hz cathodic pulse train, with a current amplitude of 100 microAmps and a pulse width of 45 microseconds to Electrode 12). These direct pulse trains are then sent to the VPU.

This VPU contains software that makes sure that these pulse trains are within FDA-approved safety limits. For example, these pulses must be charge-balanced (equal anodic/cathodic charge) and must have a charge density below 35 microCoulombs/cm2. Sometimes subjects will be tested using the camera. Sometimes the investigators will directly send pulses to the VPU by directly specifying pulse trains (e.g. send a 1 s 10 Hz cathodic pulse train, with a current amplitude of 100 microAmps and a pulse width of 45 microAmps to Electrode 12) via Second Sight's FDA approved software suite (including their Clinical Fitting System). There are two stages in the software that ensure that these pulse trains must remain within FDA limits: (1) every app in the software suite is based on an FDA approved application programming interface (API) that carries out checks and will "gracefully error" if the specified pulse trains are outside acceptable bounds, (2) the VPU's firmware is programmed to only deliver FDA approved pulses.

The frequency of the pulse train and the current amplitude of the pulse train is not actually a critical safety issue, since the electronic/neural interface is robust to extremely high rates of stimulation and high current levels.

However, high frequency pulse trains or high amplitude pulse trains can produce discomfort in patients (analogous to going from a dark movie theatre to sunlight) due to inducing large-scale neuronal firing. The investigators will normally be focusing on pulse-train frequencies/amplitudes that are in the normal range used by the patient when using their device.

If the investigators use parameters that might be expected to produce a more intense neural response (and therefore have the potential to cause discomfort) they will always introduce them in a step-wise function (e.g. gradually increasing amplitude) while checking that the sensation is not 'uncomfortably bright', and they will immediately decrease the intensity of stimulation if patients report that the sensation approaches discomfort. The PI has experience in this approach, and will train all personnel on these protocols.

In response to the stimulation/image on the monitor, subjects will be asked to make behavioral judgments. Examples include detecting a stimulus ('did you see a light on that trial'), reporting size by drawing on a touch screen, reporting shape by selecting a tactile object of similar shape, or identifying what letter has been presented. The investigators will use standard protocols for collecting behavioral data. The patient will either use verbal report, will draw shapes on a touch screen, or will use the keyboard or keypad attached to a computer as a means of reporting about the stimuli. Both patient response and reaction time will be recorded.

None of these stimuli will elicit emotional responses or be aversive in any way.

In some cases, the investigators will also collect data measuring subjects' eye position. This is a non-invasive procedure that will be carried out using standard eye-tracking equipment via an infrared camera that tracks the position of the subjects' pupil. Only measurements like eye position or eye blinks will be recorded, so these data do not contain identifiable information.

Patients will sit in a comfortable chair approximately 100 cm from either a computer monitor or a back-projection screen onto which an image is projected using a video projector. The patient's chin will rest on a chin rest so as to prevent head movements. Auditory stimuli may be used to cue the beginning of trials/response period. These auditory cues will be at comfortable loudness levels.

Patients are encouraged to take breaks as often as needed (they may leave the testing room). The investigators will use various experimental techniques including: (1) Same-different - e.g. subjects are shown two percepts and are asked if they are the same or different. (2) Method of adjustment - e.g. subjects are asked to adjust a display/stimulation intensity until a percept is barely visible, (3) 2-alternative-forced choice - e.g. patients will be presented with two stimuli and asked which of the two stimuli is brighter (4) Identification

  • patients are asked to identify which letter was presented.

In some cases, as well as measuring accuracy, the investigators will also measure improvement with practice by repeating the same task across multiple sessions (up to 5 sessions, each carried out on different testing days).

Study Design

Study Type:
Interventional
Anticipated Enrollment :
105 participants
Allocation:
N/A
Intervention Model:
Single Group Assignment
Masking:
None (Open Label)
Primary Purpose:
Basic Science
Official Title:
Predicting the Perceptual Experience of Argus II Retinal Prosthesis System Users
Actual Study Start Date :
Jan 11, 2022
Anticipated Primary Completion Date :
Aug 31, 2023
Anticipated Study Completion Date :
Aug 31, 2023

Arms and Interventions

Arm Intervention/Treatment
Experimental: Predicting the perceptual experience of retinal prosthesis patients

This intervention will assess the effect of different stimulation strategies on the perceptual experience of retinal prosthesis patients. We will produce visual percepts in patients either by directly stimulating electrodes (using FDA-approved pulse trains) or by asking them to view a computer or projector screen and using standard FDA-approved stimulation protocols (as is standardly used for their devices) to convert the computer or projector screen image into pulse trains on their electrodes. Existing blind users of the Argus II will be recruited for this study. Performance of Argus II users will be compared to performance of sighted subjects viewing a prosthetic vision simulation in virtual reality.

Device: Argus II
In response to the stimulation/image on the monitor subjects will be asked to make perceptual judgments and complete simple behavioral tasks. Examples include detecting a stimulus ('did you see a light on that trial'), reporting size by drawing on a touch screen, reporting shape by selecting a tactile object of similar shape, or locating a target by moving around, rotating the head, and pointing. Sighted control subjects will perform similar tasks in virtual reality.

Outcome Measures

Primary Outcome Measures

  1. Phosphene shape [through study completion, an average of 1 year]

    The effect of stimulation strategy on phosphene shape elicited by electrical stimulation as measured by patient drawings

  2. Phosphene brightness relative to reference stimulus [through study completion, an average of 1 year]

    The effect of stimulation strategy on phosphene brightness elicited by electrical stimulation compared to reference stimulus, as measured by verbal responses

  3. Pattern discrimination accuracy [through study completion, an average of 1 year]

    The effect of stimulation strategy on the ability to discriminate patterns elicited by electrical stimulation as assessed by verbal responses

  4. Scene understanding performance [through study completion, an average of 1 year]

    The effect of stimulation strategy to locate objects of interest and relay accurate descriptions of the visual scene as assessed by verbal responses

Eligibility Criteria

Criteria

Ages Eligible for Study:
18 Years and Older
Sexes Eligible for Study:
All
Accepts Healthy Volunteers:
Yes
Criteria for inclusion of Argus II users:
  • Subject must be at least 25 years of age;

  • Subject has been implanted with the Argus II system;

  • Subject's eye has healed from surgery and the surgeon has cleared the subject for programming;

  • Subject has the cognitive and communication ability to participate in the study (i.e., follow spoken directions, perform tests, and give feedback);

  • Subject is willing to conduct psychophysics testing up to 4-6 hours per day of testing on 3-5 consecutive days;

  • Subject is capable of understanding patient information materials and giving written informed consent;

  • Subject is able to walk unassisted.

Criteria for inclusion of sighted control subjects:
  • Subject speaks English;

  • Subject must be at least 18 years of age;

  • Subject has visual acuity of 20/40 or better (corrected);

  • Subject has the cognitive and communication ability to participate in the study (i.e., follow spoken directions, perform tests, and give feedback);

  • Subject is capable of understanding participant information materials and giving written informed consent.

  • Subject is able to walk unassisted

Exclusion criteria:
  • Argus II: Subject is unwilling or unable to travel to testing facility for at least 3 days of testing within a one-week timeframe;

  • Sighted controls: Subject has a history of motion sickness or flicker vertigo

  • All: Subject does not speak English;

  • All: Subject has language or hearing impairment

Contacts and Locations

Locations

Site City State Country Postal Code
1 University of California, Santa Barbara Santa Barbara California United States 93106
2 Johns Hopkins University Baltimore Maryland United States 21287
3 University of Michigan, Ann Arbor Ann Arbor Michigan United States 48109
4 University of Minnesota Minneapolis Minnesota United States 55455

Sponsors and Collaborators

  • University of California, Santa Barbara
  • Johns Hopkins University
  • University of Michigan
  • University of Minnesota
  • National Eye Institute (NEI)

Investigators

  • Principal Investigator: Michael Beyeler, PhD, University of California, Santa Barbara

Study Documents (Full-Text)

None provided.

More Information

Publications

None provided.
Responsible Party:
Michael Beyeler, PhD, Professor, University of California, Santa Barbara
ClinicalTrials.gov Identifier:
NCT05285618
Other Study ID Numbers:
  • R00EY029329
  • R00EY029329
First Posted:
Mar 17, 2022
Last Update Posted:
Mar 17, 2022
Last Verified:
Mar 1, 2022
Individual Participant Data (IPD) Sharing Statement:
Yes
Plan to Share IPD:
Yes
Studies a U.S. FDA-regulated Drug Product:
No
Studies a U.S. FDA-regulated Device Product:
Yes
Product Manufactured in and Exported from the U.S.:
Yes
Additional relevant MeSH terms:

Study Results

No Results Posted as of Mar 17, 2022