Viewing Study NCT06117332



Ignite Creation Date: 2024-05-06 @ 7:43 PM
Last Modification Date: 2024-10-26 @ 3:12 PM
Study NCT ID: NCT06117332
Status: ACTIVE_NOT_RECRUITING
Last Update Posted: 2023-11-07
First Post: 2023-10-25

Brief Title: AI-Powered Artificial Vision for Visual Prostheses
Sponsor: University of California Santa Barbara
Organization: University of California Santa Barbara

Study Overview

Official Title: AI-Powered Artificial Vision for Visual Prostheses
Status: ACTIVE_NOT_RECRUITING
Status Verified Date: 2023-11
Last Known Status: None
Delayed Posting: No
If Stopped, Why?: Not Stopped
Has Expanded Access: False
If Expanded Access, NCT#: N/A
Has Expanded Access, NCT# Status: N/A
Acronym: None
Brief Summary: Visual impairment is one of the ten most prevalent causes of disability and poses extraordinary challenges to individuals in our society that relies heavily on sight Living with acquired blindness not only lowers the quality of life of these individuals but also strains societys limited resources for assistance care and rehabilitation However to date there is no effective treatment for man patients who are visually handicapped as a result of degeneration or damage to the inner layers of the retina the optic nerve or the visual pathways Therefore there are compelling reasons to pursue the development of a cortical visual prosthesis capable of restoring some useful sight in these profoundly blind patients

However the quality of current prosthetic vision is still rudimentary A major outstanding challenge is translating electrode stimulation into a code that the brain can understand Interactions between the device electronics and the retinal neurophysiology lead to distortions that can severely limit the quality of the generated visual experience Rather than aiming to one day restore natural vision which may remain elusive until the neural code of vision is fully understood one might be better off thinking about how to create practical and useful artificial vision now

The goal of this work is to address fundamental questions that will allow the development of a Smart Bionic Eye a device that relies on AI-powered scene understanding to augment the visual scene similar to the Microsoft HoloLens tailored to specific real-world tasks that are known to diminish the quality of life of people who are blind eg face recognition outdoor navigation reading self-care
Detailed Description: The investigators will perform basic experimental studies involving humans BESH designed to quantify the perceptual experiences of retinal and cortical prosthesis patients These experiments will follow standard procedures for collecting behavioral data and involve simple perceptual tasks eg signal detection object recognition and behavioral tasks eg walking towards a goal location

The investigators will produce visual percepts in CORTIVIS and Argus II patients either by directly stimulating electrodes using FDA-approved pulse trains or by asking them to view a computer or projector screen and using standard stimulation protocols as is standardly used for their devices to convert the computer or projector screen image into pulse trains on their electrodes Informed by psychophysical data and computational models the investigators will test the ability of different stimulus encoding methods to support simple perceptual and behavioral tasks eg object recognition navigation These encoding methods may include computer vision and machine learning methods to highlight important objects in the scene or to highlight nearby obstacles and may be tailored to each individual patient Performance of prosthesis patients will be compared both across stimulus encoding methods and to performance in normally sighted control subjects viewing stimuli manipulated to match the expected perceptual experience of prosthesis patients

The normal method of stimulation is a chain from a camera mounted on eye glasses through a video processing unit VPU which converts the video image into FDA-approved electronic pulse trains Sometimes the investigators will test subjects using the camera More often the investigators will carry out direct stimulation when using an external computer to directly specify pulse trains eg for Argus II a 1s 10 Hz cathodic pulse train with a current amplitude of 100 microAmps and a pulse width of 45 microseconds to Electrode 12 These direct pulse trains are then sent to the VPU This VPU contains software that makes sure that these pulse trains are within FDA-approved safety limits For example these pulses must be charge-balanced equal anodiccathodic charge and must have a charge density below 35 microCoulombscm2 Sometimes the investigators will test subjects using the camera Sometimes the investigators will directly send pulses to the VPU by directly specifying pulse trains eg send a 1 s 10 Hz cathodic pulse train with a current amplitude of 100 microAmps and a pulse width of 45 microAmps to Electrode 12 of an Argus II implant

Important parameters for safety include a pulses must be charge-balanced an anodic pulse must be followed quickly by a cathodic pulse and vice versa or the electrode will dissolve b charge density should be limited The frequency of the pulse train and the current amplitude of the pulse train is not actually a critical safety issue since the electronicneural interface is robust to extremely high rates of stimulation and high current levels However high frequency pulse trains or high amplitude pulse trains can produce discomfort in patients analogous to going from a dark movie theatre to sunlight due to inducing large-scale neuronal firing The investigators will normally be focusing on pulse-train frequenciesamplitudes that are in the normal range used by the patient when using their device If the investigators use parameters that might be expected to produce a more intense neural response and therefore have the potential to cause discomfort they will always introduce them in a step-wise function eg gradually increasing amplitude while checking that the sensation is not uncomfortably bright and the investigators will immediately decrease the intensity of stimulation if patients report that the sensation approaches discomfort The PI has experience in this approach and will train all personnel on these protocols

In response to the stimulationimage on the monitor subjects will be asked to either make a perceptual judgment or perform a simple behavioral task Examples include detecting a stimulus did you see a light on that trial reporting size by drawing on a touch screen or walking to a target location Both patient response and reaction time will be recorded

None of these stimuli will elicit emotional responses or be aversive in any way

In some cases the investigators will also collect data measuring subjects eye position This is a noninvasive procedure that will be carried out using standard eye-tracking equipment via an infra-red camera that tracks the position of the subjects pupil Only measurements like eye position or eye blinks will be recorded so these data do not contain identifiable information

Subjects are encouraged to take breaks as often as needed they may leave the testing room The investigators use various experimental techniques including 1 Same-different - eg subjects are shown two percepts and are asked if they are the same or different 2 Method of adjustment - eg subjects are asked to adjust a displaystimulation intensity until a percept is barely visible 3 2-alternative-forced choice - eg subjects will be presented with two stimuli and asked which of the two stimuli is brighter 4 Identification - subjects are asked to identify which letter was presented

In some cases as well as measuring accuracy the investigators will also measure improvement with practice by repeating the same task across multiple sessions up to 5 sessions each carried out on different testing days

Study Oversight

Has Oversight DMC: None
Is a FDA Regulated Drug?: False
Is a FDA Regulated Device?: True
Is an Unapproved Device?: None
Is a PPSD?: None
Is a US Export?: False
Is an FDA AA801 Violation?: None