A Data-Based Artificial Neural Network Assisted Testing Framework for the Assessment of Electrical Stimulation Patterns for Retinal Implants

DSpace Repository


URI: http://hdl.handle.net/10900/99489
Dokumentart: PhDThesis
Date: 2022-02-14
Language: English
Faculty: 7 Mathematisch-Naturwissenschaftliche Fakultät
Department: Biologie
Advisor: Zrenner, Eberhart (Prof. Dr.)
Day of Oral Examination: 2020-03-12
DDC Classifikation: 500 - Natural sciences and mathematics
600 - Technology
Keywords: Maschinelles Lernen , Mustererkennung , Netzhaut
Other Keywords:
Decoding Framework
Artificial Neural Network
Retinal Implant
Artificial Vision
License: http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=de http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=en
Order a printed copy: Print-on-Demand
Show full item record


Electrical stimulation (E-stim) of the retina with electrode arrays can be employed to evoke visual sensations for patients blinded by photoreceptor dystrophy due to retinitis pigmentosa. Although E-stim through electrical retinal implants (E-retinal-implants) can provide benefits for affected patients in daily life, the temporal and spatial resolution of perceived visual sensations need to be optimized. A driving question is, if the perceived visual sensations in patients are an appropriate representation of the objects in their visual perception. To date, only patients can answer this question. In terms of developing E-retinal-implants to aid vision, this appears to be a very late stage in the development of implants for a qualified answer to this question. In this work, an experimental and analytical basic concept was developed, which allows to estimate the degree to which the retina has coded the desired object in terms of recognition. Regarding the minimal needs for object recognition by blind patients, a black-box approach is described on the meta-level, where all recorded retinal ganglion cell (RGC)-responses are considered. As a fundamental building block, an electrophysiological multielectrode array (MEA)-setup for recording both, light stimulation (L-stim) and E-stim induced RGC responses in epiretinal configuration from healthy mouse retinal explants was used. L-stim patterns were applied as single bars and as double bars with different bar spacings at different velocities in four directions. With this, a library consisting of L-stim induced RGC responses with 104 classes was created. E-stim patterns were presented analogously through the MEA electrodes. Since this approach includes complex stimulus and response constellations, artificial neural networks (ANNs) for pattern recognition were applied to detect E-stim induced objects. As a robust ANN architecture, convolutional ANNs were employed. To test and assess the approximation quality of E-stim, E-stim induced response sequences were presented to the ANNs trained with L-stim induced classes. Six E-stim classes were correctly classified with approximately 96 % accuracy by the networks which were exclusively trained with the 104-class library from light evoked responses. For different object structures, this approach can give hints about redundant components in the stimulus structure. Through this, it can be estimated to which degree a certain object structure is required so that L-stim induced responses can be approximated by E-stim induced responses to a useful degree. The here developed set of analysis-tools supports the evaluation and approximation of E-stim induced RGC responses to L-stim induced responses. This allows an assessment of the functionality of new E-stim patterns already before implementing respective solutions in E-retinal-implants and ahead of the testing phase with affected patients. It was possible to reduce the original test dataset with 24 E-stim induced classes to a subset of most useful classes. With the help of such an approach, new E-stim strategies could already be narrowed down to a limited parameter space, which in turn could help shorten future design procedures of retinal implants.

This item appears in the following Collection(s)