COGSTIM: Online Computational Modulation of Visual Perception.
Projektleiter:
Prof. Dr. Kristine Krug , Dr. Corentin Gaillard
Finanzierung:
EU HORIZON Europe;
HORIZON TMA MSCA Postdoctoral Fellowship - European Fellowship for Dr. Corentin Gaillard:
Computational models of vision often address problems that have a single and definite end-point, such as visual recognition: an example of this might be to find a ripe banana in a complex scene. However, not all computation is of this form. Visual information is processed continuously in sensory areas and the nervous system has the capacity to alter or halt an ongoing behavioural response to changes in incoming information. We can therefore react flexibly to updated sensory input or changed requirements for motor output. On the other hand, these same neuronal mechanisms must also support perceptual stability, so that noisy signals do not cause loss of a crucial goal. In project COGSTIM, I will investigate the functional neuronal networks that support the balance between perceptual flexibility and stability, within primate visual areas. I will use a highly innovative approach, combining dense electrophysiological recording with online (real-time) decoding of neuronal correlates of the subject’s perceptual choice, based on adaptive machine-learning algorithms. In order to control visual perception effectively and predictably, closed-loop electrical stimulation will be applied under dynamically adjusted feedback to identified neuronal circuits that causally modulate associated percepts. Crucially, this novel approach using joint decoding and stimulation in real time will allow me to target dynamically visual percepts, representing a significant advance in our understanding of on-going, continuous computations of the primate brain. Such developments offer promising bases for the future development of rehabilitative therapeutical protocols, as well as innovative brain machine interfaces suitable for real-world use.
Computational models of vision often address problems that have a single and definite end-point, such as visual recognition: an example of this might be to find a ripe banana in a complex scene. However, not all computation is of this form. Visual information is processed continuously in sensory areas and the nervous system has the capacity to alter or halt an ongoing behavioural response to changes in incoming information. We can therefore react flexibly to updated sensory input or changed requirements for motor output. On the other hand, these same neuronal mechanisms must also support perceptual stability, so that noisy signals do not cause loss of a crucial goal. In project COGSTIM, I will investigate the functional neuronal networks that support the balance between perceptual flexibility and stability, within primate visual areas. I will use a highly innovative approach, combining dense electrophysiological recording with online (real-time) decoding of neuronal correlates of the subject’s perceptual choice, based on adaptive machine-learning algorithms. In order to control visual perception effectively and predictably, closed-loop electrical stimulation will be applied under dynamically adjusted feedback to identified neuronal circuits that causally modulate associated percepts. Crucially, this novel approach using joint decoding and stimulation in real time will allow me to target dynamically visual percepts, representing a significant advance in our understanding of on-going, continuous computations of the primate brain. Such developments offer promising bases for the future development of rehabilitative therapeutical protocols, as well as innovative brain machine interfaces suitable for real-world use.
Kontakt
Prof. Dr. Kristine Krug
Otto-von-Guericke-Universität Magdeburg
Fakultät für Naturwissenschaften
Leipziger Str. 44
39120
Magdeburg
Tel.:+49 391 6755051
weitere Projekte
Die Daten werden geladen ...