How do we recognise orientated features in the visual world?
The eye is not just a pixel camera. Visual information is detected by the retina, where it is already processed before it is transmitted to higher visual brain centers. This includes the extraction of salient features from visual scenes, such as motion directionality or contrast, through neurons belonging to distinct neural circuits. Some retinal neurons are tuned to the orientation of elongated visual stimuli. Such ‘orientation-selective’ neurons are present in the retinae of most, if not all, vertebrate species analyzed to date, with species-specific differences in frequency and degree of tuning.
We are interested in the development, assembly and function of retinal cells belonging to this orientation-selective circuit.
Type II (left) and type III (right) amacrine cells share a strong asymmetry their dendritic arborisations, crucial for their function.
We are exploring the molecular, cellular and functional mechanisms that lead to this elongated structure.
Using two-photon calcium imaging, we are identifying the functional development of orientation-selective amacrine cells
In collaboration with Prof. Kevin Briggman at the Max Planck Institute for Neurobiology of Behavior – caesar, we use correlative light and 3D electron microscopy to identify synaptically connected retinal neurons belonging to the orientation-selective circuit.
Our work will decipher the molecular and cellular mechanisms to form a functional visual system enabling us to extract specific features from the visual world.