Open-source repository containing neural network models of the retina. The models in this repository are inspired by and contain adapted code of sinzlab/neuralpredictors. Accompanying preprint: openretina: Collaborative Retina Modelling Across Datasets and Species.
openretina supports installation via pip.
# (Recommended) using a package manager like uv
uv pip install openretina
# Or directly via pip if you prefer
pip install openretinaIf you want to train your own models, run Jupyter notebooks, contribute to the project, or modify the source code of openretina, we recommend to install from source.
Consider using uv, a fast and flexible project and package manager. If you are not familiar with uv, check out their simple quickstart guide.
git clone git@github.com:open-retina/open-retina.git
cd open-retina
# Sync with uv
uv sync --extra dev
# Alternatively, install in editable mode via pip.
pip install -e .[dev]Test openretina by downloading a model and running a forward pass:
import torch
from openretina.models import *
model = load_core_readout_from_remote("hoefling_2024_base_low_res", "cpu")
responses = model.forward(torch.rand(model.stimulus_shape(time_steps=50)))Before raising a PR please run:
# Fix formatting of python files
make fix-formatting
# Run type checks and unit tests
make test-allWith this repository we provide already pre-trained retina models that can be used for inference and intepretability out of the box, and dataloaders together with model architectures to train new models. For training new models, we rely on pytorch lightning in combination with hydra to manage the configurations for training and dataloading.
The openretina package is structured as follows:
- modules: pytorch modules that define layers and losses
- models: pytorch lightning models that define models that can be trained and evaluated (i.e. models from specific papers)
- data_io: dataloaders to manage access of data to be used for training
- insilico: Methods perform insilico experiments with above models
- stimulus_optimization: optimize inputs for neurons of above models according to interpretable objectives (e.g. most exciting inputs)
- future options: gradient analysis, data analysis
- utils: Utility functions that are used across above submodules
- hoefling_2024: Originally published by Höfling et al. (2024), eLife
- Paper: A chromatic feature detector in the retina signals visual context changes.
- Dataset originally deposited at: https://gin.g-node.org/eulerlab/rgc-natstim
- karamanlis_2024: Originally published by Karamanlis et al. (2024), Nature
- Paper: Nonlinear receptive fields evoke redundant retinal coding of natural scenes
- Dataset: Karamanlis D, Gollisch T (2023) Dataset - Marmoset and mouse retinal ganglion cell responses to natural stimuli and supporting data. G-Node. https://doi.org/10.12751/g-node.ejk8kx
- maheswaranathan_2023: Originally published by Maheswaranathan et al. (2023), Neuron
- Paper: Interpreting the retinal neural code for natural scenes: From computations to neurons
- Dataset: Maheswaranathan, N., McIntosh, L., Tanaka, H., Grant, S., Kastner, D., Melander, J., Nayebi, A., Brezovec, L., Wang, J. Ganguli, S. Baccus, S. (2023). Interpreting the retinal neural code for natural scenes: from computations to neurons. Stanford Digital Repository. Available at https://purl.stanford.edu/rk663dm5577
- goldin_2022: Originally published by Goldin et al. (2022), Nature Communications
- Paper: Context-dependent selectivity to natural images in the retina
- Dataset originally deposited at: https://zenodo.org/records/6868362
- sridhar_2025: Originally published by Sridhar et al. (2025), biorxiv
- Paper: Modeling spatial contrast sensitivity in responses of primate retinal ganglion cells to natural movies
- Dataset: Sridhar S, Gollisch T (2025) Dataset - Marmoset retinal ganglion cell responses to naturalistic movies and spatiotemporal white noise. G-Node. https://doi.gin.g-node.org/10.12751/g-node.3dfiti/
- Models: Models trained on this dataset were developed as part of A systematic comparison of predictive models on the retina
The paper Most discriminative stimuli for functional cell type clustering explains the discriminatory stimulus objective we showcase in notebooks/most_discriminative_stimulus.
