Skip to content

ssec-jhu/neuro-morpho

SSEC-JHU neuro_morpho

CI Documentation Status codecov Security DOI

SSEC-JHU Logo

Neural Morphology (Neuro-Morpho)

Neurons form complex dendritic arbors to integrate signals from many sources at once. The structure of a neuron is so essential to its function that classes of neuron can be identified by their structure alone. Additionally, the morphology of a neuron gives important insights into the mechanisms of nervous system development and disfunction. Therefore, software that can accurately trace the structure of the dendritic arbor is essential. This software shouldn’t require human supervision, which is time-consuming and introduces biases and inconsistencies, and should keep pace with modern imaging techniques that can rapidly generate large datasets. To address these issues, we propose developing open-source software based on convolutional neural networks (CNNs - specifically Unet) to segment/skeletonize neural dendrites.

Installation instructions

For additional cmds see the Conda cheat-sheet.

  • Download and install either miniconda or anaconda.
  • Create new environment: conda create -n <environment_name>
  • Activate/switch to new env: conda activate <environment_name>
  • Clone the repository: git clone git@github.com:ssec-jhu/neuro-morpho.git <repo_dir>
  • cd into repo_dir (neuro-morpho, if repo_dir wasn't set in previous command).
  • Install python and pip: conda install python=3.11 pip
  • Install all required dependencies (assuming local dev work), there are two ways to do this
    • If working with tox (recommended) pip install -r requirements/dev.txt.
    • If you would like to setup an environment with all requirements to run outside of tox pip install -r requirements/all.txt.
  • Build and install package in <environment_name> conda env: pip install .
  • Do the same but in dev/editable mode (changes to repo will be reflected in env installation upon python kernel restart) NOTE: This is the preferred installation method for dev work. pip install -e .. NOTE: If you didn't install dependencies from requirements/dev.txt, you can install a looser constrained set of deps using: pip install -e .[dev]. _NOTE: For GPU acceleration PyTorch can be re-installed with their accelerator options. For PyTorch see the PyTorch installation docs. E.g., pip install --force -r requirements/pytorch.txt --index-url https://download.pytorch.org/whl/cu126. Since it's installed via requirements/prd.txt, --forceor --upgrade must be used to re-install the accelerator versions. --force is preferable as it will error if the distribution is not available at the given url index, however --upgrade may not.

Usage

Preprocessing

Run neuro_morpho/notebooks/data_organizer.ipynb notebook to partition the data to three disjoint groups: training, validation and test sets. The partition ratios are hardcoded in notebook and set currently on 60% of data for training, 20% for validation and 20% for testing.

Main pipeline

Pipeline configuration is maintained in unet.config.gin file. Using the command line interface (i.e., from a terminal prompt):

python -m neuro_morpho.cli

command runs the pipeline that consists of 4 separate modules: Each one of them can be run separately, or alternatively, all 4 can be run one after another.

Training

The relevant params in config file are:

run.training_x_dir = "/Path/to/training/images"
run.training_y_dir = "/Path/to/training/labels"
run.validating_x_dir = "/Path/to/validation/images"
run.validating_y_dir = "/Path/to/validation/labels"
run.logger = @CometLogger()

Threshold calculation

The relevant param in config file is: run.get_threshold = True

Testing

The relevant params in config file are:

run.testing_x_dir = "/Path/to/testing/images"
run.testing_y_dir = "/Path/to/testing/labels"

Image inference

The relevant params in config file are: run.infer = True and the same paths to use as in case of testing

Using tox

  • Run tox tox. This will run all of linting, security, test, docs and package building within tox virtual environments.
  • To run an individual step, use tox -e {step} for example, tox -e test, tox -e build-docs, etc.

Typically, the CI tests run in github actions will use tox to run as above. See also ci.yml.

Outside of tox:

The below assume you are running steps without tox, and that all requirements are installed into a conda environment, e.g. with pip install -r requirements/all.txt.

NOTE: Tox will run these for you, this is specifically if there is a requirement to setup environment and run these outside the purview of tox.

Linting:

Facilitates in testing typos, syntax, style, and other simple code analysis tests.

  • cd into repo dir.
  • Switch/activate correct environment: conda activate <environment_name>
  • Run ruff .
  • This can be automatically run (recommended for devs) every time you git push by installing the provided pre-push git hook available in ./githooks. Instructions are in that file - just cp ./githooks/pre-push .git/hooks/;chmod +x .git/hooks/pre-push.

Security Checks:

Facilitates in checking for security concerns using Bandit.

  • cd into repo dir.
  • bandit --severity-level=medium -r neuro_morpho

Unit Tests:

Facilitates in testing core package functionality at a modular level.

  • cd into repo dir.
  • Run all available tests: pytest .
  • Run specific test: pytest tests/test_util.py::test_base_dummy.

Regression tests:

Facilitates in testing whether core data results differ during development.

  • WIP

Smoke Tests:

Facilitates in testing at the application and infrastructure level.

  • WIP

Build Docs:

Facilitates in building, testing & viewing the docs.

  • cd into repo dir.
  • pip install -r requirements/docs.txt
  • cd docs
  • make clean
  • make html
  • To view the docs in your default browser run open docs/_build/html/index.html.

About

Segmentation of neural dendrites

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •