Skip to content

Latest commit

 

History

History
18 lines (12 loc) · 861 Bytes

File metadata and controls

18 lines (12 loc) · 861 Bytes

mgp

bart

An educational NumPy-based autodiff engine and neural network library. See my blog post for a breakdown of the key logic.

Installation

pip install microgradpp

How To Use It

The mgp.np module contains the primary NumPy engine with the code from the blog post, while mgp.vanilla contains a baseline scalar engine similar to micrograd.

  • See scripts/train_mnist.py, where we use mgp.np to train a convolutional neural network for MNIST image classification. With the stated hyperparameters, it achieves an accuracy of 0.97+.
  • scripts/train_xor.py runs MLPs built on both the vanilla and NumPy-based engines on the XOR task. The NumPy-based engine trains several times faster.

In general, I tried to make the API as PyTorch-like as possible while keeping it simple.