Skip to content

Sushitrashhhh/NN-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Simple MNIST Neural Network from Scratch

In this notebook, I implemented a simple two-layer neural network and trained it on the MNIST digit recognizer dataset.

The goal of this project is not state-of-the-art accuracy, but to serve as an instructional example to understand the underlying math of neural networks — forward propagation, backward propagation, and gradient descent.

Architecture

  • Input layer: 784 units (28x28 pixels)
  • Hidden layer: 10 units with ReLU activation
  • Output layer: 10 units with Softmax activation

What you'll get to know

  • Forward propagation equations
  • Backward propagation (gradients for W and b)
  • Parameter updates using gradient descent
  • How accuracy improves during training

Dataset

  • MNIST handwritten digits dataset (0–9)

How to run

Clone this repo and open the notebook:

git clone <your-repo-link>
cd <repo-folder>
jupyter notebook

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published