Skip to content

mhadjiantonis/neural-nets-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Nets From Scratch

This repository implements a simple feedforward neural network framework in Python using only NumPy. It is designed for educational purposes, demonstrating the core mechanics of neural networks, including forward and backward propagation, custom activation functions, and training on the MNIST dataset.

Features

  • Modular layer and activation function design
  • Custom implementations of Sigmoid, Softmax, and ReLU activations
  • Batch training with backpropagation
  • Model saving and loading via pickle
  • One-hot encoding and normalization for MNIST data
  • Training and evaluation with cross-tabulation of predictions

Getting Started

Prerequisites

  • Python 3.12+
  • Poetry for dependency management

Installation

  1. Clone the repository:
git clone https://github.com/yourusername/neural-nets-from-scratch.git
cd neural-nets-from-scratch
  1. Install dependencies
poetry install
  1. Download or place the MNIST CSV files in the mnist-in-csv directory. The dataset in CSV format can be obtained for example from Kaggle.

Usage

Train and evaluate the model on MNIST:

poetry run python neural_nets_from_scratch/main.py

This will:

  • Load and preprocess the MNIST data
  • Train a neural network with two hidden layers
  • Save the trained model to model.pickle
  • Print a confusion matrix of predictions vs. actual labels

Customization

  • Modify main.py to change the network architecture, learning rate, batch size, or number of epochs.
  • Implement additional activation functions or layers in activation_function.py and layer.py.

Code Overiew

  • DenseLayer: Implements a fully connected layer with customizable activation.
  • SequentialModel: Manages layers, training, forward/backward passes, and model persistence.
  • ActivationFunction: Abstract base for activation functions; includes Sigmoid, Softmax, and ReLU.
  • utils.py: Contains data shuffling utilities for batch training.

About

Educational neural network framework in pure Python/NumPy, featuring custom layers, activations, and MNIST training from scratch.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Contributors

Languages