A from-scratch implementation of a flexible neural network using only NumPy. This project demonstrates the inner workings of neural networks without relying on high-level deep learning libraries.
- Configurable multi-layer neural network
- Support for different activation functions (ReLU, Sigmoid)
- Implementation of backpropagation algorithm
- Choice of optimizers (SGD, Adam)
- Training script with data generation and visualization