NumPy_ML was a personal education & exploration project that implements a grammar of Deep-Learning tools.
- Built From Scratch 🐥🥚
- 100% NumPy 💯🐍
- Extensible architecture with Auto-Grad 🗻🗺️
- Easy Modular 🥥🌴
- Expressive, type-hinted, human code 🧼🛀
Please see the MNIST demo as a 'Tutorial by Example' 🧐🧮
The code is useful for ML students to preview some essential ANN algorithms from scratch ;) 🍿🍿🍿
Algorithms such as:
- 2D Convolution (Numba accelerated)
- Softmax
- Backpropagation / Chain Rule
- Adam Optimisation
- Auto-Initialisation (e.g. ReLU -> Kaiming; Softmax -> Xavier)
- Cross-Entropy Loss
- Confusion Matrix
Network operations are float32 based. The design is capable of quick prototyping & deployment of small networks with ~max 12-16 layers unless you're good at keeping gradients alive (may require non-sequential architecture). 🐙
The full project was originally intended for non-linear Deep Reinforcement-Learning workflows hence its a bit over-engineered for its current capabilities but the essential roadmap is all laidout if I ever wish to revisit this old project. There's of course a lot more features I wish to have added. Please see pyproject.toml for build requirements. Please note that Torchvision is a dependency of this project only for convenience and reproducibility of fetching the MNIST dataset for the MNIST classification demo.
All work within this repository is licensed with the Attribution-NonCommercial-ShareAlike 4.0 International
(see license.md or visit https://creativecommons.org/licenses/by-nc-sa/4.0/)
Thanks for reading❗️😄
If my work was useful in anyway, please support it with a star ⭐️👍