Skip to content

A lightweight implementation of pytorch's autograd. For education purposes.

Notifications You must be signed in to change notification settings

mulac/autograd-light

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

autograd-light

A lightweight implementation of pytorch's autograd. For education purposes. Inspired by Karpathy and Hotz.

It includes enough functions to create and train a fully connected neural network, from nothing but numpy. It uses the principle of automatic differentiation which underlies all widely used deep learning frameworks today.

Builds upon karpathy/micrograd. View geohot/tinygrad to see how you would expand on this further and then of course pytorch.

Installation

pip install autograd tests require python 3.8 (walrus)

Example

from autograd.tensor import Tensor
from autograd.other import SGD, NLLLoss, fetch_mnist, init_layer

class Net:
    def __init__(self):
        self.l1 = Tensor(init_layer(28*28, 800))
        self.l2 = Tensor(init_layer(800, 10))

    def forward(self, x):
        return x.dot(self.l1).relu().dot(self.l2).log_softmax()

    def parameters(self):
        return [self.l1, self.l2]
        
net = Net()
optim = SGD(net.parameters(), lr=1e-7)
loss_fn = NLLLoss
BS = 128

See test_net.py for full training loop. Achieves ~94% accuracy on MNIST.

About

A lightweight implementation of pytorch's autograd. For education purposes.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages