-
-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Labels
Description
This blog post is pretty cool
It builds a GPT in 243 lines of python including the autograd engine.
It learns baby names and then creates new ones.
Q: How is this different to a simple transition matrix building probabilities off of word transitions? Token size, how it builds patterns?
I think this could form the foundation of an interesting lecture using python.
Reactions are currently unavailable