Leangrad is a Lean 4 neural network library inspired by Karpathy's Micrograd.
- Automatic Differentiation Engine: Implementation of reverse-mode autodiff with a computation graph
- Basic Operations: Support for addition, multiplication, power operations, and ReLU activation
- Neural Network Components:
- Single neurons with customizable activation
- Neural network layers
- Multi-layer perceptrons (MLPs)
- Module System: Type class-based module system for parameter management and gradient zeroing
The engine provides the fundamental autodiff functionality:
Value: Core data structure representing a node in the computation graph- Basic operations:
add,mul,pow,relu - Automatic gradient computation via
backward - Operator overloads for natural mathematical syntax
The neural network module provides:
Neuron: Single neuron implementation with weights, bias, and optional nonlinearityLayer: Collection of neuronsMLP: Multi-layer perceptron implementationModuletypeclass for parameter management
This creates a network with 3 input neurons and two hidden layers and computes gradients via backpropagation:
import Leangrad.Engine
import Leangrad.NN
import Std
open Std Leangrad.Engine Leangrad.Engine.Ops Leangrad.NN
def main : IO Unit := do
-- Create an MLP with 3 inputs and [4,4,1] architecture
let mlp ← mkMLP 3 [4, 4, 1]
-- Create input values
let x1 ← mkValue 2
let x2 ← mkValue (-3)
let x3 ← mkValue 10
let input := #[x1, x2, x3]
-- Forward pass
let out ← mlpForward mlp input
-- Backward pass (on first output)
match out[0]? with
| some firstOut => backward firstOut
| none => pure ()