Skip to content
#

kolmogorov-arnold-networks

Here are 38 public repositories matching this topic...

An implementation of the KAN architecture using learnable activation functions for knowledge distillation on the MNIST handwritten digits dataset. The project demonstrates distilling a three-layer teacher KAN model into a more compact two-layer student model, comparing the performance impacts of distillation versus non-distilled models.

  • Updated May 11, 2024
  • Python

Improve this page

Add a description, image, and links to the kolmogorov-arnold-networks topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the kolmogorov-arnold-networks topic, visit your repo's landing page and select "manage topics."

Learn more