Visual Pokedex for LLMs
go to live demo with preprocessed models
run your own models in google colab
- after this just do
npm installandnpm run devin thedexfolder. This should start the application. - for deploying to remote server use
npm run build
- model repos on hf contains metadata files and the transformers repo contains the classes which are supposed to handle a model. we extract that information and save it in assets so that all of this can run in browser. we use torch device meta to avoiding loading stuff into memory.
- in case of being used from inside notebook, we do not write any files but keep the processed data in memory, and pass that information as query string to the iframe
@article{attentionmech2025dex,
title={dex: interactive visual explorer for open source LLMs},
author={attentionmech},
year={2025}
}
This is the post where it got started, and this is where it got viral.

