A fancy name for my personal AI experiments.
Welcome to the Pomilon Intelligence Lab. Despite the official-sounding name, this isn't a company or a research team. It's just me (@pomilon) organizing my chaotic "learning-by-doing" projects so they don't clutter up my personal profile.
I'm a university student and hobbyist exploring how LLMs work under the hood. I run most of my code on my laptop or local Linux servers, trying to squeeze performance out of consumer hardware.
I use this organization to host:
- Source code for experimental architectures I'm testing.
- Training pipelines and inference scripts.
- Weird proofs-of-concept that might break at any moment.
Note: The heavy model weights for these projects are hosted over at my Hugging Face Profile.
| Project | Type | Description | Links |
|---|---|---|---|
| Aetheris | LLM | A 294M parameter Hybrid Mamba-MoE model. My attempt at smashing two architectures together to see what happens. | Weights (HF) |
(There are other repositories here, but they are currently private/classified pending further stability tests. 🤫)
Built with ❤️ (and a lot of coffee) by Pomilon.