The world's slowest 3D Gaussian Splatting renderer... built that way on purpose.
splaty is a deliberately slow, crystal-clear Python implementation of a 3D Gaussian Splatting renderer designed for learning. No GPU acceleration. No CUDA kernels. No performance tricks obscuring the algorithm. Just pure Python and PyTorch, making every single step explicit and understandable.
This isn't about not knowing how to optimize—it's about recognizing that teaching tools have different requirements than production code. When you're trying to understand a complex algorithm, the last thing you need is layers of optimization obscuring the core ideas. Sometimes the best way to learn is to slow down and see every step clearly.
This repository is the official companion to my Medium series on building a 3D Gaussian Splatting renderer from scratch:
| Part | Title | Link |
|---|---|---|
| Part 1 | I Built the Slowest 3D Gaussian Splatting Renderer... On Purpose | Blog Post |
| Part 2 | Circles Are Not Gaussians (But Let's Pretend They Are) | Blog Post |
| Part 3 | Splat Your Own Gaussians: Covariance, Ellipses, and Real 2D Projection | Blog Post |
| Part 4 | The Tricks That Make Production 3DGS Fast (Even If Ours Isn't) | 🔜 coming soon |
Each blog post walks through the theory and implementation details for one stage of the renderer. The code here implements everything discussed in the series.
Most 3D Gaussian Splatting implementations are optimized for performance, which can make them harder to learn from. The math is often in CUDA kernels, operations are batched and parallelized, and you need to know what you're looking for.
splaty takes a different approach: clarity over speed. Operations are explicit, transformations are visible, and the blog series walks through the implementation step by step.
This project uses pixi for environment management and task running to keep things simple. Install pixi following the official instructions.
Note that this repo uses git submodules (for gsplat integration), so clone with:
git clone --recurse-submodules https://github.com/sascha-kirch/splaty.git
cd splatyTip
If you already cloned without --recurse-submodules, run:
git submodule update --init --recursiveInstall gsplat and its dependencies (including a patch to dump scene normalization info):
pixi run gsplat-setupWarning
Make sure you have initialized recursive submodules before running gsplat setup, or it will fail.
Download the MipNeRF-360 dataset (the bonsai scene used throughout the series):
pixi run gsplat-download-dataTrain a 3D Gaussian Splatting scene that you can then render with splaty:
# Train on bonsai scene (data factor 4, 30k iterations)
pixi run 3dgs-bonsai
# Visualize the trained model interactively (optional)
pixi run visualize-bonsaiWarning
To train a scene with gsplat, you need a CUDA-capable GPU with sufficient memory. You can adjust the --data_factor and --num_iterations parameters in the task definition to fit your hardware.
Tip
I pre-defined the tasks for 3 scenes, so you can either choose between bonsai, kitchen or bicycle or use it as a template to define your own task. Available tasks can be visualized with pixi task list -s or directly in pyproject.toml.
That's it! You're ready to render.
The repository implements six progressive stages of rendering. Use these pre-configured tasks to render each stage and see the results:
| Example Output | Command | Blog Post |
|---|---|---|
| Stage 1 Means colored by depth ![]() |
pixi run render-means-full |
Part 1 |
| Stage 2 Circles colored by depth ![]() |
pixi run render-circles-full |
Part 2 |
| Stage 3 Circles with spherical harmonics ![]() |
pixi run render-circles-sh-full |
Part 2 |
| Stage 4 Circles with SH and alpha blending ![]() |
pixi run render-circles-alpha-full |
Part 2 |
| Stage 5 Gaussian splats with full rendering ![]() |
pixi run render-splats-alpha-quarter |
Part 3 |
| Stage 6a Splats with transmittance/conics ![]() |
pixi run render-splats-transmittance-quarter |
🔜 Part 4: coming soon |
| Stage 6b Splats with tiling (very slow!) ![]() |
pixi run render-splats-tiled-quarter |
🔜 Part 4: coming soon |
Warning
When I said this renderer is slow, I meant it! Stages 5-6 can take several hours depending on how many gaussians the scene has and in which resolution you want to render at. Grab a coffee and watch the progress.
Tip
Custom Configs: The pre-defined tasks use the default config from render_image.py. To use a different config simply append --config-name <NAME>, e.g.: pixi run render-means-full --config-name kitchen
Outputs are saved to: ./outputs/
To render from different viewpoints, edit the config files in config/ (e.g., config/bonsai.yaml).
You can either:
- Define custom camera parameters manually
- Extract camera parameters from COLMAP training data:
pixi run colmap-convert-bonsaiThis converts the COLMAP binary model to text format in data/360_v2/bonsai/sparse/. The text files include headers explaining each parameter. See the COLMAP documentation for details.
Important: gsplat normalizes scenes during training. The patch we applied during installation dumps the normalization transformation to transform.txt in the output directory. Copy this transformation into your config to properly undo the normalization (as explained in the blog series).
This project is for anyone who wants to understand how 3D Gaussian Splatting works under the hood. If you've read about it or used existing implementations and want to see the details, this might help.
You should be comfortable with:
- Python and PyTorch (basic tensor operations)
- Linear algebra fundamentals (matrices, vectors, transformations)
- Basic computer graphics concepts (cameras, projections)
You don't need:
- CUDA or GPU programming experience
- Deep graphics expertise
- Advanced mathematics background
Found a bug? Have a question? Want to improve the code or documentation?
- Issues: Open an issue on GitHub for bugs, questions, or suggestions
- Pull Requests: Contributions are welcome! Please open an issue first to discuss major changes
- Discussions: Share your results, ask questions, or discuss improvements in the GitHub Discussions
If you want to dive deeper into 3D Gaussian Splatting:
- Original Paper: 3D Gaussian Splatting for Real-Time Radiance Field Rendering
- gsplat Library: nerfstudio-project/gsplat
- MipNeRF-360 Dataset: Google Research dataset page
- The gsplat team for their CUDA implementation and scene representation
- Google Research for the MipNeRF-360 dataset
- The original 3D Gaussian Splatting authors
If you found this useful:
- ⭐ Star this repository to help others discover it
- 👏 Clap for the blog posts on Medium
- 💬 Leave comments with your questions or feedback
Built by Sascha Kirch








