Implementation of classic N-Gram Language Models with Laplace smoothing, interpolation backoff, and perplexity evaluation — built from scratch in Python.
-
Updated
Oct 15, 2025 - Jupyter Notebook
Implementation of classic N-Gram Language Models with Laplace smoothing, interpolation backoff, and perplexity evaluation — built from scratch in Python.
This thesis presents an empirical comparison between PPM★ (Prediction by Partial Matching), a state-of-the-art statistical language modeling algorithm, and IDyOT (Information Dynamics of Thinking), a cognitive architecture designed to model human-like hierarchical learning and prediction.
Add a description, image, and links to the statistical-language-modeling topic page so that developers can more easily learn about it.
To associate your repository with the statistical-language-modeling topic, visit your repo's landing page and select "manage topics."