Skip to content

🧠 Incremental learning experiment using Learning without Forgetting (LwF) in PyTorch - training model new classes (like horse) without forgetting old ones (cow, donkey, sheep).

License

Notifications You must be signed in to change notification settings

pranta-barua007/learning-without-forgetting-pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🧠 Incremental Learning with Learning without Forgetting (LwF)

An educational PyTorch experiment demonstrating how a neural network can learn new classes without forgetting previously learned ones.


🌟 Overview

This project explores Incremental Learning using the Learning without Forgetting (LwF) approach.
Instead of retraining from scratch when new classes arrive, we retain knowledge from the old model while adapting to new data β€” reducing catastrophic forgetting.

The experiment is implemented in PyTorch and applied to a subset of the Animal Image Dataset (Kaggle).


🧩 What’s Inside

  • Stage 1 – Initial Training:
    Train a ResNet-18 model on 3 animal classes: cow, donkey, and sheep.

  • Stage 2 – Incremental Learning:
    Introduce a new class horse and fine-tune the model using LwF to preserve old knowledge.

  • Knowledge Distillation:
    Combine standard classification loss with a distillation loss that aligns new model outputs with those of the frozen old model.

βš™οΈ Implementation Details

Component Description
Backbone Pretrained ResNet-18 (ImageNet weights)
Optimizer Adam
Initial Learning Rate 0.001
Incremental Learning Rate 0.0001
Initial Epochs 20
Incremental Epochs 10

πŸ“ˆ Results Summary

  • The model successfully learns the new class (horse) without completely forgetting the original classes.
  • Knowledge distillation stabilizes logits for old classes, achieving balanced performance across all four categories.
Stage Classes Accuracy (approx.) Observation
Initial cow, donkey, sheep ~95% Good base performance
Incremental +horse ~90–92% Slight drop, minimal forgetting

πŸš€ Future Work

  • Extend to multi-step incremental learning (add more classes sequentially)
  • Add visualization for logits drift and forgetting metrics

πŸ‘€ Author

Pranta Barua Educational experiment on continual learning (2025)


⭐ If you found this useful or educational, consider starring the repository!

About

🧠 Incremental learning experiment using Learning without Forgetting (LwF) in PyTorch - training model new classes (like horse) without forgetting old ones (cow, donkey, sheep).

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published