Skip to content

alazkiyai09/federated-learning-core

Repository files navigation

🌐 Federated Learning Core

FedAvg • Non-IID Partitioning • Compression • Personalization • DP-SGD

Python PyTorch Flower

OverviewAboutTopicsQuick StartExperiments


Core federated learning repository covering algorithm baselines, partitioning strategies, communication efficiency, personalization, and privacy-aware training.


🎯 Overview

federated-learning-core provides reusable FL building blocks:

  • FedAvg/FedProx/FedAdam orchestration layers
  • IID and non-IID client data partitioning
  • Compression strategies for communication cost reduction
  • Personalization modules for client-specific adaptation
  • DP-SGD privacy mechanisms and accounting

📌 About

  • Engineered for experimentation and modular FL system design
  • Supports research workflows and production migration paths
  • Includes ready-to-run scenario and experiment scripts

🏷️ Topics

federated-learning fedavg non-iid flower differential-privacy distributed-ml pytorch privacy

🧩 Architecture

  • src/algorithms/: aggregation and strategy modules
  • src/data/partitioning/: IID and skew partitioners
  • src/training/flower/: orchestrated FL training
  • src/optimization/: compression and personalization
  • src/privacy/dp_sgd/: private training components
  • src/scenarios/: cross-silo and vertical setups

⚡ Quick Start

pip install -r requirements.txt
pytest -q tests/test_public_surfaces.py

🧪 Experiments

  • src/experiments/run_fedavg.py
  • src/experiments/run_flower.py
  • src/experiments/run_compression.py
  • src/experiments/run_cross_silo.py
  • src/experiments/run_vertical.py
  • src/experiments/run_personalization.py
  • src/experiments/run_dp.py

🛠️ Tech Stack

FL/ML: PyTorch, Flower, NumPy, scikit-learn
Optimization: compression + personalization techniques
Privacy: DP-SGD and accounting utilities

About

Core federated learning framework for distributed model training with privacy-preserving collaboration.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors