Skip to content

vcl-iisc/PepFedPT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Here’s a clean and structured README.md you can use for the PEPFedPT directory.


PEPFedPT

Official implementation of PEPFedPT — a federated learning framework with class centroid / prototype alignment and Vision Transformer (ViT) backbone with CCMP integration.

This repository supports multiple datasets (CIFAR-100, DomainNet, Tiny-ImageNet, iNat, etc.) and implements federated averaging with class prototype updates for personalized federated learning.


📂 Project Structure

PEPFedPT/
│
├── centroids.py
├── centroidutils.py
├── centroidinit.py
├── constants.py
├── datasets.py
├── get_class_embeddings.py
├── train.sh
│
├── models/
│   └── vit_models_pr_centroid.py
│
├── dataloaders/
├── cifar_embeddings/
├── domain_embeddings/
├── imagenet_embeddings/
├── inat_embeddings/
├── tinyimagenet_models/
├── ...

🔑 Core Files

centroids.py

Main training orchestration script.

  • Handles argument parsing
  • Initializes dataset and model
  • Controls communication rounds
  • Manages client sampling
  • Calls local training routines
  • Performs aggregation
  • Updates global centroids/prototypes

All flag descriptions are defined directly inside this file.


centroidutils.py

Contains core federated learning logic:

  • Local training routine
  • Federated averaging (FedAvg)
  • Class prototype / centroid update logic
  • Prototype aggregation across clients

This file implements the main optimization and aggregation mechanisms.


models/vit_models_pr_centroid.py

Vision Transformer (ViT) backbone implementation with:

  • CCMP integration
  • Prototype-based classification support
  • Feature extraction for centroid computation

This is the primary model used during training.


train.sh

Provides example commands for running experiments on:

  • CIFAR-100
  • DomainNet
  • Tiny-ImageNet
  • iNaturalist

Use this file as a reference for running different datasets and partition strategies.


🧠 Supported Datasets

  • CIFAR-100
  • DomainNet
  • Tiny-ImageNet
  • iNaturalist

Dataset handling is implemented in:

dataloaders/

Make sure datasets are downloaded and paths are correctly set.


🏃 How to Run

For dataset-specific examples, see:

bash train.sh

⚙️ Important Arguments

All arguments are defined in centroids.py. Key flags include:

Flag Description
--dataset Dataset name
--partition Data partitioning strategy
--batch-size Local batch size
--lr Learning rate
--epochs Local epochs per round
--parties Total number of clients
--comm_round Communication rounds
--init_seed Random seed
--reg Regularization strength

Refer to centroids.py for full details and defaults.


📌 Notes

  • Ensure GPU is available for ViT training.
  • Prototype memory grows with number of classes.
  • Non-IID partitions are supported.
  • Embeddings are optionally saved under dataset-specific folders.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors