Skip to content

dcomp-leris/FccIQ_Dataset_MTL

Repository files navigation

Multi-Task Learning Architectures for Joint Interference Detection and KPI Prediction in 5G Networks

Overview

This project investigates multi-task learning (MTL) architectures for real-time interference detection and Key Performance Indicator (KPI) prediction in 5G Radio Access Networks (RANs). The objective is to jointly solve two heterogeneous tasks:

Interference detection (binary classification)

KPI prediction (continuous regression)

while mitigating negative transfer between tasks and minimizing computational overhead, which is critical for real-time deployment in wireless networks.

We conduct a systematic comparison between Single-Task Learning (STL) and several state-of-the-art MTL architectures, analyzing trade-offs among prediction accuracy, regression error, model complexity, and inference latency.

Implemented Architectures

The following models are implemented and evaluated:

  • STL (Single-Task Learning) Separate GCN models for each task

  • Hard Parameter Sharing Shared GCN backbone with task-specific output heads

  • MMoE (Multi-gate Mixture-of-Experts) Shared expert layers with task-specific gating networks

  • Cross-Stitch Networks Learnable feature-sharing layers between tasks

  • PLE (Progressive Layered Extraction) Multi-level shared and task-specific experts with gating

  • Attention-based MTL GCN backbone with task-specific attention mechanisms

Model Outputs:

regression_out → [RSRP, RSRQ, SINR]

cls_out → interference label

All architectures are fully compatible with the same preprocessing pipeline and training loop.

Training & Evaluation

  • Training loop implementation
  • Validation and testing pipeline
  • Multi-task loss computation
    • Classification loss
    • Per-task regression losses
  • Model statistics computation
    • Number of parameters
    • FLOPs
    • Model size

We also defined the network architecture and training hyperparameters for the models, which are presented in the following: Screenshot from 2026-03-28 14-01-07 Screenshot from 2026-03-28 14-01-48 Screenshot from 2026-03-28 14-02-25 Screenshot from 2026-03-28 14-03-05 Screenshot from 2026-03-28 14-03-42 Screenshot from 2026-03-28 14-04-25

Outputs

regression_out & cls_out predictions (saved + visualized)

Comparison of models by accuracy, loss, training time, inference time, and size

Requirements

  • numpy
  • pandas
  • torch
  • scikit-learn
  • matplotlib
  • seaborn
  • thop

Project Structure

File / Folder Description
models.py Models architecture
main_training_loop_(train_all_models).py Training, validation, evaluation functions, Main script to train all models and evaluate performance
Load and Preprocess Data.ipynb Load, clean, scale, prepare datasets, Metrics calculation, plotting, helper functions
dataset/ CSV files
README.md Project documentation and instructions.

Results

Screenshot from 2026-01-19 10-04-01

Contact

For questions or collaboration:

Email: [email protected]

Alternative: [email protected]

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages