PRiDeL focuses on the co-design of distributed learning algorithms and communication strategies that incorporate privacy guarantees for real-time and streaming data. Sensors and connected devices increasingly process data locally, making distributed learning techniques such as federated learning (FL) both possible and practical. FL enables raw data to remain on-device. It can be viewed as a form of compression, reducing communication overhead, energy consumption and latency. It can also improve data privacy, as raw data is not communicated directly. However, the model updates communicated between devices can still leak sensitive information. PRiDeL addresses this challenge by developing learning algorithms with built-in privacy guarantees.
The associate team will be jointly led by Principal Investigator Hsuan-Yin Lin from the Department of Information Theory at Simula UiB, and Principal Investigator Malcolm Egan from the MARACAS team at Inria Lyon, with Yu-Chih (Jerry) Huang at NYCU serving as Co-PI.





