ucmllogo-text.svg

The Calgary Machine Learning Lab is a research group led by Yani Ioannou within the Schulich School of Engineering at the University of Calgary. The lab has a research focus on improving Deep Neural Network (DNN) training and models. Topics of research include: Sparse Neural Network Training, Bias and Robustness of Efficient Deep Learning methods and Efficient Inference with Large Language Models.

A photo of many of CML members from December 2025 in the Schulich School of Engineering. CML had 6 different works presented by 5 students at one of the top machine learning conferences, International Conference on Machine Learning (ICML) 2025.

news

Jan 26, 2026 Mike Lasby’s collaborative work with Cerebras, “REAP the experts: Why pruning prevails for one-shot moe compression” (Lasby et al., 2026), has been accepted at the International Conference on Learning Representations (ICLR), 2026. This work explores the compression of Sparse Mixture of Experts (SMoE) models through expert compression techniques, demonstrating that REAP (Router-weighted Expert Activation Pruning) outperforms existing expert merging and pruning methods in terms of compressed model quality retention.
Dec 01, 2025 Tejas Pote was awarded the Alberta Innovates Graduate Student Scholarship in the 2025 Graduate Award Competition. The Government of Alberta funds Alberta Innovates Graduate Studentship Scholarships to support top-quality, research related to Information and Communications Technology (ICT), Nanotechnology, and other technology areas.
Nov 19, 2025 Mike Lasby was announced as one of only 10 RBC Borealis 2025 AI Fellows.

latest blog posts

selected publications

  1. REAP the Experts: Why Pruning Prevails for One-Shot MoE compression
    Mike Lasby, Ivan Lazarevich, Nish Sinnadurai, Sean Lie, Yani Ioannou, and Vithursan Thangarasa
    In International Conference on Learning Representations (ICLR), Apr 2026
  2. Sparse Training from Random Initialization: Aligning Lottery Ticket Masks using Weight Symmetry
    Mohammed AdnanRohan Jain, Ekansh Sharma, Rahul Krishnan, and Yani Ioannou
    In Proceedings of the 42nd International Conference on Machine Learning (ICML), Jul 2025
  3. What is Left After Distillation? How Knowledge Transfer Impacts Fairness and Bias
    Aida Mohammadshahi, and Yani Ioannou
    Transactions on Machine Learning Research (TMLR), Mar 2025
  4. Navigating Extremes: Dynamic Sparsity in Large Output Spaces
    Nasib Ullah, Erik Schultheis, Mike LasbyYani Ioannou, and Rohit Babbar
    In 38th Annual Conference Neural Information Processing Systems (NeurIPS), Dec 2024
  5. Dynamic Sparse Training with Structured Sparsity
    Mike Lasby, Anna Golubeva, Utku Evci, Mihai Nica, and Yani Ioannou
    In International Conference on Learning Representations (ICLR), May 2024