HOrIc&RRO7HrEs#uNl NB%6GDjxO2Y0
My name is Justin Deschenaux and I am a PhD student at . I am advised by Professor Caglar Gulcehre, and I work on diffusion language models, a blazingly fast, controllable, and principled way to generate text.

Latest News
- Jan. 2026
Three papers were accepted to ICLR 2026: Partition Generative Modeling (PGM), Loopholing Discrete Diffusion, and The Diffusion Duality, Chapter II: Ψ-Samplers and Efficient Curriculum. PGM was awarded an oral presentation!
- Nov. 2025
We started a reading group on discrete diffusion with Subham and Zhihan.
- May. 2025
The Diffusion Duality was accepted at ICML 2025.
- Jan. 2025
Self-Distillation Through Time was accepted at ICLR 2025.
Selected Work
Scaling Beyond Masked Diffusion Language Models
Scaling law study of discrete diffusion methods, showing that uniform-state diffusion might be the future despite worse perplexity than masked diffusion.
The Diffusion Duality, Chapter II: Ψ-Samplers and Efficient Curriculum
Ψ-samplers let uniform-state diffusion keep improving with more steps while reducing training cost.