I'm a ML researcher who values simple yet effective solutions to challenging problems affecting key areas of innovation. At Orbital, I am currently developing 3D atomistic foundation models for accelerated materials discovery, driving progress in critical areas like carbon capture.
I did my PhD in Machine Learning at the University of Edinburgh, where I was advised by Michael Gutmann. My research primarily focused on unsupervised machine learning in the presence of missing data, but also has broader implications to probabilistic model estimation, probabilistic inference, and tabular ML.
I also hold a MSc in Artificial Intelligence from the University of Edinburgh and a BEng in Software Engineering from the University of Southampton.
We introduce Orb-v3, a family of universal interatomic potentials, that offer near-SoTA performance across various evaluations with a >10x reduction in latency and >8x reduction in memory.
We introduce a conditional flow-matching method for missing data imputation, which matches or outperforms a wide range of existing methods on tabular and time-series data.
We show that missing data increases the complexity of the posterior distribution of the latent variables in VAEs. To mitigate the increased posterior complexity we introduce two strategies based on (i) finite and (ii) imputation-based variational-mixtures.
We link a structured latent space in VAEs, a commonly desired property, to poor conditional sampling performance of Metropolis-within-Gibbs (MWG). To mitigate the issues of MWG we introduce two original methods for conditional sampling of VAEs: AC-MWG and LAIR.
We propose a new method for statistical model estimation from incomplete data, called variational Gibbs inference (VGI). Whilst being general-pupose, the proposed method outperforms existing VAE and normalising flow specific methods.