I am a PhD student in the Department of Econometrics and Business Statistics at Monash University. Despite the department’s name, my research is largely unrelated to “business and economics.” I focus instead on fundamental aspects of statistical methodology. Before transferring to Monash in late 2024, I was studying at the University of Melbourne in the School of Mathematics and Statistics.
During my PhD, I’ve had the privilege of working with several amazing academics. My primary supervisor is Susan Wei, and I am also supervised/mentored by Liam Hodgkinson, Weichang Yu, and David Frazier.
My interests are broad but center around Bayesian statistics. I explore algorithms for fitting Bayesian models, understanding neural networks through a Bayesian lens, and studying how misspecified Bayesian models affect downstream tasks.
I am a statistician by training, having completed my undergraduate and MPhil studies at the University of Western Australia under the supervision of Berwin Turlach and Kevin Murray. Before starting my PhD, I worked as a statistician and data scientist at the Department of Primary Industries and Regional Development in Western Australia—a role I still maintain part-time.
SLT combines Bayesian statistics and algebraic geometry to study the properties of singular models, such as neural networks and mixture models. For an introduction, see this article and this resource. Since I lack the mathematical background to contribute to SLT’s theoretical development, my focus is on integrating its tools into modern deep learning workflows. This is especially challenging with modern networks operating in the interpolating regime (\(p > n\)), as most key results in SLT are based on large \(n\) asymptotic (\(n \gg p\)).
Model and prior misspecification can significantly impact the downstream performance of Bayesian models. My research explores tools like loss-based inference (here’s a great blog), empirical likelihood (Owen, 1988), and strange things such as the cold posterior effect. Recently, I have also taken an interested in model- and prior-free approaches for constructing predictive densities (e.g., TabPFN) and posteriors (e.g., the martingale posterior).
I am continually searching for tools to reliably compute or approximate posteriors, especially for distributions that are multimodal, high-dimensional, or have awkward posterior supports (e.g., Bayesian empirical likelihood). I have experience with Langevin-based Markov Chain Monte Carlo methods, variational Bayes approaches, and expectation-propagation techniques.
PhD in Statistics (2024–)
Monash University
Transferred from the University of Melbourne, 2022–2024, with Susan
MPhil in Statistics (2017–2019)
The University of Western Australia
Dean’s List for top 5% thesis
BSc (Honours) in Engineering & Mathematics (2013–2017)
The University of Western Australia
Feel free to reach out! My email address starts with kenyon.ng — just add dpird.wa.gov.au for DPIRD matters or monash.edu for research.
You can also find me on GitHub and LinkedIn.
Last updated: 21 Jan 2025