Welcome!
I am a third-year PhD student at the Australian National University, focusing on theoretical machine learning and deep learning theory. I am very grateful to be supervised by Stephen Gould, Thalaiyasingam Ajanthan, and Marcus Hutter. Prior to my PhD, I earned a Master’s in Machine Learning and Computer Vision from ANU, where I worked on adversarial machine learning and deep declarative networks under the supervision of Stephen Gould. I also hold a Bachelor of Science (BSc Hons) in Computer Science from Harokopio University of Athens, where I did research on emotion recognition and affective computing using deep learning under Dimitrios Michail and Iraklis Varlamis.
My research lies at the intersection of deep learning, optimisation, and geometry, with an emphasis on building a more rigorous theoretical understanding of neural networks.
Core Research Interests
The Geometry of Deep Learning: I analyse the geometric structure of the loss landscape and solution space in deep neural networks. My goal is to characterise the properties of optimal solutions in over-parameterised models using tools from differential geometry and frame theory.
Optimisation and Convergence: I derive convergence guarantees for non-convex optimisation problems in neural networks. My work explores how geometric conditions on the manifold of minimisers can be leveraged to design more efficient and reliable training algorithms.
Generalisation and Occam’s Razor: My research connects the principle of Occam’s Razor to how and why neural networks generalise. I develop novel bounds by measuring the algorithmic complexity of learned functions, using tools from algorithmic information theory and topology to analyse the structure of their decision boundaries.
I am always open to new collaborations. If you are interested in working together, please feel free to get in touch.
News
- (September 2025) I’m heading to Melbourne this February for the Machine Learning Summer School 2026 to explore ‘The Future of AI Beyond LLMs’.
- (September 2025) I got a paper accepted as Spotlight at NeurIPS 2025 on non-convex optimisation, reduction mappings, and their convergence!
- (September 2025) We got a paper accepted as Long Oral at AJCAI 2025 on backpropagation-free gradient estimation. Congratulations to Daniel!
- (July 2025) I will be presenting my work on neural collapse at the 2025 Greeks in AI Symposium in Athens, Greece.
- (March 2025) Our team has received a £40,000 grant from the UK AISI’s Bounty Programme to advance our research into assessing the dangerous capabilities of LLMs for AI safety.
- (November 2024) I will be attending the AMSI Summer School 2025 on January 2025 to learn about optimal transport, Monge-Ampère equations, and elliptic PDEs.
- (October 2024) Our team won second place at the 24-hour ANU Hackathon 2024 for building a system to detect early-stage cognitive decline using visual biomarkers.
- (September 2024) I got a paper accepted at NeurIPS 2024 on neural collapse!
- (July 2024) I will be attending ICML 2024 in Vienna, Austria.
- (May 2024) I will be attending the Annual Graduate School in Mathematical Aspects of Data Science at Darwin, Northern Territory, Australia.
- (November 2023) I will be giving a talk at MATRIX-AMSI PhD Symposium: Rough Path Theory 2023 on neural collapse and local elasticity of neural networks.
- (October 2023) I will participate in MATRIX-AMSI PhD Symposium: Rough Path Theory 2023 at Creswick campus of The University of Melbourne.
- (February 2023) Started my PhD at ANU!