I am a PhD student at IST Austria, working on machine learning in the group of Prof. Christoph Lampert. Before joining IST, I completed a 4-year master degree program in Mathematics and Statistics at the University of Oxford. My master thesis concerned the usage of kernel dependence measures for unsupervised learning and I was supervised by Prof. Dino Sejdinovic on this project.
Full CV here.
Key to the recent success of machine learning algorithms is the availability of large data sets for training models. The scale and variability of the needed data, however, often enforces its collection from potentially unreliable external sources. Previous work has shown that machine learning models are vulnerable to noise and adversarial perturbations in the training data and that their performance can also suffer from model misspecifications and test-time attacks (e.g. adversarial examples).
I am interested in designing and analysing algorithms with provable guarantees of being robust to such problems. Extending our understanding of classic machine learning theory to account for noise and perturbations in the data at train and test time is important for ensuring steady performance of learned models at deployment time and a necessary step towards establishing trust in learning systems among the general community.
I am also interested in distributed learning and optimization.