Francesco Orabona

My photo

About Me

I am looking for 1-2 post-docs and 1-2 PhD students to work on practical and theoretical aspects of online learning, stochastic optimization, and training of LLMs. The ideal candidate has an exceptional mathematical background and is proficient in coding. If you are interested, send me an email with your CV. Please send me your transcript too if your are applying for a PhD position.
Due to the volume of emails and my limited time, I might not answer to everyone: please do not take it personally, you were probably not a good match for my lab.


I am currently an Associate Professor at KAUST in the Computer, Electrical and Mathematical Sciences and Engineering Division.
Previously, I was at Boston University, Stony Brook University, Yahoo Research NY, the Toyota Technological Institute at Chicago, the University of Milan, the IDIAP Research Institute, and the University of Genoa.
My current research interest is parameter-free machine learning. In particular I am interested in online learning, batch/stochastic optimization, and statistical learning theory.

I manage the OPTIMAL Lab.


Twitter: @bremen79



News
  • NEW! A paper on training without depth limits using Batch Normalization Without Gradient Explosion by Alexandru Meterez, Amir Joudaki, me, Alexander Immer, Gunnar Rätsch, and Hadi Daneshmand has been accepted at ICLR 2024.
  • NEW! A paper on tight concentrations and confidence sequences from the regret of universal portfolio by me and Kwang-Sung Jun has been accepted at IEEE Transactions on Information Theory.
  • A paper on tighter PAC-Bayes bounds through coin-betting by Kyoungseok Jang, Kwang-Sung Jun, Ilja Kuzborskij, and me has been accepted at COLT 2023.
  • A paper on optimal stochastic non-smooth non-convex optimization through online-to-non-convex conversion by Ashok Cutkosky, Harsh Mehta, and me has been accepted at ICML 2023.
  • A paper on generalized implicit follow-the-regularized-leader by Keyi Chen and me has been accepted at ICML 2023.
  • Ashok Cutkosky and I gave a (virtual) tutorial at ICML 2020 on Parameter-Free Online Optimization.
  • I have compiled all my lecture notes of my Introduction to Online Learning class in a monograph.

Recent Events

Recent Talks
  • 6/2/23, Adaptive Optimization Methods, SIAM Conference on Optimization.
  • 5/4/23, ML theory seminar, Princeton.
  • 4/26/23, Microsoft Research Asia Theory Lecture Series.

Old articles on the Italian Press