Georgios Arvanitidis

Associate professor

Section for Cognitive Systems
Department of Applied Mathematics and Computer Science
Technical University of Denmark (DTU)

name@dtu.dk (replace "name" with "gear")
Bulding 321, room 224
[ Google scholar ] [ Github ]

About me

Currently, I am an associate professor at the Section for Cognitive Systems at the Technical University of Denmark. Before, I was a postdoctoral researcher at the Empirical Inference department at the Max Planck Institute for Intelligent Systems, working with Bernhard Schölkopf. I finished my PhD at the Section for Cognitive Systems, at the Technical University of Denmark under the supervision of Søren Hauberg and Lars Kai Hansen. Also, during my PhD studies I visited for six months the Probabilistic Numerics group working with Philipp Hennig. I obtained my Master's Degree in Computer Science from the Saarland University, with the support of the Max Planck Institute for Informatics (IMPRS-CS). Everything started from the Department of Informatics at the Aristotle University of Thessaloniki where I received my Bachelor's Degree.


News
  • January 18, 2024. Congrats to Hadi for the spotlight at ICLR 2024! We develop a neural network model to learn dynamical systems with stability guarantees.
  • January 1, 2024. Starting the year as an associate professor. Let's see what happens...
  • November 30, 2023. The Independent Research Fund Denmark (Danmarks Frie Forskningsfond, DFF) has awarded me a Sapere Aude starting grant! This is absolutely fantastic! I will have two PhD positions open soon.
  • September 22, 2023. Federico's paper (preprint) got accepted at NeurIPS 2023! Cool!
  • July 10, 2023. New preprint from Alison! We analyze the loss landscape of deep nets via differential geometric principles. The first insights are fantastic!
  • June 13, 2023. New preprint from Federico! We propose a Laplace approximation for Bayesian neural nets that adapts locally to the structure of the true posterior, by considering the parameter space as a Riemannian manifold! A simple trick full of potential!
  • April 25, 2023. Great work from Ricardo, and our paper got accepted at ICML 2023! Establishing connections between causality and differential geometry! Groovy!
  • July 11-15, 2022. I participated in LogML summer school as a tutor! A great event and very fun to interact with such a cool group of students!
  • March 22, 2022. I gave a talk at SIAM - Geometric Methods for Understanding and Applying Machine Learning.
  • March 16, 2022. New preprint from Hadi, which brings together geometry, robots, variational auto-encoders and change of variables!
  • February 1, 2022. I started as tenure-track assistant professor at the Section for Cognitive Systems at the Technical University of Denmark. Fantastic to be back again! :-)
  • January 19, 2022. Two papers accepted at AISTATS 2022!
  • November 2, 2021. I gave a talk at the Computer Lab @ University of Cambridge about differential geometry for representation learning. Cool group! I wish I could visit!
  • October 5, 2021. Check our new preprint where we study several properties of random deep nets based on the stable rank of the weight matrices.
  • July 17, 2021. Our paper at R:SS 2021 got a best student paper award. Congratulations Hadi!
  • July 15, 2021. I gave a talk about Riemannian manifolds learned from data at the workshop Geometry and Topology in Robotics: Learning, Optimization, Planning and Control (R:SS 2021).
  • June 9, 2021. New preprint online! We show using information geometry that we can compute continuous shortest paths in the latent space of generative models which model non-continuous data. Quite funky!
  • May 11, 2021. Hadi's paper got accepted to R:SS 2021! We control robots with geodesics in the latent space of generative models, while we take into account objects by changing the geometry of the ambient space!
  • May 8, 2021. Christian's paper got accepted to ICML 2021! We apply probabilistic numeric techniques (Bayesian quadrature) to compute expectations on Riemannian manifolds under locally adaptive normal distributions! Cool!
  • March 9, 2021. In our new preprint we show that we can approximate the induced Riemannian metric in the latent space of a generative model by a conformal metric that is based on a simple learnable prior. This turns out to be robust and efficient, while the behavior of shortest paths is similar.
  • January 23, 2021. Our paper "Geometrically Enriched Latent Spaces" accepted at AISTATS 2021!
  • August 4, 2020. Check our new preprint. We replace the Euclidean metric of the ambient space with a Riemannian data based metric. Thus, we encode domain knowledge in the latent space of a generative model and control the shortest paths therein.
  • June 1, 2020. Best way to start the summer! Dimitris' paper got accepted at ICML 2020! We use Brownian motion on Riemannian manifolds to define a flexible prior in the latent space of a VAE.
  • April 27, 2020. Due to the covid-19 the Machine Learning Summer School (MLSS) 2020 at Tübingen, will be a virtual event. This will be the first time of such an event in a virtual form. Eager to see the result.
  • February 19, 2020. Since the latent space of a VAE is a curved space, prior distributions in there can be defined accordingly. Check the new preprint from Dimitris.
  • September 3, 2019. Back to back! I got for the second year in a row, a best reviewer award for NeurIPS 2019.
  • June 25, 2019. I will be one of the organizers of the Machine Learning Summer School (MLSS) 2020 at Tübingen
  • June 1, 2019. I start my postdoc in the Empirical Inference department at the Max Planck Institute for Intelligent Systems at Tübingen, where I will be working with Bernhard Schölkopf. Das ist fantastisch! :-)
  • May 31, 2019. Here you can find my PhD thesis, in case you are interested to know how differential geometry can be applied in machine learning. For any questions, doubts (or potential mistakes) please feel free to contact me. :-)
  • April 16, 2019. Very long trip to Okinawa (Japan) for AISTATS 2019, however, the conference worth it.
  • April 12, 2019. My PhD is officially finished after a great defence. Special thanks to the wonderful committee Mark Girolami, David Pfau and Ole Winther.
  • March 25, 2019. I uploaded the code from my PhD thesis and papers here.
  • January 31, 2019. My last day as PhD student, I just submitted my thesis. Very strange and mixed feelings! Quite unpleasant to leave my awesome group. Getting ready for the new challenge though.
  • January 23, 2019. The preprint of the AISTATS 2019 paper is now available.
  • December 24, 2018. Our work "Fast and Robust Shortest Paths on Manifolds Learned from Data" accepted at AISTATS 2019. What a fantastic way to finish the PhD. :-)
  • September 9, 2018. It seems that I was one of the best reviewers for the NeurIPS 2018. Indeed, very flattering.
  • April 1, 2018. Back to base! Returned from the external stay. Fantastic experiences during my external stay, thanks to all the people there!
  • January 29, 2018. Our paper on the latent space geometry of the deep generative models accepted at ICLR 2018. Wonderful!
  • December 8, 2017. I presented at the Geometric Data Analysis workshop at NeurIPS 2017 an overview of our work on the LAND model and its extensions.
  • November 8, 2017. Very happy to give a talk about geometrical machine learning models at the Geometric Science of Information conference. By the way, Paris is beautiful.
  • October 31, 2017. When deep learning meets geometry preprint. See why the latent space geometry of the notorious deep generative models is highly non-linear.
  • September 1, 2017. Starting my external stay at the Max Planck for Intelligent Systems at Tübingen. I will be a part of the Probabilistic Numerics group for the next six months.
  • July 3, 2017. We continue our research on the LAND model. We try to estimate the Riemannian metric with our new paper at GSI 2017.
  • June 12, 2017. The summer school in Nonlinear Statistics at DIKU was amazing. Super interesting to see how geometry is used in several other research fields.
  • December 6, 2016. We presented our work at NeurIPS 2016! Great experience to be a part of such a conference.
  • September 12, 2016. I participated in the Gaussian Process Summer School at Sheffield. The uncertainty quantification rules!
  • August 12, 2016. Our paper on locally adaptive normal distributions accepted at NeurIPS 2016. This is awesome!
  • June 1, 2016. My personal website is finally online.
Research interests
  • Differential geometry in machine learning
  • Representation learning
  • Generative models
  • Deep learning theory
  • Approximate Bayesian inference

Team

Under Construction...


Alumni

Alison Pouplin (PhD student, co-supervisor)

Information

If you are interested in writing a BSc / BEng / MSc thesis in our group, please review the following information:

1. You should have attended relevant courses in the field of machine learning (e.g., 02450, 02460, 02456, 02477), and also have a good background in mathematics (calculus/statistics/linear algebra).

2. I focus on the following research topics. Please check if any of these resonate with your interests.

3. Please send an email with the following information:

  • Transcript of records and current study status.
  • Relevant courses/projects/experience in machine learning.
  • Timeline (start and duration) and number of ECTS.
  • Which topic are you considering.

Ideally, you should work on one of the suggested topics, and we will discuss particular ideas when we meet. If you have another idea, you should provide a short description, describing the problem, related literature, your planned approach/method, access to relevant data, etc. This process (usually) takes a couple of weeks, so please be proactive.

Suggested topics:

  1. Generative models. Learning the geometry of the manifold that approximates the data enhances representation learning and statistical modeling.
  2. Deep learning theory. One of the most interesting challenges is understanding why deep learning models generalize so well on unseen data.
  3. Optimization techniques. The properties of optimization algorithms influence the behavior of modern machine learning models.