Hugues Van Assel

photo_profil.jpeg

My primary research interests are in unsupervised and self-supervised learning. I develop computational methods that leverage optimal transport and probabilistic modeling to compute robust and grounded data representations suitable for real-world applications. I am especially interested in applying these methods to tackle challenges in cell biology.

I am a strong advocate for open and accessible science through projects such as:

  • TorchDR : a modular, GPU-friendly toolbox for dimensionality reduction (DR) that offers a unified interface for state-of-the-art DR methods.
  • Stable-SSL : a library offering essential boilerplate code for a wide range of self-supervised learning tasks.

While finalizing my PhD thesis in the math department of ENS Lyon, I am currently a visiting research fellow at Brown University working with Randall Balestriero. Prior to my PhD, I was a student at Ecole polytechnique and MVA master (CV).

news

Dec 14, 2024 Happy to present our work on balanced data subsampling with Randall Balestriero at the SSL Neurips workshop.
Sep 17, 2024 TorchDR 0.1 is out ! Feel free to give it a try !
Feb 12, 2024 New preprint out! Exciting work relating dimensionality reduction and clustering methods to the Gromov-Wasserstein OT problem.
Dec 10, 2023 I will be at NeurIPS 2023 to present SNEkhorn as well as two posters at the OTML workshop.

selected publications

  1. Preprint
    Distributional Reduction: Unifying Dimensionality Reduction and Clustering with Gromov-Wasserstein
    Hugues Van Assel, Cédric Vincent-Cuaz, Nicolas Courty, Rémi Flamary, Pascal Frossard, and Titouan Vayer
    Preprint, 2024
  2. NeurIPS
    SNEkhorn: Dimension Reduction with Symmetric Entropic Affinities
    Hugues Van Assel, Titouan Vayer, Rémi Flamary, and Nicolas Courty
    Advances in Neural Information Processing Systems, 2023
  3. NeurIPS
    A Probabilistic Graph Coupling View of Dimension Reduction
    Hugues Van Assel, Thibault Espinasse, Julien Chiquet, and Franck Picard
    Advances in Neural Information Processing Systems, 2022