Jean Kossaifi

Pioneering tensor methods and neural operators for scientific machine learning

Jean Kossaifi

About Me

I lead research at NVIDIA in the field of AI for Engineering and Scientific Simulation, where my work focuses on advancing new algorithmic paradigms to solve complex physics-based problems. My core research combines tensor methods with deep learning to develop efficient and powerful neural architectures.

A central part of my mission is to democratize advanced computational techniques. To that end, I created and lead the development to two widely-used open-source libraries: TensorLy, for tensor methods, and NeuralOperator, for scientific machine learning, helping to accelerate scientific discovery for the broader research community.

Prior to NVIDIA, I was a founding member of the Samsung AI Center in Cambridge. My academic foundation includes a French Engineering Diploma in Mathematics, Computer Science, and Finance and a BSc in advanced mathematics. I then completed my PhD in Artificial Intelligence at Imperial College London.

Research Interests

AI for Engineering Tensor Methods Neural Operators Scientific Machine Learning Deep Learning

Featured Work

A selection of projects and research I'm particularly proud of.

TensorLy: Democratizing Tensor Methods

TensorLy: Democratizing Tensor Methods

Open-Source Software

Created the leading Python library for tensor methods, now with 1,000+ GitHub stars and used by researchers worldwide. TensorLy provides a high-level API for tensor decomposition and tensorized neural networks.

ZenCFG

ZenCFG

Open-Source Software

Easy configuration of Python projects with minimal boilerplate. A clean, Pythonic approach to configuration management, particularly adapted for deep learning research.

Neural Operator

Neural Operator

Open-Source Software

Learning operators that map between function spaces for solving PDEs and scientific computing problems. A new paradigm for AI in science.

TensorLy-Torch

TensorLy-Torch

Deep tensorized neural networks in PyTorch. Provides out-of-the-box tensor layers, decomposition methods, and utilities for building efficient deep learning models.

Academic Service

Leadership & Editorial

2024 – Present

Peer Review

NeurIPS 2022 – 2024 ICML 2023 – 2025 ICLR 2023 TPAMI 2020 – 2021 ICCV 2019 - 2021 SciPy 2021 NeurIPS 2018, 2020, 2021 ICLR 2019 - 2020 ICML 2019 CVPR Outstanding Reviewer 2020 2019 -2020 AAAI 2020 ECCV 2020 JMLR 2017 - 2022 Transactions on Signal Processing 2021 - 2022 Image and Vision Computing Journal 2014 - 2018 IEEE Transactions on Emerging Topics in Computing 2019 IEEE Sensors 2022

Selected Publications

TensorGRaD: Tensor Gradient Robust Decomposition for Memory-Efficient Neural Operator Training

Sebastian Loeschcke, David Pitt, Robert Joseph George, Jiawei Zhao, Cheng Luo, Yuandong Tian, Jean Kossaifi, Anima Anandkumar

arXiv preprint arXiv:2501.02379, 2025

Neural operators for accelerating scientific simulations and design

Kamyar Azizzadenesheli, Nikola Kovachki, Zongyi Li, Miguel Liu-Schiaffini, Jean Kossaifi, Anima Anandkumar

Nature Reviews Physics, 2024

A library for learning neural operators

Jean Kossaifi, Nikola Kovachki, Zongyi Li, David Pitt, Miguel Liu-Schiaffini, Robert Joseph George, Boris Bonev, Kamyar Azizzadenesheli, Julius Berner, Valentin Duruisseaux, others

arXiv preprint arXiv:2412.10354, 2024

Estimation of continuous valence and arousal levels from faces in naturalistic conditions

Antoine Toisoul, Jean Kossaifi, Adrian Bulat, Georgios Tzimiropoulos, Maja Pantic

Nature Machine Intelligence, 2021

Factorized Higher-Order CNNs with an Application to Spatio-Temporal Emotion Estimation

Jean Kossaifi, Antoine Toisoul, Adrian Bulat, Yannis Panagakis, Maja Pantic

IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2020

TensorLy: Tensor Learning in Python

Jean Kossaifi, Yannis Panagakis, Anima Anandkumar, Maja Pantic

Journal of Machine Learning Research (JMLR), 2019

Spectral learning on matrices and tensors

Majid Janzamin, Rong Ge, Jean Kossaifi, Anima Anandkumar

Foundations and Trends{\textregistered} in Machine Learning, 2019

Recent News

May 2025

Guest speaker at the Oak Ridge National Laboratory's AI Expo, where I presented my work on Neural Operators for AI in Science and Engineering

2024

Co-authored a book chapter, "Tensor methods in deep learning", in the book Signal Processing and Machine Learning Theory

November 2024

Co-organized workshop with Topal team at INRIA Bordeaux on efficient scaling of neural architectures, covering re-materialization, offloading, scheduling and model pipelining

July 2024

Co-organizer, ICML Workshop on Advancing Neural Network Training (WANT): Computational Efficiency, Scalability, and Resource Optimization