Josh Alman  

Josh Alman


Research Interests

Theoretical Computer Science, especially Algorithm Design, Complexity Theory, and Algebraic Methods

Research Areas

AlgorithmsData Science

Josh Alman is a theoretical computer scientist who works on designing more efficient algorithms for fundamental problems, and mathematically proving when algorithmic improvements are impossible. Much of his work focuses on how quickly one can perform basic algebraic tasks, and how computational algebraic tools can be applied to problems throughout computer science.

Alman combines insights from both algorithm design and complexity theory to approach problems in his research. Two topics he has studied extensively are algorithms for matrix multiplication, and algorithms for computing important linear transforms such as Fourier transforms.

His work on matrix multiplication gives the fastest known algorithm for multiplying two matrices, and also proves mathematical barrier results explaining why computer scientists have been unable to design even faster algorithms for this important problem.

His work on linear transforms focuses on a technique called matrix rigidity, which studies how well the matrices underlying computational tasks can be decomposed as a sum of a low rank matrix and a sparse matrix. He has shown that important transforms like the Walsh-Hadamard transform have more efficient decompositions than previously conjectured to be possible, and he has given the first non-trivial construction of matrices without an efficient decomposition.

He has employed these and other techniques from algorithms, complexity, and algebra to a diverse set of problems throughout computer science, including nearest neighbor search, streaming algorithms, dynamic graph algorithms, fine-grained complexity, communication complexity, and data structure lower bounds.

Alman earned a B.S in mathematics from MIT in 2014, a M.S in computer science from Stanford in 2016, and a Ph.D. in computer science from MIT in 2019. Before coming to Columbia, he was a Michael O. Rabin postdoctoral fellow in theoretical computer science at Harvard University.


  • Assistant professor of computer science, Columbia University, 2021— Michael O. Rabin Postdoctoral Fellow, Harvard University School of Engineering and Applied Sciences, 2019-2021


  • Machtey Award for Best Student Paper at FOCS 2019
  • Best Student Paper Award at CCC 2019
  • European Association of TCS Distinguished Dissertation Award, 2019
  • George M. Sprowls Award for outstanding Ph.D. theses in Computer Science at MIT, 2019


  • Kronecker Products, Low-Depth Circuits, and Matrix Rigidity
    In 53rd Annual ACM Symposium on the Theory of Computing (STOC 2021)
  • A Refined Laser Method and Faster Matrix Multiplication with Virginia Vassilevska Williams
    In 32nd Annual ACM-SIAM Symposium on Discrete Algorithms (SODA 2021)
  • Efficient Construction of Rigid Matrices Using an NP Oracle with Lijie Chen
    In 60th Annual IEEE Symposium on Foundations of Computer Science (FOCS 2019)
  • Limits on the Universal Method for Matrix Multiplication
    In 34th Computational Complexity Conference (CCC 2019)
  • Limits on All Known (and Some Unknown) Approaches to Matrix Multiplication with Virginia Vassilevska Williams
    In 59th Annual IEEE Symposium on Foundations of Computer Science (FOCS 2018)
  • Probabilistic Rank and Matrix Rigidity with Ryan Williams
    In 49th Annual ACM Symposium on the Theory of Computing (STOC 2017)
  • Polynomial Representations of Threshold Functions and Algorithmic Applications with Timothy M. Chan, Ryan Williams
    In 57th Annual IEEE Symposium on Foundations of Computer Science (FOCS 2016)
  • Probabilistic Polynomials and Hamming Nearest Neighbors with Ryan Williams
    In 56th Annual IEEE Symposium on Foundations of Computer Science (FOCS 2015)