Date: September 8, 2025

Time: 3:45-5 p.m.

Location: Gates 310 or via Zoom

Speaker: Robert Andrews, University of Waterloo
 

A color photo of a man smiling for a photo in front of a book case.

 

Bio: Robert Andrews is a postdoc at the University of Waterloo and the Simons Institute. He was previously a postdoc at the Institute for Advanced Study and received his Ph.D. from the University of Illinois Urbana-Champaign. His research interests include complexity theory, pseudorandomness, and the design of algorithms for algebraic problems.
 

Abstract: Solving systems of polynomial equations is a basic task in computational algebra. The complexity of solving systems of equations varies significantly depending on the parameter regime of interest: linear equations can be solved efficiently, while solving nonlinear equations is easily seen to be NP-hard. Less well-known is the fact that polynomial equations in a constant number of variables can be solved efficiently. For example, solving univariate equations amounts to factoring the GCD of the given polynomials, a task which can be performed in polynomial time.

Algorithms to solve nonlinear equations in few variables often proceed by reduction to linear algebra. Is there a reduction in the other direction that shows linear algebra is necessary to solve polynomial equations? Recent work shows that this is not the case for equations in one variable: deciding solvability of a system of univariate equations is strictly easier than performing linear-algebraic computation. In this talk, we continue this line of work in the multivariate setting, showing that deciding satisfiability of polynomial equations in a constant number of variables is strictly easier than linear algebra. Specifically, in this parameter regime, the multivariate resultant—a nonlinear generalization of the determinant—can be computed by arithmetic circuits of polynomial size and constant depth, a fact which is false for the determinant itself.

Joint work with Abhibhav Garg.