Optimal Algorithms for Continuous Non-Monotone Submodular Maximization (via Zoom)

Abstract: In this talk, I will explain our result on designing optimal approximation algorithms for maximizing continuous non-monotone submodular functions over the hypercube. This family of optimization problems has several applications in machine learning, finance and social network analysis. Our main result is the first 1/2-approximation algorithm for this problem; this approximation factor is the best possible for algorithms that only query the objective function at polynomially many points. For the special case of DR-submodular maximization, i.e., when the submodular function is also coordinate wise concave along all coordinates, we provide a different 1/2-approximation algorithm that runs in quasilinear time in dimension. Both of these results improve upon prior work [Bian et al, 2017, Buchbinder et al, 2012].

Our first algorithm uses novel ideas such as reducing the guaranteed approximation problem to analyzing a stylized zero-sum game for each coordinate, and incorporates the geometry of this zero-sum game to find a value for this coordinate. Our second algorithm exploits coordinate-wise concavity to identify a monotone equilibrium condition sufficient for getting the required approximation guarantee, and hunts for the equilibrium point using binary search.

The talk is based on the following paper: "Optimal Algorithms for Continuous Non-monotone Submodular and DR-Submodular Maximization", with Tim Roughgarden and Joshua Wang. Preliminary conference version presented at NeurIPS'18 (oral presentation) Journal version: https://www.jmlr.org/papers/volume21/18-527/18-527.pdf

Bio: Rad Niazadeh is an Assistant Professor of Operations Management at The University of Chicago Booth School of Business. Prior to joining Chicago Booth, he was a visiting researcher at Google Research NYC and a Motwani postdoctoral fellow at Stanford University, Computer Science Department. He obtained his PhD in Computer Science (minored in Applied Mathematics) from Cornell University. He primarily studies the interplay between algorithms, incentives, and learning, with the goal of advancing the theoretical methodologies and foundations of market design and operations in dynamic and complex environments. Rad has received several awards for his research, including the INFORMS Auctions and Market Design 2021 Rothkopf Junior Researcher Paper Prize (first place), the INFORMS Revenue Management and Pricing Dissertation Award (honorable mention) and the Google PhD Fellowship in Market Algorithms.