Non-Stochastic CDF Estimation Using Threshold Queries (via Zoom)

Abstract: Estimating the empirical distribution of a scalar-valued data set is a basic and fundamental task. In this paper, we tackle the problem of estimating an empirical distribution in a setting with two challenging features. First, the algorithm does not directly observe the data; instead, it only asks a limited number of threshold queries about each sample. Second, the data are not assumed to be independent and identically distributed; instead, we allow for an arbitrary process generating the samples, including an adaptive adversary. These considerations are relevant, for example, when modeling a seller experimenting with posted prices to estimate the distribution of consumers’ willingness to pay for a product: offering a price and observing a consumer’s purchase decision is equivalent to asking a single threshold query about their value, and the distribution of consumers’ values may be non-stationary over time, as early adopters may differ markedly from late adopters. Our main result quantifies, to within a constant factor, the sample complexity of estimating the empirical CDF of a sequence of elements of [n], up to ε additive error, using one threshold query per sample. The complexity depends only logarithmically on n, and our result can be interpreted as extending the existing logarithmic-complexity results for noisy binary search to the more challenging setting where noise is non-stochastic. Along the way to designing our algorithm, we consider a more general model in which the algorithm is allowed to make a limited number of simultaneous threshold queries on each sample. We solve this problem using Blackwell’s Approachability Theorem and the exponential weights method. As a side result of independent interest, we characterize the minimum number of simultaneous threshold queries required by deterministic CDF estimation algorithms.
Joint work with Vaishnavi Gupta, Bobby Kleinberg, and Eleanor Goh, to appear in SODA 23.
 

Bio: Princewill is a 3rd-year Ph.D. Candidate in the Computer Science Department at Cornell University, advised by Prof. Bobby Kleinberg. His interests are in learning theory particularly in online and adversarial settings. So far, his work has focused on calibrated forecasting and CDF estimation.