Kilian Q. Weinberger

Bayesian Optimization

Robot-at-Computer



Download Matlab CODE


Hyper parameter optimization is one of the biggest problems in many settings within and outside of computer science.
The slowest method is to meticulously try out many different hyper parameter settings (also often referred to grid search).
You can typically do better with random search. Often the best approach is to train a small machine learning algorithm to predict which parameter setting is most promising
and then explore the parameter space with the help of these predictions.





Relevant publications:

[PDF] Jacob Gardner, Geoff Pleiss, Ruihan Wu, Kilian Q. Weinberger, Andrew G. Wilson
Product Kernel Interpolation for Scalable Gaussian Processes
Artificial Intelligence and Statistics (AISTATS), 2018, in press …

[PDF] Discovering and Exploiting Additive Structure for Bayesian Optimization
J Gardner, C Guo, K Weinberger, R Garnett, R Grosse
Artificial Intelligence and Statistics (AISTATS) 2017, pages 1311-1319

[PDF][BIBTEX] Bayesian Active Model Selection with an Application to Automated Audiometry
Jacob Gardner, Gustavo Malkomes, Roman Garnett, Kilian Q. Weinberger, Dennis Barbour, John P. Cunningham
Neural Information Processing Systems (NIPS), 2015, Curran Associates, pages 2386--2394.

[PDF] Psychophysical Detection Testing with Bayesian Active Learning
Jacob R. Gardner, Xinyu Song, Kilian Q. Weinberger, Dennis Barbour, John Cunningham
Conference on Uncertainty in Artificial Intelligence (UAI), Amsterdam, Netherlands, (in press…)

[
DOI] Song XD, Wallace BM, Gardner JR, Ledbetter NM, Weinberger KQ, Barbour DL.
Fast, Continuous Audiogram Estimation Using Machine Learning.
Ear and hearing, 2015 Nov-Dec; 36(6):e326-35.


[PDF][CODE][BIBTEX] Bayesian Optimization with Inequality Constraints.
Jacob Gardner, Matt Kusner, Zhixiang (Eddie) Xu, Kilian Q. Weinberger, John Cunningham
International Conference on Machine Learning (ICML), Beijing China. JMLR W&CP 32 (1) :937-945, 2014.