It is almost impossible to miss the fact that “big data” is a topic du jour, and that numerous conferences, talks, and publications feature the specific combination of optimization with big data. Hence some have asked whether continuous optimization problems without big
data have ceased to be interesting. 

We try to address this question from the perspective of a numerical analyst/computer scientist by examining recent work on several related and overlapping questions, with a particular interest in "nasty" instances: (i) What makes an optimization problem hard? (ii) How should the size and complexity of an optimization problem be defined? (iii) What advice can reliably be provided about the best methods for solving a given optimization problem (or family of problems)?  (iv) And what about the best software?

Margaret H. Wright is the Silver Professor of Computer Science and former Chair of the Computer Science department at Courant Institute of Mathematical Sciences, New York University, with research interests in optimization, linear algebra, and scientific computing. She developed an interest in mathematics at an early age and studied the subject at Stanford University, where she received a B.S. degree in Mathematics, and an M.S. and eventually a Ph.D. in Computer Science. She is a member of the National Academy of Science and the National Academy of Engineering. She has served as president of the Society for Industrial and Applied Mathematics (SIAM) and is senior editor of the SIAM Review. In 2009 she became a Fellow of the Society for Industrial and Applied Mathematics. In 2012 she became a fellow of the American Mathematical Society.