I design better hardware–software abstractions through research in programming languages and computer architecture. Much of my work is on approximate computing: the idea that computers can be more efficient if they are allowed to be imperfect. To help programmers trade off accuracy for efficiency, we need new languages, tools, processors, accelerators, memories, and compilers.

I am an assistant professor in the Department of Computer Science at Cornell University, where I am also part of the Computer Systems Laboratory. I graduated from the University of Washington in 2015. Here’s my CV.

latest blogging: December 4, 2017 — more bloggingssubscribe

FODLAM, a Poorly Named Tool for Estimating the Power and Performance of Deep Learning Accelerators

For a recent project, my group couldn’t find reusable, open-source tools for understanding the hardware costs of deep neural network accelerators. We’ve published a simple first-order model for the latency and energy of CNN execution.