I'm Yixuan Li. I am a PhD candidate at Cornell University, advised by John E. Hopcroft. My thesis committee members are Kilian Weinberger and Thorsten Joachims. The goal of my thesis research is to develop computational foundations and practical advances for scaling machine learning methods on web-scale data. I have pursued research on both theoretical and applied aspects of machine learning and perception.
A key focus of my recent work has been on computer vision and deep learning. Projects include deep representation learning for vision tasks, object detection for visual search, optimizing neural networks with efficient computational cost, adversarial training of deep generative models, improving neural network safety, and theorectical aspect of deep learning.
Prior to coming to Cornell, I graduated from Shanghai Jiaotong University with B.Eng in Information Engineering in 2013. I spent two summers at Google Research Mountain View in 2015 and 2016. I spend summer 2017 as a machine learning scientist intern at GrokStyle, working on cutting-edge visual search technologies with deep learning and computer vision.
I travel and occasionally take photos. Here is my pictorial Travel Memo.
Update (8/5/2017): Selected as one of the Rising Stars in EECS 2017.
Update (6/6/2017): Paper accepted for publication in Transactions on Knowledge Discovery from Data (TKDD).
Update (5/16/2017): I will be speaking at Grace Hopper Conference (GHC) Artificial Intelligence track in October 2017.
Update (3/12/2017): Received ICLR 2017 Student Travel Award.
Update (2/27/2016): Paper on StackedGAN has been accepted into CVPR 2017.
Update (2/6/2017): Paper on Snapshot Ensembles has been accepted into ICLR 2017.
Update (12/20/2016): My summer internship paper at Google Research is invited to the industrial track in WWW 2017.
Update (2/5/2016): I will be interning at Machine Intelligence at Google Research (Mountain View) for the summer. I am very excited about it!
Update (2/4/2016): Paper on Convergent Learning has been accpeted for oral presentation (5.7%) in ICLR 2016! (check out preprint here)