Yucheng Lu (陆昱成)
Email: yl2967 [at] cornell [dot] edu
Address: 325 Gates Hall, Cornell University, Ithaca, NY, 14853
Google Scholar / Twitter

Short Bio

I am a Ph.D. student in Computer Science at Cornell University, advised by Prof. Chris De Sa. Our group webpage can be found here. I am broadly interested in building scalable and provably correct machine learning systems. I obtained my BEng degree in Electronic Engineering from Shanghai Jiao Tong University.

I'm a Research Intern at Microsoft during summer and fall 2021. In summer and fall 2020, I worked as an Applied Scientist Intern at Amazon. Prior to Cornell, I spent some time at NetDB and IIOT.


[Sep’21] HyperDeception is accepted by NeurIPS’21, we studied justifiable hyperparameter optimization via modal logic!

[Jul’21] DeTAG wins Outstanding Paper Award Honorable Mention at ICML’21!

[May’21] DeTAG is accepted by ICML’21 as Long Oral (3%), we discussed the theoretical limits of decentralized training, and how to achieve it!

[May’21] SCott is accepted by ICML’21, we discussed how to use stratification in training forecasting models!

[Apr’21] Short version of HyperDeception is accepted by Robust ML workshop in ICLR’21, we discussed how hyperparameter optimization can be misleading!

[May’20] Moniqua is accepted by ICML’20, we discussed how to compress communication in learning systems without additional memory!


“Decentralized Deep Learning: Theory and Applications”, UC Berkeley, Sep 2021

“Optimal Complexity in Decentralized Training”, ICML 2021 Long Talk & TechBeat, July 2021

“Variance Reduced Training with Stratified Sampling for Forecasting Models”, ICML 2021 Short Talk, July 2021

“Moniqua: Modulo Quantized Communication in Decentralized SGD”, ICML 2020 Short Talk, July 2020