Home

yucheng 

Yucheng Lu (陆昱成)
Email: yl2967 [at] cornell [dot] edu
Google Scholar / Twitter

Short Bio

I am a Ph.D. candidate in Computer Science at Cornell University, advised by Prof. Chris De Sa. I am broadly interested in building scalable, provably correct and ubiquitous machine learning systems. My projects have touched on model compression, communication compression, decentralized/distributed ML, sample ordering, etc.

My work has been recognized by ICML Outstanding Paper Award (Honorable Mention) and Meta PhD Fellowship. I've also worked/interned at Microsoft DeepSpeed, Google Cerebra and Amazon Forecast.

I obtained my BEng degree in Electronic Engineering from Shanghai Jiao Tong University.

I'm currently looking for full-time jobs in both academia and industry, please reach out!

Updates

[Jan’23] 0/1 Adam is accepted by ICLR’23, we propose an Adam variant to accelerate LLM pretraining in distributed systems!

[Oct’22] NeurIPS’22 Scholar Award, thanks!

[Sep’22] GraB is accepted by NeurIPS’22, we propose algorithms to construct provably better data permutations than random reshuffling!

[Feb’22] Won Meta PhD Fellowship 2022, thanks Meta!

[Jan’22] QMC-Example-Selection is accepted by ICLR’22 as spotlight (5%), we analyzed the complexity for example selection and proposed two related algorithms!

[Oct’21] Outstanding Reviewer Award (8%) at NeurIPS’21!

[Sep’21] HyperDeception is accepted by NeurIPS’21, we studied justifiable hyperparameter optimization via modal logic!

[Jul’21] DeTAG won Outstanding Paper Award Honorable Mention at ICML’21 (5 out of 5513 submissions)!

[May’21] DeTAG is accepted by ICML’21 as Long Oral (3%), we discussed the theoretical limits of decentralized training, and how to achieve it!

[May’21] SCott is accepted by ICML’21, we discussed how to use stratification in training forecasting models!

[May’20] Moniqua is accepted by ICML’20, we discussed how to compress communication in learning systems without additional memory!