Home

yucheng 

Yucheng Lu (陆昱成)
Email: yl2967 [at] cornell [dot] edu
Google Scholar / Twitter

Short Bio

Yucheng Lu obtained his Ph.D. degree at Cornell Computer Science, where he was advised by Prof. Chris De Sa. Yucheng is broadly interested in building scalable, provably efficient and ubiquitous deep learning systems. His projects have touched on model compression, communication compression, decentralized/distributed ML, sample ordering, etc.

Yucheng's research has been recognized by ICML Outstanding Paper Award (Honorable Mention) and Meta PhD Fellowship. He has also worked/interned at Microsoft DeepSpeed, Google Cerebra and Amazon Forecast.

Yucheng obtained his BEng degree in Electronic Engineering from Shanghai Jiao Tong University.

I have graduated from Cornell in 2023, and thus this website will not be updated. Please go to my new website

Updates

[Apr’23] CocktailSGD is accepted by ICML’23, we evaluate empirically the effect of communication compression on distributed LLM finetuning!

[Apr’23] STEP is accepted by ICML’23, we propose an Adam-aware recipe for learning N:M strucutred sparsity masks on LLMs!

[Jan’23] 0/1 Adam is accepted by ICLR’23, we propose an Adam variant to accelerate LLM pretraining in distributed systems!

[Oct’22] NeurIPS’22 Scholar Award, thanks!

[Sep’22] GraB is accepted by NeurIPS’22, we propose algorithms to construct provably better data permutations than random reshuffling!

[Feb’22] Won Meta PhD Fellowship 2022, thanks Meta!

[Jan’22] QMC-Example-Selection is accepted by ICLR’22 as spotlight (5%), we analyzed the complexity for example selection and proposed two related algorithms!

[Oct’21] Outstanding Reviewer Award (8%) at NeurIPS’21!

[Sep’21] HyperDeception is accepted by NeurIPS’21, we studied justifiable hyperparameter optimization via modal logic!

[Jul’21] DeTAG won Outstanding Paper Award Honorable Mention at ICML’21 (5 out of 5513 submissions)!

[May’21] DeTAG is accepted by ICML’21 as Long Oral (3%), we discussed the theoretical limits of decentralized training, and how to achieve it!

[May’21] SCott is accepted by ICML’21, we discussed how to use stratification in training forecasting models!

[May’20] Moniqua is accepted by ICML’20, we discussed how to compress communication in learning systems without additional memory!