About Me
Welcome to my page. I am currently a Ph.D. student at UCLA, advised by Prof. Cho-Jui Hsieh. My research area is Efficient and Automated Machine Learning methods. Besides research, I am also open to venture capital and entrepreneurial opportunities.
I obtained my B.S. degree (dual) in Computer Science and Statistics at the University of Michigan, with the highest distinction. During this period, I interned at Microsoft Research and Sensetime on machine learning and computer vision, as well as helped a startup to develop its prototype robots. Prior to that, I worked on quantitative investing at Shanghai Key Laboratory of Finance.
Research Overview
I study the problem of AI for AI. The goal is to leverage the power of A.I. to automatize the development of itself. For the past two years, I have mainly focused on a prominent direction under this concept - Automated Machine Learning (AutoML), which includes:
- Dataset Distillation/Condensation (DD/DC)
- Neural Architecture Search (NAS)
- Optimizer Search (OS)
- e.g. [ENOS]
AutoML is a highly general field with connections to many, if not all, facets of machine learning and its applications. Because of this, I am (or was) also involved in several other related topics such as:
- Transformers (Efficient Inference, Multi-Modality, e.t.c.)
- Scalable Graph Learning Algorithms
- Adversarial Robustness
- Federated Learning
Outreach
- Internship I am currently looking at alternative internship opportunities for the next year. My speciaties lie in Dataset Distillation, general AutoML, and efficient transformers, but are open to other interesting domains as well.
- VC/Startups If you are in VC/Startup business and for any reason is looking for people with domain knowledge in A.I., I'd be delighted to have a chat with you.
News
- [Nov 2022] We released TESLA, one of the first to scale-up Dataset Distillation methods to ImageNet-1K, surpassing prior art by a large margin.
- [Sep 2022] Our first benchmark for Dataset Condensation methods is accepted at NeurIPS 2022, checkout DC-BENCH
- [Sep 2022] Our Efficient Optimizer Search framework is accepted at NeurIPS 2022, checkout [ENOS]
- [Sep 2022] Two papers (one 1st author) accepted at NeurIPS 2022.
- [Jul 2022] We released DC-BENCH - the first benchmark for evaluating Dataset Condensation methods.
- [May 2022] I started my internship at Perception Team, Google Research, co-advised by Dr. Boqing Gong and Dr. Ting Liu.
- [May 2022] I received Outstanding Graduate Student Award for the Master's degree at UCLA.
- [Jan 2022] Two papers (one 1st author) accepted at ICLR 2022.
- [Jul 2021] One paper (1st author) accepted at ICCV 2021.
- [May 2021] I will present our paper in the Outstanding Paper Session at ICLR 2021.
- [Apr 2021] Our paper "Rethinking Architecture Selection in Differentiable NAS" won the Outstanding Paper Award at ICLR 2021.
- [Jan 2021] Two papers (1st author) accepted at ICLR 2021.
Education & Experiences
University of California, Los Angeles
Ph.D. in Computer Science, 2020 - 2024
M.S. in Computer Science, GPA=4.0/4.0University of Michigan - Ann Arbor
B.S. in Computer Science and Statistics, GPA=4.0/4.0, 2015 - 2019Shanghai University of Finance and Economics
Finance (Honors Class, 30 students selected from the entire college), GPA=3.93/4.0 (1st), 2013 - 2015 (Transferred)
Under Review
Scaling Up Dataset Distillation to ImageNet-1K with Constant Memory
Justin Cui, Ruochen Wang, Si Si, Cho-jui Hsieh
Arxiv 2022
[Paper]
FedDM: Iterative Distribution Matching for Communication-Efficient Federated Learning
Yuanhao Xiong*, Ruochen Wang*, Minhao Cheng, Felix Yu, Cho-Jui Hsieh
NeurIPS 2022 Workshop on Federated Learning
[Paper]
Publications
* denotes equal contribution
DC-BENCH: Dataset Condensation Benchmark
Justin Cui, Ruochen Wang, Si Si, Cho-Jui Hsieh
To appear in NeurIPS 2022
[Paper] [Code] [Leaderboard]
Efficient Non-Parametric Optimizer Search for Diverse Tasks
Ruochen Wang, Yuanhao Xiong, Minhao Cheng, Cho-Jui Hsieh
To appear in NeurIPS 2022
[Paper] [Code coming soon]
Generalizing Few-Shot NAS with Gradient Matching
Shoukang Hu*, Ruochen Wang*, Lanqing Hong, Zhenguo Li, Cho-Jui Hsieh, Jiashi Feng
In International Conference on Learning Representations (ICLR), 2022.
Learning to Schedule Learning rate with Graph Neural Networks
Yuanhao Xiong, Li-Cheng Lan, Xiangning Chen, Ruochen Wang, Cho-Jui Hsieh
In International Conference on Learning Representations (ICLR), 2022.
[Paper]
RANK-NOSH: Efficient Predictor-Based NAS via Non-Uniform Successive Halving
Ruochen Wang, Xiangning Chen, Minhao Cheng, Xiaocheng Tang, Cho-Jui Hsieh
In International Conference on Computer Vision (ICCV), 2021.
Rethinking Architecture Selection in Differentiable NAS
Ruochen Wang, Minhao Cheng, Xiangning Chen, Xiaocheng Tang, Cho-Jui Hsieh
In International Conference on Learning Representations (ICLR), 2021. (Outstanding Paper Award)
DrNAS: Dirichlet Neural Architecture Search
Xiangning Chen*, Ruochen Wang*, Minhao Cheng*, Xiaocheng Tang, Cho-Jui Hsieh
In International Conference on Learning Representations (ICLR), 2021.
Awards & Honors
Academia
- Outstanding Graduate Student Award (Master's degree, 1 per department) - UCLA CS Department, 05/2022
- Outstanding Paper Award - ICLR, 04/2021
- Highest Distinction Graduate Award - The University of Michigan, 08/2019
- Berkeley Fung’s Excellence Scholarship - UC Berkeley Graduate Admission, 03/2019
- James B. Angell Scholar - The University of Michigan, 2017-2019
- EECS Scholar - The University of Michigan, 2017-2019
- University Honors - The University of Michigan, 2015-2018
- Shanghai City Scholarship - Shanghai Municipal People's Government, 09/2014
- Peoples’ Scholarship (1st) - Shanghai University of Finance and Economics, 09/2014
Industry
- Award of Excellence - Microsoft Research Asia (MSRA), 09/2019
- Outstanding Intern Award - SenseTime, 01/2019
- Honorable Employee - OvoTechnologies, 09/2016
Services
- Reviewer for ICML 2021 ~ 2022, NeurIPS 2021 ~ 2022, ICLR 2022 ~ 2023, TMLR, CVPR 2023