Ruochen Wang

CS PhD at UCLA

github
linkedin
instagram

About Me

I am currently a Ph.D. student at UCLA, advised by Prof. Cho-Jui Hsieh. My research area lies in Efficient, Automated and Robust Deep Learning methods. For the past two years, I mainly focus on

  • Automated Machine Learning (AutoML)
    • Neural Architecture Search (NAS)
    • Learning to Learn (L2L)
    • Dataset Learning
  • Scalable Graph Learning Algorithms (GNN)

Besides research, I am also into venture capital and entrepreneurship.

I obtained my B.S. degree (dual) in Computer Science and Statistics at University of Michigan, with highest distinction. During this period, I interned at Microsoft Research and Sensetime on machine learning and computer vision, as well as helped a startup to develop its prototype robots. Prior to that, I worked on quantitative investing at Shanghai Key Laboratory of Finance.

News

  • [Jan 2022] Two papers (one 1st author) accepted at ICLR 2022
  • [Dec 2021] I obtained my Master Degree
  • [Jul 2021] One paper (1st author) accepted at ICCV 2021
  • [May 2021] I will present our paper in the Outstanding Paper Session at ICLR 2021
  • [Apr 2021] Our paper "Rethinking Architecture Selection in Differentiable NAS" won the Outstanding Paper Award at ICLR 2021.
  • [Jan 2021] Two papers (1st author) accepted at ICLR 2021

Education & Experiences

  • University of California, Los Angeles
    Ph.D. in Computer Science, 2022 - 2024/25
    M.S. in Computer Science, GPA=4.0/4.0, 2020 - 2021

  • University of Michigan - Ann Arbor
    B.S. in Computer Science and Statistics, GPA=4.0/4.0, 2015 - 2019

  • Shanghai University of Finance and Economics
    Finance (Honors Class, 30 students selected from the entire college), GPA=3.93/4.0 (1st), 2013 - 2015 (Transferred)

Preprints & Submissions

Publications

* denotes equal contribution

Generalizing Few-Shot NAS with Gradient Matching
Shoukang Hu*, Ruochen Wang*, Lanqing Hong, Zhenguo Li, Cho-Jui Hsieh, Jiashi Feng
To appear in International Conference on Learning Representations (ICLR), 2022.

Learning to Schedule Learning rate with Graph Neural Networks
Yuanhao Xiong, Li-Cheng Lan, Xiangning Chen, Ruochen Wang, Cho-Jui Hsieh
To appear in International Conference on Learning Representations (ICLR), 2022.

RANK-NOSH: Efficient Predictor-Based NAS via Non-Uniform Successive Halving
Ruochen Wang, Xiangning Chen, Minhao Cheng, Xiaocheng Tang, Cho-Jui Hsieh
In International Conference on Computer Vision (ICCV), 2021.

[PDF] [Code]

Rethinking Architecture Selection in Differentiable NAS
Ruochen Wang, Minhao Cheng, Xiangning Chen, Xiaocheng Tang, Cho-Jui Hsieh
In International Conference on Learning Representations (ICLR), 2021. (Outstanding Paper Award)

[PDF] [Code] [Talk] [Media]

DrNAS: Dirichlet Neural Architecture Search
Xiangning Chen*, Ruochen Wang*, Minhao Cheng*, Xiaocheng Tang, Cho-Jui Hsieh
In International Conference on Learning Representations (ICLR), 2021.

[PDF] [Code]

Awards & Honors

Academic

  • Outstanding Paper Award - ICLR, 04/2021
  • Berkeley Fung’s Excellence Scholarship - UC Berkeley Graduate Admission, 03/2019
  • James B. Angell Scholar - The University of Michigan, 2017-2019
  • EECS Scholar - The University of Michigan, 2017-2019
  • University Honors - The University of Michigan, 2015-2018
  • Shanghai City Scholarship - Shanghai Municipal People's Government, 09/2014
  • Peoples’ Scholarship (1st) - Shanghai University of Finance and Economics, 09/2014

Industrial

  • Award of Excellence - Microsoft Research Asia (MSRA), 09/2019
  • Outstanding Intern Award - SenseTime, 01/2019
  • Honorable Employee - OvoTechnologies, 09/2016

Services

  • Reviewer for ICML 2021, NeurIPS 2021, ICLR 2022, ICML 2022.
Last Updated: 1/2/2020, 10:03:33 PM