About Me
(This site will be deprecated soon)
I am a 4th-year Ph.D. student at UCLA with Prof. Cho-Jui Hsieh, and am the founder and current principal of ARC Group - an AIGC Research Collaboration aims at exploring core technologies in large-scale foundation models and eventually productizing the results.
I am also serving as a long-term Student Researcher at Google Research (Ads ML) working on advancing the state of LLMs, hosted by Prof. Inderjit S. Dhillon, Felix Yu and Si Si.
Prior to the era of LLM, I worked on AutoML and Dataset Compression.
I was the recepient of the Outstanding Paper Award at ICLR 2021.
Besides research, I am also interested in venture capital and entrepreneurial opportunities.
I obtained my B.S. degree (dual) in Computer Science and Statistics from the University of Michigan, with the highest distinction.
During this period, I interned at Microsoft Research and Sensetime on machine learning and computer vision, as well as helped a startup to develop its prototype robots.
Prior to that, I worked on quantitative investing at Shanghai Key Laboratory of Finance.
Research Overview
My research aims to enable self-improving AI, i.e., to leverage the power of AI Agents to automatize the development & deployment of themselves. To achieve this goal, I founded AIGC Research Collaboration (ARC), a multi-lab research team focused on developing highly automated and trustworthy Multimodal Language Agents.
- LLM-era: core technologies in Multimodal Language Agents, including MLLM-Diffusion Synergy, Prompt Optimization, Reasoning and Safety of Multimodal Large Language Models (MLLMs).
- Pre-LLM-era: Efficient ML pipelines, including AutoML and Dataset Compression
Outreach
-
[ARC Group] ARC (AIGC Research Collaboration) is a collective of multiple laboratories dedicated to exploring key areas in multimodal foundational agents (LLMs, MLLMs, Diffusion Models, etc.). The team aims at pushing the boundary of SOTA academic research, and eventually commercialize the results. Since established in Aug 2023, we have expanded to 15 researchers. For more information on our principles and how to apply, please refer to here.
-
[VC/Startups] I’ve been interviewing for VC positions. If you are in VC/Startup business and for any reason is looking for people with domain knowledge in A.I., I’d be delighted to have a chat with you.
Highlighted News
- [Dec. 2023] Together with multiple reowned researchers across academia and industry, we are organizing the 1st workshop on Dataset Distillation @CVPR 2024 to explore new frontiers in data-efficient ML. Please stay tuned for more details on submissions, speakers and events!
- [Oct. 2023] I will return to Google Research (Ads ML) for another internship hosted by Prof. Inderjit S. Dhillon, Felix Yu, and Si Si, focusing on advancing the state of Large Language Models.
- [Aug. 2023] I formed a Research Alliance to pursue topics in AIGC (see the Outreach section for more info).
- [Apr. 2023] TESLA is accepted at ICML 2023 - one of the first to scale-up Dataset Distillation to ImageNet-1K.
- [Jul. 2022] We released DC-BENCH - the first benchmark for evaluating Dataset Compression methods.
- [May. 2022] I started my internship at Google Research.
- [May 2022] I received Outstanding Graduate Student Award for the Master’s degree at UCLA.
- [Apr. 2021] Our paper “Rethinking Architecture Selection in Differentiable NAS” won the Outstanding Paper Award at ICLR 2021.
Experiences (Selected)
-
ARC - AIGC Research Collaboration - Founder & Principal (Aug 2023 - Present)
Areas: Multimodal Multiagent Foundational Systems
Advisory Board: Tianyi Zhou, Minhao Cheng, Cho-Jui Hsieh
Student Researchers: Yuanhao Ban, Xirui Li, Sohyun An, Sen Li, Hengguang Zhou, Licheng Lan, Andrew Bai, Xiangwen Wang
-
Google Research @AdsML - Student Researcher (Oct 2023 - Present)
Multimodal Large Language Models
Hosts: Inderjit S. Dhillon, Felix Yu, Si Si, Cho-Jui Hsieh
-
Google Research - Student Researcher (May 2022 - 2023)
Text-to-Image Diffusion Models, Efficient Transformers
Host: Ting Liu and Boqing Gong
-
Microsoft Research Asia (2019)
Neural Architecture Search
Host: Kai Chen
Publications (1st author)
-
Arxiv
Yuanhao Ban, Ruochen Wang, Tianyi Zhou, Minhao Cheng, Boqing Gong, Cho-Jui Hsieh
(Arxiv), 2024.
ARC group present, Under Review
-
Arxiv
Sen Li, Ruochen Wang, Cho-Jui Hsieh, Minhao Cheng, Tianyi Zhou
(Arxiv), 2024.
ARC group present, Under Review
-
Arxiv
Xirui Li, Ruochen Wang, Minhao Cheng, Tianyi Zhou, Cho-Jui Hsieh
(Arxiv), 2024.
ARC group present, Under Review
-
Arxiv
Ruochen Wang*, Sohyun An*, Minhao Cheng, Tianyi Zhou, Sung Ju Hwang, Cho-jui Hsieh (*Equal Contribution)
(Arxiv), 2023.
ARC group present, Under Review
-
Arxiv
Ruochen Wang, Ting Liu, Cho-jui Hsieh, Boqing Gong
(Arxiv), 2023.
Under Review, work done at Google
-
Arxiv
Justin Cui, Ruochen Wang, Yuanhao Xiong, Cho-jui Hsieh
(Arxiv), 2023.
Under Review
-
ICML
Justin Cui, Ruochen Wang, Si Si, Cho-jui Hsieh
The International Conference on Machine Learning (ICML), 2023.
-
CVPR
Yuanhao Xiong*, Ruochen Wang*, Minhao Cheng, Felix Yu, Cho-Jui Hsieh (*Equal Contribution)
The IEEE / CVF Computer Vision and Pattern Recognition Conference (CVPR), 2023.
-
NeurIPS
Justin Cui, Ruochen Wang, Si Si, Cho-Jui Hsieh
Advances in Neural Information Processing Systems (NeurIPS), 2022.
-
NeurIPS
Ruochen Wang, Yuanhao Xiong, Minghao Cheng, Cho-Jui Hsieh
Advances in Neural Information Processing Systems (NeurIPS), 2022.
-
ICLR
Shoukang Hu*, Ruochen Wang*, Lanqing Hong, Zhenguo Li, Cho-Jui Hsieh, Jiashi Feng (*Equal Contribution)
The International Conference on Learning Representations (ICLR), 2022.
-
ICLR
Yuanhao Xiong, Li-Cheng Lan, Xiangning Chen, Ruochen Wang, Cho-Jui Hsieh
The International Conference on Learning Representations (ICLR), 2022.
-
ICCV
Ruochen Wang, Xiangning Chen, Minhao Cheng, Xiaocheng Tang, Cho-Jui Hsieh
The IEEE / CVF International Conference on Computer Vision (ICCV), 2021.
-
ICLR
Ruochen Wang, Minhao Cheng, Xiangning Chen, Xiaocheng Tang, Cho-Jui Hsieh
The International Conference on Learning Representations (ICLR), 2021.
PDF
Code
Outstanding Paper Award (1/8)
-
ICLR
Xiangning Chen*, Ruochen Wang*, Minhao Cheng*, Xiaocheng Tang, Cho-Jui Hsieh (*Equal Contribution)
The International Conference on Learning Representations (ICLR), 2021.
Something personal
I enjoy learning and acquiring new skills in general, so I have to be selective with my hobbies to stay focused on the main quest. Currently, I’m sticking with reading across a wide range of topics and practicing swordsmanship.
- With books, I enjoy a diverse mix: history, philosophy, science, strategy, business, biographies. And I’m pro-ebook because it is impossible to carry paper books around.
- For swordsmanship, I’m currently studying Katori Shintō-Ryū (天真正伝香取神道流) under Masashi Sensei, with a lineage tracing directly back to Otake Shihan (大竹利典師範).
Powered by Jekyll and Minimal Light theme.