Key Significance
Yuke Zhu is a leading researcher in Robot Learning and Embodied AI.
- NVIDIA GEAR Lab Co-Lead: Leads Generalist Embodied Agent research with Jim Fan
- GR00T Project Lead: Spearheads NVIDIA’s humanoid robot foundation model development
- robosuite Creator: Developed the standard simulation framework for robot learning
- Academia-Industry Bridge: Bridges academic research and industry applications through dual roles at UT Austin and NVIDIA
Profile
| Item | Details |
|---|---|
| Position | Associate Professor, UT Austin (2025-), Assistant Professor (2020-2025) |
| Affiliation | NVIDIA Research Director & Distinguished Research Scientist |
| Team | GEAR Lab (Generalist Embodied Agent Research) Co-Lead |
| Lab | RPL Lab (Robot Perception and Learning Lab) |
| PhD | Stanford University (2015-2019) |
| Advisors | Fei-Fei Li, Silvio Savarese |
| Citations | 34,000+ (Google Scholar) |
Education
| Period | Degree | Institution | Notes |
|---|---|---|---|
| 2015-2019 | Ph.D. Computer Science | Stanford University | Advisors: Fei-Fei Li, Silvio Savarese |
| 2013-2015 | M.S. Computer Science | Stanford University | |
| 2011-2013 | B.S. Computer Science | Simon Fraser University | First Class with Distinction |
| 2009-2013 | B.E. Computer Science | Zhejiang University | Dual Degree Program |
Career
NVIDIA Research (2020-present)
Director & Distinguished Research Scientist
| Year | Work | Impact |
|---|---|---|
| 2024 | GEAR Lab Founded | Co-founded with Jim Fan, generalist embodied agent research |
| 2024 | GR00T | Humanoid foundation model announcement |
| 2025 | GR00T N1 | Open humanoid VLA model release |
UT Austin (2020-present)
Associate Professor, Computer Science (2025-) Assistant Professor (2020-2025)
| Year | Work | Impact |
|---|---|---|
| 2020 | RPL Lab Founded | Robot Perception and Learning Lab Director |
| 2022 | NSF CAREER Award | Robot manipulation research funding |
| 2023 | MimicGen | Large-scale data generation from few demonstrations |
| 2024 | RoboCasa | Everyday environment simulation framework |
| 2025 | IEEE Early Career Award | Recognition for robot learning contributions |
Stanford (2013-2019)
Ph.D. Student, Stanford AI Lab
| Year | Work | Impact |
|---|---|---|
| 2017 | AI2-THOR | 3D indoor environment simulator |
| 2017 | Target-driven Navigation | Goal-conditioned visual navigation |
| 2019 | Making Sense of Vision and Touch | ICRA Best Paper Award |
| 2020 | robosuite | Robot learning simulation framework |
Research
Research Areas
Core Domains:
1. Robot Learning - Reinforcement learning, imitation learning
2. Computer Vision - Visual perception, scene understanding
3. Embodied AI - Embodied agents, agent systems
4. Simulation - Robot simulation, synthetic data generation
Research Philosophy
“My goal is to build algorithms and systems for autonomous robots and embodied agents that reason about and interact with the real world.”
Research Evolution
- 2015-2019 (Stanford): Visual navigation, perception-action loop
- 2019-2022: Simulation frameworks, robot manipulation
- 2022-2024: Data generation, foundation models
- 2024-present: Humanoid robotics, GR00T
Key Publications
Simulation & Benchmarks
- robosuite (2020) - Modular simulation framework for robot learning
- AI2-THOR (2017) - 3D indoor environment simulator
- RoboCasa (2024) - Everyday environment simulation
Robot Learning
- MimicGen (CoRL 2023) - Automated large-scale data generation from few demonstrations
- DexMimicGen (2024) - Bimanual dexterous manipulation data generation
- Making Sense of Vision and Touch (ICRA 2019) - Best Paper Award
Visual Navigation
- Target-driven Visual Navigation (ICRA 2017) - Goal-conditioned visual navigation
- Visual Semantic Planning (ICCV 2017) - Semantic planning
Foundation Models
- GR00T N1 (2025) - Humanoid robot foundation model
- MineDojo (NeurIPS 2022 Outstanding Paper) - Minecraft-based agent benchmark
GR00T Project
Role
Yuke Zhu is a key lead of NVIDIA’s GR00T (Generalist Robot 00 Technology) project.
GR00T N1 Architecture:
- Dual-system design (System 1 + System 2)
- Vision-Language Module (System 2): Environment interpretation, language understanding
- Diffusion Transformer (System 1): Real-time motor action generation
- Support for various humanoid robots
Key Contributions
- Architecture Design: VLA (Vision-Language-Action) model structure
- Data Pyramid: Leveraging full spectrum from real to synthetic data
- Open Source Release: Democratizing research through GR00T N1 release
- Simulation Integration: Integration with Isaac Lab, robosuite
GR00T N1 (2025)
- NVIDIA’s first open humanoid foundation model
- Natural language instruction understanding and execution
- Human motion imitation learning
- Support for various robot embodiments
GEAR Lab
Collaboration with Jim Fan
GEAR Lab is an NVIDIA Research group co-led by Jim Fan and Yuke Zhu.
GEAR Lab Research Areas:
1. LLM for Planning - Large language model-based planning
2. Vision-Language Models - Vision-language models
3. Robotic Systems - Robot systems, manipulation, locomotion
4. Simulation Infrastructure - Simulation infrastructure, synthetic data
Division of Expertise
| Researcher | Strengths | Representative Projects |
|---|---|---|
| Jim Fan | LLM, game agents, communication | Voyager, Eureka |
| Yuke Zhu | Robot systems, simulation, manipulation | robosuite, MimicGen |
Joint Projects
- GR00T / GR00T N1: Humanoid foundation model
- MineDojo: Minecraft agent benchmark (NeurIPS 2022 Outstanding Paper)
- GEAR Research Infrastructure: Isaac Lab, Omniverse integration
Awards & Honors
Major Awards
| Year | Award | Organization |
|---|---|---|
| 2025 | IEEE RAS Early Career Award | IEEE |
| 2022 | NeurIPS Outstanding Paper Award | NeurIPS (MineDojo) |
| 2022 | NSF CAREER Award | National Science Foundation |
| 2022 | Outstanding Learning Paper Award | ICRA |
| 2019 | Best Conference Paper Award | ICRA |
| 2019 | Best Cognitive Robotics Paper (Finalist) | IROS |
| 2021 | Amazon Research Award | Amazon |
| 2021 | Best Multi-Robotic Systems Paper (Finalist) | ICRA |
Corporate Research Support
- Amazon Research Award (2021)
- JP Morgan Faculty Award
- Sony Research Award
Open Source Contributions
robosuite
robosuite: Modular simulation framework for robot learning
Features:
- MuJoCo physics engine based
- 10 commercial robot models supported (including GR1 humanoid)
- 9 grippers, 4 bases supported
- Photo-realistic rendering
- Jointly maintained by Stanford SVL, UT RPL, NVIDIA GEAR
Impact:
- Standard simulation environment for robot learning research
- Used in thousands of research projects
MimicGen
MimicGen: Automated large-scale learning data generation from few demonstrations
Results:
- Less than 200 human demonstrations → 50,000+ auto-generated trajectories
- 18 tasks, multiple simulator support
- Validated on real robots
Impact:
- Dramatically reduced data collection costs
- Enables large-scale data acquisition essential for foundation model training
Links
References
- UT Austin: Yuke Zhu Earns IEEE Early Career Award
- UT Austin: NSF CAREER Award
- NVIDIA GEAR Lab
- GR00T N1 Paper
- robosuite GitHub
- MimicGen
See Also
- Jim Fan - GEAR Lab Co-Lead
- Fei-Fei Li - Stanford Advisor
- GR00T - Humanoid Foundation Model
- NVIDIA - Affiliated Company