HCPLab-SYSU / Embodied_AI_Paper_List

[Embodied-AI-Survey-2024] Paper list and projects for Embodied AI
509 stars 33 forks source link
agent causality embodied-ai interaction manipulation navigation percpetion reasoning robotics survey


Paper list for Embodied AI

HCPLab
Pengcheng Laboratory & SYSU HCP Lab

arXiv

We appreciate any useful suggestions for improvement of this paper list or survey from peers. Please raise issues or send an email to liuy856@mail.sysu.edu.cn and chen867820261@gmail.com. Thanks for your cooperation! We also welcome your pull requests for this project!

Teaser

Aligning Cyber Space with Physical World: A Comprehensive Survey on Embodied AI
Yang Liu, Weixing Chen, Yongjie Bai, Xiaodan Liang, Guanbin Li, Wen Gao, Liang Lin

🏠 About

Embodied Artificial Intelligence (Embodied AI) is crucial for achieving Artificial General Intelligence (AGI) and serves as a foundation for various applications that bridge cyberspace and the physical world. Recently, the emergence of Multi-modal Large Models (MLMs) and World Models (WMs) have attracted significant attention due to their remarkable perception, interaction, and reasoning capabilities, making them a promising architecture for the brain of embodied agents. However, there is no comprehensive survey for Embodied AI in the era of MLMs. In this survey, we give a comprehensive exploration of the latest advancements in Embodied AI. Our analysis firstly navigates through the forefront of representative works of embodied robots and simulators, to fully understand the research focuses and their limitations. Then, we analyze four main research targets: 1) embodied perception, 2) embodied interaction, 3) embodied agent, and 4) sim-to-real adaptation, covering the state-of-the-art methods, essential paradigms, and comprehensive datasets. Additionally, we explore the complexities of MLMs in virtual and real embodied agents, highlighting their significance in facilitating interactions in dynamic digital and physical environments. Finally, we summarize the challenges and limitations of embodied AI and discuss their potential future directions. We hope this survey will serve as a foundational reference for the research community and inspire continued innovation.

:collision: Update Log

📚 Table of Contents

Books & Surveys 🔝

Embodied Simulators 🔝

General Simulator

Real-Scene Based Simulators

Embodied Perception 🔝

Active Visual Exploration

3D Visual Perception and Grounding

Visual Language Navigation

Non-Visual Perception: Tactile

Embodied Interaction 🔝

Embodied Agent 🔝

Embodied Multimodal Foundation Models

Embodied Manipulation & Control

Sim-to-Real Adaptation 🔝

Datasets 🔝

To be updated...

Embodied Perception

Vision

Tactile

Embodied Navigation

Embodied Question Answering

Embodied Manipulation

Other Useful Embodied Projects & Tools

Resources

Awesome-Embodied-Agent-with-LLMs
Awesome Embodied Vision
Awesome Touch

Simulate Platforms & Enviroments

Habitat-Lab
Habitat-Sim
GibsonEnv
LEGENT
MetaUrban
GRUtopia
GenH2R
Demonstrating HumanTHOR

Projects

RoboMamba
MANIPULATE-ANYTHING
DexGraspNet
UniDexGrasp
UniDexGrasp++
OAKINK2

EmbodiedQA

EmbodiedScan

Octopus
Holodeck
AllenAct

LEO
Voyager

:newspaper: Citation

If you think this survey is helpful, please feel free to leave a star ⭐️ and cite our paper:

@article{liu2024aligning,
  title={Aligning Cyber Space with Physical World: A Comprehensive Survey on Embodied AI},
  author={Liu, Yang and Chen, Weixing and Bai, Yongjie and Li, Guanbin and Gao, Wen and Lin, Liang},
  journal={arXiv preprint arXiv:2407.06886},
  year={2024}
}

👏 Acknowledgements

We sincerely thank Jingzhou Luo, Xinshuai Song, Kaixuan Jiang, Junyi Lin, Zhida Li, and Ganlong Zhao for their contributions.