Knowledge2026年5月13日·1 分钟阅读

Awesome Embodied Robotics & Agents

Curated reading list for embodied robotics + agent research. Use it to track key papers, datasets, benchmarks, and trends in embodied AI.

Agent 就绪

这个资产可以被 Agent 直接读取和安装

TokRepo 同时提供通用 CLI 命令、安装契约、metadata JSON、按适配器生成的安装计划和原始内容链接,方便 Agent 判断适配度、风险和下一步动作。

Native · 94/100策略:允许
Agent 入口
任意 MCP/CLI Agent
类型
Memory
安装
Git
信任
信任等级:Established
入口
git clone https://github.com/zchoi/Awesome-Embodied-Robotics-and-Agent
通用 CLI 安装命令
npx tokrepo install 81e1cc04-92f1-5b5a-b177-4f37b8fd819d

简介

该仓库整理了具身机器人与 Agent 系统的研究资料,帮助你在 embodied AI 方向快速导航论文、数据集与 benchmark。

最适合: 需要梳理具身 AI 研究版图的研究者与工程师

适配: 任意系统;Markdown 目录含大量外链;阅读时建议逐条验证来源与年份

配置时间: 5–15 分钟

关键事实(已验证)

  • 清单仓库为 Apache-2.0 许可证(GitHub API 已验证)。
  • 建议先读 surveys,再看 benchmarks/datasets,最后看实现仓库,形成可执行阅读路径。
  • GitHub:1,782 stars · 96 forks;最近更新 2026-05-11(GitHub API 验证)。

正文

更可执行的阅读方式:

  1. 先从 survey 建立分类法(任务/传感器/环境/指标)。
  2. 选一个 benchmark/dataset,追溯哪些论文在它上面报告结果。
  3. 对你关心的方法,找到实现并记录可复现笔记(版本、超参、依赖)。

README 原文节选(verbatim)

🤖 Awesome Embodied Robotics and Agent Awesome

This is a curated list of "Embodied robotics or agent with Vision-Language Models (VLMs) and Large Language Models (LLMs)" research which is maintained by haonan.

Watch this repository for the latest updates and feel free to raise pull requests if you find some interesting papers!

News🔥

[2026/05/11] 🎉 Add NavSpace: How Intelligent Agents Follow Spatial Intelligence Instructions (ICRA 2026), the first benchmark for evaluating spatial intelligence in embodied navigation, with open-sourced dataset, evaluation code, and baseline SNav. [arXiv] [Github]
[2025/10/30] 🎉 Our survey paper "A Survey on Efficient Vision-Language-Action Models" [arXiv] has been released!
[2025/04/23] Add π-0.5, a lightweight and modular framework designed to integrate perception, control, and learning directly within physical systems.
[2025/03/18] Add some popular vision-language action (VLA) models. 🦾
[2024/06/28] Created a new board about agent self-evolutionary research. 🤖
[2024/06/07] Add Mobile-Agent-v2, a mobile device operation assistant with effective navigation via multi-agent collaboration. 🚀
[2024/05/13] Add "Learning Interactive Real-World Simulators"——outstanding paper award in ICLR 2024 🥇.
[2024/04/24] Add "A Survey on Self-Evolution of Large Language Models", a systematic survey on self-evolution in LLMs! 💥
[2024/04/16] Add some CVPR 2024 papers.
[2024/04/15] Add MetaGPT, accepted for oral presentation (top 1.2%) at ICLR 2024, ranking #1 in the LLM-based Agent category. 🚀
[2024/03/13] Add CRADLE, an interesting paper exploring LLM-based agent in Red Dead Redemption II!🎮

Development of Embodied Robotics and Benchmarks

🙏

来源与感谢

Source: https://github.com/zchoi/Awesome-Embodied-Robotics-and-Agent > License: Apache-2.0 > GitHub stars: 1,782 · forks: 96

讨论

登录后参与讨论。
还没有评论,来写第一条吧。

相关资产