Document detail
ID

oai:arXiv.org:2411.00965

Topic
Computer Science - Robotics
Author
Hsu, Cheng-Chun Wen, Bowen Xu, Jie Narang, Yashraj Wang, Xiaolong Zhu, Yuke Biswas, Joydeep Birchfield, Stan
Category

Computer Science

Year

2024

listing date

11/6/2024

Keywords
task demonstrations object-centric object pose
Metrics

Abstract

We introduce SPOT, an object-centric imitation learning framework.

The key idea is to capture each task by an object-centric representation, specifically the SE(3) object pose trajectory relative to the target.

This approach decouples embodiment actions from sensory inputs, facilitating learning from various demonstration types, including both action-based and action-less human hand demonstrations, as well as cross-embodiment generalization.

Additionally, object pose trajectories inherently capture planning constraints from demonstrations without the need for manually crafted rules.

To guide the robot in executing the task, the object trajectory is used to condition a diffusion policy.

We show improvement compared to prior work on RLBench simulated tasks.

In real-world evaluation, using only eight demonstrations shot on an iPhone, our approach completed all tasks while fully complying with task constraints.

Project page: https://nvlabs.github.io/object_centric_diffusion

Hsu, Cheng-Chun,Wen, Bowen,Xu, Jie,Narang, Yashraj,Wang, Xiaolong,Zhu, Yuke,Biswas, Joydeep,Birchfield, Stan, 2024, SPOT: SE(3) Pose Trajectory Diffusion for Object-Centric Manipulation

Document

Open

Share

Source

Articles recommended by ES/IODE AI