Datum / čas
Date(s) - 08.04.
10:00 - 11:00
Kategorie ne Kategorie
Zoom Meeting ID: 811 9797 2679
Passcode: 258822
https://us02web.zoom.us/j/81197972679?pwd=SG9uU1FuMXJSdW1oQlhub01rWEh4UT09
Abstract:
Accurately recovering the poses of multiple objects and robots in non-instrumented environments is an important problem to grant autonomous systems the ability to solve real tasks in-the-wild, especially in the context of collaborative robotics. In this talk, I will present our recent works on object and robot pose estimation from one or multiple uncalibrated RGB cameras. First, I will present CosyPose, our state-of-the-art method for single-view 6D pose estimation of rigid objects which won the BOP challenge at ECCV 2020. Second, I will present our multi-view approach that is designed to address the limitations inherent to single-view pose estimation. This multi-view approach significantly improves robustness and accuracy and is able to automatically process noisy or incomplete visual information from multiple cameras into a complete scene interpretation in near real time. Third, I will present our latest work on RoboPose, a method for recovering the 6D pose and the joint angles of an articulated robot from a single RGB image. Our method significantly improves the state-of-the-art for multiple commonly used robotic manipulators. It opens-up many exciting applications in visually guided manipulation or collaborative robotics without fiducial markers or time-consuming hand-eye calibration.
This talk is based on the following papers:
[1] Yann Labbé, Justin Carpentier, Mathieu Aubry, Josef Sivic. CosyPose: Consistent multi-view multi-object 6D pose estimation. ECCV 2020.
[2] Yann Labbé, Justin Carpentier, Mathieu Aubry, Josef Sivic. Single-view robot pose and joint angle estimation via render & compare. CVPR 2021 (Oral).
For more information on IMPACT and AAG seminars, please visit http://impact.ciirc.cvut.cz/seminars/#seminar2021-04-08-labbé
http://aag.ciirc.cvut.cz/seminars/