Human-to-Robot Handovers is one of the five competition tracks within the 10th Robotic Grasping and Manipulation Competition (RGMC) that will be held during the IEEE/RAS International Conference on Robotics and Automation (ICRA) 2025 in Atlanta, USA.
A unique opportunity to benchmark your work with the state-of-the-art and advance this robotic task in real-world conditions.
Winners will receive cash prizes and certificates. Participants will also be invited to present their work at an IROS Workshop and contribute to a Special Issue (TBC).
Applications
Track participation
If you are interested participating in our track, please fill out the online application form.
We are still receiving applications and may consider them based on the availability of the participant spots and the readiness of the solution.
Robots usage
Teams can also apply for using the robots that will be available at the competition site (6x UR5 and 4x Franka Emika Panda) if they are having difficulty bringing their own (subjected to availability). Teams are welcome and encouraged to bring their own robots to the competition.
Travel support
Travel support will be available to selected teams based on financial needs. Application details will be provided in the next weeks.
The real-time estimation through vision perception of the physical properties of objects manipulated by humans is important to inform the control of robots and perform accurate and safe grasps of objects handed over by humans. However, estimating the physical properties of previously unseen objects using only inexpensive cameras is challenging due to illumination variations, transparencies, reflective surfaces, and occlusions caused both by the human and the robot. Our dynamic human-to-robot handovers track is based on an affordable experimental setup that does not use a motion capture system, markers, or prior knowledge of specific object models. The track focuses on food containers and drinking glasses that vary in shape and size, and may be empty or filled with an unknown amount of unknown content. The goal is to assess the generalisation capabilities of the robotic control when handing over previously unseen objects filled (or not) with unknown content, hence with a different and unknown mass and stiffness. No object properties are initially known to the robot (or the team) that must infer these properties on-the-fly, during the execution of the dynamic handover, through perception of the scene.
Important dates
The used time zone is Pacific Time.
The track will involve a preparation phase in which teams can perform the handover configurations of the CORSMAL Benchmark protocol remotely in their laboratory. Please see also the starting kit and documentation for preparing your solution.
Track organisers
Changjae Oh, Queen Mary University of London (UK)
Andrea Cavallaro, Idiap Research Institue and École polytechnique fédérale de Lausanne (Switzerland)
Alessio Xompero, Queen Mary University of London (UK)
Contacts
If you have any questions or enquiries related to the competition track, please contact Dr. Changjae Oh or Prof. Andrea Cavallaro.
If you have any general questions or other enquiries related to the Robotic and Grasping Manipulation Competition,
please visit the RGMC webpage.