RGMC ICRA2024 Human-to-Robot Handover Team TCS

00:09:23
https://www.youtube.com/watch?v=UHCnVjyf5A4

Sintesi

TLDRThe TCS Smart Machines team participated in the IC 2024 competition in Yokohama, Japan, focusing on the human-to-robot handover challenge. This involved a human handing an object to a robot arm, which must infer the object's properties and ensure a natural handover. The hardware setup included a UR5 robotic arm with a two-finger gripper, tactile sensors, and cameras for visual input. The methodology comprised several steps: image acquisition, hand and object detection, mask generation, object tracking, grasp generation, and reactive grasping. The results showcased the system's ability to adaptively grasp various objects, ensuring effective transfer from human to robot.

Punti di forza

  • 🤖 The TCS Smart Machines team participated in IC 2024 in Yokohama.
  • 👐 Focused on human-to-robot handover in robotic grasping.
  • 📦 The robot must infer object properties during handover.
  • 🔧 Hardware included a UR5 robotic arm and tactile sensors.
  • 📸 Cameras were used for visual input and stability.
  • 🔍 Methodology involved image acquisition and object detection.
  • 📊 Results showed effective object transfer from human to robot.
  • ⚙️ Reactive grasping adapted to various object types.
  • 💡 The system ensures a natural handover experience.
  • 🌍 Collaboration with top researchers worldwide.

Linea temporale

  • 00:00:00 - 00:09:23

    The TCS smart machines team participated in the IC 2024 competition in Yokohama, Japan, focusing on the human-to-robot handover challenge. The problem involves a human handing an object to a robot arm, which must infer the object's properties and ensure a natural handover. The robot arm used is a UR5 with a two-finger gripper and tactile sensors for detecting texture and pressure. The methodology includes image acquisition, hand and object detection, mask generation, object tracking, grasp generation, visual suring, and reactive grasping, allowing the robot to adaptively grasp various objects. The results demonstrate the system's effectiveness in facilitating smooth transfers from human to robot.

Mappa mentale

Video Domande e Risposte

  • What was the main focus of the TCS Smart Machines team at IC 2024?

    The main focus was on human-to-robot handover in robotic grasping and manipulation.

  • What hardware was used in the experiment?

    The hardware included a UR5 robotic arm, a two-finger parallel gripper, tactile sensors, and cameras.

  • What are the key steps in the methodology?

    Key steps include image acquisition, hand and object detection, mask generation, object tracking, grasp generation, and reactive grasping.

  • How does the robot ensure a natural handover?

    The robot infers the object's properties and adjusts its actions to ensure a smooth transfer.

  • What types of objects can the robot grasp?

    The robot can grasp a variety of objects, both hard and soft, by dynamically adjusting the grasp force.

Visualizza altre sintesi video

Ottenete l'accesso immediato ai riassunti gratuiti dei video di YouTube grazie all'intelligenza artificiale!
Sottotitoli
en
Scorrimento automatico:
  • 00:00:03
    hello everyone with the TCS smart
  • 00:00:06
    machines team are excited to share our
  • 00:00:08
    journey at the IC 2024 competition in
  • 00:00:11
    Yokohama Japan we participated in the
  • 00:00:14
    fourth essential skill track of the rgmc
  • 00:00:16
    name human to robot Handover focusing on
  • 00:00:19
    robotic grasping and manipulation this
  • 00:00:21
    event brought together top research from
  • 00:00:23
    around the world and we are thrilled to
  • 00:00:25
    present our experience problem statement
  • 00:00:28
    the problem statement consists of a
  • 00:00:29
    scenario where a human subject picks up
  • 00:00:31
    an object either filled or empty from a
  • 00:00:34
    table and hands it over to a robot arm
  • 00:00:36
    the challenge here is that the robot
  • 00:00:38
    must infer the object's properties on
  • 00:00:40
    fly the subject must move the object to
  • 00:00:42
    a location that the robot can reach
  • 00:00:44
    while ensuring that the Handover feels
  • 00:00:46
    natural for the human once the robot
  • 00:00:48
    receives the object it is responsible
  • 00:00:50
    for delivering it upright within a
  • 00:00:52
    predefined area which could be the same
  • 00:00:54
    table or a nearby one hardware setup the
  • 00:00:57
    hardware setup for this experiment
  • 00:00:58
    includes a robotic arm with 6° of
  • 00:01:01
    Freedom specifically the ur5 the arm is
  • 00:01:04
    equipped with a two- finger parallel
  • 00:01:05
    gripper the robotic 2 f85 which is
  • 00:01:08
    crucial for grasping we also have two
  • 00:01:10
    tactile sensors contactl pillar mounted
  • 00:01:13
    on the fingers of the gripper to detect
  • 00:01:15
    the texture and pressure applied to real
  • 00:01:17
    sensity 455 cameras are used for visual
  • 00:01:20
    input mounted on tripods for stability
  • 00:01:23
    additional equipment includes a digital
  • 00:01:25
    weighing scale a wi table and various
  • 00:01:27
    containers of different shapes and
  • 00:01:29
    weights method methodology the
  • 00:01:31
    methodology involves several steps one
  • 00:01:33
    with image acquisition we start by
  • 00:01:35
    capturing depth and color images from
  • 00:01:37
    both cameras two hand and object
  • 00:01:40
    detection next we identify bounding
  • 00:01:42
    boxes for the hand and objects from
  • 00:01:44
    these RGB images three mask generation
  • 00:01:47
    masks are generated based on these
  • 00:01:49
    bounding boxes to separate the hand and
  • 00:01:51
    objects from the background for object
  • 00:01:54
    tracking the object is then tracked
  • 00:01:56
    across frames using an object tracking
  • 00:01:58
    algorithm five grasp generation huris
  • 00:02:01
    stics huris stics are applied to
  • 00:02:02
    determine the grasp centroid and
  • 00:02:04
    generate grasp poses this visual suring
  • 00:02:07
    wh visual suring is used to guide and
  • 00:02:10
    adjust the robot's actions based on the
  • 00:02:12
    track data seven reactive grasping
  • 00:02:15
    finally tacle sensing is utilized to
  • 00:02:17
    adaptively grasp a variety of objects
  • 00:02:19
    whether they are hard or soft ensuring a
  • 00:02:21
    firm grip without slippage by
  • 00:02:23
    dynamically adjusting the grasp force
  • 00:02:25
    and position results in the results we
  • 00:02:28
    can see the stages of our system in
  • 00:02:30
    action from hand object detection mask
  • 00:02:32
    generation object tracking and grasp
  • 00:02:35
    generation to visual surving and
  • 00:02:37
    reactive grasping these steps work
  • 00:02:39
    together to allow the robot to
  • 00:02:41
    understand and adapt to the object being
  • 00:02:42
    handed over ensuring a smooth and
  • 00:02:44
    effective transfer from the human to the
  • 00:02:51
    robot system at work finally here we
  • 00:02:54
    showcase the system in
  • 00:02:58
    action
  • 00:03:28
    e
  • 00:03:58
    e
  • 00:04:28
    e
  • 00:04:58
    e
  • 00:05:28
    e
  • 00:05:58
    e
  • 00:06:28
    e
  • 00:06:58
    e
  • 00:07:28
    e
  • 00:07:58
    e
  • 00:08:28
    e
  • 00:08:58
    e for
Tag
  • robotics
  • human-robot interaction
  • grasping
  • manipulation
  • IC 2024
  • UR5
  • tactile sensors
  • object tracking
  • reactive grasping
  • automation