The Marathon 2: A Navigation System (IROS 2020)

00:12:01
https://www.youtube.com/watch?v=QB7lOKp3ZDQ

Zusammenfassung

TLDRThe video covers an overview of the new and improved ROS 2 Navigation 2 stack, which is designed to enhance autonomous navigation in robotics. Developed by Samsung Research America, Navigation 2 uses lessons from over 10 years of experience with the previous version. It features a behavior tree-based system offering greater configurability and modularity than its predecessor. The system is compatible with various positioning technologies and allows for dynamic adaptation to different tasks. Experiments, such as the Marathon Experiments, demonstrated its effectiveness and autonomy over long distances in real-world scenarios without human intervention or collisions. Future developments include integrating dynamic obstacles and advanced navigation algorithms.

Mitbringsel

  • 🚀 ROS 2 Navigation 2 offers flexibility and modularity for robotics.
  • 🔧 Utilizes behavior trees for complex task modeling.
  • 🧪 Conducted Marathon Experiments for real-world validation.
  • 🤖 Compatible with various robot types and positioning systems.
  • 📊 Provides dynamic object and waypoint following.
  • 🔍 Offers high software quality and reliability testing.
  • 🌉 Implemented in diverse network environments, like WiFi and 5G.
  • 🛠️ Open-source contributions expand its capabilities.
  • 📚 Extensive tutorials and documentation available.
  • 🔄 Future developments focus on dynamic obstacle integration.

Zeitleiste

  • 00:00:00 - 00:05:00

    The talk introduces the new and improved ROS 2 navigation stack, highlighting its development and features. ROS 2 aims to break ROS 1 out of the laboratory by providing designs for big systems, standardized network interfaces, and compatibility with non-ideal networks like Wi-Fi and 5G. It supports embedded systems and various operating systems, offering real-time compliance and multi-robot teaming capabilities. The new navigation system allows for flexible configurations with behavior trees and modular servers, supporting multiple trajectory planners, recovery behaviors, and positioning systems.

  • 00:05:00 - 00:12:01

    The Navigation 2 stack in ROS 2 succeeds the original navigation stack by incorporating modularity, reconfigurability, and simulation capabilities. It offers extensive testing and quality assurance, including static analysis and unit testing, and is supported by Samsung Research. The system supports different robot types and provides positioning independence, evidenced by the Marathon experiments conducted in dynamic environments. These experiments validated its reliability, with robots navigating autonomously without human intervention. Future developments aim at integrating dynamic obstacle detection, enhanced SLAM, and new algorithms for broader application impact.

Mind Map

Mind Map

Häufig gestellte Fragen

  • What is ROS 2 Navigation 2?

    ROS 2 Navigation 2 is a framework for autonomous mobile robotics, succeeding the original ROS navigation stack.

  • What experiments were conducted with Navigation 2?

    Marathon experiments were conducted with Navigation 2 on two industrial robots, navigating autonomously in a university environment.

  • How does Navigation 2 differ from the original ROS Navigation stack?

    Navigation 2 uses a configurable behavior tree model instead of a single process state machine, allowing more customization and modularity.

  • What capabilities does Navigation 2 offer?

    It offers full simulation environments, dynamic object following, waypoint navigation, robust lidar-based localization, and more.

  • What is the significance of behavior trees in Navigation 2?

    Behavior trees allow for modeling complex tasks more effectively than finite state machines.

  • What technologies does ROS 2 utilize for networking?

    ROS 2 uses DDS for working on non-ideal networks like Wi-Fi and 5G.

  • What is the outcome of the marathon experiment?

    Robots using Navigation 2 navigated 37 miles autonomously, demonstrating reliable operation without collisions or human intervention.

  • What are the future developments planned for Navigation 2?

    Future developments include dynamic obstacle integration, 3D SLAM, new localization frameworks, and semantic navigation.

  • What types of robots are supported by Navigation 2?

    Navigation 2 supports differential drive, omnidirectional, and other types of robots.

  • How does Navigation 2 ensure high software quality?

    It ensures high quality through extensive testing, static and dynamic analysis, and maintaining high test coverage.

Weitere Video-Zusammenfassungen anzeigen

Erhalten Sie sofortigen Zugang zu kostenlosen YouTube-Videozusammenfassungen, die von AI unterstützt werden!
Untertitel
en
Automatisches Blättern:
  • 00:00:00
    hello and welcome to our talk today
  • 00:00:02
    we'll be talking about the marathon 2
  • 00:00:04
    and navigation system
  • 00:00:05
    this is going to be a basic overview of
  • 00:00:06
    the new and improved ros 2 navigation
  • 00:00:08
    stack
  • 00:00:10
    i am the open source for box engineering
  • 00:00:12
    lead at samsung research america
  • 00:00:13
    i develop and maintain a large variety
  • 00:00:15
    of ros1 and ros2 packages
  • 00:00:17
    around navigation and robot perception
  • 00:00:19
    i'm also on the roster technical
  • 00:00:20
    steering committee and the navigation
  • 00:00:21
    project lead
  • 00:00:23
    so first we'll go through some
  • 00:00:24
    background and motivating problems
  • 00:00:25
    behind this work
  • 00:00:26
    and then we're going to be talking about
  • 00:00:27
    the work itself the rost2 navigation 2
  • 00:00:29
    project
  • 00:00:30
    then we'll be describing the marathon
  • 00:00:31
    experiments that we conducted and then
  • 00:00:33
    the roadmap for our future development
  • 00:00:36
    so we based our new navigation system
  • 00:00:38
    off of ross2 ross2 was built to break
  • 00:00:40
    ross1 out of the laboratory
  • 00:00:42
    so it includes things like design
  • 00:00:43
    patterns for big systems has
  • 00:00:45
    well-designed network interfaces to
  • 00:00:46
    standardize things like sensor messages
  • 00:00:48
    and certain common
  • 00:00:49
    aspects of robotics it works on
  • 00:00:51
    non-ideal networks using dds so things
  • 00:00:53
    like wi-fi and 5g networks
  • 00:00:55
    it includes a security standard it works
  • 00:00:58
    on embedded systems as well as
  • 00:01:00
    linux mac and windows it's real-time
  • 00:01:02
    compliant and is
  • 00:01:03
    used often in multi-robot teaming so the
  • 00:01:06
    right you see an image i'm not sure if
  • 00:01:07
    this team from the darpa subtee
  • 00:01:09
    challenge used ross2 but this is a great
  • 00:01:10
    example of the things that that ross was
  • 00:01:12
    2 is built for
  • 00:01:13
    so working in an unideal network
  • 00:01:15
    underground with multi-robot teaming and
  • 00:01:17
    doing things like real-time control
  • 00:01:19
    loops on the
  • 00:01:19
    the drones and the quadrupeds so if
  • 00:01:22
    you're unfamiliar with ross
  • 00:01:23
    ross gives you access to a large variety
  • 00:01:25
    of software built specifically for
  • 00:01:26
    robotics
  • 00:01:27
    this includes things like sensor and
  • 00:01:28
    robot drivers you can get started very
  • 00:01:30
    quickly with new
  • 00:01:31
    hardware you buy in your laboratories
  • 00:01:33
    include software stacks to do autonomous
  • 00:01:35
    navigation
  • 00:01:35
    manipulation and tom's driving tasks it
  • 00:01:38
    includes processing pipelines for common
  • 00:01:39
    sensors
  • 00:01:40
    so things like lidars imu's point cloud
  • 00:01:42
    generators and
  • 00:01:43
    images uh it includes a lot of data
  • 00:01:45
    visualization tools like arviz and
  • 00:01:47
    simulation through gazebo and ignition
  • 00:01:49
    we have geometric transformation
  • 00:01:50
    libraries that help you with time series
  • 00:01:52
    data
  • 00:01:52
    we have a lot of debugging and
  • 00:01:53
    diagnostic tools and we're able to
  • 00:01:55
    interface
  • 00:01:56
    uh your research algorithms into these
  • 00:01:58
    production frameworks all these are made
  • 00:02:00
    available by individual research and
  • 00:02:02
    industry contributions
  • 00:02:03
    if you have some interesting research
  • 00:02:04
    that really could be used on a lot of
  • 00:02:05
    robots consider releasing that work to
  • 00:02:07
    pay it forward
  • 00:02:09
    one of the flagship projects within ross
  • 00:02:11
    was the navigation stack the navigation
  • 00:02:13
    stack has been used for over 10 years by
  • 00:02:14
    research academia and industry
  • 00:02:16
    the navigation stack is primarily based
  • 00:02:18
    around move base which is an
  • 00:02:20
    unconfigurable single process state
  • 00:02:21
    machine
  • 00:02:22
    it does allow for full autonomous
  • 00:02:23
    navigation but it really only works
  • 00:02:25
    effectively on differential and
  • 00:02:26
    omnidirectional robots
  • 00:02:27
    and it only allows you to use a single
  • 00:02:29
    global and local trajectory planner
  • 00:02:31
    at one time the algorithms that it
  • 00:02:33
    provides are things like a dwa local
  • 00:02:35
    trajectory planner
  • 00:02:36
    a wavefront dijkstras path planner and
  • 00:02:38
    cost map 2d environmental model
  • 00:02:40
    which is a 2d occupancy grid based
  • 00:02:42
    environmental model
  • 00:02:44
    while it is the industry standard it's
  • 00:02:45
    been largely undeveloped since the early
  • 00:02:47
    2000s and hasn't had any significant uh
  • 00:02:50
    refactors or new algorithms since then
  • 00:02:53
    the navigation 2 stack in ros2 is the
  • 00:02:55
    successor to the original navigation
  • 00:02:57
    stack
  • 00:02:57
    we use the lessons learned over 10 years
  • 00:02:59
    of use maintenance and development
  • 00:03:01
    in order to build the next world-leading
  • 00:03:03
    autonomous mobile robotics navigation
  • 00:03:04
    framework
  • 00:03:05
    rather than being based on an
  • 00:03:06
    unconfigurable state machine we now have
  • 00:03:08
    a super configurable behavior tree based
  • 00:03:10
    navigation system
  • 00:03:11
    we also use independent modular servers
  • 00:03:13
    so things like your recovery
  • 00:03:14
    planner and controller servers can be
  • 00:03:16
    completely removed or swapped out for
  • 00:03:18
    your custom applications
  • 00:03:19
    we also allow for sensor processing
  • 00:03:21
    pipelines for collision avoidance
  • 00:03:23
    we also enable you to have multiple
  • 00:03:24
    local trajectory and path planners for
  • 00:03:26
    your single navigation task
  • 00:03:28
    we also have recovery behaviors and our
  • 00:03:30
    positioning system agnostic
  • 00:03:31
    so you can utilize any 3d 2d or visual
  • 00:03:34
    slam and localization system that you
  • 00:03:35
    like
  • 00:03:36
    all of our our plugins and algorithms
  • 00:03:38
    are all runtime configurable including
  • 00:03:40
    the behavior tree
  • 00:03:41
    and we're extremely focused on software
  • 00:03:42
    quality and production focused
  • 00:03:44
    so we have extremely high test coverage
  • 00:03:46
    and static and dynamic analysis
  • 00:03:50
    so the key behind the navigation two
  • 00:03:52
    frameworks that's different from the
  • 00:03:53
    navigation
  • 00:03:54
    stack and ros1 is the behavior tree
  • 00:03:56
    based navigation
  • 00:03:57
    behavior trees are a tree based
  • 00:03:59
    execution model and we're able to
  • 00:04:01
    model far more complex tasks than
  • 00:04:03
    practical with finite state machines
  • 00:04:04
    to do this we use the
  • 00:04:06
    behaviortree.cpplibrary which enables us
  • 00:04:08
    to have plug-in based control action
  • 00:04:10
    decorator and condition nodes in our
  • 00:04:11
    trees
  • 00:04:12
    we also can then extend those custom
  • 00:04:14
    nodes or change the behavior of a
  • 00:04:15
    navigation task
  • 00:04:16
    by changing the xml files that we
  • 00:04:18
    utilize for navigation this includes
  • 00:04:20
    extensions like elevator and automatic
  • 00:04:21
    door apis
  • 00:04:22
    we can do dynamic uh object following
  • 00:04:25
    and think and have multiple different
  • 00:04:27
    path introductory planners in different
  • 00:04:28
    contexts
  • 00:04:29
    we also then have the bt navigator which
  • 00:04:31
    is the server which hosts our behavior
  • 00:04:33
    tree
  • 00:04:33
    that enables you to load your runtime
  • 00:04:35
    configurable behavior tree xml files
  • 00:04:37
    and each of those xml files themselves
  • 00:04:39
    can include any number of custom
  • 00:04:41
    plugins these different plugins can be
  • 00:04:44
    calling you specialized servers so you
  • 00:04:45
    can be calling
  • 00:04:46
    things like your planner controller or
  • 00:04:48
    recovery servers but they also might be
  • 00:04:50
    able to compute some value themselves
  • 00:04:51
    like call an elevator api is a very
  • 00:04:52
    quick
  • 00:04:53
    action another maintenance of the
  • 00:04:55
    navigation 2 system is this modularity
  • 00:04:57
    and reconfigurability
  • 00:04:58
    we do this through independent task
  • 00:05:00
    servers and plugin interfaces for
  • 00:05:01
    algorithms
  • 00:05:02
    this allows you at runtime to select any
  • 00:05:04
    number of your custom plugins for use
  • 00:05:06
    it also allows you to fully replace
  • 00:05:08
    these servers if you like so you can
  • 00:05:09
    implement them in other languages
  • 00:05:10
    or with new features we also then
  • 00:05:13
    leverage multi-core processors using
  • 00:05:14
    multi-processing frameworks so we can
  • 00:05:17
    compute more complex navigation tasks we
  • 00:05:20
    do this to design patterns so each of
  • 00:05:22
    these servers
  • 00:05:22
    has an action server interface to handle
  • 00:05:24
    goals provide feedback and return
  • 00:05:26
    results to the client
  • 00:05:27
    we also then each of these servers have
  • 00:05:29
    a map of plugins so it can support n
  • 00:05:31
    plugins per server so you can use
  • 00:05:33
    different algorithms in different unique
  • 00:05:34
    contexts
  • 00:05:35
    we also then have the environmental
  • 00:05:36
    model that our algorithm needs to have
  • 00:05:38
    zero latency access to our
  • 00:05:39
    our cost map 2d interfaces
  • 00:05:43
    the navigation 2 system also has a bunch
  • 00:05:45
    of additional capabilities
  • 00:05:46
    that includes full simulation
  • 00:05:47
    environments and a robot to start off
  • 00:05:49
    with so you can get started immediately
  • 00:05:50
    as you download the code
  • 00:05:52
    we also have dynamic object and waypoint
  • 00:05:54
    following capabilities
  • 00:05:55
    we provide a robust lidar based
  • 00:05:56
    localization system out of the box
  • 00:05:59
    it also works with all major robot types
  • 00:06:01
    including differential drive
  • 00:06:02
    omnidirectional ackermann steering of
  • 00:06:04
    any shape including circular and of an
  • 00:06:06
    arbitrary
  • 00:06:07
    uh shape we also have a
  • 00:06:10
    wide ranging number of tutorials on
  • 00:06:11
    configuration use debugging custom
  • 00:06:14
    plugins and far more
  • 00:06:15
    like we said before it's positioning
  • 00:06:17
    system independent so you can use any
  • 00:06:18
    sort of
  • 00:06:19
    vendor you like from visual 2d or 3d
  • 00:06:21
    localization as well as gps navigation
  • 00:06:25
    navigation 2 is very focused on on
  • 00:06:27
    reliability quality and having a
  • 00:06:28
    production mindset
  • 00:06:29
    we built this on top of ros2 which is a
  • 00:06:31
    production grade middleware
  • 00:06:33
    and robot framework we conducted
  • 00:06:35
    reliability testing through these
  • 00:06:36
    marathon experiments
  • 00:06:37
    we'll talk about later it also includes
  • 00:06:39
    server lifecycle management and
  • 00:06:40
    real-time heartbeats to ensure that
  • 00:06:42
    all of our systems come up
  • 00:06:43
    deterministically and stay alive for the
  • 00:06:45
    full duration of your navigation task
  • 00:06:47
    it includes extensive full system
  • 00:06:48
    simulation testing with about 70
  • 00:06:50
    test coverage we also have static
  • 00:06:52
    analysis
  • 00:06:53
    code quality and unit testing tools it's
  • 00:06:55
    professionally developed and maintained
  • 00:06:57
    by samsung research
  • 00:06:58
    and we provide docker images so you can
  • 00:06:59
    get started on any operating system
  • 00:07:02
    so in review let's look at the block
  • 00:07:03
    diagram for the navigation 2 stack
  • 00:07:05
    so the top we have a waypoint follower
  • 00:07:07
    which might have a number of waypoints
  • 00:07:09
    to achieve
  • 00:07:10
    in order to complete your objective it
  • 00:07:12
    sends each of those requests to the bt
  • 00:07:13
    navigator server
  • 00:07:15
    the bt navigator server loads a behavior
  • 00:07:17
    tree xml file at runtime
  • 00:07:19
    it will then parse the xml file and
  • 00:07:21
    identify all of the
  • 00:07:23
    custom behavior tree node plugins to
  • 00:07:25
    load at that time
  • 00:07:27
    then as you're executing your your
  • 00:07:29
    navigation
  • 00:07:30
    logic um you might call uh recovery
  • 00:07:33
    behaviors
  • 00:07:33
    ask your local trajectory planner for
  • 00:07:35
    control efforts or talk to your path
  • 00:07:37
    planner and ask for a plan through space
  • 00:07:38
    and so each of those different behavior
  • 00:07:40
    tree nodes will call the the
  • 00:07:42
    different servers in order to make full
  • 00:07:44
    use of multi-core processors
  • 00:07:46
    at the end of this then the controller
  • 00:07:47
    server will produce some sort of command
  • 00:07:49
    velocity to be sent into the robot base
  • 00:07:51
    to follow some sort of path or complete
  • 00:07:53
    an objective
  • 00:07:54
    so as you can see our tf transformations
  • 00:07:57
    provide
  • 00:07:58
    information about our localization
  • 00:07:59
    system we have a map that's given to the
  • 00:08:01
    full navigation system for anyone who
  • 00:08:02
    needs it
  • 00:08:03
    and all of our sensor processing
  • 00:08:04
    pipelines occur in these
  • 00:08:06
    cost map modules the marathon
  • 00:08:09
    experiments we conducted are an
  • 00:08:11
    extension of the experiments conducted
  • 00:08:12
    at willow garage for the ross1
  • 00:08:13
    navigation stack
  • 00:08:15
    we conducted these these experiments
  • 00:08:16
    rather than an office space in a
  • 00:08:18
    large university building we did so in
  • 00:08:21
    along daytime with students and high
  • 00:08:22
    traffic passing periods in very large
  • 00:08:24
    areas
  • 00:08:25
    we did so with two different industrial
  • 00:08:26
    robots we did that with the pal robotics
  • 00:08:29
    tiago and we did that with the robotnik
  • 00:08:31
    rb1 base
  • 00:08:32
    each of these were outfitted with an
  • 00:08:33
    rgbd camera as well as a 2d
  • 00:08:35
    sick lidar as you can see in our
  • 00:08:38
    experiment we uh navigated through a
  • 00:08:40
    central stairwell
  • 00:08:41
    uh across a hallway with classrooms a
  • 00:08:43
    sky bridge
  • 00:08:44
    and then back into a laboratory and we
  • 00:08:46
    did so both uh in sac environments as
  • 00:08:48
    well as in dynamic environments during
  • 00:08:49
    uh passing periods
  • 00:08:51
    this route is about 300 meters and the
  • 00:08:53
    maximum speed that we set these robots
  • 00:08:55
    to is 0.45 meters per second
  • 00:08:59
    while the navigation 2 system can
  • 00:09:01
    support multiple plug-ins for this
  • 00:09:02
    experiment we only utilize one set of
  • 00:09:04
    plug-in for each class of task
  • 00:09:06
    so for path planning we used a wavefront
  • 00:09:08
    a-star implementation
  • 00:09:09
    for our local trajectory planning we
  • 00:09:11
    used the timed elastic band a local
  • 00:09:13
    trajectory planner
  • 00:09:14
    for perception we used this spacio
  • 00:09:16
    temporal voxel layer
  • 00:09:17
    which is a dynamic based uh of 3d voxel
  • 00:09:20
    layer as well as the static map that was
  • 00:09:21
    available to us
  • 00:09:23
    we used dead reckon odometry using the
  • 00:09:25
    robot localization package fusing
  • 00:09:27
    in using ekf of the imu and wheel
  • 00:09:29
    odometry data
  • 00:09:30
    and we provided global corrections using
  • 00:09:32
    amcl which is the robust lidar base
  • 00:09:34
    localizer that we provide
  • 00:09:36
    during runtime the experiment but we
  • 00:09:37
    produced the map using the slam toolbox
  • 00:09:39
    ross package
  • 00:09:40
    we also provide a number of other
  • 00:09:41
    algorithms that are available that we
  • 00:09:42
    did not use this experiment
  • 00:09:44
    including things like the dwb which is a
  • 00:09:46
    new dwa based local trajectory planner
  • 00:09:48
    the voxel layer the obstacle layer the
  • 00:09:50
    non-persistent voxel layer
  • 00:09:52
    the backup recovery behavior an ompl and
  • 00:09:54
    a hybrid star based planner
  • 00:09:57
    so as you can see on the right we have a
  • 00:09:59
    picture of our behavior tree that we use
  • 00:10:00
    in these experiments
  • 00:10:01
    so that if the a-star planner failed we
  • 00:10:03
    might do something like clearing a
  • 00:10:04
    global
  • 00:10:05
    environment if our temp controller
  • 00:10:07
    failed we'd clean uh clear the local
  • 00:10:09
    environment
  • 00:10:09
    and if the navigation task itself failed
  • 00:10:11
    then we might enter this this recovery
  • 00:10:13
    subtree
  • 00:10:14
    which then conducts all of our different
  • 00:10:16
    recovery behaviors available
  • 00:10:19
    so our experiment we navigated about 37
  • 00:10:22
    miles fully autonomously
  • 00:10:23
    between the two robots which took about
  • 00:10:25
    23 hours to complete
  • 00:10:26
    we ended up executing 168 recovery
  • 00:10:28
    behaviors which is about 4.3 recoveries
  • 00:10:31
    per mile most of these recoveries were
  • 00:10:33
    due to students and other dynamic
  • 00:10:34
    obstacles in the way
  • 00:10:35
    uh resulting in clearing the cost maps
  • 00:10:37
    or other recovery behaviors
  • 00:10:39
    that shouldn't be viewed as a negative
  • 00:10:40
    indicator for the navigation stack since
  • 00:10:42
    uh these recoveries are there in order
  • 00:10:44
    to make sure that the robot can fully
  • 00:10:46
    autonomously navigate
  • 00:10:47
    without any sort of human assistance so
  • 00:10:49
    we're able to do so with no collisions
  • 00:10:51
    and no active human assistance
  • 00:10:52
    uh with zero emergency stops so i was
  • 00:10:55
    able to actually handle these dynamic
  • 00:10:56
    scenes
  • 00:10:56
    surprisingly well
  • 00:11:00
    so what's next so we're always working
  • 00:11:02
    on new capabilities uh in open source
  • 00:11:04
    so we're currently working on dynamic
  • 00:11:05
    obstacle integration uh as well as
  • 00:11:08
    uh integrated hype and wrist maps 3d
  • 00:11:10
    slam and visual slam
  • 00:11:11
    official integrations new algorithms for
  • 00:11:13
    all these different classes of plugins
  • 00:11:15
    a new localization framework and
  • 00:11:16
    implementations for 2d and 3d lidar
  • 00:11:18
    based uh positioning
  • 00:11:20
    uh as well as semantic navigation so if
  • 00:11:22
    you're working on some sort of
  • 00:11:23
    interesting research that could have
  • 00:11:24
    real world applications uh you should
  • 00:11:25
    definitely reach out to us we're very
  • 00:11:27
    interested in
  • 00:11:28
    in providing new uh and modern
  • 00:11:30
    algorithms in the navigation to system
  • 00:11:32
    and it would allow your research to have
  • 00:11:33
    far larger impact than just publishing a
  • 00:11:35
    paper
  • 00:11:38
    and thank you uh here are some
  • 00:11:39
    interesting links so if you're
  • 00:11:40
    interested in our work
  • 00:11:41
    the repository is listed here we also
  • 00:11:43
    have documentation tutorials on our
  • 00:11:44
    website
  • 00:11:45
    uh you can see all the different code we
  • 00:11:47
    use in this experiment as well as our
  • 00:11:48
    community slack so if you're interested
  • 00:11:50
    in
  • 00:11:50
    getting involved in the community or
  • 00:11:52
    you're just interested in learning more
  • 00:11:53
    this is a great place to be
  • 00:11:54
    and hear all of my other co-authors and
  • 00:11:56
    thank you for your time
Tags
  • ROS 2
  • Navigation 2
  • autonomous robotics
  • behavior tree
  • navigation stack
  • Marathon Experiments
  • hybrid systems
  • modular design
  • robotics software
  • high reliability