LogoLogo
Navigate?
  • XXX!
    • Frequently Asked XQuestions
  • YYY!
    • Advanced: Help me troubleshoot weird build problems
    • Advanced: Help me troubleshoot weird camera problems
  • ZZZ!
    • Basic Chatgpt ROS interface
    • Camera Calibration
    • Claw Movement
    • Computer Vision With Yolo8a
    • Connecting to the robot
    • Creating and Executing Launch Files
  • FIIVA
    • Download File From vscode
    • Edge Detection
    • Finding HSV values for any color
    • Finding correct color for line following
    • GPS Data with iPhone (GPS2IP)
    • How can I calculate a better way to rotate?
    • How do I attach a Bluetooth headset?
    • How do I control AWS RoboMaker?
    • How do I control the Arm
    • How do I convert Imagenet to Darknet
    • How do I create a Gazebo world
    • How do I create a ROS UI with TkInter?
    • How do I creating a gazebo world
    • How do I deploy a Pytorch model our cluster?
    • How do I move a file from my vnc and back?
    • How do I read a BDLC motor spec sheet
    • How do I set up AprilTags
    • How do I set up a USB camera?
    • How do I set up the Astra Pro Depth Camera?
    • How do I setup to Coral TPU
    • How do I spawn an animated Human?
    • How do I use Alexa Flask-ASK for ROS
    • How do I use OpenCV and Turtlebot3 Camera
    • How do I use Parameters and Arguments in ROS?
    • How do I use a sigmoid function instead of a PID
    • How do I visualize the contents of a bag
    • How do you use UDP to communicate between computers?
    • How does GPS work?
    • How to Copy a MicroSD
    • How to add an SDF Model
    • How to approach computer vision
    • How to connect to multiple robots
    • How to define and Use your own message types
    • Interbotix Pincher X100 Arm
    • PID-guide.md
    • PX-100 Arm ROS2 Setup
    • Pincer Attachment
    • ROSBridge and ROSLIBJS
    • Recognizing Objects Based on Color and Size using OpenCV
    • Reinforcement Learning and its Applications
    • Robot Arm Transforms
    • Running Multi Robot in Gazebo and Real Robot
    • Simplifying_Lidar.md
    • Spawning Multiple Robots
    • Tips for using OpenCV and Cameras
    • Using ROS2 with Docker
    • What are some Computer Vision Tips
    • What are the ROS Message Types
    • Why does roscd go wrong?
    • Why is my robot not moving?
    • Working with localStorage in React for web clients
    • bouncy-objects.md
    • camera-performance-notes.md
    • camera_pitch.md
    • change_model_color.md
    • communicate-with-rosserial.md
    • contribution-guide.md
    • customize_tb3.md
    • diy-gazebo-world.md
    • fiducial-tips.md
    • fiducial_follows.md
    • gazebo_tf.md
    • gazebo_world.md
    • handy-commands.md
    • how-to-add-texture-to-sdf.md
    • how_to_get_correct_color_for_line_following.md
    • joint-controllers.md
    • laserscan-definition-modify.md
    • launch-files.md
    • lidar_placement_and_drift.md
    • logging.md
    • model_teleportation.md
    • modular_teleop.md
    • multi-robot-one-core.md
    • multirobot-map-merge.md
    • namespacing-tfs.md
    • object_detection_yolo_setup.md
    • publish_commands_to_commandline.md
    • quaternions.md
    • reset-world-gazebo.md
    • robot multitasking
    • ros_and_aws_integration.md
    • rosbridge.md
    • rviz-markers.md
    • sdf_to_urdf.md
    • spawn_model_terminal.md
    • using-conditionals-in-roslaunch.md
    • ROS and TkInter
    • Brandeis Robotics Utility
      • Controlling Robots from VNC
      • BRU Concepts
      • Commands
      • Standard ROSUTILS directory everywhere
      • script.md
    • Cosi119 Final Reports!
      • 2023
        • Autopilot
        • Bowling Bot
        • Cargo Claw
        • Command and Control Dashboard
        • Dynamaze
        • Guard Robot
        • Multi Robot Surveilance
        • Object Sorter
        • Robot Race
        • Typenator
      • 2022
        • NASCAR-style-turtlebot-racing.md
        • RoboTag.md
        • litter_picker.md
        • mini_scouter.md
        • not-play-catch.md
        • Waiterbot
      • 2020
        • Behavior Trees Investigatoin
        • Computer Vision Maze Solver
        • FiducialSLAM.md
        • Gesture Recognition
          • color.md
          • demo.md
          • gestures.md
          • kinect.md
          • kinectv2.md
          • leap-motion.md
          • leap_motion.md
          • local-camera.md
          • ssd.md
        • dangersigns.md
        • pathplanning.md
        • reinforcement-learning-racer.md
        • stalkerbot.md
      • 2019
        • robot-arm.md
      • Sample Project Template
      • past-gen-letters.md
    • Brandeis Rover Cluster
      • intro.md
      • operation-guide
        • architecture.md
        • cluster.md
        • faq.md
        • graphs
        • graphs.md
        • image.md
        • lifecycle.md
        • nodes.md
        • operating
          • cluster.md
          • users.md
        • sources.md
      • user-guide
        • code-editor.md
        • desktop-ui.md
        • getting-started.md
    • Robots in our Lab
      • linorobot
        • Platform Hardware Specs
        • connections.md
        • howto.md
        • stack.md
        • troubleshooting.md
        • validating.md
      • minirover
        • mrbuild.md
        • mrsetup.md
        • mrtroubleshooting.md
        • mruse.md
      • platform
      • platform.md
      • pupper
        • boundary-generation.md
        • controls.md
        • fiducial-detection.md
        • hardware.md
        • introduction.md
        • planning.md
        • software-overview.md
        • software-setup.md
        • testing.md
    • Campus Rover Packages
      • C3 Architecture Notes
      • Campus Rover V 3
      • campus-rover-4
        • Working with TIVAC
        • progress-report.md
      • demos
        • demo-script-fall-2018.md
        • gen2-demo-instructions.md
      • mutant
        • Description of Mutant
        • mutant-usage.md
        • mutantsetup.md
        • raspicam.md
      • navigation
        • costmap-clearing
          • costmap-clearing-part-1.md
          • costmap-clearing-part-2.md
        • cpu-usage-and-errors-in-navigation.md
        • fiducials.md
        • floormapping.md
        • lost-and-found.md
      • nodes.md
      • package-delivery
        • talker-node.md
      • state-management-services.md
      • voice
        • [voice integration.md](./cr-package/voice/voice integration.md)
        • voice-integration.md-.-cr-package-voice-voice-integration.md.md
        • voice.md
      • web-application
        • Integrating using Flask and ROS
        • flask.md
        • livemap.md
    • Lab Infrastructure
      • Tailscale VPN
      • Creating a bootable SSD
      • Danger Signs with Batteries and Chargers
      • How to use the Rover Cluster
      • Setting up SSH on a new robot
      • Turtlebot3s.md
      • copying-robot-sd-cards.md
      • external-ssd-instructions
      • external-ssd-instructions.md
      • linux_terminal_eduroam_setup.md
      • multi-robot-infrastructure.md
      • networking.md
      • our-robots.md
      • private-networking.md
      • ros-melodic.md
      • setup-hints.md
      • ubuntubrandeis.md
    • Our ROS Packages
      • Behavior Trees
        • Nodes
        • Visualization of the behavior Tree
        • basic_movement.md
        • build
          • defining_blackboard.md
          • defining_nodes.md
          • defining_references.md
        • custom_nodes
          • custom_action.md
          • custom_conditional.md
          • custom_update.md
        • included_nodes
          • action_nodes.md
          • conditional_nodes.md
          • included_nodes.md
          • parent_nodes.md
          • update_nodes.md
        • nodes
          • leaf_nodes.md
          • parent_nodes.md
      • Speech Recognition Report
Powered by GitBook

Copyright (c) Brandeis University

On this page
  • Installation
  • Usage
  • Key Components of the Behavior Tree
  • Blackboard
  • Nodes

Was this helpful?

Edit on GitHub
Export as PDF
  1. FIIVA
  2. Our ROS Packages

Behavior Trees

Github repo for mr_bt: https://github.com/campusrover/mr_bt

The goal behind the MRBT project was to create a efficient, modular, and user-friendly solution for programming complex behaviors into the Turtlebot3 robot stack. Our solution came in the form of a popular method in the videogaming industry for programming behaviors into NPCs (Non-Playable Characters in videogames; think enemies in HALO or Bioshock). With the use of behavior trees, we are now able to program complex behaviors using a simple tree definition in a JSON file, and we can partition a behavior into multiple different subtrees which can then be used elsewhere or standalone.

Installation

  • Requires ROS Noetic, Ubuntu 20.04, and a catkin workspace

  • cd into your catkin_ws/src directory

  • Run git clone https://github.com/campusrover/mr_bt.git

  • Run cd ../ && catkin_make

Usage

  • To run one of the example behavior trees you must use a Turtlebot3 robot connected to ROS.

  • Run the example roslaunch mr_bt btree.launch tree:=move_to_position

  • If you want to run any example in the folder mr_bt/src/tree_jsons/, pass in the name of the example folder containing a root.json as the tree argument for the launch script.

Key Components of the Behavior Tree

Any complex behavior tree can be broken down into a few key components which highlight the overall logical structure of how they operate.

Blackboard

The Blackboard is the main storage component for data accessible to nodes in the behavior tree. It is stored as a Python dictionary and its reference is passed down to each node in the behavior tree from the root node. The effect of this is that each node in the tree is able to change and share data through the Blackboard. Additionally, data that is sent from the sensors on the robot, including camera data, lidar data, etc, is accessable from each node in the tree.

Nodes

A behavior tree is a tree-shaped data structure consisting of nodes, each of which contain a logical method which executes code. The behavor tree is evaluated recursively starting at the root node. Each node has the abilility to execute code which will either run a script, or execute all of its children. Each node will also return one of 3 outputs to its parent: "success", "failure", or "running". There are two main types of nodes: the control-flow (parent nodes) nodes and the leaf nodes.

Control-Flow Nodes

  • Selector

    • The Selector executes its children sequentially from left to right.

    • If one of its children returns either "success" or "running", it will halt execution of its children and it will return the result of the child it stopped on.

    • If all of its children return "failure", the Selector will also return "failure".

  • Sequencer

    • The Sequencer executes its children sequentially from left to right.

    • The Sequencer will not halt execution of its children unless one of them returns "failure" or "running", in which case it will also return "failure" or "running".

    • If all children return "success" the Sequencer will return "success"

  • Multitasker

    • The Multitasker runs all of its children concurrently, each in a separate thread.

    • The Multitasker will return "success" only if all of it's children return "success".

    • If any of its children return "running" but none return "failure", the Multitasker will return "running".

    • If any of its children return "failure", the Multitasker will return "failure".

Leaf Nodes

  • Action Nodes

    • Action nodes send ROS topic messages from the behavior tree to the robot.

    • Often the type of message sent from an Action node is a cmd_vel message which encodes movement instructions for the robot.

  • Update Nodes

    • Update nodes are designated for updating data in the blackboard.

    • Often times the types of data updates performed by Update nodes include preprocessing or processing of message data from the robot.

  • Conditional Nodes

    • Conditional nodes will return either "success" or "failure", corresponding to the boolean values "true" and "false" respectively

    • Conditional nodes will access data in the blackboard and return one of the two values listed above based on if a particular condition is met within the data.

PreviousOur ROS PackagesNextNodes

Last updated 1 year ago

Was this helpful?