LogoLogo
Navigate?
  • XXX!
    • Frequently Asked XQuestions
  • YYY!
    • Advanced: Help me troubleshoot weird build problems
    • Advanced: Help me troubleshoot weird camera problems
  • ZZZ!
    • Basic Chatgpt ROS interface
    • Camera Calibration
    • Claw Movement
    • Computer Vision With Yolo8a
    • Connecting to the robot
    • Creating and Executing Launch Files
  • FIIVA
    • Download File From vscode
    • Edge Detection
    • Finding HSV values for any color
    • Finding correct color for line following
    • GPS Data with iPhone (GPS2IP)
    • How can I calculate a better way to rotate?
    • How do I attach a Bluetooth headset?
    • How do I control AWS RoboMaker?
    • How do I control the Arm
    • How do I convert Imagenet to Darknet
    • How do I create a Gazebo world
    • How do I create a ROS UI with TkInter?
    • How do I creating a gazebo world
    • How do I deploy a Pytorch model our cluster?
    • How do I move a file from my vnc and back?
    • How do I read a BDLC motor spec sheet
    • How do I set up AprilTags
    • How do I set up a USB camera?
    • How do I set up the Astra Pro Depth Camera?
    • How do I setup to Coral TPU
    • How do I spawn an animated Human?
    • How do I use Alexa Flask-ASK for ROS
    • How do I use OpenCV and Turtlebot3 Camera
    • How do I use Parameters and Arguments in ROS?
    • How do I use a sigmoid function instead of a PID
    • How do I visualize the contents of a bag
    • How do you use UDP to communicate between computers?
    • How does GPS work?
    • How to Copy a MicroSD
    • How to add an SDF Model
    • How to approach computer vision
    • How to connect to multiple robots
    • How to define and Use your own message types
    • Interbotix Pincher X100 Arm
    • PID-guide.md
    • PX-100 Arm ROS2 Setup
    • Pincer Attachment
    • ROSBridge and ROSLIBJS
    • Recognizing Objects Based on Color and Size using OpenCV
    • Reinforcement Learning and its Applications
    • Robot Arm Transforms
    • Running Multi Robot in Gazebo and Real Robot
    • Simplifying_Lidar.md
    • Spawning Multiple Robots
    • Tips for using OpenCV and Cameras
    • Using ROS2 with Docker
    • What are some Computer Vision Tips
    • What are the ROS Message Types
    • Why does roscd go wrong?
    • Why is my robot not moving?
    • Working with localStorage in React for web clients
    • bouncy-objects.md
    • camera-performance-notes.md
    • camera_pitch.md
    • change_model_color.md
    • communicate-with-rosserial.md
    • contribution-guide.md
    • customize_tb3.md
    • diy-gazebo-world.md
    • fiducial-tips.md
    • fiducial_follows.md
    • gazebo_tf.md
    • gazebo_world.md
    • handy-commands.md
    • how-to-add-texture-to-sdf.md
    • how_to_get_correct_color_for_line_following.md
    • joint-controllers.md
    • laserscan-definition-modify.md
    • launch-files.md
    • lidar_placement_and_drift.md
    • logging.md
    • model_teleportation.md
    • modular_teleop.md
    • multi-robot-one-core.md
    • multirobot-map-merge.md
    • namespacing-tfs.md
    • object_detection_yolo_setup.md
    • publish_commands_to_commandline.md
    • quaternions.md
    • reset-world-gazebo.md
    • robot multitasking
    • ros_and_aws_integration.md
    • rosbridge.md
    • rviz-markers.md
    • sdf_to_urdf.md
    • spawn_model_terminal.md
    • using-conditionals-in-roslaunch.md
    • ROS and TkInter
    • Brandeis Robotics Utility
      • Controlling Robots from VNC
      • BRU Concepts
      • Commands
      • Standard ROSUTILS directory everywhere
      • script.md
    • Cosi119 Final Reports!
      • 2023
        • Autopilot
        • Bowling Bot
        • Cargo Claw
        • Command and Control Dashboard
        • Dynamaze
        • Guard Robot
        • Multi Robot Surveilance
        • Object Sorter
        • Robot Race
        • Typenator
      • 2022
        • NASCAR-style-turtlebot-racing.md
        • RoboTag.md
        • litter_picker.md
        • mini_scouter.md
        • not-play-catch.md
        • Waiterbot
      • 2020
        • Behavior Trees Investigatoin
        • Computer Vision Maze Solver
        • FiducialSLAM.md
        • Gesture Recognition
          • color.md
          • demo.md
          • gestures.md
          • kinect.md
          • kinectv2.md
          • leap-motion.md
          • leap_motion.md
          • local-camera.md
          • ssd.md
        • dangersigns.md
        • pathplanning.md
        • reinforcement-learning-racer.md
        • stalkerbot.md
      • 2019
        • robot-arm.md
      • Sample Project Template
      • past-gen-letters.md
    • Brandeis Rover Cluster
      • intro.md
      • operation-guide
        • architecture.md
        • cluster.md
        • faq.md
        • graphs
        • graphs.md
        • image.md
        • lifecycle.md
        • nodes.md
        • operating
          • cluster.md
          • users.md
        • sources.md
      • user-guide
        • code-editor.md
        • desktop-ui.md
        • getting-started.md
    • Robots in our Lab
      • linorobot
        • Platform Hardware Specs
        • connections.md
        • howto.md
        • stack.md
        • troubleshooting.md
        • validating.md
      • minirover
        • mrbuild.md
        • mrsetup.md
        • mrtroubleshooting.md
        • mruse.md
      • platform
      • platform.md
      • pupper
        • boundary-generation.md
        • controls.md
        • fiducial-detection.md
        • hardware.md
        • introduction.md
        • planning.md
        • software-overview.md
        • software-setup.md
        • testing.md
    • Campus Rover Packages
      • C3 Architecture Notes
      • Campus Rover V 3
      • campus-rover-4
        • Working with TIVAC
        • progress-report.md
      • demos
        • demo-script-fall-2018.md
        • gen2-demo-instructions.md
      • mutant
        • Description of Mutant
        • mutant-usage.md
        • mutantsetup.md
        • raspicam.md
      • navigation
        • costmap-clearing
          • costmap-clearing-part-1.md
          • costmap-clearing-part-2.md
        • cpu-usage-and-errors-in-navigation.md
        • fiducials.md
        • floormapping.md
        • lost-and-found.md
      • nodes.md
      • package-delivery
        • talker-node.md
      • state-management-services.md
      • voice
        • [voice integration.md](./cr-package/voice/voice integration.md)
        • voice-integration.md-.-cr-package-voice-voice-integration.md.md
        • voice.md
      • web-application
        • Integrating using Flask and ROS
        • flask.md
        • livemap.md
    • Lab Infrastructure
      • Tailscale VPN
      • Creating a bootable SSD
      • Danger Signs with Batteries and Chargers
      • How to use the Rover Cluster
      • Setting up SSH on a new robot
      • Turtlebot3s.md
      • copying-robot-sd-cards.md
      • external-ssd-instructions
      • external-ssd-instructions.md
      • linux_terminal_eduroam_setup.md
      • multi-robot-infrastructure.md
      • networking.md
      • our-robots.md
      • private-networking.md
      • ros-melodic.md
      • setup-hints.md
      • ubuntubrandeis.md
    • Our ROS Packages
      • Behavior Trees
        • Nodes
        • Visualization of the behavior Tree
        • basic_movement.md
        • build
          • defining_blackboard.md
          • defining_nodes.md
          • defining_references.md
        • custom_nodes
          • custom_action.md
          • custom_conditional.md
          • custom_update.md
        • included_nodes
          • action_nodes.md
          • conditional_nodes.md
          • included_nodes.md
          • parent_nodes.md
          • update_nodes.md
        • nodes
          • leaf_nodes.md
          • parent_nodes.md
      • Speech Recognition Report
Powered by GitBook

Copyright (c) Brandeis University

On this page
  • August Soderberg & Joe Pickens
  • maxVel
  • minDepth
  • kd
  • kp
  • Tips When Debugging

Was this helpful?

Edit on GitHub
Export as PDF
  1. FIIVA

bouncy-objects.md

August Soderberg & Joe Pickens

This guide will give you all the tools you need to create a bouncy surface in Gazebo whether you want to make a basketball bounce on the ground or to make a hammer rebound off of an anvil.

First you will need to create the URDF file for your model, examine any of the URDF files given to you for help with this, I will put a generic example of a ball I created below.

<robot name="ball">
    <link name="base_link">
        <visual>
            <origin xyz="0 0 0" rpy="0 0 0"/>
            <geometry>
                <sphere radius=".5"/>
            </geometry>
        </visual>

        <collision>
            <origin xyz="0 0 0" rpy="0 0 0"/>
            <geometry>
                <sphere radius=".5"/>
            </geometry>
        </collision>

        <inertial>
            <origin xyz="0 0 0" rpy="0 0 0"/>
            <mass value="1"/>
            <inertia ixx="8e-03" ixy="-4-05" ixz="1e-04"
                                 iyy="8e-03" iyz="-4-06"
                                             izz="8e-03" />
        </inertial>
    </link>
    <gazebo reference="base_link">
        <mu1>1</mu1>
        <mu2>1</mu2>
        <kp>500000</kp>
        <kd>0</kd>
        <minDepth>0</minDepth>
        <maxVel>1000</maxVel>
    </gazebo>
</robot>

A few notes on the sample URDF we will be discussing above: the model is a 0.5m radius sphere, the inertia tensor matrix is arbitrary as far as we are concerned (a third-party program will generate an accurate tensor for your models if you designed them outside of Gazebo), and the friction coefficients "mu1" and "mu2" are also arbitrary for this demonstration.

To create a good bounce we will be editing the values within the "gazebo" tags, specifically the "kp", "kd", "minDepth", and "maxVel".

maxVel

When objects collide in Gazebo, the simulation imparts a force on both objects opposite to the normal vector of their surfaces in order to stop the objects from going inside of one another when we have said they should collide. The "maxVel" parameter specifies the velocity (in meters/second) at which you will allow Gazebo to move your object to simulate surfaces colliding, as you will see later, this value is unimportant as long as it is greater than the maximum velocity your object will be traveling.

minDepth

Gazebo only imparts this corrective force on objects to stop them from merging together if the objects are touching. The "minDepth" value specifies how far (in meters) you will allow the objects to pass through each other before this corrective velocity is applied; again we can set this to 0 and forget about it.

kd

The "kd" value is the most important for our work, this can be thought of as the coefficient of elasticity of collision. A value of 0 represents a perfectly elastic collision (no kinetic energy is lost during collision) and as the values get the collision becomes more inelastic (more kinetic energy is converted to other forms during collision). For a realistic bouncing ball, just start at 0 and work your way up to higher values until it returns to a height that feels appropriate for your application, a value of 100 is more like dropping a block of wood on the ground for reference.

kp

The "kp" value can be thought of as the coefficient of deformation. A value of 500,000 is good for truly solid objects like metals; keep in mind that an object with a "kd" value of 0 and a "kp" value of 500,000 will still bounce all around but will be hard as metal. A "kp" of 500 is pretty ridiculously malleable, like a really underinflated yoga ball, for reference. Also, keep in mind that low "kp" values can often cause weird effects when objects are just sitting on surfaces so be careful with this and err on the side of closer to 500,000 when in doubt.

There you go, now you know how to create objects in Gazebo that actually bounce!

Tips When Debugging

Here are a bunch of random tips for debugging:

When testing how much objects bounce, try it on the grey floor of Gazebo and not on custom created models since those models will also affect how your object bounces.

Keep in mind all external forces, especially friction. If a robot is driving into a wall and you want it to bounce off, that means the wheels are able to maintain traction at the speed the robot is travelling so even if that speed is quickly reversed, your robot will simply stop at the wall as the wheels stop all bouncing movement.

When editing other URDF files, they will often have a section at the top like:

<xacro:include filename="$(find turtlebot3_description)/urdf/common_properties.xacro"/>
<xacro:include filename="$(find turtlebot3_description)/urdf/turtlebot3_waffle.gazebo.xacro"/>

Make sure these files aren't affecting the behavior of your model, they could be specifying values of which you were unaware.

If your simulation is running at several times real time speed, you could miss the bouncing behavior of the ball, make sure simulation is running at a normal "real-time-factor".

If any small part has a low "maxVel" value, it could change the behavior of the entire model.

Good luck!

PreviousWorking with localStorage in React for web clientsNextcamera-performance-notes.md

Last updated 1 year ago

Was this helpful?