LogoLogo
Navigate?
  • XXX!
    • Frequently Asked XQuestions
  • YYY!
    • Advanced: Help me troubleshoot weird build problems
    • Advanced: Help me troubleshoot weird camera problems
  • ZZZ!
    • Basic Chatgpt ROS interface
    • Camera Calibration
    • Claw Movement
    • Computer Vision With Yolo8a
    • Connecting to the robot
    • Creating and Executing Launch Files
  • FIIVA
    • Download File From vscode
    • Edge Detection
    • Finding HSV values for any color
    • Finding correct color for line following
    • GPS Data with iPhone (GPS2IP)
    • How can I calculate a better way to rotate?
    • How do I attach a Bluetooth headset?
    • How do I control AWS RoboMaker?
    • How do I control the Arm
    • How do I convert Imagenet to Darknet
    • How do I create a Gazebo world
    • How do I create a ROS UI with TkInter?
    • How do I creating a gazebo world
    • How do I deploy a Pytorch model our cluster?
    • How do I move a file from my vnc and back?
    • How do I read a BDLC motor spec sheet
    • How do I set up AprilTags
    • How do I set up a USB camera?
    • How do I set up the Astra Pro Depth Camera?
    • How do I setup to Coral TPU
    • How do I spawn an animated Human?
    • How do I use Alexa Flask-ASK for ROS
    • How do I use OpenCV and Turtlebot3 Camera
    • How do I use Parameters and Arguments in ROS?
    • How do I use a sigmoid function instead of a PID
    • How do I visualize the contents of a bag
    • How do you use UDP to communicate between computers?
    • How does GPS work?
    • How to Copy a MicroSD
    • How to add an SDF Model
    • How to approach computer vision
    • How to connect to multiple robots
    • How to define and Use your own message types
    • Interbotix Pincher X100 Arm
    • PID-guide.md
    • PX-100 Arm ROS2 Setup
    • Pincer Attachment
    • ROSBridge and ROSLIBJS
    • Recognizing Objects Based on Color and Size using OpenCV
    • Reinforcement Learning and its Applications
    • Robot Arm Transforms
    • Running Multi Robot in Gazebo and Real Robot
    • Simplifying_Lidar.md
    • Spawning Multiple Robots
    • Tips for using OpenCV and Cameras
    • Using ROS2 with Docker
    • What are some Computer Vision Tips
    • What are the ROS Message Types
    • Why does roscd go wrong?
    • Why is my robot not moving?
    • Working with localStorage in React for web clients
    • bouncy-objects.md
    • camera-performance-notes.md
    • camera_pitch.md
    • change_model_color.md
    • communicate-with-rosserial.md
    • contribution-guide.md
    • customize_tb3.md
    • diy-gazebo-world.md
    • fiducial-tips.md
    • fiducial_follows.md
    • gazebo_tf.md
    • gazebo_world.md
    • handy-commands.md
    • how-to-add-texture-to-sdf.md
    • how_to_get_correct_color_for_line_following.md
    • joint-controllers.md
    • laserscan-definition-modify.md
    • launch-files.md
    • lidar_placement_and_drift.md
    • logging.md
    • model_teleportation.md
    • modular_teleop.md
    • multi-robot-one-core.md
    • multirobot-map-merge.md
    • namespacing-tfs.md
    • object_detection_yolo_setup.md
    • publish_commands_to_commandline.md
    • quaternions.md
    • reset-world-gazebo.md
    • robot multitasking
    • ros_and_aws_integration.md
    • rosbridge.md
    • rviz-markers.md
    • sdf_to_urdf.md
    • spawn_model_terminal.md
    • using-conditionals-in-roslaunch.md
    • ROS and TkInter
    • Brandeis Robotics Utility
      • Controlling Robots from VNC
      • BRU Concepts
      • Commands
      • Standard ROSUTILS directory everywhere
      • script.md
    • Cosi119 Final Reports!
      • 2023
        • Autopilot
        • Bowling Bot
        • Cargo Claw
        • Command and Control Dashboard
        • Dynamaze
        • Guard Robot
        • Multi Robot Surveilance
        • Object Sorter
        • Robot Race
        • Typenator
      • 2022
        • NASCAR-style-turtlebot-racing.md
        • RoboTag.md
        • litter_picker.md
        • mini_scouter.md
        • not-play-catch.md
        • Waiterbot
      • 2020
        • Behavior Trees Investigatoin
        • Computer Vision Maze Solver
        • FiducialSLAM.md
        • Gesture Recognition
          • color.md
          • demo.md
          • gestures.md
          • kinect.md
          • kinectv2.md
          • leap-motion.md
          • leap_motion.md
          • local-camera.md
          • ssd.md
        • dangersigns.md
        • pathplanning.md
        • reinforcement-learning-racer.md
        • stalkerbot.md
      • 2019
        • robot-arm.md
      • Sample Project Template
      • past-gen-letters.md
    • Brandeis Rover Cluster
      • intro.md
      • operation-guide
        • architecture.md
        • cluster.md
        • faq.md
        • graphs
        • graphs.md
        • image.md
        • lifecycle.md
        • nodes.md
        • operating
          • cluster.md
          • users.md
        • sources.md
      • user-guide
        • code-editor.md
        • desktop-ui.md
        • getting-started.md
    • Robots in our Lab
      • linorobot
        • Platform Hardware Specs
        • connections.md
        • howto.md
        • stack.md
        • troubleshooting.md
        • validating.md
      • minirover
        • mrbuild.md
        • mrsetup.md
        • mrtroubleshooting.md
        • mruse.md
      • platform
      • platform.md
      • pupper
        • boundary-generation.md
        • controls.md
        • fiducial-detection.md
        • hardware.md
        • introduction.md
        • planning.md
        • software-overview.md
        • software-setup.md
        • testing.md
    • Campus Rover Packages
      • C3 Architecture Notes
      • Campus Rover V 3
      • campus-rover-4
        • Working with TIVAC
        • progress-report.md
      • demos
        • demo-script-fall-2018.md
        • gen2-demo-instructions.md
      • mutant
        • Description of Mutant
        • mutant-usage.md
        • mutantsetup.md
        • raspicam.md
      • navigation
        • costmap-clearing
          • costmap-clearing-part-1.md
          • costmap-clearing-part-2.md
        • cpu-usage-and-errors-in-navigation.md
        • fiducials.md
        • floormapping.md
        • lost-and-found.md
      • nodes.md
      • package-delivery
        • talker-node.md
      • state-management-services.md
      • voice
        • [voice integration.md](./cr-package/voice/voice integration.md)
        • voice-integration.md-.-cr-package-voice-voice-integration.md.md
        • voice.md
      • web-application
        • Integrating using Flask and ROS
        • flask.md
        • livemap.md
    • Lab Infrastructure
      • Tailscale VPN
      • Creating a bootable SSD
      • Danger Signs with Batteries and Chargers
      • How to use the Rover Cluster
      • Setting up SSH on a new robot
      • Turtlebot3s.md
      • copying-robot-sd-cards.md
      • external-ssd-instructions
      • external-ssd-instructions.md
      • linux_terminal_eduroam_setup.md
      • multi-robot-infrastructure.md
      • networking.md
      • our-robots.md
      • private-networking.md
      • ros-melodic.md
      • setup-hints.md
      • ubuntubrandeis.md
    • Our ROS Packages
      • Behavior Trees
        • Nodes
        • Visualization of the behavior Tree
        • basic_movement.md
        • build
          • defining_blackboard.md
          • defining_nodes.md
          • defining_references.md
        • custom_nodes
          • custom_action.md
          • custom_conditional.md
          • custom_update.md
        • included_nodes
          • action_nodes.md
          • conditional_nodes.md
          • included_nodes.md
          • parent_nodes.md
          • update_nodes.md
        • nodes
          • leaf_nodes.md
          • parent_nodes.md
      • Speech Recognition Report
Powered by GitBook

Copyright (c) Brandeis University

On this page
  • By Vibhu Singh
  • A couple important things about the arm
  • How to set up
  • How to get X transforms
  • How to get Y transforms

Was this helpful?

Edit on GitHub
Export as PDF
  1. FIIVA

Robot Arm Transforms

By Vibhu Singh

A couple important things about the arm

  1. Moving the arm in multiple steps is not the same as moving the arm with just one transform because of the motors in the arm

    • For example, moving the arm -0.03 and then 0.01 in the x direction is not the same as moving -0.02

    • The farther the arm moves in any direction, especially the y direction

  2. The arm needs to be firmly secured on a surface, otherwise the whole apparatus will move around when the arm moves

  3. The arm should be returned to the sleep position frequently, one team found that the sweet spot was every 5-6 transforms. If that isn't enough, you should return the arm to the sleep position after one full cycle of transforms.

  4. Brush up on polar coordinates since those are the best way to think about the arm and arm transforms. The best way to think about the transforms is that the y transform is the theta and the x transform is the radius of the unit circle.

How to set up

The arm set up doesn't have to be exactly as the image below, but having a setup that follows the same principles is going to be ideal.

  • Having the camera positioned over the arm so that the arm is at the edge of the frame will make the math that you will have to do easier

  • As mentioned above, the arm should be fixed on a stable surface

  • For reproducability of the results, you will need to have the exact same, or as close as you can get to the exact same, set up so that all the math you do works correctly.

    • Depending on what you are trying to grab using the arm, a difference of even 1-2 centimeters is enough to mess up the equations

How to get X transforms

As mentioned at the start of this guide it's a good idea to brush up on polar coordinates for this project. The x transform is basically the radius portion of a polar coordinate. The arm has a very limited range of motion for the X transforms, ranging from about -0.08 to .04 and the transforms can only go to the second decimal place. The best way to find a relationship between the X transforms and the real coordinate system is to use a stepwise function.

It is important to remember to transform the coordinates so that the origin is at. This is why it was important to position the camera so that the arm was at one edge of the frame, it would make it easier to transform the coordinates the camera gives since there will be less transforms to make.

To get the "bins" for the stepwise function, the best way so far to do it is to have the arm extend or retract to all possible positions and record the coordinates, apply the transforms, and then convert to polar coordinates. What this will do is allow you to find a way to get ranges of radii that correspond with certain X transforms.

Below is an example of a graph that the Cargo Claw team made to illustrate their X transforms for their project. There are more details available about the project in the reports section under the cargoclaw project.

How to get Y transforms

The best way to think about the Y transforms is to imagine them as angles of rotation on a unit circle. The first step to getting a good idea of the relationship between the Y transforms is to record what the coordinates are at even intervals of the Y transform locations you are trying to reach. Once you have that, apply the transforms to all the coordinates and then save the theta values. From there, it'll be easy work to graph the Y transforms against the theta values.

From here, to get an actual equation to convert from the coordinates to transforms, it'll be a good idea make a graph of the Y transforms against the theta values so that you can see what sort of pattern the relationship follows. A cubic regression works well because it can account for some of the problems that come with the arm's movement and the inaccuracies that. You might still have to do some fine tuning in your code, for example:

    if (y_transform > .5):
        y_transform += .05
    elif (y_transform > .6):
        y_tranfrom += .09

Alternatively, you can use two different cubic regressions, one for if the Y transform is positive and one for if the Y transform is negative. You can also try higher degree regressions, but you will have to decide if the additional complexity is worth it.

Below is an example of a graph that the Cargo Claw team made to show the relationship between the theta and the y transform. There are more details available about the project in the reports section under the cargoclaw project.

PreviousReinforcement Learning and its ApplicationsNextRunning Multi Robot in Gazebo and Real Robot

Last updated 1 year ago

Was this helpful?

To get an applied and working example of everything described, you can look at the code for the cargoclaw project in the box_pickup.py file within the src directory of the git.
An example setup that works
x transform graph
y transform graph