LogoLogo
Navigate?
  • XXX!
    • Frequently Asked XQuestions
  • YYY!
    • Advanced: Help me troubleshoot weird build problems
    • Advanced: Help me troubleshoot weird camera problems
  • ZZZ!
    • Basic Chatgpt ROS interface
    • Camera Calibration
    • Claw Movement
    • Computer Vision With Yolo8a
    • Connecting to the robot
    • Creating and Executing Launch Files
  • FIIVA
    • Download File From vscode
    • Edge Detection
    • Finding HSV values for any color
    • Finding correct color for line following
    • GPS Data with iPhone (GPS2IP)
    • How can I calculate a better way to rotate?
    • How do I attach a Bluetooth headset?
    • How do I control AWS RoboMaker?
    • How do I control the Arm
    • How do I convert Imagenet to Darknet
    • How do I create a Gazebo world
    • How do I create a ROS UI with TkInter?
    • How do I creating a gazebo world
    • How do I deploy a Pytorch model our cluster?
    • How do I move a file from my vnc and back?
    • How do I read a BDLC motor spec sheet
    • How do I set up AprilTags
    • How do I set up a USB camera?
    • How do I set up the Astra Pro Depth Camera?
    • How do I setup to Coral TPU
    • How do I spawn an animated Human?
    • How do I use Alexa Flask-ASK for ROS
    • How do I use OpenCV and Turtlebot3 Camera
    • How do I use Parameters and Arguments in ROS?
    • How do I use a sigmoid function instead of a PID
    • How do I visualize the contents of a bag
    • How do you use UDP to communicate between computers?
    • How does GPS work?
    • How to Copy a MicroSD
    • How to add an SDF Model
    • How to approach computer vision
    • How to connect to multiple robots
    • How to define and Use your own message types
    • Interbotix Pincher X100 Arm
    • PID-guide.md
    • PX-100 Arm ROS2 Setup
    • Pincer Attachment
    • ROSBridge and ROSLIBJS
    • Recognizing Objects Based on Color and Size using OpenCV
    • Reinforcement Learning and its Applications
    • Robot Arm Transforms
    • Running Multi Robot in Gazebo and Real Robot
    • Simplifying_Lidar.md
    • Spawning Multiple Robots
    • Tips for using OpenCV and Cameras
    • Using ROS2 with Docker
    • What are some Computer Vision Tips
    • What are the ROS Message Types
    • Why does roscd go wrong?
    • Why is my robot not moving?
    • Working with localStorage in React for web clients
    • bouncy-objects.md
    • camera-performance-notes.md
    • camera_pitch.md
    • change_model_color.md
    • communicate-with-rosserial.md
    • contribution-guide.md
    • customize_tb3.md
    • diy-gazebo-world.md
    • fiducial-tips.md
    • fiducial_follows.md
    • gazebo_tf.md
    • gazebo_world.md
    • handy-commands.md
    • how-to-add-texture-to-sdf.md
    • how_to_get_correct_color_for_line_following.md
    • joint-controllers.md
    • laserscan-definition-modify.md
    • launch-files.md
    • lidar_placement_and_drift.md
    • logging.md
    • model_teleportation.md
    • modular_teleop.md
    • multi-robot-one-core.md
    • multirobot-map-merge.md
    • namespacing-tfs.md
    • object_detection_yolo_setup.md
    • publish_commands_to_commandline.md
    • quaternions.md
    • reset-world-gazebo.md
    • robot multitasking
    • ros_and_aws_integration.md
    • rosbridge.md
    • rviz-markers.md
    • sdf_to_urdf.md
    • spawn_model_terminal.md
    • using-conditionals-in-roslaunch.md
    • ROS and TkInter
    • Brandeis Robotics Utility
      • Controlling Robots from VNC
      • BRU Concepts
      • Commands
      • Standard ROSUTILS directory everywhere
      • script.md
    • Cosi119 Final Reports!
      • 2023
        • Autopilot
        • Bowling Bot
        • Cargo Claw
        • Command and Control Dashboard
        • Dynamaze
        • Guard Robot
        • Multi Robot Surveilance
        • Object Sorter
        • Robot Race
        • Typenator
      • 2022
        • NASCAR-style-turtlebot-racing.md
        • RoboTag.md
        • litter_picker.md
        • mini_scouter.md
        • not-play-catch.md
        • Waiterbot
      • 2020
        • Behavior Trees Investigatoin
        • Computer Vision Maze Solver
        • FiducialSLAM.md
        • Gesture Recognition
          • color.md
          • demo.md
          • gestures.md
          • kinect.md
          • kinectv2.md
          • leap-motion.md
          • leap_motion.md
          • local-camera.md
          • ssd.md
        • dangersigns.md
        • pathplanning.md
        • reinforcement-learning-racer.md
        • stalkerbot.md
      • 2019
        • robot-arm.md
      • Sample Project Template
      • past-gen-letters.md
    • Brandeis Rover Cluster
      • intro.md
      • operation-guide
        • architecture.md
        • cluster.md
        • faq.md
        • graphs
        • graphs.md
        • image.md
        • lifecycle.md
        • nodes.md
        • operating
          • cluster.md
          • users.md
        • sources.md
      • user-guide
        • code-editor.md
        • desktop-ui.md
        • getting-started.md
    • Robots in our Lab
      • linorobot
        • Platform Hardware Specs
        • connections.md
        • howto.md
        • stack.md
        • troubleshooting.md
        • validating.md
      • minirover
        • mrbuild.md
        • mrsetup.md
        • mrtroubleshooting.md
        • mruse.md
      • platform
      • platform.md
      • pupper
        • boundary-generation.md
        • controls.md
        • fiducial-detection.md
        • hardware.md
        • introduction.md
        • planning.md
        • software-overview.md
        • software-setup.md
        • testing.md
    • Campus Rover Packages
      • C3 Architecture Notes
      • Campus Rover V 3
      • campus-rover-4
        • Working with TIVAC
        • progress-report.md
      • demos
        • demo-script-fall-2018.md
        • gen2-demo-instructions.md
      • mutant
        • Description of Mutant
        • mutant-usage.md
        • mutantsetup.md
        • raspicam.md
      • navigation
        • costmap-clearing
          • costmap-clearing-part-1.md
          • costmap-clearing-part-2.md
        • cpu-usage-and-errors-in-navigation.md
        • fiducials.md
        • floormapping.md
        • lost-and-found.md
      • nodes.md
      • package-delivery
        • talker-node.md
      • state-management-services.md
      • voice
        • [voice integration.md](./cr-package/voice/voice integration.md)
        • voice-integration.md-.-cr-package-voice-voice-integration.md.md
        • voice.md
      • web-application
        • Integrating using Flask and ROS
        • flask.md
        • livemap.md
    • Lab Infrastructure
      • Tailscale VPN
      • Creating a bootable SSD
      • Danger Signs with Batteries and Chargers
      • How to use the Rover Cluster
      • Setting up SSH on a new robot
      • Turtlebot3s.md
      • copying-robot-sd-cards.md
      • external-ssd-instructions
      • external-ssd-instructions.md
      • linux_terminal_eduroam_setup.md
      • multi-robot-infrastructure.md
      • networking.md
      • our-robots.md
      • private-networking.md
      • ros-melodic.md
      • setup-hints.md
      • ubuntubrandeis.md
    • Our ROS Packages
      • Behavior Trees
        • Nodes
        • Visualization of the behavior Tree
        • basic_movement.md
        • build
          • defining_blackboard.md
          • defining_nodes.md
          • defining_references.md
        • custom_nodes
          • custom_action.md
          • custom_conditional.md
          • custom_update.md
        • included_nodes
          • action_nodes.md
          • conditional_nodes.md
          • included_nodes.md
          • parent_nodes.md
          • update_nodes.md
        • nodes
          • leaf_nodes.md
          • parent_nodes.md
      • Speech Recognition Report
Powered by GitBook

Copyright (c) Brandeis University

On this page
  • Author - Eyal Cohen
  • How It Works:
  • Consider This Example:

Was this helpful?

Edit on GitHub
Export as PDF
  1. FIIVA

How can I calculate a better way to rotate?

Author - Eyal Cohen

Given two coordinates in a 2-dimensional plane and your robots' current direction, the smart rotation algorithm will calculate your target angle and return whether your robot should to turn left, turn right, or go straight ahead to reach its navigation goal.

How It Works:

Inputs: posex1 is a float that represents the x-axis coordinate of your robot, posey1 is a float that represents your robot's y-axis coordinate. posex2 and posey2 are floats that represent the x and y of your robot's goal. Lastly, theta represents your robot's current pose angle.

from math import atan2
import math


def smartRotate(posex1, posey1, posex2, posey2, theta):
    inc_x = posex2 -posex1
    inc_y = posey2 -posey1
    angle_goal = atan2(inc_y, inc_x)
    # if angle_to_goal < 0:
    #     angle_to_goal = (2* math.pi) + angle_to_goal
    print("angle goal = ",angle_goal)
    goal_range = theta + math.pi
    wrapped = goal_range - (2 * math.pi)
    if abs(angle_goal - theta) > 0.1:
        print(theta)
        print("goal_range = ",goal_range)
        if (goal_range) > (2 * math.pi) and (theta < angle_goal or angle_goal < wrapped):
            print("go left")
        elif (goal_range) < (2 * math.pi) and (theta < angle_goal and angle_goal < goal_range):
            print("go left")
        else:
            print("go right")

    else:
        print("go straight")

Firstly, the angle_goal from your robot's coordinate (not taking its current angle into account) is calculated by finding the arc tangent of the difference between the robot's coordinates and the goal coordinates.

In order to decide whether your robot should go left or right, we must determine where the angle_goal is relative to its current rotational direction. If the angle_goal is on the robot's left rotational hemisphere, the robot should rotate left, otherwise it should rotate right. Since we are working in Radians, π is equivilant to 180 degrees. To check whether the angle_goal is within the left hemisphere of the robot, we must add π to theta (the robot's current direction) to get the upperbound of the range of values we want to check the target may be included in. If the angle_goal is between theta and that upper bound, then the robot must turn in that direction to most efficiently reach its goal.

Consider This Example:

    smartRotation(0,0,2,2,0)

If your robot is at (0,0), its rotational direction is 0, and it's target is at (2,2), then its angle_goal would equal = 0.785. First we check whether its current angle's deviation from the angle_goal is significant by finding the difference and seeing if its larger than 0.1. If the difference between the angles is insignificant the robot should go straight towards its goal. In this case however, angle_goal - theta (0.785 - 0) is greater than 0.1, so we know that we must turn left or right to near our angle_goal. To find out whether this angle is to the left or the right of the robot's current angle, we must add π to its current angle to discover the point between its left and right hemispheres. In this case, if the angle_goal is between theta and its goal_range, 3.14 (0(theta) + π), then we would know that the robot must turn left to reach its goal.

However, if theta (your robot's current direction) + π is greater than 2π (maximum radians in a circle) then the left hemisphere of your robot is partially across the 0 radian point of the circle. To account for that case, we must calculate how far the goal range wraps around the circle passed the origin. If there is a remainder, we check whether the angle_goal is between theta and 2π or if the angle_goal is present within the remainder of the range that wraps around the origin. If either of these conditions are met then we know that your robot should turn left to most efficiently arrive at its goal, otherwise it should turn right.

PreviousGPS Data with iPhone (GPS2IP)NextHow do I attach a Bluetooth headset?

Last updated 1 year ago

Was this helpful?