arrow-left

All pages
gitbookPowered by GitBook
1 of 25

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

howto.md

hashtag
Rebuild and upload teensy software

$ roscd linorobot
$ cd teensy/firmware
$ pio run --target upload -e <platform>

Where currently platform is either teensy31 (Plarform2) or teensy41 (Platform1)

platform.md

fiducial-detection.md

Example of Fiducial Detection

hashtag
Raspicam

  • We use a RaspberryPi v1 camera whose full documentation can be found at Raspicam Docsarrow-up-right, this also includes hardware and software specs

  • Additional information on how to install Raspicam hardware can be found at

hashtag
AprilTags

  • We use the pupil_apriltags package to detect fiducials in the Raspicam image so full documentation for the AprilTags can be found at

  • The package works by taking the camera parameters, fiducial size and family, and additional fiducial detection parameters and creating a detector class which contains a detect method that inputs a camera frame and outputs a list of detected fiducials

hashtag
Parameters and Tuning

  • The params.yaml file contains all of the fiducial vision parameters

  • The camera parameters which are used in the params.yaml file were found online in the raspicam section

  • The size of the fiducial in meters can be adjusted by printing out new fiducials of a larger size

hashtag
Transformation Math

  • The src/transforms.py and src/geometry.py contain the methods used for transforming the fiducial detection results into easier to work with translations and rotations

  • Also, parameters in params.yaml are used to slightly adjust the detection results after transformation

In order to print new apriltags you have to follow the instructions at AprilTag Generationarrow-up-right and Premade AprilTagsarrow-up-right

Raspicam Installationarrow-up-right
Pupil AprilTagsarrow-up-right
Fiducial Tag Families

hardware.md

The robot was donated by the Hands-On Robotics Initiativearrow-up-right. Due to time constraints and some delays, a prebuilt robot was delivered instead of a parts kits. The build instructions can be found on this Google Docarrow-up-right. It also includes other relevant instructions that will be referenced later, such as motor calibration, software installation and run instructions.

hashtag
Hardware Overview

RaspberryPi

hashtag
Default hardware (robot)

The robot consists of 12 C610 motor controllers, 12 M2006 motors, a Teensy board, cables and the chassis.

hashtag
Additional hardware

  • Raspberry Pi

  • Raspberry Pi Camera

hashtag
Battery Charging Settings

The battery charger in the lab supports charging the 6S Lipo Battery. Settings can be found below:

The batteries took around 1 hour to charge and last around 2-3 hours per charge.

The USB Battery Pack used was a generic one that provided enough voltage for the Pi. These are usually battery packs that support fast charge technology for smart phones. An appropriate cable is needed (e.g: USB 3.0 to Micro-USB or otherwise)

hashtag
Calibration Tips

  • Motor calibration can be found in the . The robot's casing may have to be opened to access some of the motor controllers.

  • Instructions for leg calibration can be found . Best results were achieved by supporting the robot with a box and adding paddings (made of paper towels and tape) between the legs and the box to get desired motor angles.

USB Battery Pack for Raspberry Pi
  • Keyboard (for emergency stop)

  • 6S Lipo Battery 22.2Varrow-up-right
    doc linked earlierarrow-up-right
    herearrow-up-right
    Pupper Calibration Setup
    Chemistry:    LiPo
    Voltage:      22 Volts
    Capacity:     1300mAh
    Charge Rate:  5C
    Series Count: 6S

    software-setup.md

    hashtag
    Pupper Base Software Setup

    Instructions for the base software setup can be found in the docarrow-up-right. See notes section for project-specific instructions.

    hashtag
    Notes on the setup doc

    • 3.c: en1 might not exist on the Pi. To find out which network interface to use run sudo ifconfig and look at the first interface it lists. For this project, it was eth0.

    • 4.d.ii: Although it says Mac/linux, this is not always the case. If ls /dev | grep tty.usbmodem shows no results try ls /dev | grep ttyACM

    hashtag
    Running base software

    Follow instructions . The keyboard program mentioned can be found .

    hashtag
    Setting up this project

    Install the dependencies using pip:

    hashtag
    Dependencies

    • picamera

    • pupil_apriltags

    • tqdm

    hashtag
    Running

    The full software can be run using python main.py, or the controller can be run separately using python3 src/controller.py.

    . For this project, it was
    ttyACM0
    .

    scipy

  • UDPComms

  • pyyaml

  • opencv or cv2

  • argparse

  • pickle

  • matplotlib

  • herearrow-up-right
    herearrow-up-right

    validating.md

    When a new robot is built, there are a number of steps needed to see that it is properly configured.

    hashtag
    Encoders

    Encoders need to be connected correctly. The left and right can be swapped, and also they may be backwards. These issues lead to very crazy and incomprehensible behavir, so it's best to check them first if your new robot is acting weird.

    hashtag
    Motors

    Motors also need to be connected correctly. They can be left-right swapped or forward-reverse swapped. It is worth again to test them separately. The kind of incomprehenible behavor from this kind of problem is totally different from the one when the encoders are backwards or swaps.

    hashtag
    Front and back

    It is very important that you know what the front of the robot is. Otherwise nothing makes sense. Our robots have the big wheels in front and the casters in back.

    hashtag
    IMU

    The IMU is connceted via an I2C Quick Connector to the Teensy. We have seen problems when the IMU doesn't work that were caused by the placement or length of the wire, so keep an eye out for that case.

    connections.md

    troubleshooting.md

    hashtag
    linobase.h Pins

    The most common problem with a new build is that the pin numbers are incorrect. You can find the hardware configuration in linorobot/teensy/firmware/lib/config/lino_base.h. In that same directory you will find other versions of that file for our different models. If you are qualifying a new model, then you should add a copy there.

    There are numberous variables in this file. The key ones for now are:

    and

    MOTOR1 is front left, and MOTOR2 is front right. There are several misconfigurations possible, basically all the permutations of left/right and forward/backward, on both the motors and encoders.

    hashtag
    Symptoms

    If all wheels are spinning but the robot spins in circles, goes backward, is unresponsive to cmd_vel commands or in general acts crazy, your first hypothesis should be that one or more of the pins above are incorrect or swapped.

    If one of the wheels doesn't spin at all then it's probably an electrical connection to the motor. If both wheels don't spin then it's probably a power/battery issue.

    hashtag
    Erratic Travel on a newly built robot

    hashtag
    Check Encoders

    To check the encoders, disconnect the power from the motors, leaving just the encoders. Then when you roslaunch linorobot minimal.launch it will print four numbers over and over again which are the readings of the potential for encoders (two in front and two in the back.)

    As I had only two wheels I just got two non-zero numbers. Put the robot on the floor and push it gently forward. Those two numbers will change. They should both go up approximately as fast, even though they are not the same number. In my case, they went in opposite directions. Note the one that is going down. The left wheel is encoder1 and the right wheel is encoder2. In the setup file I switched the pins for the corresponding motor and that solved it.

    (Note: This is very obvious but I got it wrong: The caster is on the BACK of the robot not the front. You need to know what is the front of the robot to know which wheel is left or right.)

    hashtag
    Check Motors

    For me, this didn't quite solve it. So the next thing to check was the motor itself. I changed the Arduino code (.ino) so that instead of sending the calculated numbers to the motors, I sent 100 to the left motor and 500 to the right motor. This is so that I could tell if the left motor turned slower than the right. If not, I had to switch those pins. Next I had to tell that both motors turned so that the effect was forward motion of the robot. That was incorrect for me too. That too is fixed by switching PIN numbers.

    hashtag
    Check PID

    Next came the PID settings. The instructions are good there in terms of getting monitoring the result of the pid calculations but not as far as what numbers are right. There are an infinite number of references on the web on tuning pid and they are all vague and different.

    Here again I made a small change to the Arduino code. I had it print out the error between the desired and actual rate of the left and right wheels. If things are working like they should that error starts at zero and when you give the robot a command it temporarily goes away from zero and then returns nicely to zero. I don' know yet what "technique" I used, nor whether I have the right parameters yet. But at least the robot is behaving better.

    #define MOTOR1_IN_A 20
    #define MOTOR1_IN_B 1
    #define MOTOR1_PWM 22
    
    #define MOTOR2_IN_A 6
    #define MOTOR2_IN_B 8
    #define MOTOR2_ PWM 5
    #define MOTOR1_ENCODER_A 14
    #define MOTOR1_ENCODER_B 15
    
    #define MOTOR2_ENCODER_A 12
    #define MOTOR2_ENCODER_B 11

    Robots in our Lab

    Over the years we've built up a varied collection of robots. Some are no longer in use and several types still are. This section contains a lot of the robot specific details. Today the main robots we use are the TurtleBot3 from Robotis and the home grown Platform line.

    hashtag
    TurtleBot3 - Robotis

    These have been our workhorses. They are robust and don't seem to break much. Probably because they are pretty musch standardized from the factory

    hashtag
    Platform - Brandeis

    These robots are homegrown, hardware design and manugacture by Charlie Squires and software stack based on the teensy

    stack.md

    hashtag
    Intro to LinoRobot

    hashtag
    Intro

    Examine Linorobotarrow-up-right again. You will see very detailed instructions for building a robot, both the hardware and the software. In our world, bullet, platform, cat are all fully compliant Linorobot robots.

    hashtag
    Base hardware stack

    • For the SBC we use either a Rasperry Pi 3B+ or a Raspberry Pi 4

      • Lidar is connected via USB to the Raspberry Pi

      • The Microcontroller is connected via USB to the Raspberry Pi

    hashtag
    SBC Software

    The SBC is running Ubuntu 20.04 and ROS 1.0. It is a standard install which we get from the Linorobot installation. Of course our own ROS code and scripts are then added to it. Certain standard Linorobot Nodes are launched.

    hashtag
    Standard Linorobot Nodes

    hashtag
    Microcontroller Software

    The Teensy code is provided by Linorobot. We have tweaked it in small ways. See for information on rebuilding it and installing the software. This software has the following jobs:

    1. Read the encoders to determine the apparent speed and direction of the robot

    2. Subscribe to cmd_vel to determine the desired speed and direction

    3. Use a PID controller to drive the motors to meet the desired speed and direction

    For the microcontroller we use either a Teensy 3.2 or a Teensy 4.x (check this)

    • The motor controller is connected to the Teensy via

    • The IMU is connected to the Teensy via I2C bus

    Publish the actual speed and direction as computed by the encoders as ODOM_RAW
  • Read the IMU data (via the I2C bus) and publish it as IMU_RAW

  • Read other I2C sensors and actuators (coming soon!)

  • How Toarrow-up-right

    Platform Hardware Specs

    • Consistent across Platform 1, 2 and 3?

    • Raspberry Pi 4b

      • Arduino Teensy 3.1

      • MPU 9250 IMU

      • YDLidar X4

    mrbuild.md

    hashtag
    GoPiGo3 Basic Kit

    We recommend that you start with a GoPiGo3 Basic Kitarrow-up-right. Build it following the instructions. There is a lot written and a lot to google about it. Here are some useful links:

    • Dexter Forumarrow-up-right - where you can get many questions answered

    • - where you find step by step instructions.

    • - instructions for students taking Cosi119a at Brandeis University to order a miniRover.

    In addition to what you received with the Basic Kit you will need the following to get through the instructions.

    • A Raspberry Pi 3+

    • A battery pack (with Y-cable and charger)

    • A MicroSD card

    The Battery pack is different from what the instructions talk about. The Y-Cable is used to connect the barrel connector of the battery pack with the corresponding battery connector on the "Red Board" which in turn will power the Raspberry Pi.

    The robot will come with a microSD card with the dexter software. There is some very very simple calibration that you need to do.

    hashtag
    Battery Pack

    • The battery pack is specific in terms of voltage and capacity. Don't substitute it for another one.

    • Note that the battery pack needs to be in the "on" position in order to have the charger do anything. And obviously it has to be in "ON" for the robot to work.

    • It also comes with a charger and a y-cable. The y-cable is about 10" long and has three barrel connectors on it in a y-configuration.

    hashtag
    Camera

    You will have bought the camera separately, but the instructions talk about how to mount it to the robot. Make sure it is centered in all three dimensions. A good spot is on the front of the top plexiglass part. It can be do without any extra parts, but which works nicely too.

    hashtag
    Lidar

    The Lidar needs to be mounted on the top plexiglass part. You will need to drill two holes. Make sure that the Lidar is exactly centered and pointed forward. As far as the drilling: I kept it to a minimum and drilled only two. I think it doesn't matter much. Mostly that it be straight and centered. Note that the narrower part of the Lidar is the front. You want it centered and facing forward (which on the other side of the ball caster.) I drilled holes for the two posts at the back of the Lidar (the wider part.) I don't see that it matters much though, that was what was easiest for me.

    A Lidar
  • A very short USB cable

  • We use only two of the 3 ends of the Y-connector. One goes into the barrel connector on the red board, and one goes into the barrel connector on the battery pack.

  • We then connect the very short usb cable with one end in the battery pack and the other end in the "power" connection of the Lidar board. The Lidar board is tiny about 1.5" square. On the one side you have the funny Lidar wires and on the other side you have two micro usb connectors. If you look very carefully one is marked data and one is marked power. The one marked data is connected with a short usb cable to one of the usb connectors of the pi. The one marked power is connected with another short usb cable to the battery pack.

  • Build your GoPiGo3arrow-up-right
    MiniRover Purchase Programarrow-up-right
    Dexter has this bracketarrow-up-right
    Top View of MiniRover
    Side View of MiniRover

    minirover

    MiniRover is our modification of the GoPiGo3 robot. It comes with software, a VPN and all kinds of fancy things which is why I decided to give it a special name. This page is under construction as I troubleshoot and make notes about things. I expect feedback from everyone where the instructions don't meet reality!

    linorobot

    hashtag
    What it is

    Linorobotarrow-up-right is a software package for building our own robots. We have used their software and instructions to construct our own robots and configure the software for them. The above link is a major resource for you. In this section we will explain things that are specific to Brandeis but we won't repeat it.

    mruse.md

    hashtag
    Turning it on

    It is important that you follow a rigid procedure when turning the robot on and off.

    Assuming it is totally off:

    1. Make sure the battery pack is correctly connected to the robot.

    2. Switch on the battery pack

    3. Click the micro button on the red board

    1. Look at the Raspberry Pi (not the "red board") You will see tiny red and green LEDs blinking. Wait until the green one has settled down to slow flickering.

    hashtag
    Turning it off

    1. From your ssh command line on the remote computer, type sudo shutdown now

    2. Once the Raspberry Pi has stopped blinking you can turn off the power switch on the battery pack.

    hashtag
    Charging the Battery

    The Battery pack came with a charger. It has a light on which is red while charging and green when fully charged. Note that the battery will not charge when its switch is set to off.

    hashtag
    rset command

    We've implemented a very simple tool to set up the IP addresses correctly. It will be changing as I figure out how to make it better. So for now...

    1. If you have a cloud desktop and want to run simulations without a actual miniRover" rset cloud

    2. If you have a cloud desktop add a real robot, run rset robot on your cloud desktop and rset pi on the actual miniRover (over ssh)

    rset by itself displays the current status

    (I know a combination is missing and plan a revision of this)

    hashtag
    Starting the MiniRover ROS applications

    Note that all we run on the MiniRover itself are roscore, and the nodes needed for the motors, lidar and camera. Everything else runs on your "remote". The following commands are to be done on the MiniRover from ~/catkin_ws

    hashtag
    Web Desktop tools

    Note that this includes all flavors, cloud based, local docker based, and gpu based browser desktops. If you just want to use the simulators on their own and are not using an actual miniRover, then: rset cloud is enough. At that point you can run your ROS programs.

    There are numerous scripts, programs and launch files that are preinstalled on your ROS wegpgb desktop. I will document only some of them here but you can look around and find more that are interesting. All of them from the book are here. I have not renamed any of them for that reason.

    hashtag
    Rset Command

    In order to facilitate working in all the combinations of environments we have these commands:

    • rset pi - declare that this is the raspberry pi

    • rset cloud - declare that this is a cloud desktop working with sim

    • rset robot - declare that this is a cloud desktop working with a real robot

    hashtag
    Aliases

    There are a bunch of handy aliases:

    • myip - this computer's regular local ip

    • myvpnip - this computer's vpn ip if it has one

    • stopnow

    hashtag
    Accounts and passwords

    • miniRover

      • hostname gopigo3

      • default account pi

    hashtag
    Key Environment Variables

    ROS_MASTER_URI=http:/100.94.206.80:11311 (example!) should always be set to the computer where roscore is running. If you are using a physical robot, then roscore runs on the robot itself. If you are in the web deskop and working just with simulation then roscore would run there.

    hashtag
    Robot

    ROS_MASTER_URI = robot's own ip address ROS_IP = robot's own ip address

    hashtag
    Remote Computer

    ROS_MASTER_URI = robots ip address ROS_IP = remote computer's own IP address

    These IP addresses are on different networks and cannot access each other. So instead we've created what is called a "virtual private network" that connects them together. Both your robot and your cloud desktop have an alternate ip address which they can both see.

    hashtag
    IP Addresseses

    • myip returns your local ip address

    • myvpnip returns your vpn ip address (if you have one)

    platform

    pupper

    If you have a local docker based desktop, run rset docker there.
    rset - display settings
    - immediately stop the robot
  • teleop - run a simple teleop tool

  • ru - connect to rosutils

  • cm - catkin_make

  • cw - cd ~/catkin_ws

  • cs - cd ~/catkin_ws/src

  • default password raspberry

  • Cloud or Docker Desktop

    • default password dev@ros

    • desktop: url: http://vnc..ros.campusrover.org

    • vscode: http://code..ros.campusrover.org

  • Button to reboot
    # launch the motor controller and lidar
    roslaunch minirover mr_bringup.launch
    
    # launch camera (optional if you need the camera)
    roslaunch gpg_bran raspicam.launch
    # Fun prr examples
    rosrun prrexamples red_light_green_light.py
    rosrun prrexamples wander.py
    
    # Visualize robot with rviz
    roslaunch gopigo3_navigation gopigo3_rviz.launch 
    
    # Or with gazebo (not as useful)
    roslaunch gopigo3_navigation gopigo3_gazebo.launch 
    
    # Run different simulated stages to experiment with roslaunch 
    roslaunch turtlebot3_gazebo turtlebot3_stage_1.launch
    roslaunch turtlebot3_gazebo turtlebot3_stage_2.launch
    roslaunch turtlebot3_gazebo turtlebot3_stage_3.launch
    roslaunch turtlebot3_gazebo turtlebot3_stage_4.launch
    
    # Control the robot
    roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch
    
    # Begin building the map on real robot
    roslaunch gopigo3_navigation gopigo3_slam.launch
    
    # Begin localization with AMCL
    roslaunch gopigo3_navigation amcl.launch
    

    mrsetup.md

    This section will help you get your robot set up to go. Note that if you received this from the Autonomous Robotics Lab then part of these steps may already be done.

    See the important information about turning the robot on and off here: .

    hashtag
    MicroSD Card (if needed)

    If your robot was set up by us, then you should skip this step!

    mrtroubleshooting.md

    Author: Pito Salas

    hashtag
    Gemeral

    • An excellent and very detailed

    hashtag
    LED on the Red Board
    • Blinking green is the “starting up” indication.

    • Solid green is the “ready” indication.*

    • Solid yellow is the “low battery” caution indication.

    • Solid red is the “low battery” warning.

    • Blinking red is the “shutting down” indication.

    • Blinking purple is (AFAIK) the “I don’t know what is going on” indication, but I could be wrong.

    But, note that all of these indications are only valid when using a Dexter Industries / Modular Robotics operating system. (i.e. Raspbian for Robots, GoPiGo OS, or Dexter OS). It will continue to blink green when using any other O/S. The Minirover configuration is Ubuntu! Also see: https://www.dexterindustries.com/GoPiGo/get-started-with-the-gopigo3-raspberry-pi-robot/2-connect-to-the-gopigo-3/

    hashtag
    Python default

    To Change the default python on ubuntu (assuming you want it to be python3.4)

    sudo update-alternatives --install /usr/bin/python python /usr/bin/python3.4 1

    hashtag
    Check whether spi is working on the pi

    troubleshooting guide for the Raspberry Piarrow-up-right
    >>> import spidev
    >>> spi = spidev.SpiDev()
    >>> spi.open(0,1)
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    PermissionError: [Errno 13] Permission denied
    You are going to load software onto the microSD Card - which will wipe out what you loaded onto it according the the instructions before. We provide you a disk image on which to store it. It changes from time to time. This linkarrow-up-right is to a google folder that will contain the versions. You should use the latest one in that folder. I recommend you use the app "Balena Etcher" to copy the image onto your microSD card.

    To create a new MicroSD from the old one, see Backup Raspi Card on MacOSarrow-up-right

    hashtag
    Connecting to the network

    Now we are facing a dilemma. We need to get the robot on your network. There are several ways of doing this. Below are two specific scenarios that we support:

    1. A usb keyboard, usb mouse and a HDMI display and eduroam wifi access

    2. OR A network cable and non-eduroam wifi access

    3. A MiniRover v2 and non-eduroam wifi access. It should be labeled with a name.

    hashtag
    Scenario 1

    1. We will tell you what the name of your robot is. Plug it in below wherever you see gopigo3

    2. Locate a wired network connection (on your router for example) and use a network cable to connect your robot to the network

    3. Now turn the power on (see Using MiniRoverfor instructions. It should boot up into linux. But you won't know this because there's no keyboard or screen!

    4. Using your own computer or development environment that is attached to the network check that you see the robot. This includes a linux computer where you program, or a browser web development environment. Broadly speaking, it's the computer "other than" to raspberry pi on the robot (we will refer to this as the remote computer from now on).

    Once this works you know that you have access to the robot from your remote computer. Make note of the robot' ip address. It will likely look like 192.168.1.xx but not necessarily.

    1. Now use ssh (secure shell) to get to the robot from the remote:

    It will ask you for the password for account pi. It is raspberry. Once you get in you are 1/2 way there!

    1. Now we want to get your robot onto your local wifi network. You need to know the wifi network's name and password. On the robot command line type:

    Where <SSID> is the network's name. You can surround it in quotes if there are spaces or other funny characters. is the password.

    1. Next shutdown the robot nicely (see above), disconnect the network cable, and start the robot up nicely again (see above.)

    2. Once it's back, follow the same steps to ssh pi@gopigo3.local and enter the password raspberry and you should have a wireless enabled robot.

    hashtag
    Scenario 2

    1. Connect your mouse, keyboard and screen to the Raspberry pi. You will find several free USB ports and an HDMI port. Look closely they are all there.

    2. Boot up the Raspberry Pi and wait until it is running and you see the desktop

    3. Locate the network settings dialog box by clicking the network icon on the top right

    4. Add eduroam as a network, and fill it in as follows:

    Button to reboot
    1. Finally shutdown the robot, unplug the keyboard, mouse and monitor and reboot

    2. Once it's back ssh pi@gopigo3.local and enter the password raspberry and you should have a wireless enabled robot

    hashtag
    Scenario 3 incomplete and incorrect

    You've received a fully set up and tested Minrover v2, and you know the name. The following steps are written as if the robot is called mr1. Scenario 3 Robots will have the account ubuntu with the password ubuntu (so different from scenario 1 and 2)

    1. Now turn the power on (see Using MiniRoverfor instructions. It should boot up into linux. But you won't know this because there's no keyboard or screen!

    2. MiniRover will come up as a Wifi access point. The SSID is mr1XXXX where XXXX is part of the MAC address. The wifi password is robotseverywhere.s Connect to it via your computer's wifi.

    hashtag
    Troubleshooting ssh gopigo3.local

    Under certain circumstances gopigo3.local will be found. If so you need to find out the ip address of your robot when it is on wifi (not wired). If then this should work:

    hashtag
    Updating the hostname of your Robot

    Rename the hostname of your robot. It comes to you called "gopigo3" but they are all called that and this can cause confusion. Let's say you want to call your robot pitosalas. Oddly you have to change it in two places. Here's how"

    Now the robot is called pitosalas and at least it will be different from other miniRovers.

    hashtag
    Eduroam

    On eduroam it is often automatically known as gopigo3.dyn.brandeis.edu. So after rebooting it check if this works. It's not guaranteed but often it works just fine.

    hashtag
    VPN

    You can run everything on the robot itself but it doesn't have a screen. You could plug an external screen, keyboard and mouse and have a complete ROS enabled computer (the robot) at your finger-tips. But it would be real slow. So instead we are going to add the robot to a "VPN" - virtual private network which will allow you to work with it remotely.

    1. Prepare the VPM configuration by:

    1. Setup VPN

    1. myvpnip should now return that same IP address.

    hashtag
    Updates for your Robot

    We have installed some Brandeis specific software which should be updated:

    And then edit the new ~/.bashrc according to the instructions in the file.

    hashtag
    Updates to your Cloud Desktop

    1. Check that you have ~/rosutils directory on your cloud desktop. If not:

    Edit .bashrc according to the instructions in it

    hashtag
    This will continue to be updated

    Using the Robot

    software-overview.md

    The code can be found at Pupper GitHub Repositoryarrow-up-right

    hashtag
    Directory

    hashtag
    Components Overview

    hashtag
    Models

    • The models can be found at /models/

    • The obstacle, agent, and goal models are of type Shape which can be found within src/boundary_detection.py

    • The parameters of the models can be found within params.yaml

    hashtag
    Computer Vision

    • The computer vision module can be found at /src/fiducial_vision.py

    • The fiducial and computer vision package requires numerous parameters to work correctly, these include fiducial tag size, lens size, and center pixel of the image

    • The module itself contains the Vision

    hashtag
    Boundary Generation and Navigation

    • The modules relevant to boundary generation and planning are src/boundary_detection.py, src/path_finder.py, /src/path_profiler.py, and path_test.py

    • Boundary generation works by taking in the detected positions and rotations of fiducials and then creating obstacle classes to represent each fiducial. Then each obstacle.points

    hashtag
    Visualization

    • The visualization class in src/viz.py uses matplotlib to generate a plot of the agent model, obstacles, obstacle boundaries, position of the goal, graph nodes, and the path of the robot

    hashtag
    Main

    • The program can be run simply by navigating to the root of the repository and then running python3 main.py

    controls.md

    The controller.py provides an API to control the robot by emulating a PS4 controller, similar to thisarrow-up-right. Main difference is some keybinds are different, but it can also be called from code.

    hashtag
    The UDP message

    This is a breakdown of the dictionary's fields that is sent through UDPComms to the robot.

    Field
    Value
    Comments

    Note: values [-1,1] means any values between -1 and 1 and values 1 or 0 are a toggle. This means that the first time 1 is sent, it will cause the value on the pupper to change. A 0 needs to be sent before another 1 will cause a toggle.

    It is a good idea to use some sort of smoothing for lx, ly and rx to avoid abrupt stops.

    hashtag
    Keybinds

    If running the controller manually, these are the controls:

    Keybind
    Function

    hashtag
    Control sequence

    Robot must first be activated, this will also trigger calibration if the pupper software was run with the --zero flag. Then it must be in trotting mode and only then can it be controlled with other functions.

    introduction.md

    hashtag
    Context

    The Pupper project uses the Stanford Pupper 2.0 which does not currently have public documentation but information on the original Pupper project can be found at . More detailed documentation on the original Pupper can be found at . This project explores topics related to fiducial detection, boundary generation, motion planning, controls, and hardware.

    planning.md

    hashtag
    Discretization

    • The discretization of the environment is done in /src/path_finder.py in the PathFinder

    testing.md

    hashtag
    Setup

    • We tested with 16 printed 0.03 meter fiducials and a floor sign to hold the goal fiducial

    ping gopigo3.local
    ssh pi@gopigo3.local
    sudo nmcli d wifi connect <SSID> password <password>
    ssh pi@<ip address>
    # In the shell of the robot:
    sudo hostname pitosalas
    
    # In the hostname file:
    sudo nano /etc/hostname
    sudo apt-get remove -y tailscale
    sudo rm -rf /var/lib/tailscale/tailscaled.state
    
    # Reboot right after!
    sudo reboot
    # Get the tskey from Pito
    cd ~/rosutils
    chmod +x pi_connect.sh
    
    # Run the script with the tailscale authkey
    sudo ./pi_connect.sh <tskey-123abc456>
    
    # On successful connect, you should see this
    # Connected. IP address: 100.xx.xxx.xxx
    cd ~/rosutils
    git pull
    cp ~/rosutils/bashrc_template.bash ~/.bashrc
    cd ~/catkin_ws/src/gpg_bran4
    git pull
    cd
    git clone https://github.com/campusrover/rosutils.git
    cp rosutils/bashrc_template.bash .bashrc
    ├───models
    │   └───agent.py
    │   └───goal.py
    │   └───obstacle.py
    ├───plots
    │   └───map.png
    ├───src
    │   └───boundary_detection.py
    │   └───controller.py
    │   └───fiducial_vision.py
    │   └───geometry.py
    │   └───main.py
    │   └───node.py
    │   └───path_finder.py
    │   └───boundary_profiler.py
    │   └───transforms.py
    │   └───path_test.py
    │   └───viz.py
    ├───params.yaml
  • The goal model is a shape containing only a single point

  • The obstacle and agent models are defined by 4 corner points which are then interpolated to create a series of points defining the boundary of the model

  • Each model also has a center as well which would be it's relative location to the Pupper

  • class which contains a method
    capture_continuous
    which returns a generator which yields the results of the fiducial detection module on frames of the RaspberryPi camera
    of type
    Point[]
    are transformed to it's corresponding fiducials location by the
    src/transform.py
    class. Then The Pupper robot model
    models/agent.py
    which is at
    (0, 0)
    and each
    obstacle
    are used to calculate the configuration space for robot using the [Minkowski Sum](https en.wikipedia.org/wiki/Minkowski_addition)
  • Also, the models/goal.pyclass is used to represent the goal of the Pupper which corresponds to the fiducial with id = 0

  • Each point in the resulting configuration space is used to generate a graph of the area where vertices of the graph close to the points in the configuration space are removed so that when a shortest path search is performed the resulting path only includes valid positions from the robot's current position to the goal

  • Finally, the path is interpolated and converted into an array of distances to travel and at what angle it should travel at, which is then converted into command interfaces commands based on the velocity of the robot

  • Stanford Pupperarrow-up-right
    Pupper Docsarrow-up-right
    Pupper Robot
    class
  • The parameters for discretization are in the params.yaml file and affect the inflation of obstacles in the graph, step size of the robot (distance from one vertex to it's immediate neighbors), the length and width of the graph to generate, as well as if the graph search should explore nodes diagonally

  • The discretization happens in the explore method which uses two serial doubly nested lambda expressions to create a matrix of Point objects which gets converted into a matrix of Node objects containing point, distance, previous, and explored which are the required fields to perform a graph search on this data

  • hashtag
    Graph Search

    • The graph search happens in the PathFinder class in the solve method which performs a Dijkstra shortest path search from the node corresponding to the agent location to the node nearest to the goal location

    • The algorithm operates the same as a standard shortest path search but has several optimizations built in to account for the limited hardware of the RaspberryPi

    hashtag
    Path Profiling

    • Path profiling is the process of converting the path, a list of Point objects to a series of distances, and headings for the controller to follow

    • The math for it is in src/geometry.py which finds the angle and distance between two points

    hashtag
    Potential Improvements

    Many potential improvements exist to boos the performance, accuracy, and resolution of the planning module. Some ideas are:

    • Use dynamic programming to eliminate redundant in_range_of_boundary_quick checks for the same node

    • Implement a gradient based approach to converting configuration space into edge weights

    • Use a better shortest path search algorithm or a sampling-based approach so discretization is not necessary

    Example Map

    R2

    0

    Nothing

    R1

    1 or 0

    Toggle trot/rest

    L1

    1 or 0

    Toggle activation/deactivation

    dpadx

    [-1,1]

    Move body up or down

    dpady

    [-1,1]

    Roll left/right (roll)

    square

    0

    Nothing

    circle

    0

    Nothing

    triangle

    0

    Nothing

    x

    0

    Nothing

    message_rate

    20

    Rate of which messages are sent

    Stop any sort of movement

    lx

    [-1,1]

    Walk left or right

    ly

    [-1,1]

    Walk front or backwards

    rx

    [-1,1]

    Turn left or right (yaw)

    ry

    [-1,1]

    Tilt up or down (pitch)

    L2

    0

    space

    Toggle trot/rest

    v

    Toggle activation

    w/s

    Move forward/back

    a/d

    Turn left/right

    q/e

    Move left/right

    Nothing

    x

    hashtag
    Requirements
    • A Pupper 2.0 robot flashed with the low level Teensey software for the motor controllers

    • A successful calibration and correctly configured motor IDs

    • The RaspberryPi and RaspberryPi camera running with the latest software

    hashtag
    Running

    • Running the test is simple, once you have the calibrated pupper robot operating you can then run python3 /src/main.py

    hashtag
    Unit Testing

    • A test for the boundary generation, discretization, and graph search can be run through python3 /src/path_test.py

    • This opens a pickled Environment object containing obstacle, agent, and goal data from a testing session and then performs a graph search using the PathFinder class

    Example Robot Testing Configuration
    Testing Area

    boundary-generation.md

    hashtag
    Environment

    • Boundary generation occurs in the environment class within src/boundary_detection.py

    • The class works by setting an ego agent (the pupper), adding each detected fiducial as an obstacle, and if a fiducial with id = 0 was detected then a goal object is added to the class as well

    • Obstacle locations are updated if a fiducial with an existing id is detected and obstacle locations can be cleared with clear_obstacles

    • Then a list of boundaries corresponding to each obstacle is created using the [Minkowski Sum](https en.wikipedia.org/wiki/Minkowski_addition) formula

    hashtag
    Configuration Space

    • Configuration space is essentially a series of points that represent infeasible locations and rotations for a robot in an environment

    • We can use this along with discretization to generate a graph of all the feasible locations for the pupper robot in an environment with obstacles, assuming we have the models for the agent and the obstacles

    • Below are two example configuration spaces shown in red for an ego agent in green for the obstacle in blue, notice how the configuration spaces changes when the rotation of the agent changes

    Example Configuration Space 1
    Example Configuration Space 2