arrow-left

All pages
gitbookPowered by GitBook
1 of 10

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

testing.md

Testing Area

hashtag
Setup

  • We tested with 16 printed 0.03 meter fiducials and a floor sign to hold the goal fiducial

hashtag
Requirements

  • A Pupper 2.0 robot flashed with the low level Teensey software for the motor controllers

  • A successful calibration and correctly configured motor IDs

  • The RaspberryPi and RaspberryPi camera running with the latest software

hashtag
Running

  • Running the test is simple, once you have the calibrated pupper robot operating you can then run python3 /src/main.py

hashtag
Unit Testing

  • A test for the boundary generation, discretization, and graph search can be run through python3 /src/path_test.py

  • This opens a pickled Environment object containing obstacle, agent, and goal data from a testing session and then performs a graph search using the PathFinder class

Example Robot Testing Configuration

boundary-generation.md

hashtag
Environment

  • Boundary generation occurs in the environment class within src/boundary_detection.py

  • The class works by setting an ego agent (the pupper), adding each detected fiducial as an obstacle, and if a fiducial with id = 0 was detected then a goal object is added to the class as well

  • Obstacle locations are updated if a fiducial with an existing id is detected and obstacle locations can be cleared with clear_obstacles

  • Then a list of boundaries corresponding to each obstacle is created using the [Minkowski Sum](https en.wikipedia.org/wiki/Minkowski_addition) formula

hashtag
Configuration Space

  • Configuration space is essentially a series of points that represent infeasible locations and rotations for a robot in an environment

  • We can use this along with discretization to generate a graph of all the feasible locations for the pupper robot in an environment with obstacles, assuming we have the models for the agent and the obstacles

  • Below are two example configuration spaces shown in red for an ego agent in green for the obstacle in blue, notice how the configuration spaces changes when the rotation of the agent changes

Example Configuration Space 1
Example Configuration Space 2

pupper

hardware.md

The robot was donated by the Hands-On Robotics Initiativearrow-up-right. Due to time constraints and some delays, a prebuilt robot was delivered instead of a parts kits. The build instructions can be found on this Google Docarrow-up-right. It also includes other relevant instructions that will be referenced later, such as motor calibration, software installation and run instructions.

hashtag
Hardware Overview

RaspberryPi

hashtag
Default hardware (robot)

The robot consists of 12 C610 motor controllers, 12 M2006 motors, a Teensy board, cables and the chassis.

hashtag
Additional hardware

  • Raspberry Pi

  • Raspberry Pi Camera

hashtag
Battery Charging Settings

The battery charger in the lab supports charging the 6S Lipo Battery. Settings can be found below:

The batteries took around 1 hour to charge and last around 2-3 hours per charge.

The USB Battery Pack used was a generic one that provided enough voltage for the Pi. These are usually battery packs that support fast charge technology for smart phones. An appropriate cable is needed (e.g: USB 3.0 to Micro-USB or otherwise)

hashtag
Calibration Tips

  • Motor calibration can be found in the . The robot's casing may have to be opened to access some of the motor controllers.

  • Instructions for leg calibration can be found . Best results were achieved by supporting the robot with a box and adding paddings (made of paper towels and tape) between the legs and the box to get desired motor angles.

USB Battery Pack for Raspberry Pi
  • Keyboard (for emergency stop)

  • 6S Lipo Battery 22.2Varrow-up-right
    doc linked earlierarrow-up-right
    herearrow-up-right
    Pupper Calibration Setup
    Chemistry:    LiPo
    Voltage:      22 Volts
    Capacity:     1300mAh
    Charge Rate:  5C
    Series Count: 6S

    fiducial-detection.md

    Example of Fiducial Detection

    hashtag
    Raspicam

    • We use a RaspberryPi v1 camera whose full documentation can be found at Raspicam Docsarrow-up-right, this also includes hardware and software specs

    • Additional information on how to install Raspicam hardware can be found at

    hashtag
    AprilTags

    • We use the pupil_apriltags package to detect fiducials in the Raspicam image so full documentation for the AprilTags can be found at

    • The package works by taking the camera parameters, fiducial size and family, and additional fiducial detection parameters and creating a detector class which contains a detect method that inputs a camera frame and outputs a list of detected fiducials

    hashtag
    Parameters and Tuning

    • The params.yaml file contains all of the fiducial vision parameters

    • The camera parameters which are used in the params.yaml file were found online in the raspicam section

    • The size of the fiducial in meters can be adjusted by printing out new fiducials of a larger size

    hashtag
    Transformation Math

    • The src/transforms.py and src/geometry.py contain the methods used for transforming the fiducial detection results into easier to work with translations and rotations

    • Also, parameters in params.yaml are used to slightly adjust the detection results after transformation

    In order to print new apriltags you have to follow the instructions at and

    Raspicam Installationarrow-up-right
    Pupil AprilTagsarrow-up-right
    Fiducial Tag Families
    AprilTag Generationarrow-up-right
    Premade AprilTagsarrow-up-right

    software-overview.md

    The code can be found at Pupper GitHub Repositoryarrow-up-right

    hashtag
    Directory

    hashtag
    Components Overview

    hashtag
    Models

    • The models can be found at /models/

    • The obstacle, agent, and goal models are of type Shape which can be found within src/boundary_detection.py

    • The parameters of the models can be found within params.yaml

    hashtag
    Computer Vision

    • The computer vision module can be found at /src/fiducial_vision.py

    • The fiducial and computer vision package requires numerous parameters to work correctly, these include fiducial tag size, lens size, and center pixel of the image

    • The module itself contains the Vision

    hashtag
    Boundary Generation and Navigation

    • The modules relevant to boundary generation and planning are src/boundary_detection.py, src/path_finder.py, /src/path_profiler.py, and path_test.py

    • Boundary generation works by taking in the detected positions and rotations of fiducials and then creating obstacle classes to represent each fiducial. Then each obstacle.points

    hashtag
    Visualization

    • The visualization class in src/viz.py uses matplotlib to generate a plot of the agent model, obstacles, obstacle boundaries, position of the goal, graph nodes, and the path of the robot

    hashtag
    Main

    • The program can be run simply by navigating to the root of the repository and then running python3 main.py

    ├───models
    │   └───agent.py
    │   └───goal.py
    │   └───obstacle.py
    ├───plots
    │   └───map.png
    ├───src
    │   └───boundary_detection.py
    │   └───controller.py
    │   └───fiducial_vision.py
    │   └───geometry.py
    │   └───main.py
    │   └───node.py
    │   └───path_finder.py
    │   └───boundary_profiler.py
    │   └───transforms.py
    │   └───path_test.py
    │   └───viz.py
    ├───params.yaml
  • The goal model is a shape containing only a single point

  • The obstacle and agent models are defined by 4 corner points which are then interpolated to create a series of points defining the boundary of the model

  • Each model also has a center as well which would be it's relative location to the Pupper

  • class which contains a method
    capture_continuous
    which returns a generator which yields the results of the fiducial detection module on frames of the RaspberryPi camera
    of type
    Point[]
    are transformed to it's corresponding fiducials location by the
    src/transform.py
    class. Then The Pupper robot model
    models/agent.py
    which is at
    (0, 0)
    and each
    obstacle
    are used to calculate the configuration space for robot using the [Minkowski Sum](https en.wikipedia.org/wiki/Minkowski_addition)
  • Also, the models/goal.pyclass is used to represent the goal of the Pupper which corresponds to the fiducial with id = 0

  • Each point in the resulting configuration space is used to generate a graph of the area where vertices of the graph close to the points in the configuration space are removed so that when a shortest path search is performed the resulting path only includes valid positions from the robot's current position to the goal

  • Finally, the path is interpolated and converted into an array of distances to travel and at what angle it should travel at, which is then converted into command interfaces commands based on the velocity of the robot

  • software-setup.md

    hashtag
    Pupper Base Software Setup

    Instructions for the base software setup can be found in the docarrow-up-right. See notes section for project-specific instructions.

    hashtag
    Notes on the setup doc

    • 3.c: en1 might not exist on the Pi. To find out which network interface to use run sudo ifconfig and look at the first interface it lists. For this project, it was eth0.

    • 4.d.ii: Although it says Mac/linux, this is not always the case. If ls /dev | grep tty.usbmodem shows no results try ls /dev | grep ttyACM

    hashtag
    Running base software

    Follow instructions . The keyboard program mentioned can be found .

    hashtag
    Setting up this project

    Install the dependencies using pip:

    hashtag
    Dependencies

    • picamera

    • pupil_apriltags

    • tqdm

    hashtag
    Running

    The full software can be run using python main.py, or the controller can be run separately using python3 src/controller.py.

    . For this project, it was
    ttyACM0
    .

    scipy

  • UDPComms

  • pyyaml

  • opencv or cv2

  • argparse

  • pickle

  • matplotlib

  • herearrow-up-right
    herearrow-up-right

    planning.md

    Example Map

    hashtag
    Discretization

    • The discretization of the environment is done in /src/path_finder.py in the PathFinder class

    • The parameters for discretization are in the params.yaml file and affect the inflation of obstacles in the graph, step size of the robot (distance from one vertex to it's immediate neighbors), the length and width of the graph to generate, as well as if the graph search should explore nodes diagonally

    • The discretization happens in the explore method which uses two serial doubly nested lambda expressions to create a matrix of Point objects which gets converted into a matrix of Node objects containing point, distance, previous, and explored which are the required fields to perform a graph search on this data

    hashtag
    Graph Search

    • The graph search happens in the PathFinder class in the solve method which performs a Dijkstra shortest path search from the node corresponding to the agent location to the node nearest to the goal location

    • The algorithm operates the same as a standard shortest path search but has several optimizations built in to account for the limited hardware of the RaspberryPi

    hashtag
    Path Profiling

    • Path profiling is the process of converting the path, a list of Point objects to a series of distances, and headings for the controller to follow

    • The math for it is in src/geometry.py which finds the angle and distance between two points

    hashtag
    Potential Improvements

    Many potential improvements exist to boos the performance, accuracy, and resolution of the planning module. Some ideas are:

    • Use dynamic programming to eliminate redundant in_range_of_boundary_quick checks for the same node

    • Implement a gradient based approach to converting configuration space into edge weights

    • Use a better shortest path search algorithm or a sampling-based approach so discretization is not necessary

    introduction.md

    hashtag
    Context

    The Pupper project uses the Stanford Pupper 2.0 which does not currently have public documentation but information on the original Pupper project can be found at . More detailed documentation on the original Pupper can be found at . This project explores topics related to fiducial detection, boundary generation, motion planning, controls, and hardware.

    Stanford Pupperarrow-up-right
    Pupper Docsarrow-up-right
    Pupper Robot

    controls.md

    The controller.py provides an API to control the robot by emulating a PS4 controller, similar to thisarrow-up-right. Main difference is some keybinds are different, but it can also be called from code.

    hashtag
    The UDP message

    This is a breakdown of the dictionary's fields that is sent through UDPComms to the robot.

    Field
    Value
    Comments

    Note: values [-1,1] means any values between -1 and 1 and values 1 or 0 are a toggle. This means that the first time 1 is sent, it will cause the value on the pupper to change. A 0 needs to be sent before another 1 will cause a toggle.

    It is a good idea to use some sort of smoothing for lx, ly and rx to avoid abrupt stops.

    hashtag
    Keybinds

    If running the controller manually, these are the controls:

    Keybind
    Function

    hashtag
    Control sequence

    Robot must first be activated, this will also trigger calibration if the pupper software was run with the --zero flag. Then it must be in trotting mode and only then can it be controlled with other functions.

    R2

    0

    Nothing

    R1

    1 or 0

    Toggle trot/rest

    L1

    1 or 0

    Toggle activation/deactivation

    dpadx

    [-1,1]

    Move body up or down

    dpady

    [-1,1]

    Roll left/right (roll)

    square

    0

    Nothing

    circle

    0

    Nothing

    triangle

    0

    Nothing

    x

    0

    Nothing

    message_rate

    20

    Rate of which messages are sent

    Stop any sort of movement

    lx

    [-1,1]

    Walk left or right

    ly

    [-1,1]

    Walk front or backwards

    rx

    [-1,1]

    Turn left or right (yaw)

    ry

    [-1,1]

    Tilt up or down (pitch)

    L2

    0

    space

    Toggle trot/rest

    v

    Toggle activation

    w/s

    Move forward/back

    a/d

    Turn left/right

    q/e

    Move left/right

    Nothing

    x