Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Examine Linorobot again. You will see very detailed instructions for building a robot, both the hardware and the software. In our world, bullet, platform, cat are all fully compliant Linorobot robots.
For the SBC we use either a Rasperry Pi 3B+ or a Raspberry Pi 4
Lidar is connected via USB to the Raspberry Pi
The Microcontroller is connected via USB to the Raspberry Pi
For the microcontroller we use either a Teensy 3.2 or a Teensy 4.x (check this)
The motor controller is connected to the Teensy via
The IMU is connected to the Teensy via I2C bus
The SBC is running Ubuntu 20.04 and ROS 1.0. It is a standard install which we get from the Linorobot installation. Of course our own ROS code and scripts are then added to it. Certain standard Linorobot Nodes are launched.
The Teensy code is provided by Linorobot. We have tweaked it in small ways. See How To for information on rebuilding it and installing the software. This software has the following jobs:
Read the encoders to determine the apparent speed and direction of the robot
Subscribe to cmd_vel to determine the desired speed and direction
Use a PID controller to drive the motors to meet the desired speed and direction
Publish the actual speed and direction as computed by the encoders as ODOM_RAW
Read the IMU data (via the I2C bus) and publish it as IMU_RAW
Read other I2C sensors and actuators (coming soon!)
Consistent across Platform 1, 2 and 3?
Raspberry Pi 4b
Arduino Teensy 3.1
MPU 9250 IMU
YDLidar X4
is a software package for building our own robots. We have used their software and instructions to construct our own robots and configure the software for them. The above link is a major resource for you. In this section we will explain things that are specific to Brandeis but we won't repeat it.
MiniRover is our modification of the GoPiGo3 robot. It comes with software, a VPN and all kinds of fancy things which is why I decided to give it a special name. This page is under construction as I troubleshoot and make notes about things. I expect feedback from everyone where the instructions don't meet reality!
This section will help you get your robot set up to go. Note that if you received this from the Autonomous Robotics Lab then part of these steps may already be done.
See the important information about turning the robot on and off here: Using the Robot.
If your robot was set up by us, then you should skip this step!
You are going to load software onto the microSD Card - which will wipe out what you loaded onto it according the the instructions before. We provide you a disk image on which to store it. It changes from time to time. This link is to a google folder that will contain the versions. You should use the latest one in that folder. I recommend you use the app "Balena Etcher" to copy the image onto your microSD card.
To create a new MicroSD from the old one, see Backup Raspi Card on MacOS
Now we are facing a dilemma. We need to get the robot on your network. There are several ways of doing this. Below are two specific scenarios that we support:
A usb keyboard, usb mouse and a HDMI display and eduroam wifi access
OR A network cable and non-eduroam wifi access
A MiniRover v2 and non-eduroam wifi access. It should be labeled with a name.
We will tell you what the name of your robot is. Plug it in below wherever you see gopigo3
Locate a wired network connection (on your router for example) and use a network cable to connect your robot to the network
Now turn the power on (see Using MiniRoverfor instructions. It should boot up into linux. But you won't know this because there's no keyboard or screen!
Using your own computer or development environment that is attached to the network check that you see the robot. This includes a linux computer where you program, or a browser web development environment. Broadly speaking, it's the computer "other than" to raspberry pi on the robot (we will refer to this as the remote computer from now on).
Once this works you know that you have access to the robot from your remote computer. Make note of the robot' ip address. It will likely look like 192.168.1.xx but not necessarily.
Now use ssh
(secure shell) to get to the robot from the remote:
It will ask you for the password for account pi
. It is raspberry
. Once you get in you are 1/2 way there!
Now we want to get your robot onto your local wifi network. You need to know the wifi network's name and password. On the robot command line type:
Where <SSID> is the network's name. You can surround it in quotes if there are spaces or other funny characters. is the password.
Next shutdown the robot nicely (see above), disconnect the network cable, and start the robot up nicely again (see above.)
Once it's back, follow the same steps to ssh pi@gopigo3.local
and enter the password raspberry
and you should have a wireless enabled robot.
Connect your mouse, keyboard and screen to the Raspberry pi. You will find several free USB ports and an HDMI port. Look closely they are all there.
Boot up the Raspberry Pi and wait until it is running and you see the desktop
Locate the network settings dialog box by clicking the network icon on the top right
Add eduroam as a network, and fill it in as follows:
Finally shutdown the robot, unplug the keyboard, mouse and monitor and reboot
Once it's back ssh pi@gopigo3.local
and enter the password raspberry
and you should have a wireless enabled robot
You've received a fully set up and tested Minrover v2, and you know the name. The following steps are written as if the robot is called mr1
. Scenario 3 Robots will have the account ubuntu with the password ubuntu (so different from scenario 1 and 2)
Now turn the power on (see Using MiniRoverfor instructions. It should boot up into linux. But you won't know this because there's no keyboard or screen!
MiniRover will come up as a Wifi access point. The SSID is mr1XXXX where XXXX is part of the MAC address. The wifi password is robotseverywhere.s Connect to it via your computer's wifi.
Under certain circumstances gopigo3.local will be found. If so you need to find out the ip address of your robot when it is on wifi (not wired). If then this should work:
Rename the hostname
of your robot. It comes to you called "gopigo3" but they are all called that and this can cause confusion. Let's say you want to call your robot pitosalas
. Oddly you have to change it in two places. Here's how"
Now the robot is called pitosalas
and at least it will be different from other miniRovers.
On eduroam it is often automatically known as gopigo3.dyn.brandeis.edu. So after rebooting it check if this works. It's not guaranteed but often it works just fine.
You can run everything on the robot itself but it doesn't have a screen. You could plug an external screen, keyboard and mouse and have a complete ROS enabled computer (the robot) at your finger-tips. But it would be real slow. So instead we are going to add the robot to a "VPN" - virtual private network which will allow you to work with it remotely.
Prepare the VPM configuration by:
Setup VPN
myvpnip
should now return that same IP address.
We have installed some Brandeis specific software which should be updated:
And then edit the new ~/.bashrc according to the instructions in the file.
Check that you have ~/rosutils directory on your cloud desktop. If not:
Edit .bashrc according to the instructions in it
Author: Pito Salas
An excellent and very detailed troubleshooting guide for the Raspberry Pi
Blinking green is the “starting up” indication.
Solid green is the “ready” indication.*
Solid yellow is the “low battery” caution indication.
Solid red is the “low battery” warning.
Blinking red is the “shutting down” indication.
Blinking purple is (AFAIK) the “I don’t know what is going on” indication, but I could be wrong.
But, note that all of these indications are only valid when using a Dexter Industries / Modular Robotics operating system. (i.e. Raspbian for Robots, GoPiGo OS, or Dexter OS). It will continue to blink green when using any other O/S. The Minirover configuration is Ubuntu! Also see: https://www.dexterindustries.com/GoPiGo/get-started-with-the-gopigo3-raspberry-pi-robot/2-connect-to-the-gopigo-3/
To Change the default python on ubuntu (assuming you want it to be python3.4)
sudo update-alternatives --install /usr/bin/python python /usr/bin/python3.4 1
Over the years we've built up a varied collection of robots. Some are no longer in use and several types still are. This section contains a lot of the robot specific details. Today the main robots we use are the TurtleBot3 from Robotis and the home grown Platform line.
These have been our workhorses. They are robust and don't seem to break much. Probably because they are pretty musch standardized from the factory
These robots are homegrown, hardware design and manugacture by Charlie Squires and software stack based on the teensy
When a new robot is built, there are a number of steps needed to see that it is properly configured.
Encoders need to be connected correctly. The left and right can be swapped, and also they may be backwards. These issues lead to very crazy and incomprehensible behavir, so it's best to check them first if your new robot is acting weird.
Motors also need to be connected correctly. They can be left-right swapped or forward-reverse swapped. It is worth again to test them separately. The kind of incomprehenible behavor from this kind of problem is totally different from the one when the encoders are backwards or swaps.
It is very important that you know what the front of the robot is. Otherwise nothing makes sense. Our robots have the big wheels in front and the casters in back.
The IMU is connceted via an I2C Quick Connector to the Teensy. We have seen problems when the IMU doesn't work that were caused by the placement or length of the wire, so keep an eye out for that case.
The most common problem with a new build is that the pin numbers are incorrect. You can find the hardware configuration in linorobot/teensy/firmware/lib/config/lino_base.h
. In that same directory you will find other versions of that file for our different models. If you are qualifying a new model, then you should add a copy there.
There are numberous variables in this file. The key ones for now are:
and
MOTOR1 is front left, and MOTOR2 is front right. There are several misconfigurations possible, basically all the permutations of left/right and forward/backward, on both the motors and encoders.
If all wheels are spinning but the robot spins in circles, goes backward, is unresponsive to cmd_vel
commands or in general acts crazy, your first hypothesis should be that one or more of the pins above are incorrect or swapped.
If one of the wheels doesn't spin at all then it's probably an electrical connection to the motor. If both wheels don't spin then it's probably a power/battery issue.
To check the encoders, disconnect the power from the motors, leaving just the encoders. Then when you roslaunch linorobot minimal.launch it will print four numbers over and over again which are the readings of the potential for encoders (two in front and two in the back.)
As I had only two wheels I just got two non-zero numbers. Put the robot on the floor and push it gently forward. Those two numbers will change. They should both go up approximately as fast, even though they are not the same number. In my case, they went in opposite directions. Note the one that is going down. The left wheel is encoder1 and the right wheel is encoder2. In the setup file I switched the pins for the corresponding motor and that solved it.
(Note: This is very obvious but I got it wrong: The caster is on the BACK of the robot not the front. You need to know what is the front of the robot to know which wheel is left or right.)
For me, this didn't quite solve it. So the next thing to check was the motor itself. I changed the Arduino code (.ino) so that instead of sending the calculated numbers to the motors, I sent 100 to the left motor and 500 to the right motor. This is so that I could tell if the left motor turned slower than the right. If not, I had to switch those pins. Next I had to tell that both motors turned so that the effect was forward motion of the robot. That was incorrect for me too. That too is fixed by switching PIN numbers.
Next came the PID settings. The instructions are good there in terms of getting monitoring the result of the pid calculations but not as far as what numbers are right. There are an infinite number of references on the web on tuning pid and they are all vague and different.
Here again I made a small change to the Arduino code. I had it print out the error between the desired and actual rate of the left and right wheels. If things are working like they should that error starts at zero and when you give the robot a command it temporarily goes away from zero and then returns nicely to zero. I don' know yet what "technique" I used, nor whether I have the right parameters yet. But at least the robot is behaving better.
The controller.py
provides an API to control the robot by emulating a PS4 controller, similar to this. Main difference is some keybinds are different, but it can also be called from code.
This is a breakdown of the dictionary's fields that is sent through UDPComms to the robot.
lx
[-1,1]
Walk left or right
ly
[-1,1]
Walk front or backwards
rx
[-1,1]
Turn left or right (yaw)
ry
[-1,1]
Tilt up or down (pitch)
L2
0
Nothing
R2
0
Nothing
R1
1 or 0
Toggle trot/rest
L1
1 or 0
Toggle activation/deactivation
dpadx
[-1,1]
Move body up or down
dpady
[-1,1]
Roll left/right (roll)
square
0
Nothing
circle
0
Nothing
triangle
0
Nothing
x
0
Nothing
message_rate
20
Rate of which messages are sent
Note: values [-1,1]
means any values between -1 and 1 and values 1 or 0
are a toggle. This means that the first time 1
is sent, it will cause the value on the pupper to change. A 0
needs to be sent before another 1
will cause a toggle.
It is a good idea to use some sort of smoothing for lx
, ly
and rx
to avoid abrupt stops.
If running the controller manually, these are the controls:
space
Toggle trot/rest
v
Toggle activation
w/s
Move forward/back
a/d
Turn left/right
q/e
Move left/right
x
Stop any sort of movement
Robot must first be activated, this will also trigger calibration if the pupper software was run with the --zero
flag. Then it must be in trotting mode and only then can it be controlled with other functions.
We recommend that you start with a GoPiGo3 Basic Kit. Build it following the instructions. There is a lot written and a lot to google about it. Here are some useful links:
Dexter Forum - where you can get many questions answered
Build your GoPiGo3 - where you find step by step instructions.
MiniRover Purchase Program - instructions for students taking Cosi119a at Brandeis University to order a miniRover.
In addition to what you received with the Basic Kit you will need the following to get through the instructions.
A Raspberry Pi 3+
A battery pack (with Y-cable and charger)
A MicroSD card
A Lidar
A very short USB cable
The Battery pack is different from what the instructions talk about. The Y-Cable is used to connect the barrel connector of the battery pack with the corresponding battery connector on the "Red Board" which in turn will power the Raspberry Pi.
The robot will come with a microSD card with the dexter software. There is some very very simple calibration that you need to do.
The battery pack is specific in terms of voltage and capacity. Don't substitute it for another one.
Note that the battery pack needs to be in the "on" position in order to have the charger do anything. And obviously it has to be in "ON" for the robot to work.
It also comes with a charger and a y-cable. The y-cable is about 10" long and has three barrel connectors on it in a y-configuration.
We use only two of the 3 ends of the Y-connector. One goes into the barrel connector on the red board, and one goes into the barrel connector on the battery pack.
We then connect the very short usb cable with one end in the battery pack and the other end in the "power" connection of the Lidar board. The Lidar board is tiny about 1.5" square. On the one side you have the funny Lidar wires and on the other side you have two micro usb connectors. If you look very carefully one is marked data and one is marked power. The one marked data is connected with a short usb cable to one of the usb connectors of the pi. The one marked power is connected with another short usb cable to the battery pack.
You will have bought the camera separately, but the instructions talk about how to mount it to the robot. Make sure it is centered in all three dimensions. A good spot is on the front of the top plexiglass part. It can be do without any extra parts, but Dexter has this bracket which works nicely too.
The Lidar needs to be mounted on the top plexiglass part. You will need to drill two holes. Make sure that the Lidar is exactly centered and pointed forward. As far as the drilling: I kept it to a minimum and drilled only two. I think it doesn't matter much. Mostly that it be straight and centered. Note that the narrower part of the Lidar is the front. You want it centered and facing forward (which on the other side of the ball caster.) I drilled holes for the two posts at the back of the Lidar (the wider part.) I don't see that it matters much though, that was what was easiest for me.
It is important that you follow a rigid procedure when turning the robot on and off.
Assuming it is totally off:
Make sure the battery pack is correctly connected to the robot.
Switch on the battery pack
Click the micro button on the red board
Look at the Raspberry Pi (not the "red board") You will see tiny red and green LEDs blinking. Wait until the green one has settled down to slow flickering.
From your ssh command line on the remote computer, type sudo shutdown now
Once the Raspberry Pi has stopped blinking you can turn off the power switch on the battery pack.
The Battery pack came with a charger. It has a light on which is red while charging and green when fully charged. Note that the battery will not charge when its switch is set to off.
We've implemented a very simple tool to set up the IP addresses correctly. It will be changing as I figure out how to make it better. So for now...
If you have a cloud desktop and want to run simulations without a actual miniRover" rset cloud
If you have a cloud desktop add a real robot, run rset robot
on your cloud desktop and rset pi
on the actual miniRover (over ssh)
If you have a local docker based desktop, run rset docker
there.
rset by itself displays the current status
(I know a combination is missing and plan a revision of this)
Note that all we run on the MiniRover itself are roscore, and the nodes needed for the motors, lidar and camera. Everything else runs on your "remote". The following commands are to be done on the MiniRover from ~/catkin_ws
Note that this includes all flavors, cloud based, local docker based, and gpu based browser desktops. If you just want to use the simulators on their own and are not using an actual miniRover, then: rset cloud
is enough. At that point you can run your ROS programs.
There are numerous scripts, programs and launch files that are preinstalled on your ROS wegpgb desktop. I will document only some of them here but you can look around and find more that are interesting. All of them from the book are here. I have not renamed any of them for that reason.
In order to facilitate working in all the combinations of environments we have these commands:
rset pi - declare that this is the raspberry pi
rset cloud - declare that this is a cloud desktop working with sim
rset robot - declare that this is a cloud desktop working with a real robot
rset - display settings
There are a bunch of handy aliases:
myip
- this computer's regular local ip
myvpnip
- this computer's vpn ip if it has one
stopnow
- immediately stop the robot
teleop
- run a simple teleop tool
ru
- connect to rosutils
cm
- catkin_make
cw
- cd ~/catkin_ws
cs
- cd ~/catkin_ws/src
miniRover
hostname gopigo3
default account pi
default password raspberry
Cloud or Docker Desktop
default password dev@ros
desktop: url: http://vnc..ros.campusrover.org
vscode: http://code..ros.campusrover.org
ROS_MASTER_URI=http:/100.94.206.80:11311
(example!) should always be set to the computer where roscore is running. If you are using a physical robot, then roscore runs on the robot itself. If you are in the web deskop and working just with simulation then roscore would run there.
ROS_MASTER_URI = robot's own ip address ROS_IP = robot's own ip address
ROS_MASTER_URI = robots ip address ROS_IP = remote computer's own IP address
These IP addresses are on different networks and cannot access each other. So instead we've created what is called a "virtual private network" that connects them together. Both your robot and your cloud desktop have an alternate ip address which they can both see.
myip
returns your local ip address
myvpnip
returns your vpn ip address (if you have one)
Boundary generation occurs in the environment
class within src/boundary_detection.py
The class works by setting an ego agent (the pupper), adding each detected fiducial as an obstacle, and if a fiducial with id = 0
was detected then a goal object is added to the class as well
Obstacle locations are updated if a fiducial with an existing id
is detected and obstacle locations can be cleared with clear_obstacles
Then a list of boundaries corresponding to each obstacle is created using the [Minkowski Sum](https en.wikipedia.org/wiki/Minkowski_addition) formula
Configuration space is essentially a series of points that represent infeasible locations and rotations for a robot in an environment
We can use this along with discretization to generate a graph of all the feasible locations for the pupper robot in an environment with obstacles, assuming we have the models for the agent and the obstacles
Below are two example configuration spaces shown in red for an ego agent in green for the obstacle in blue, notice how the configuration spaces changes when the rotation of the agent changes
We use a RaspberryPi v1 camera whose full documentation can be found at , this also includes hardware and software specs
Additional information on how to install Raspicam hardware can be found at
The package works by taking the camera parameters, fiducial size and family, and additional fiducial detection parameters and creating a detector
class which contains a detect
method that inputs a camera frame and outputs a list of detected fiducials
The params.yaml
file contains all of the fiducial vision parameters
The camera parameters which are used in the params.yaml
file were found online in the raspicam section
The size of the fiducial in meters can be adjusted by printing out new fiducials of a larger size
The src/transforms.py
and src/geometry.py
contain the methods used for transforming the fiducial detection results into easier to work with translations and rotations
Also, parameters in params.yaml
are used to slightly adjust the detection results after transformation
The Pupper project uses the Stanford Pupper 2.0 which does not currently have public documentation but information on the original Pupper project can be found at . More detailed documentation on the original Pupper can be found at . This project explores topics related to fiducial detection, boundary generation, motion planning, controls, and hardware.
The discretization of the environment is done in /src/path_finder.py
in the PathFinder
class
The parameters for discretization are in the params.yaml
file and affect the inflation of obstacles in the graph, step size of the robot (distance from one vertex to it's immediate neighbors), the length and width of the graph to generate, as well as if the graph search should explore nodes diagonally
The discretization happens in the explore
method which uses two serial doubly nested lambda expressions to create a matrix of Point
objects which gets converted into a matrix of Node
objects containing point
, distance
, previous
, and explored
which are the required fields to perform a graph search on this data
The graph search happens in the PathFinder
class in the solve
method which performs a Dijkstra shortest path search from the node corresponding to the agent location to the node nearest to the goal location
The algorithm operates the same as a standard shortest path search but has several optimizations built in to account for the limited hardware of the RaspberryPi
Path profiling is the process of converting the path, a list of Point
objects to a series of distances, and headings for the controller to follow
The math for it is in src/geometry.py
which finds the angle and distance between two points
Many potential improvements exist to boos the performance, accuracy, and resolution of the planning module. Some ideas are:
Use dynamic programming to eliminate redundant in_range_of_boundary_quick
checks for the same node
Implement a gradient based approach to converting configuration space into edge weights
Use a better shortest path search algorithm or a sampling-based approach so discretization is not necessary
The robot was donated by the . Due to time constraints and some delays, a prebuilt robot was delivered instead of a parts kits. The build instructions can be found on this . It also includes other relevant instructions that will be referenced later, such as motor calibration, software installation and run instructions.
The robot consists of 12 C610 motor controllers, 12 M2006 motors, a Teensy board, cables and the chassis.
Raspberry Pi
Raspberry Pi Camera
USB Battery Pack for Raspberry Pi
Keyboard (for emergency stop)
The battery charger in the lab supports charging the 6S Lipo Battery. Settings can be found below:
The batteries took around 1 hour to charge and last around 2-3 hours per charge.
The USB Battery Pack used was a generic one that provided enough voltage for the Pi. These are usually battery packs that support fast charge technology for smart phones. An appropriate cable is needed (e.g: USB 3.0 to Micro-USB or otherwise)
The code can be found at
The models can be found at /models/
The obstacle, agent, and goal models are of type Shape which can be found within src/boundary_detection.py
The parameters of the models can be found within params.yaml
The goal model is a shape containing only a single point
The obstacle and agent models are defined by 4 corner points which are then interpolated to create a series of points defining the boundary of the model
Each model also has a center as well which would be it's relative location to the Pupper
The computer vision module can be found at /src/fiducial_vision.py
The fiducial and computer vision package requires numerous parameters to work correctly, these include fiducial tag size, lens size, and center pixel of the image
The module itself contains the Vision
class which contains a method capture_continuous
which returns a generator which yields the results of the fiducial detection module on frames of the RaspberryPi camera
The modules relevant to boundary generation and planning are src/boundary_detection.py
, src/path_finder.py
, /src/path_profiler.py
, and path_test.py
Boundary generation works by taking in the detected positions and rotations of fiducials and then creating obstacle
classes to represent each fiducial. Then each obstacle.points
of type Point[]
are transformed to it's corresponding fiducials location by the src/transform.py
class. Then The Pupper robot model models/agent.py
which is at (0, 0)
and each obstacle
are used to calculate the configuration space for robot using the [Minkowski Sum](https en.wikipedia.org/wiki/Minkowski_addition)
Also, the models/goal.py
class is used to represent the goal of the Pupper which corresponds to the fiducial with id = 0
Each point in the resulting configuration space is used to generate a graph of the area where vertices of the graph close to the points in the configuration space are removed so that when a shortest path search is performed the resulting path only includes valid positions from the robot's current position to the goal
Finally, the path is interpolated and converted into an array of distances to travel and at what angle it should travel at, which is then converted into command interfaces commands based on the velocity of the robot
The visualization class in src/viz.py
uses matplotlib
to generate a plot of the agent model, obstacles, obstacle boundaries, position of the goal, graph nodes, and the path of the robot
The program can be run simply by navigating to the root of the repository and then running python3 main.py
Instructions for the base software setup can be found in the . See notes section for project-specific instructions.
3.c: en1
might not exist on the Pi. To find out which network interface to use run sudo ifconfig
and look at the first interface it lists. For this project, it was eth0
.
4.d.ii: Although it says Mac/linux
, this is not always the case. If ls /dev | grep tty.usbmodem
shows no results try ls /dev | grep ttyACM
. For this project, it was ttyACM0
.
Follow instructions . The keyboard program mentioned can be found .
Install the dependencies using pip:
picamera
pupil_apriltags
tqdm
scipy
UDPComms
pyyaml
opencv
or cv2
argparse
pickle
matplotlib
The full software can be run using python main.py
, or the controller can be run separately using python3 src/controller.py
.
We use the pupil_apriltags
package to detect fiducials in the Raspicam image so full documentation for the AprilTags can be found at
In order to print new apriltags you have to follow the instructions at and
Motor calibration can be found in the . The robot's casing may have to be opened to access some of the motor controllers.
Instructions for leg calibration can be found . Best results were achieved by supporting the robot with a box and adding paddings (made of paper towels and tape) between the legs and the box to get desired motor angles.
We tested with 16 printed 0.03 meter
fiducials and a floor sign to hold the goal fiducial
A Pupper 2.0 robot flashed with the low level Teensey software for the motor controllers
A successful calibration and correctly configured motor IDs
The RaspberryPi and RaspberryPi camera running with the latest software
Running the test is simple, once you have the calibrated pupper robot operating you can then run python3 /src/main.py
A test for the boundary generation, discretization, and graph search can be run through python3 /src/path_test.py
This opens a pickled Environment
object containing obstacle, agent, and goal data from a testing session and then performs a graph search using the PathFinder
class