Autonomous VEX Robotics Competition

The VEX Robotics University competition is a competition where colleges build robots to play a sport. These robots run preengineered actions for the first minute and then have a minute of driver control. I led undergraduate team attempting to compete with a robot running autonomously for the entire 2 minute competition. Making its own decisions during both traditionally hand scripted section and the driver control section.

In 2017 we competed with a fully integrated robotics stack running autonomously at the world championship. Seen below is one of our best matches from the world championship. The event was full of bugs causing the robot to stop working and when it did work perfectly lots of suboptimal behavior that unsurprisingly resulted in losing against human drivers.

The autonomy elements were built on a 150$ budget using a simple planar lidar, raspberry pi and a set of encoders. The software stack was built using ROS and PCL. The software is entirely open source and was released to the community in the hopes it would spure others to be interest in autonomy.

In 2018 we decided to upgrade the entire stack to encorporate state of the art tools instead of handbuilt pieces, this would also serve as my senior thesis project. For example we replaced open loop dead reckoning with Google Cartographer a slam package to map the field and then be switched to pure mapping during runtime. We used the ROS Nav Stack instead of simple drive straight turn in place motion commands like in 2017, so we could more properly handle robot collisions with walls instead of an overly cautious hack. Below is a video of our demo robotic platform running SLAM and Nav Stack motion planning running a script of high level commands.

We also replaced our hand built very hacky pointcloud based objet classifier with state of the art computer vision CNN running on an onboard GPU. Example below showing some of our hand labeled dataset collected on our demo platform of the robot driving around while recording our pair of fisheye cameras. The fisheye camera is pointing downwards and “unwrapped” so that angle of the original image is represented as x axis in our new image and the pixel distance from the calibrated camera sensor is the y axis in the new image. This meant that a single image covered a large FOV and that objects in front vs behind looked the same to our neural network that would not normally be rotationally invariant.

While this updated system was never fully integrated it was a good demonstration of what was possible given standard tools onboard the robot given various constraints like space and power consumption. This project won the WPI Provost award in the Computer Science Department. Excitingly since this project VEX Robotics Competition has added a special division for teams attempting to develop autonomous robots.

Related