Skip to content
jenshane edited this page Apr 11, 2018 · 2 revisions

Welcome to the foocars wiki! Please be patient with us-- we're still adding things.

Foocars: Autonomous cars from Fubar Labs

Fubar got involved with DIY autonomous cars in 2016 to participate in the autonomous races at sparkfun AVC in Denver. Fubar's first autonomous car was OTTO Tractor, which began its life as a kid's power wheels John Deere tractor. The autonomous team was originally made up of Rick Anderson, Dan Van Boxel, and Zoltan Sisko, who masterminded OTTO's drill-powered steering setup. Jenny Shane took over as neural network wrangler in mid 2017 when Dan moved away.

Work began on Motto when we realized that the combination of DIY hardware and DIY software on OTTO made it very difficult to diagnose problems quickly. Motto was built from a Traxxas RC car with an added camera, raspberry pi, and arduino: the same autonomous hardware setup used on OTTO.

Our first major success with the autonomous project was in October 2017, at the Pittsburgh maker faire, where OTTO did two full laps of the power racing track with no intervention (albeit slowly). We had more success at the Bell Works mini maker fest in February 2018 where we had Motto running continuously around various loops of cones for hours at a time. We repeated this demo for the USA science and engineering festival in April 2018, with the system running continuously for almost the entire day on Sunday.

We have two other autonomous car projects running out of Fubar: MicroOtto and BananaDrone. MicroOtto is Jim Oslislo's modified donkey car, which has been modded to use foocars' code and hardware. Jim has done a lot of work to improve the foocars codebase and hardware design for reliability and repeatability. Adam Tannir's BananaDrone is designed to be an even cheaper autonomous car solution, using an fpv camera with streaming to a raspberry pi for off-board processing.

As Fubar gains more DIY autonomous cars, the goal of foocars will stay the same: to demonstrate that the underlying technology of self-driving cars can be cheap and accessible to anyone who's interested in it. Also, we'd like to win some races!

Our approach to autonomous cars

Even though all the cars at Fubar Labs are different, they all operate on the same principal: using a single RGB camera as input to a neural network, which outputs a steering command for the car. Since all the cars use the same method of autonomy, they all follow the same set of steps to become autonomous:

1. Data Collection

Our hardware setup uses a regular remote control and receiver, which is connected to an arduino-like microcontroller. This allows us to drive the cars and record the signals from the remote control. The RC signals are sent from the arduino to the raspberry pi, which records video from the camera. This allows us to record pairs of training data: an image from the camera, along with the correct steering command at the time that the image was recorded. These pairs of data are then transferred from the car to a computer set up for training.

2. Data Curation

After the data is recorded, it's very important to review all of it to make sure there are no aberrant recordings. Since a neural network's behavior is determined entirely by the data used to train it, any unintentional recordings can cause unexpected results.

3. Training

Once we're certain that we've pruned any unwanted examples from our dataset, we train the network. Our network is a convolutional net, with five convolutional layers and two fully connected. Once we have a set of weights for the trained network, we transfer the weights to the car.

4. Self-Driving

Once the weight file is on the car, the network can be run. Currently, our network only steers the car; it cant' stop the car.