πŸš—

Gazebo Simulation

← Back

Roobt the robot, detecting license plates.

ENPH 353 Software Project Course - Winter 2021

Goal

In this project course, we were provided with a Gazebo world with roads, pedestrians and parked blue β€œcars” with license plates. In a team of two, we had to program (using Python) a driving robot to detect as many plates as possible and read them correctly using a neural network.

Result

You can see our robot in action and some license plate predictions here:

Driving

We decided to go with PID to keep things simple for the driving. My teammate took care of all the driving and pedestrian detection code. He used HSV thresholding with a polygon mask for road detection. It worked really well! πŸš—

Plate detection

We used HSV thresholding with blurring, erosion and dilation to detect specific contours above and below the license plates. Using the corners of these contours, we could cropped the plates and applied a perspective transform to straightened them.

Then, we applied a different HSV thresholding on these cropped plates to detect the characters. Using the center of masses of each character, we could crop them easily and send them to our neural network for prediction.

Neural network to read plates

Type: Convolutional

Module: tensorflow.keras

Training data: characters from a provided license plate generator + cropped characters captured by our robot.

We transformed the generated characters using ImageDataGenerator from tf.keras.preprocessing.image.

First tries:

The first models we trained were only predicting a few plates right. Some letters were read as numbers and vice-versa.

How we improved our results:

  • Downsize and reupsize generated characters to better reproduce the low quality of world characters.
  • Increase the number of characters captured in the world (vs generated) for training.
  • Train the captured characters and generated characters separately to control the number of epochs and avoid overfitting.
  • Have two models: one for letters and one for numbers.