Autonomous Line-Following Buggy

In 2017, I designed and developed an automous buggy as part of a four-member team. The buggy is able to follow a line and avoid obstacles such as walls, line gaps or slopes. The buggy implements a PID control system that corrects it's errors in line estimation in real time. The sensors used are an array of 6x digital light sensors and 1x ultrasonic sensor. The software is embedded in a PIC microcontroller.

The video shows the first buggy prototype.

The buggy was able to successfully complete the track ('heats demo') with no issues. It was not the fastest buggy but we still managed to win a prize for best software documentation and we could finally sleep without worrying about the project!

These photos show the team after the heats test and also the final buggy which we called Eduardo or Jacques because 'he' resembles the shrimp from Nemo.

Robust Mobile Face Recognition System

In 2016, I worked as a summer intern with Dr. Hujun Yin at the University of Manchester. I developed a mobile video face detection and recognition system on a Raspberry Pi. The system is controlled via speech. It uses Viola-Jones and Eigenface methods for detection and recognition. It is able to actively re-learn and add new faces.

The image shown here is the working prototype. It has shown to be robust to varying light conditions and could be used for a medical assistant robot in the future.

Nigel: Animatronic Assistant

In 2017, during a 24-hour hackathon Student Hack IV, as part of a team I designed a smart animatronic monkey assistant. It is able to give restaurant suggestions, respond to questions and even send text messages.

Honestly, my role in this project was miniscule and involved more gluing than coding, but it was an amazing experience nonetheless and my team was incredibly gifted. The monkey won first prize at the hackathon. As a result we were invited to go to a small conference in Manchester where we saw the Woz (Steve Wozniak).