# Face tracking with Arduino, Processing and OpenCV

Tracks faces and orientates the camera so that the face is always in the middle

UPDATE 1:

I have stripped down the processing code to the bare basics so that it can be used however you want. I added lots of comments to make it easier to understand and leave you open to put your own actions in when criteria are met. I have also added an additional check to see if the face is in the middle of the screen, (min_x, max_x, min_y and max_y inside middle square), so additional actions can be performed, in the case of a robot, it could move towards you.

The stripped down processing code : https://www.dropbox.com/sh/vpvbsrpx43zzcen/S8Xi4MmMwj

UPDATE 2:

Made the face tracking code to control a remote control car to show its potential uses.

UPDATE 3:

The last update I will post on this. I was playing around with the basic sketch today and managed to read how far away the face is from the screen. All I did was put my face 30cm away from the screen and work out how many pixels wide the face bounding box was. I then did the same for 60cm and used the two points to form a linear equation to determine how far away the face is depending on how big the bounding box of the face is. I recorded this and the link is below:

How this actually works is described about below but also in the video. I suggest you watch the video as my writing is terrible.

The Processing code : https://www.dropbox.com/sh/v9hkdxuoazoyb0d/fCGQzEfBAK The OpenCV and Arduino libraries are needed. OpenCV also needs to be installed onto the computer.

The Arduino code : https://www.dropbox.com/sh/ujjlahx83ilv1j2/QB0bu8E-EJ

This is something that I have wanted to do for a very long time, face tracking. As I look at all my favorite films I see a common factor, technology, whether it was R2D2, flying cars or JARVIS from the Iron Man movies. I have already made something inspired by JARVIS, a simple voice control code written in Glovepie to control my computer seen here: http://www.youtube.com/watch?v=i5S1H1nogpI&feature=plcp, but this time I wanted to make something really cool. I had the idea in the shower to try to make something that would follow my face like something out of the i-Robot movie and the rough idea of how I would do it.

Now to talk about what I did. I broke up the screen into 9 different quadrants representing different areas of the screen, I did this visually by drawing 4 lines running up and down the screen to show where the areas are. The program drew a bounding box around the faces with a (x,y,width,height) values, all I did was simply check to see which quadrant the box was in and then print those results to Processing. Depending on what quadrant the box is in Processing will write cases to the Arduino through the serial port. The Arduino code listens for cases and increments the servos position accordingly.