Let's Make Robots!

【Wall E】 in action ?!!


 

Intro:
This project is aimed to bring the Wall-E robot from the movie into life! it will have a camera in his eye, the information will be processed in the computer, and commands are sent via bluetooth back the Wall-E. It can also recognised sounds, and can be controlled manually as a spy robot.
The source code you see in this project are written in C++ and with Qt Framework and OpenCV library.
=============================================
update 08/08/2012

I think i have most of the project planned, and come up with quite a few functionalities.

For Object tracking and recognition, I will write the code myself with OpenCV in C++. And the program will run on a PC, images are transmitted from Wall-E using the wireless Wecam, and after processing the corresponding cammands will be sent back to Wall-E via bluetooth.

I have been looking very hard for programming solution for speech recognition, and hope someone has already wriiten a API or some sort. And Iaccidentally bumpped into a YouTube Video showing a much simpler way of doing this - EasyVR Arduino Shield! So I might use that instead of writing codes myself!


=============================================
update 10/08/2012

Wall-E arrived! :)
I should have started this week, but just before I was going to take the video for my lastest version of the hexapod robot, one of the servos broke!! I guess i just had to wait...



It's a great toy for 8's, it's only got one motor, which means it can only turn left, or go forward. Moves its hands as well, but that's pretty much it. Here is a video showing roughly the same one:

http://www.youtube.com/watch?v=VEoh8Iws-kk 



=============================================
update 12/08/2012

Still waiting for the servo gear to arrive, I am so bored, so I started working on the robot hardware. 

I took it apart and amazed by how well it works considering it's only got so few components.






It was quite dirty since it's second hand. I had to wash every piece of it with soup water! 
I will leave the assembling another day. 

=============================================
update 14/08/2012

Finally found the time to look at the robot pieces and could get started to assemble it.

I recycled the motors and motor driver from my previous robot (Wally Object tracking robot). 








It was quite challenging to modify the robot to fit the servos. but i did at the end ^.^.
I will start coding another day!



=============================================
update 17/08/2012



Here is the first draft of colour tracking code! 

    capwebcam.read(matOriginal);

    if(matOriginal.empty() == true) return;
    inRange(matOriginal, cv::Scalar(0,0,175), cv::Scalar(100,100,256), matProcessed);
    GaussianBlur(matProcessed, matProcessed, cv::Size(9,9), 1.5);
    cv::HoughCircles(matProcessed, vecCircles, CV_HOUGH_GRADIENT, 2, matProcessed.rows/4, 100, 50, 10, 400);

    for(itrCircles = vecCircles.begin(); itrCircles != vecCircles.end(); itrCircles++){
        ui->txtXYRadius->appendPlainText(QString("ball position x =") +
                                         QString::number((*itrCircles)[0]).rightJustified(4, ' ') +
                                         QString(", y =") +
                                         QString::number((*itrCircles)[1]).rightJustified(4, ' ') +
                                         QString(", radius =") +
                                         QString::number((*itrCircles)[2], 'f', 3).rightJustified(7, ' '));

        cv::circle(matOriginal, cv::Point((int)(*itrCircles)[0], (int)(*itrCircles)[1]), 3, cv::Scalar(0,255,0), CV_FILLED);
        cv::circle(matOriginal, cv::Point((int)(*itrCircles)[0], (int)(*itrCircles)[1]), (int)(*itrCircles)[2], cv::Scalar(0,0,255), 3);

    }

    // Convert OpenCV image to QImage

    cv::cvtColor(matOriginal, matOriginal, CV_BGR2RGB);

    QImage qimgOriginal((uchar*)matOriginal.data, matOriginal.cols, matOriginal.rows, matOriginal.step, QImage::Format_RGB888);
    QImage qimgProcessed((uchar*)matProcessed.data, matProcessed.cols, matProcessed.rows, matProcessed.step, QImage::Format_Indexed8);

    // update label on form

    ui->lblOriginal->setPixmap(QPixmap::fromImage(qimgOriginal));
    ui->lblProcessed->setPixmap(QPixmap::fromImage(qimgProcessed));

=============================================
update 24/08/2012



The Servo Gear finally arrived! (actually the servo did, i guess they must have send me the wrong thing, ^.^ )

Anyway, i immediately fixed the hexapod robot, and started making the video, hope I can finally begin working on the coding for Wall-E.


=============================================
update 27/08/2012

As a starting point, I wrote a Qt program to detect colour (red), and send out command via serial port to arduino, to turn Wall-E's head to follow the object. I will extend the object that can be tracked to faces, certain objects, light source etc..

I struggled so much at the beginning, because everytime i connect to arduino via serial port, it freezes the video. I later realized it's the thread issue. when the program is waiting for data from serial port (or reading, or writing? i am not sure), it actually hangs the thread, so I decided to modify both serialport class, and video class to have their own thread when running.

Some people suggest it's not a very good idea to use thread if we don't have a formal education on this subject. And I did find it confusing how to start with thread, because some say we shouldn't make QThread subclass and we should instead move a object into a thread. But since it's in the official documentation that we should make it subclass, I followed the latter.

I am still very new to Qt and OpenCV, since I only started learning these a few days back, and I was already thinking about multithreading, and I now realized how crazy that was!

With frustrations, I spent the whole weekend and my bank holiday just debugging the code. I dropped my diet  routine, my exercises, and my movies!  But I won at the end. Altought it is still not as good as I would expect, tracking is quite slow and inaccurate, and the head shakes a lot, at least it works ^.^

I will look around for some better algorithm, at the meantime might add a few more functionality in the program like adjusting the video properties, and better threading coding...

see you now..




=============================================
update 28/08/2012

The whole reason i spent so much time coding the colour tracking was because, i needed to write a program that does multithreading, and that's because i need to listen to the serial port for input from arduino while processing video.

I need to confirm arduino has completed the previous command before i send another one out. but still, it's not fast enough.

I saw someone has done a project similar to this, but he doesn't listen to signal from arduino, but send out commands from computer every frame he processes. and the result is actually better than mine!

i am thinking, with enough delay between each frame, there might be a possibility that this could work. I can also say good bye to the confusing multi-threading programming too!

I should also stop sending commands when there is nothing detected.

should calculate the middle point of the detected image, so it will work regardless the size of the detected image.

i might try it out tomorrow.



=============================================
update 29/08/2012
So I tried sending commands without feedback signal from arduino, and it works great! I modified the code based on my initial Qt program, using single thread, it  was lagging at all! So now I know what was slowing the program down, must be the 'serial port data listener'. For whatever reason either I am using it wrong, or by nature it is blocking other processes, I should avoid using it. But in the future when I add the command recognition functionality, I will need to somehow send data back to computer, to run certain applications, for example, if I want Wall-E to track Faces, I would say 'Wall-E, follow faces', and arduino will send the computer the command, and open 'track faces application'.



=============================================
update 01/09/2012

Assembling the 
eye. Hot Glue gun is really helpful :)



Ta Da !!!
 
Currently working on object tracking and face tracking. will update soon! 
^.^

 

=============================================

update 05/09/2012


Finally I have some time to sit down and continue my project! I finished the inside layout and tidied up all the cablings tonight. I Also tested the servos and motor driver, all seem working fine!


But i just have to say how much i hate soldering right now!! I literally spent 2 hours trying to solder a switch to some long cables. The first try, i found the cables was a bit out of contacts, so I put hot glue to stick them together, didn't help. So I tried very hard, to take the glue off, and replace all the cables with new ones, and soldered again.


This time was better. I then installed it on the robot, and hot-glued it on. Only found that the switch itself isn't working properly, I have to occasionally push the plastic bit. I guess i might have damaged the switch when I was taking the hot glue off... 


There is a reason I love programming so much! When I was at Uni doing projects, I always left all the soldering and cabling works to my lab partner, and I would take care of all the coding and maths. I just don't have the hands to do these things i guess... :(

 

 

Motor Driver information can be found here:

http://arduin0.blogspot.co.uk/2011/12/what-does-it-do-it-will-take-external.html


 


 

 

=============================================

update 08/09/2012

 

We can now control Wall-E from PC.

 

We can also use it as a spy robot :)

 

With some modifications, we can see the video on a web interface, and control it over the internet.

 

 

 

=============================================

update 22/01/2013


 

Sorry I haven't been making any progress on this robot, as I am currently working on a few website projects.

 

But I have decided in the next few updates, I will be using Raspberry PI instead of Arduino as Wall-E's brain, which will make Wall-E more compact and react faster.

 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

looking forward to your next project.

Got to say, it's sooooooooooooo cool !

You might consider using PID control for tracking, I found it to be very complimentary to OpenCV input - it's also nice in that PID allows easy extensibility to control other actuators besides servos.. that is if you ever decided to re-use the code with a motor driven pan / tilt kit.

to be honest I don't know what is a PID controller yet, could you explain a little? Thanks for the suggestion! :-)

 

Here are some links.

The short of it is.. it's a clever function which tries to follow chaotic input.  If you think of tracking as trying to follow 2 chaotic inputs (the X & Y axis) - then you use 2 PID methods to follow them - it becomes smooth tracking.. 

There are several constants in the formula you can tweak to suite your specific system and desired attributes - 

Here are some examples of people trying different values with MRL 

http://myrobotlab.org/content/fun-arduino-or-how-i-learned-love-mrl

http://myrobotlab.org/content/second-mrl-tutorial-tracking-service-wiring-diagram-0

Good Luck

Brillant! very informative! I will definitely try that!

thanks a lot GroG! :)

Hey it's ez robots lol it's a good board and program

No, it uses Arduino to receive data, and data are processed in a C++ application I wrote myself. 

EZ Robot is for kid, and people who can't code...

I was just messin with ya, you have some of the same parts as their kits, but I knew it was different somewhat. Plus I can't code very well......... yet lol

Ditto!!!

haha... i guess it's good idea for new starter to use ez kit to get a feeling how to build a robot after all... :)