PDA

View Full Version : [Project] Pi Robot head tracking video



Pi Robot
04-08-2009, 05:13 PM
Hello,

I've been working on a robotics project for a few years and discovered this great forum. Here is my first video of "Pi Robot" performing head tracking and "arm tracking" of a colored object. For some reason YouTube insists on playing the video in fast motion--the way to fix this is to pause it as soon as it starts, drag the position slider back to the beginning, then click the play button again. Another is to simply click on the video and watch it on YouTube directly where the problem seems to go away.

The head tracking shown in the video uses two servos to move the head and a webcam. I am using two HiTech HS-475HB servos on a Lynxmotion pan and tilt platform and a Philips SPC 1300NC webcam on top of the head. I use a Lynxmotion SSC-32 servo controller to control the servos. The other sensors on the head (e.g. sonar) are not used for the head tracking routine but the middle sonar and IR sensors are used to get the distance to the balloon for arm tracking.

The video is sent to an instance of RoboRealm running either on my PC or the onboard mini-ITX computer. I configure RoboRealm to detect any bright yellow object (in this case the balloon on the stick). I then wrote a C# program to take the output from RoboRealm and send servo control signals to the SSC-32 to move the head to track the yellow object. One could do this completely within RoboRealm since it has a module for the SSC-32, but the object tracking is just one component of my larger C# control program.

Curious to know what you think!

YouTube - Pi Robot Head Tracking + Arm Tracking #2

--
Pi Robot

Semicton
04-08-2009, 07:42 PM
That's awsome. The head tracking and the arms that grabs the ball when it is in reach gives it alot of personality! Very, very cool.

Orac
04-09-2009, 04:30 PM
Very nice.

I'm curious if you had to optimise the pipeline much to get such a fast response to the balls movement.

My experiences with Roborealm were much slower. maybe I had a much slower platform and bluky processing pipeline.

Thanks

Pi Robot
04-09-2009, 06:52 PM
Hi Orac,

Hopefully you weren't watching the video in the funky "fast motion" mode because it's not *that* fast! However, the tracking does in fact work even faster than shown in the video at normal speed.

But to answer your question, I came up with a number of ways to improve effective frame rates when using RoboRealm. My computer is nothing special--a 2 Ghz single core AMD HP desktop with 3 Gb RAM running Windows XP and I get good results even when using the onboard mini-ITX which is a 1.2 Ghz Via board with 1 Gb RAM. The tricks I use are:

* Always use the lowest resolution you can get away with--I use 160x120. Going even to 320x240 can slow things down noticeably and 640x480 is usually a non-starter. There is generally no need for higher resolution unless you are trying to track very small objects.

* Turn off all automatic adjustments on the webcam's control panel. In particular, webcams often adjust for low lighting by increasing exposure time between frames thereby reducing frame rates considerably. Instead, use RoboRealm's Color Balance module to adjust color and intensity if you need it at all.

* Stay away from the Edge Detection modules in RoboRealm unless you really need them as they seem to be fairly CPU intensive.

* Use a high frame-rate webcam. The Philips SPC 1300NC camera I am using goes up to 90 fps and it really works as confirmed by RoboRealm's FPS readout.

* Use a camera with a wide angle lens. Most webcams have a fairly narrow field of view--around 45-60 degrees. The widest I've seen is with the Creative Labs Live! Voice which they claim is 85 degrees. The reason wider is better is that a moving object can very quickly leave the field of view when trying to track it. (According to Wikipedia, the human FOV of both eyes combined is about 200 degrees!) I don't recall the FOV size for the Philips 1300 but I believe it is fairly average--I chose it for its high FPS.

* Finally, as you know, RoboRealm returns the X-Y coordinates of the blob you are trying to track. In my C# program, I keep a short running history of these values and compute an estimate of blob speed (in visual coordinates). I then move the head not just to where the blob is currently, but to where it will probably be a few frames in the future. This helps significantly and tends to compensate for the narrow field of view of the camera.

Putting all these together or even some of them can dramatically improve tracking performance.

--
Pi Robot