View Full Version : Basic PIR Motion Tracking????

11-28-2010, 04:21 PM
Hello All,
I'm looking for a little advice and help. I've been building a robot for the last 3 months and here is what i have so far.

I've used a 3D printer, dynamixel actuators and labview for movement. I got that work - no problem there. If you can see - i have 2 camera on the head and i'm actually using only one. I've been doing alot of labview motion tracking using the camera (image subtraction and all that) and its too complicated and difficult for my intelligence anyhow.

I'm planning on putting the robot in our college hallway and i would like the head to follow somebody walking down the hall. The robot will be fixed to a table. Motion tracking using the camera is not going to work so i need another method.

Would it be possible to use maybe 3 PIR sensors to determine whether someone is to the left, center or right of the robot? The person would be between 0.5m and 3 meters from the table.

There is very large windows near the robot so the ambient light will change around the robot will this effect the sensors? Maybe i could set different threshold values?

Do you guys have any other advice on the robot? I hope i've explained what im trying to do. Thanks for your help.

11-28-2010, 05:08 PM
Hi FunnyRobot,
Thanks for the advice but I can do color tracking of objects using the National instruments Vision software already. The problem with someone walking down the hall is you cant rely on them wearing a certain color. I'll check it out but i dont think it would work.

11-28-2010, 06:42 PM
I've been doing a little research tonight on this and I think im going to try connecting 3 ultrasonic sensors and angles from each other. We have lego mindstorms kits in the college and i know they use an ultrasonic sensors. I might rig one up tomorrow and see would it work. If i had 3 of them at least i could have front left and right positions of somebody walking past the robot????

11-28-2010, 06:53 PM
Here's my thoughts:

PIR -- most of the cheap PIR sensors are very slow to respond (like ~1s to trigger, 10s to die out after they have triggered), meaning you can't exactly track something. Acroname does carry a much higher grade pyroelectric, which has very fast update rates (but a very narrow field of view, and quite expensive) (http://acroname.com/robotics/parts/R1-442-3.html)

Visual -- If I were trying to track a target using a single camera, on a mostly static robot, using a simple algorithm, I'd probably turn to optical flow. Not sure about Labview & vision, but most vision packages (openCV, roborealm, IVT, etc) have an optical flow implementation. If your robot's head is not moving, the optical flow should be near 0 most of the time, a person walking in front of the robot would than have optical flow > 0. Once you notice optical flow in a region, pick a color from that region at approximately chest height, lock onto the color of the user's T-shirt and you can do color tracking at that point!


11-28-2010, 06:57 PM
EDIT: Looks like fergs beat me to the punch.

Can't you just have it look for any rapid change in the colors of a group of pixels over a certain size? Basically, grab an image from the camera stream and compare it to the last few images. If there is an certain area of the image where the color values of the pixels have changed more than some threshold, it could be assumed that something in the environment has changed (an object in motion). Ignore specific colors and just pay attention to the changes in color. You would have to compensate somehow for the movement of the robot in tracking (maybe once it identifies movement it selects the final color of the moving object as that it should track?).

Given the changing lighting, you would have to do some averaging of the past and current images to prevent gradual changes in ambient lighting from registering as false positives for motion (pretty much what most IR motion sensors do, any changes slower than some limit do not register as motion).

If you do go with simple PIR motion sensors, you should not have to worry about lighting (as mentioned above, they should only register rapid changes in temperature of the area in view). That said, the tracking will be very basic (three positions: left, right, and center) if you go with only three PIR sensors (if the sensors you have in mind are anything like most are, then there will be no ability to change any thresholds and they will act simply as switches). A better solution may even be to use the PIR sensors to identify motion, turn the camera towards the motion, and grab a color least like the hallway to track (a person).

11-28-2010, 08:15 PM
Thank you to Fergs & Tician,
Hmmmm complicated stuff! So it seems that PIRs are not the way to go. Just to clarify...
The robot is static but i want the head to move from side to side and maybe tilt a little but lets start with going side to side. Your saying if he head is not moving - which i think it will be if it is tracking an object? I've done some image subtracting on gray images with the robot and the results are not great. Too much vibration from movement. I will definitely check out the opitcal flow - I did find the function in labviews machine vision library.

Have you guys ever seen the ultrasonic sensors on the NXI mindstorm robot? I dont think they act as switches - They output a distance if i had 3 of them at different angles I could more than 3 positions - more like 5 left, left-center, center, center-right, right, etc.
Ok thanks again and good night to you all.

11-28-2010, 08:39 PM
PIR sensors measure infrared light (intensity is a function of the temperature of an object. see: IR non-contact thermometers). Most PIR sensors that you would find in security systems, and from some electronics suppliers, act simply as switches using a single pin to notify the user that motion has been detected.

Most ultrasonic sensors and range finding IR sensors (very different from PIR sensors) give an analog or digital value which can be used to compute the distance of an object from the sensor. If you can afford sufficient mindstorm ultrasonic sensors I see no reason why you couldn't use them, but receiving data only at discrete angles from the robot may result in jerky tracking without some assumptions of the target's motion.

Pi Robot
11-29-2010, 09:18 AM
If you use visual motion detection or optical flow in some way (e.g. RoboRealm or OpenCV), one trick I have found that works more or less OK to deal with the movement of the camera is to move it in steps, stopping the motion briefly in between steps to take another motion reading, then move it again toward the new target. Depending on the frame rate and field of view of your camera, you can do this fairly smoothly. The wider the field of view, the longer you can wait before having to move the camera again. Also, since the movement of people down your hallway should be fairly predictable, you could do a little trajectory prediction by simply storing the last few movement parameters and extrapolating ahead a few frames.


11-29-2010, 10:53 AM
Have you guys ever seen the ultrasonic sensors on the NXI mindstorm robot? I dont think they act as switches - They output a distance if i had 3 of them at different angles I could more than 3 positions - more like 5 left, left-center, center, center-right, right, etc.
Ok thanks again and good night to you all.

Those NXT ultrasonic sensors are short range - up to 2 feet.
You wanted 2 - 10 feet range, right? Hm, that sounds like a range for Kinect.
You would have to write your own Labview VI or just wait until somebody else doe it.