View Full Version : [Contest Entry] Lunar Rover Robot

08-16-2007, 02:56 PM

Hi all, first would like to say I found Trossen's customer service refreshingly well, helpful :)

Now down to business...

Website with additional info/pics/reports: link (http://www.cise.ufl.edu/%7Ebmouring)

2'x2' 1/4" thick high-impact polycarbonate sheet
8'x1-3/4"x1" 1/8" thick aluminum U-channel
Many 4-40 and 6-32 machine screws and nuts
4x Lynxmotion GHM-04 high-torque gear head motors (link (http://www.trossenrobotics.com/store/p/4260-Gear-Head-Motor-7-2vdc-50-1-175rpm-6mm-shaft-.aspx))
4x Encoders for above motors (not currently used)
4x hubs and 3.5"x1.75" wheels (link (http://www.trossenrobotics.com/store/c/2719-Off-Road.aspx))
2x Devantech SRF10 sonar rangers (link (http://www.trossenrobotics.com/store/p/4872-Devantech-SRF10-Ranger.aspx))
2x Dimension Engineering 5v switching regulators
Dimension Engineering Sabertooth 2x10 motor driver board
BDMicro MavricII Atmega128 dev board
Arcom Viper XScale-based PC/104 development kit
D-Link DCS-900 ethernet-based camera
2x random servos scrounged at the last minute
Generic 4-bit LCD
Altera MAX7064 CPLD (not currently used)
Many rechargeable batteries

Important features: Vision, all computational work is done on the robot, completely autonomous (not remote controlled)

For a course at my university, and for a later robotics competition, I built a rover that could support higher-level image processing on-board (a constraint of the competition) as some of the targets that I must pick up (painted blocks) are on backgrounds that can be the same color as the block itself.

Disclaimer: right now, the vision is very basic, but the capability is there.

I started out with building a rugged base from quarter-inch high-impact polycarbonate (top and bottom) and eighth-inch aluminum U-channel (sides) to provide a neat little "motor sandwich." The motors, encoders, hubs, and wheels were purchased together to ensure the worked together with minimal fuss.

The motor driver board I got (at a fairly nice student discount) is quite a piece of kit. It is more than sufficient for my project as even at stall I'm not going past the capabilities of the board. Couple that with regenerative braking, several built-in protection systems, and the flexibility to control the channels with PWM, analog, or serial and you have a nice unit indeed. At this point, the Mavric was mounted as well to communicate with the driver board and simple drive tests were performed. Worked well, powerful and relatively fast (would either push or run over whatever it came into contact with :D).

Hooked up the two delightful SRF10's to the Mavric's I2C/TWI system and got some simple obstacle avoidance up and running. Scared my roommates' dogs pretty well. Now my hard work wouldn't just ram directly into walls, a major plus.

At this point, I started working on a vision system. I had the DCS-900 laying around (had an opportunity to get a new one for a song, so I couldn't pass that up), and through playing around with it I discovered that making an HTTP request to http://IP_OF_CAMERA (http://%3Ci%3EIP_OF_CAMERA%3C/i%3E)/video.cgi would return a video of some sort. After dumping a portion of the stream, I realized it was an MJPEG stream, or a JPEG with some header and footer stacked directly on top of another such JPEG, 30 times a second. After doing some research into the JPEG standard, I was able to locate and strip JPEGs out of the stream manually after a dump. Time to make computers do what they do best: no, not download questionable material :rolleyes:, do repetitive tasks quickly.

The Arcom Viper board, donated by Arcom (thanks again, without this board I wouldn't have been able to do the vision as easily!), runs a stripped-down Linux install. The first task was to get an environment together to build binaries that would run on this ARM-based system. The dev kit install CD took care of much of this, however I wanted a nice IDE with debugging capability, so I hooked Eclipse (a fine IDE that unfortunately is targeted to Java ;)) to the arm-linux-gcc binaries installed by the dev kit, and built a gdb-server to connect my dev machine to the board over a SSH link to do debugging. Solved a few nagging issues this way, many thanks to Arcom, Eclipse, the CDT plugin group, and De Raadt of OpenSSH (and OpenBSD) fame (even if he can be a bit standoffish at times)

At this point, my attention turned to documentation of how to use the Independent JPEG Group's (in)famous libjpeg. It's quite a nice library as libraries go, and even provided a nice interface for writing so-called decompression source (or compression destination) managers. The default included in the library reads from a file for a compressed JPEG to work on. I decided that in lieu of writing a received JPEG to a file, then read it in for decompression (terribly slow and would shorten the life of the built-in flash storage), I would write a decompression source manager that would make an HTTP request, continuously receive the MJPEG data, parse out a full JPEG, check whether or not the decompression engine is in need of a new JPEG (indicated by making a mutex-protected request) and if so, load up the mutex-protected shared buffer. The back end took some work to get it working well (a few over-protective areas would attempt to obtain the lock twice, simply a result of working too late), but in the end worked quite well. I ended up able to get ~15 decompressed, lightly-filtered and processed (only a simple "distance from color" filter and a horizontal and vertical histogram of the filtered image) images a second, while only consuming ~15% of the Viper's resources. More than adequate for my needs.

I would then get the processing results (an X,Y coordinate of the location of the block) and send the results over a serial link to the Mavric board. It would use this data, along with the sonar readings, to track the block while avoiding obstacles (and subsequently reacquiring the block after avoiding). It works well for now, but there is still work to be done to get the system to where it needs to be for the competition. Fortunately it's not for quite a few months, and with Fall I'll have other students to help me.

I'll try to get a video of my robot running and post it ASAP.


08-22-2007, 10:05 AM
Awesome project Brad! It's great to hear about all the support that you and your team found along the way. Sometimes that can be a project in itself.

We're excited to this video!

Thanks for the submission:)