Project Information

Money Pit -- RoboMagellan Rover

Synopsis
I'm converting a 10-year-old RC car (a Nikko humvee truck) to a RoboMagellan competitor for 2013 RoboGames.

The main navigation is a pair of 1080p webcams set 40 degrees apart. Additional sensors and lost of computer power helps.
Resources
Created by jwatte | Forum Thread Link
6 months
10-20 pounds
2 feet x 1 foot x 2 feet
1-2 hours
C++ (gcc 4.7, avr-gcc 4.6)
Autonomous C++, plus a wireless e-stop
Powersource: LiPO / LiON

Computer power source: 4S LiFePO4 at 10 Ah.
Motor and control power source: 2S LiPo at 2.6 Ah.

Locomotion: Wheel Driven

An old Nikko R/C truck, with all the car parts, radio, and even steering servo ripped out.
Four-wheel drive (two differentials in plastic gearboxes) came with the truck.
Home-brew servo and H-bridge controller.

Controller/CPU: Intel Core i5-2500

A mini-ITX motherboard with 8 GB of RAM, 30 GB of SSD and an Intel Core i5-2500 provide the main analytical horsepower (not to mention 10 USB ports and a full development environment.) A number of homebrew boards based on AtMega328p microcontrollers drive sensors, motors, and interfacing.
The Intel runs Arch Linux (x64 architecture) and the AVR microcontrollers use my own runtime library with support for interrupts, asynchronous tasks, etc.

Sensors

3x Ping sensors: left flank, right flank, backup
3x Sharp IR sensors: left wheel front, right wheel front, top-down ground sensing
nRF24L01+ wireless for remote display and e-stop
Polulu MinIMU 9DOF IMU (accelerometer + gyro) for center of gravity (a magnetometer on the same sensor is unusable because of interference from motors)
High-mounted Sparkfun HMC6352 compass.
GlobalSat BU-353 GPS
Dual 1080p Microsoft LifeCam Studio USB webcams.
Limit switch front contact sensor

Actuators

Simple hobby servo for steering; built-in Nikko motors for 4WD propulsion.
Home-built high-amperage MOSFET H-bridge driven by PWM from AVR microcontroller.

Description

I can't actually make categories for my album (get a permission denied error.)

Anyway, the main goal here is to use two webcams as world sensors. The theory is: "If humans can drive using mostly the input from two eyes, then why not robots?"

I had initially hoped to get high-resolution GPS input, but given the amount of interference and uncertainty, I am moving away from that, using it mainly for compass and velocity second opinions (just like the top-mounted compass and the gyro/accelerometer.)

Here is the mount I milled for the cameras:



Here is an example output from the "traffic cone classifier" routine:


The main application is written in C++ using FLTK 1.3 for a rough GUI. I develop by simply plugging in networking, display, and keyboard to the computer itself.
The runtime for the computer when idling is over 5 hours (128 Watt-hours LiFePO4 pack) but when compiling or doing analysis is a lot less.
The runtime for the motion is perhaps 30 minutes when moving, and more when standing still (though there are significant losses in some of the control boards that I'm not yet trying to fix.)

I built the LiFePO4 battery pack myself using raw 10 Ah cells and a battery protector/charger circuit board, plus a bunch of labor and soldering.



I'm trying to avoid hard-attaching pieces, so that I can tear the thing apart for debugging, which is needed more often than I'd like.