Trossen Force Sensor  Top Banner

Project Information

Xachikoma

Synopsis
I wanted to make a Tachikoma, the cute and deadly tank from Ghost in the Shell: Stand Alone Complex. With not enough time nor resources to make a 100% accurate one, I focused on giving it the exact same movement capabilities.
Resources
Created by Xevel | Forum Thread Link | External Link
500+ hours over 8 months
5-10 pounds
30x30cm; h=43cm, very variable
15-30 minutes
Python 2.7, C, C++
Autonomous or WiFi to master PC
Powersource: LiPO / LiON

9.9V 3s1p 1800mAh LiFePO4

Locomotion: Wheel Driven

Most of it's life, he moved using its 4 individually steerable powered wheels at the end of its 4DOF legs. It could also walk, but I haven't implemented it yet :/
The wheels are position and speed controlled to provide accurate rolling movement.
The drive system allows for arbitrary holonomous motion.
The legs have a form of high level control to maintain the shape of the robot when it moves, using FK and IK.

The body and legs are driven using IK control, with custom heuristics to resolve the 4th DOF, and full body translation and rotation is implemented to allow Xachi some dance moves :)

The 3DOF neck is also IK driven and stabilized relative to the ground to keep it level and at a fixed height at all time.

Controller/CPU: Beagleboard-xM

ARM cortex A8, 1Ghz, 512MB RAM.
Used to run Ubuntu 10.10 headless for arm (kernel limited to 800MHz), now running Ubuntu Linaro 12.03 with customized kernel at full speed.

Sensors

- LIDAR from the XV-11 from Neato Robotics: 360°, 5Hz, 1° angular resolution
- Playstation Eye: USB cam, up to 640x480@60Hz and 320x240@120fps (no blurry images!)
- CMUcam3+ running custom color detection code

Actuators

14 x AX-12+
5 x AX-18F/AX-18A
4 x 12g Metal gear digital hobby servos (to orient the wheels)
4 x 150:1 Pololu Micro Metal Gearmotor HP (to power the wheels)

Description

After years of thinking obsessing about it, I finally had an opportunity to make a Tachikoma-like robot for the Eurobot contest of 2011.

The Tachikoma are cute SUV-sized, four-wheeled/legged and spidery-looking robots from the anime Ghost in the Shell: Stand Alone Complex.
Their design is largely inspired by jumping spiders (some may not like that), yet the way they move, their high-piched voice and their innocent AI makes them probably the most expressive and likable non-humanoid robot I know. And their hybrid locomotion system makes them so versatile (quick on the roads, yet still capable of navigating uneven terrain!)
Have a look at some Tachikoma footage without sound, you will quickly understand why I think their movements are one of the cornerstone of what makes them likable (cf Luxo Jr Effect), and one of the hardest parts to get right. Sound is easy, appearance is harder but has already be done relatively well, but as soon as it moves, the magic disappears :/

The problem

I chose to try to do right at least the biggest part: the leg movements. I removed the arms and pod to comply with Eurobot rules.

Designing a kinematically accurate Tachikoma leg is pretty demanding in terms of number of actuator per unit: in the anime, each leg has a total of 6 DOF + retractable, powered wheel + extendable fingers.
However, one of these DOF is useless since it's a rotation shared by two ball joints, and one - the wheel rotation on the knee-foot axis - does not influence the position of the contact point. The retractable quality of the wheels is not important enough in terms of kinematics and way too demanding in terms of engineering to be included in this small robot. And same goes for the fingers, for which the problem of relatively scarce documentation would have made it hard to do right anyway.

So I settled for 4DOF when computing the inverse kinematics. Better than 6, but still an infinite number of solution... So I went on an arguably successful quest to put the coolness factor of the leg motions of the Tachikoma in equation.

All in all, the leg problem it is still not solved (there is a lot of control software left to have the Tachikoma-feel I'm desperately looking for), but at least I have a robust platform that has the potential to do it.

Hardware description

The legs are made mainly from Dynamixel servos and brackets. Aluminum plates and square tubing, ABS and polyamide parts are added to correctly implement the kinematic model. A nearly full CAD model of the beast was made to make sure everything would fit.

The wheels are powered by a small motoreductor through a timing belt drive, allowing me to keep the general leg volumes compatible with the potential creation of a tachikoma-looking shell. They have magnetic encoders to sense their position, and I use that to drive them accurately in terms of position and speed with a P position controller and a PD speed controller.

The Beagleboard sits snuggly between the servos inside the body. The PS Eye camera (without its case) is attached to the "head" that is maintained level at all time ensuring images always from the same point of view whatever the body orientation. On top of the body lies the battery and power management board on one side of the neck, and the USB hub on the other side with wifi, bluetooth and USB2AX dongles plugged in. Under the body is bolted the LIDAR that is used to locate obstacles when navigating autonomously. With Hash79 and a few others, we did most of the reverse engineering of the protocol used by the LIDAR to make it usable outside of its original body.
Another fun fact about the LIDAR: it is situated between the legs, and obviously sees them since they are far enough for him to pick them up. By asking their position to the servos and computing the robot shape using FK, it can determine where the legs are situated in relationship to the LIDAR, and then dynamically ignore only the legs. Even if you put an obstacle between the legs, it can detect it properly and still not be afraid of his own limbs.

On the electronic front, this project was the occasion to create the USB2AX, a small USB to Dynamixel interface.
I also made custom Dynamixel devices out of modified Pololu Orangutan B-328 boards. to control motors and hobby servos from the dynamixel bus.
A great learning opportunity!

The rest of stuff I want to say about it (but for which I have no idea of a clever title)


As I said, I made it to participate to Eurobot 2011 (I needed the help from sponsors), and the constraints of the contest had a big influence on the design.
Results : French qualifiers : Creativity award; international finals in Russia: Design award. And it still entertains the public at the occasional robot exposition every now and then.


All in all a fun bot, and still a work in progress:
The software is undergoing a lot of performance optimization to allow for rapid-enough communication with the servos and IK computations to make it walk and, most importantly, dance like an artistic ice-skater...
With the performance boost from the latest USB2AX version, the snappier Linux install and various code optimizations, I can't wait to show what it can REALLY do

Still nursing the hope to find the resources to make the next version that will both move, look and sound like the real one... with 50+ actuators... airsoft gun in place of the guns... one day...


(Maaaaaan, am I that long-winded ? I feel there is still so much to say... :s )