PDA

View Full Version : Self-navigating rover, and how to ruin an AX-12A



jwatte
08-10-2017, 05:59 PM
Saturday, there's a race in Oakland, with a track made from yellow intermittent centerline and white solid sidelines.
(You're allowed to go past the sidelines.)
This is indoors, and the track is small enough that GPS is not that useful, so vision it is.

I whacked at my Raspberry Pi code that previously knew how to detect orange cones, to make it detect yellow centerlines.
You can see that I probably need to tune my steering gains here:


https://goo.gl/photos/1cH5JVdPD78MssQ79

Some example media:

Raspberi pi camera input:

7075


Processed to find "yellow lines":

7074

Re-projected into a virtual "top down" coordinate system:

7073


So, what about the AX-12A?

Well, when you see the bot slipping so much in the middle, you can kind-of catch the right rear wheel twisting, and then not twisting back.
It turns out, this was the plastic gears in the AX-12A slipping!
I don't know why this happens, because the friction from the wheels is not that big, and the wheels are only two inches wide, seven inches tall, with the center of rotation straight above the contact point -- so, at most an inch of torque. The rover weighs less than 12 kilos, but perhaps with enough dynamic force, and the wheel catching on something, and the other wheels slipping, it could push hard enough to slip the gears?

Anyway, I may have to take some MX-64 from Onyx, or perhaps treat this as an opportunity to test out some XM-430...
But not before Saturday! Wish me luck :-)

Hugh
08-13-2017, 08:54 AM
Could it be an angular momentum thing? Does the slipping happen if you spin the wheels in the air with no surface contact?

jwatte
08-13-2017, 11:55 AM
When the wheels spin/turn free, there's no problem with the gears. It needs significant torque to push the gears out of alignment.
It didn't happen yesterday at the races; the surface there was a polished concrete floor, so quite slick, which probably helped prevent the wheels from being pulled sideways.
Also, I had turned down the maximum allowed slew rate for steering; limiting it to 0.15 radians per 1/60th of a second; that might have helped, too!

jwatte
10-08-2017, 10:38 PM
For what it's worth, in August, I placed second in the races!
The bot followed the path with pretty much perfect accuracy (except once when it lost sync and had to use its fall-back behavior to re-acquire the track.)
My limitation now ends up being the speed of the rover. The Pololu motors are already at their max speed; trying lower gearing will burn out the motors if I drive too hard (tried it -- uphill was bad!) Thus, I'd need to find/make some significantly stronger motors with equally decent encoders; this ends up being a rat hole full of piled yaks that I luckly ended up stopping myself before I broke out the razor!

Instead, I got a 1/8 R/C car, with a fast brushless sensored motor, a cheap 150A ESC, and a Pi3/Teensy3.2 combo. It's quite fast enough! Not as cool as six independently sprung wheels, and can't turn in place, but we have to make sacrifices for science.

And, to make life more fun, I'm now using a neural network to drive, rather than the traditional computer vision approach.
But, it turns out, I'm not very good at RC driving, so I can't provide good input data to train the network on.
Thus, I feed a recording of my crummy driving into the computer vision algorithm, and train the network on what the vision thinks should be done at each frame!
I also have the option of loading the trained network, and re-training / augmenting with a low learning rate, based on more captured data, which I may try next race day.

Here's a movie (too highly compressed) of what my computer vision algorithm thinks about the track:


https://youtu.be/XmPRetbyfy4

Note that this video is made out of every 1;4 of input frames, because I capture at 90 fps! (And I run three networks in parallel on three cores on the Pi3, to process at 90 fps, but with 30 ms latency for each.)

Here's a picture of the car. For some reason, the lock nuts for the wheels keep falling off. I'm applying LocTite 242 now, and hoping it will do better:

7092

Using a Teensy 3.2, a Raspberry Pi 3, a 5" touchscreen display, a SainSmart fisheye camera, a Hobby King ESC, a Solar Servo D771 digital servo (quite noisy even when idle!) and a 4266 "OMG 1.6 kW!!!!1!1!!" sensored brushless motor. (Not actually 1.6 kW in any real sense.)
3S battery. The ESC is rated for 2S-3S; the motor is rated for 3S-4S; it's quite fast enough on 3S for now.

tician
10-08-2017, 11:08 PM
RE: locknuts. Have you already forgotten your lesson? They suck. A thick nylon patch on the bolt can work pretty well for one tightening, but nylon lined nuts just do not hold tight.

jwatte
10-11-2017, 11:01 PM
Yeah, I didn't have the option of choice here, because I bought a kit made of solid Chinesium! (Actually, mostly aluminum and zinc and perhaps some bushings of bronze?) Those nuts are "lock nuts" in the sense that they're supposed to lock the wheel onto the mounting hex, but they have no real fairings or friction patches. They're also very thin, like very thin M12 jam nuts with just a couple of threads.

Loc-Tite works for now. I also bought spare nuts, because I'm pretty sure I'll lose some. (I've had to backtrack hundreds of feet to find the lost nut more than once already...)

The network trains well; now to actually make it run at speed on the Pi. I've mocked it up and think it should work, but the proof of the pudding is on race day...

TXBDan
11-21-2017, 10:02 PM
That's really cool. I've always been interested in making autonomous cars go fast. I worked on a DARPA Urban challenge team back in the day and we took it to VIR. It went a solid 10mph... ha I also have a real life race car so i'm very interested in vehicle dynamics and such.

The Sparkfun AVC thing would be cool.