PDA

View Full Version : Giskard: a low cost ROS powered robot



richardw347
05-21-2013, 10:26 AM
Hello everyone,

I wanted to introduce myself and my robot; I'm a PhD student at the University of Liverpool and am very lucky to be working with ROS and robots for my thesis. I've been working with robots as a hobby for a few years and I figured the best way to learn ROS was to build an awesome robot!

So inspired by Maxwell and Pi Robot and built Giskard:
47744775
http://forums.trossenrobotics.com/attachment.php?attachmentid=4749&d=1369147823

I tried to make it as cheaply as possible given that I'm a poor student so the frame is just a few pieces of wood from my local hardware store :tongue:. The drive system consists of a pair of EMG30 gear motors and a 30A Roboclaw motor driver, for the low level control I'm using a Red Back Spider Robot Controller (basically just an Arduino Mega) and using rosserial to control everything from a netbook. For sensors I've got a Pololu LPY510AL gyro to help with odometry, an Neato laser for 2D localisation and mapping and an Asus Xtion mounted on a pan-tilt head for 3D perception.

Early attempts at mapping my bedroom haven't gone so well:
4776
http://forums.trossenrobotics.com/attachment.php?attachmentid=4754&d=1369149490

This is down to the poor odometry performance, which I think is a combination of low encoder resolution (I only get 360 counts per wheel rotation) and the wheels being really stiff rubber which means they slip a lot especially on carpets. I've been playing around a lot with calibration to see if I can get any better results but I haven't had much luck. I'm thinking of getting same Banebots wheels and getting some higher resolution encoders to see if that helps.

Any comments/input would be welcome!

jwatte
05-21-2013, 12:11 PM
Sounds great! I want to see pictures :-)

As a comment, it *should* be possible to get odometry from the successive readings of the Neato and/or the Xtion. That's a more challenging SLAM application of course, so it's probably smart to try for grippier wheels first.

CasperH
05-28-2013, 12:47 AM
Could you filter out the wheel slips? maybe use an accelrometer as additional reference?

I would also like to see a picture. :veryhappy: You didnt mention the type of wheels you use.

richardw347
06-05-2013, 04:32 AM
Apologies my pictures didn't upload the first time, fixed it now!

I originally had these wheels: http://www.robot-electronics.co.uk/htm/emg30.htm (bottom of page), the tread is very hard rubber though so it slips really easily on the carpets and hard-wood floors I've been it on. Really fun when driving manually but not so much for autonomous control :tongue:

I picked up some banebots wheels (50 shore) which are really nice I get loads more traction. I'm currently working on adding an arm to the robot and getting it working with MoveIt! so I'll post some more pictures and maps when that's finished.

Pi Robot
06-06-2013, 10:51 AM
Awesome looking robot!

CasperH
06-19-2013, 07:28 AM
About the odometric accuracy, for rotation (or rotation velocity) a gyro might be the best way to go.

Given your eduction, you might like this paper: http://www.valentiniweb.com/Piermo/robotica/doc/borenstein/paper63.pdf It is however, only for straight driving, but its a start.

Basically I google a bit for "encoder", "gyro" and "lateral drift".

martimorta
07-08-2013, 09:14 AM
Hi Richard,
I don't know if you have the odometry fixed at the moment, if not, seeing that you have got a gyro I would recommend to integrate the angular velocity in yaw to get the angle and use the information of the encoders only for the translation.
Also I would check the odometry in a measured straight line to see the error due to wheel slippage and if it is big you could add some kind of tape which helps the wheels grip better to the floor.

Which computer are you using?

richardw347
07-09-2013, 05:15 PM
Hi! Thanks for the reply, I was intending to use the gyro and still might but I've moved to the banebots wheels and they eliminate quite a lot of the slip I was getting so the gyro will only probably be needed to get that last ounce of accuracy out of the platform.

I haven't had much time to work on the robot as I've had a lot of reviews and meetings related to my PhD, I've got a bit more time now so hopefully I'll be able to post a more substantial update soon :D

Thanks everyone for the input :D :D

richardw347
07-12-2013, 10:40 AM
Another quick update with pictures :D

I've got the banebots wheels on the bot and spent a bit of time making a better mount for the neato laser and making the neck fold-able for easier transport.

Here's a side view of the 3d printed mount for the neato laser, gives it a nice sturdier base:
4862

And a picture of the "neck" folded down:
4863

And a couple extra pictures just for hoots:
48604861

On the software side of things I've been busy trying to lean everything up a bit and not chew up so much network bandwidth. I am currently running my robot driver, robot state publisher, move_base and the neato laser driver on a netbook located on the robot and then I run gmapping and on my laptop along with rviz for visualization and navigation goals. The netbook and laptop are communicating via a WiFi router.

I'm able to get everything running together but it chews up way to much bandwidth, to the point where giving the robot a navigation goal a metre in front of it will result in a a series of infrequent small movements towards the goal. move_base is complaining intermittently about not have a transform in the cache which is why I think it's just a simple network latency issue. I'm in the process of trimming everything down to the essentials to see if I can get better performance out of it!

jwatte
07-12-2013, 11:20 AM
Are you sure you actually have a throughput issue, and not something else, like a latency or frequency issue?

Pi Robot
07-13-2013, 08:10 AM
I'm able to get everything running together but it chews up way to much bandwidth, to the point where giving the robot a navigation goal a metre in front of it will result in a a series of infrequent small movements towards the goal. move_base is complaining intermittently about not have a transform in the cache which is why I think it's just a simple network latency issue. I'm in the process of trimming everything down to the essentials to see if I can get better performance out of it!

What happens if you take the network out of the loop and send the move_base goal directly on the robot's netbook like this:


$ rostopic pub /move_base_simple/goal geometry_msgs/PoseStamped '{ header: { frame_id: "base_link" }, pose: { position: { x: 1.0, y: 0, z: 0 }, orientation: { x: 0, y: 0, z: 0, w: 1 } } }'


--patrick