PDA

View Full Version : Neato + ROS!



lnxfergy
12-14-2010, 01:42 PM
Ok, I've now checked the initial ROS driver for the Neato into our repository at SUNY Albany. This currently gives you control of the motors (through a cmd_vel Twist message) and access to the laser. I've also created a package with config files for the navigation stack (although additional work is going to be needed to get it fully stable).

EDIT: Helps if I include where our repository is! http://albany-ros-pkg.googlecode.com -- it's in trunk/neato_robot stack.

What we have is:


neato_driver - package with a generic python based driver
neato_node - package with a node wrapper around the generic driver -- subscribes to cmd_vel, publishes laser and odometry information
neato_urdf - place holder for a URDF model (still in progress).

The original video showing this roll around is here:

http://www.youtube.com/watch?v=bJVEFlbuFO4
And here's a new short video showing navigation at work:

http://www.youtube.com/watch?v=bWwwssBjg6M

More complete documentation should be on the ROS wiki later this evening.

-Fergs

Psykoman
12-14-2010, 02:01 PM
Real cool. Nice work

lnxfergy
12-14-2010, 02:05 PM
I've posted some getting started info on the front page of our wiki: http://code.google.com/p/albany-ros-pkg/

Also, I've attached the map here that I used:
http://forums.trossenrobotics.com/attachment.php?attachmentid=2241&stc=1&d=1292357008

This was generated using gmapping, with quicker updates and some more constrained odometry parameters than usual:



rosrun gmapping slam_gmapping scan:=base_scan _srr:=0.001 _srt:=0.001 _str:=0.000001 _stt:=0.000001 _linearUpdate:=0.5 _angularUpdate=0.4Eventually, I'd like to figure out what parameters tune the beam likelihood correctly, but I haven't found them yet....

-Fergs

lnxfergy
12-14-2010, 02:12 PM
And my official announcement to the ROS user-list (just in case I mentioned something there I forgot to here): http://www.ros.org/news/2010/12/neato-xv-11-driver-for-ros-albany-ros-pkg.html

-Fergs

Xevel
12-14-2010, 02:20 PM
Great work!
Looks like it could put this (http://robodynamics.com/2010/12/200-prize-for-the-first-person-to-use-microsoft-kinect-with-slam/) to an end :)

Pi Robot
12-14-2010, 03:38 PM
Dang, did Fergs just exceed the speed of light? ;) Really nice job!!

--patrick

RobotAtlas
12-14-2010, 04:30 PM
While Fergs is not an ordinary guy by any means,
I think most of the speed comes from the fact that
a lot of ROS stuff got reused on a different robot.
Instead of SSDD we have SRDR - Same ROS, Different Robot.

What do you say, Fergs?

lnxfergy
12-14-2010, 06:55 PM
Documentation is now mostly up to date on the ROS wiki: http://www.ros.org/wiki/neato_robot

-Fergs

Nammo
12-15-2010, 01:21 AM
I don't know how you do it. This is a really amazing accomplishment in such a short time.

I think this is a strong argument for learning ROS. Point and click navigation in an unstructured environment is a holy grail, and here it is for $399.

A million rep points to Fergs!

- Nammo

lnxfergy
12-23-2010, 01:33 AM
I've been playing with integrating CoreSLAM into ROS this afternoon. I've been intrigued by this approach because it uses quite a different form of likelihood for localization, which might prove to work a lot better for lower cost/range lasers. I've still got some work for localization and map saving, but here's an early render:

http://forums.trossenrobotics.com/attachment.php?attachmentid=2263&stc=1&d=1293089483

More details are on in this blog post (http://www.showusyoursensors.com/2010/12/alternative-slam.html).

-Fergs

hash79
12-23-2010, 01:57 AM
Fergs,

I'm on the ROS website right now reading... Hopefully I will have my XV-11 up and running with it soon so I can post some results as well. Yours look very good!

-Hash

lnxfergy
12-25-2010, 04:38 PM
I've been working a bit more on some low-cost SLAM. I'm hoping to have the CoreSLAM wrappers completely done and uploaded to our subversion server in a few more days -- it's about 90% of the way there:
http://forums.trossenrobotics.com/attachment.php?attachmentid=2299&stc=1&d=1293315622

CoreSLAM's map storage methods seem as though they may be better suited for localization using low-cost lasers. Also interesting, most of the code involves only integer calculations, so it runs in realtime on a typical netbook *using EVERY scan* (unlike gmapping, which uses only one scan per meter of travel). (to read more about CoreSLAM, see openslam.org)

I also collected a new dataset the other day with the Neato XV-11 and laser, plus a second Hokuyo laser mounted directly on top (datafile is 2010-12-23-double-laser.bag). This means that the XV-11 laser and the Hokuyo are pretty well aligned, allowing analysis of when each laser is better, and testing SLAM algorithms against two types of laser (with the exact same odometry). Data collection platform looks like this:
http://forums.trossenrobotics.com/attachment.php?attachmentid=2300&stc=1&d=1293315687

An example of the overlap, red is the neato laser, white is the Hokuyo laser:

http://forums.trossenrobotics.com/attachment.php?attachmentid=2301&stc=1&d=1293316547

Finally, I've uploaded a number of datasets I've collected over the past semester on the Neato, Create, and my Armadillo robot to http://www.fergy.me/slam/ (Please note, that downloads might be a bit slow, the server is not the fastest link the world). I've also uploaded a number of maps and notes on how the maps were generated (note that I've trimmer most of the maps for storage purposes). All are free to use these datasets for whatever they want -- however, *please* post back maps and algorithms/parameters used to create them. Over time I would like to develop a set of best-known algorithms/parameters for low-cost SLAM using our commonly available platforms.

-Fergs

hash79
12-29-2010, 10:34 PM
Fergs,

I have been reading about ROS and setting it up in an Ubuntu VM on my Mac laptop. I think I have everything running and I am able to get laser data plotted in Rviz on a grid. I am connected via USB right now so it should be identical to your setup. To move your XV-11 are you strictly using the playstation controller? I noticed the odometry plugin is red and says it has not received any data. Is there a way I can control the robot without using a joystick? Perhaps via keyboard input?

What is involved in interfacing the XV-11 with Rviz so we can control it via that interface? I am continuing to read the wiki and work on this, just thought I would ask before 3am rolls around! :) Forgive my lack of terminology, still trying to wrap my mind around it!

-Hash

hash79
12-30-2010, 03:34 AM
So I have learned quite a bit by reading the ROS documentation but I must still be missing something. I have my static map loaded (currently your map) and have setup my pose. I followed the tutorial "Setting up rviz for the Navigation Stack" and created all of the entries shown in the video.

My XV-11 does not move though. I set the Pose Estimate, then set a Nav Goal and nothing happens. Communication to the XV-11 is fine I am assuming since I can receive the LIDAR data. The LIDAR data does not react to the environment as seen in the Navigation Stack tutorial video so I am assuming something that ROS needs, or a link between data is not correct. Any ideas?

-Hash

hash79
12-30-2010, 05:44 AM
Current Status... Must sleep!! See the image, everything looks right, but my XV-11 does not move...

-Hash

lnxfergy
12-30-2010, 09:45 AM
I noticed the odometry plugin is red and says it has not received any data.
I have never used the odometry plug-in, I'll have to take a look at this. TF transforms should do 99% of what you want for visualizations.


Is there a way I can control the robot without using a joystick? Perhaps via keyboard input?

I'd recommend using the "teleop_twist_keyboard" package from the Brown-ros-pkg repository.


So I have learned quite a bit by reading the ROS documentation but I must still be missing something. I have my static map loaded (currently your map) and have setup my pose. I followed the tutorial "Setting up rviz for the Navigation Stack" and created all of the entries shown in the video.

My XV-11 does not move though. I set the Pose Estimate, then set a Nav Goal and nothing happens. Communication to the XV-11 is fine I am assuming since I can receive the LIDAR data. The LIDAR data does not react to the environment as seen in the Navigation Stack tutorial video so I am assuming something that ROS needs, or a link between data is not correct. Any ideas?

You'll have to use the move_base configurations and launch files from the neato_slam package to get navigation working -- note that I'll be moving this into a neato_2dnav package very shortly as I've recently found this to be poorly named.

Also, you probably need to create your own map. AMCL won't be able to localize your robot within my map, and will probably crash because of it. Without the /map->/odom transform from AMCL, move_base will stop working.

-Fergs

hash79
12-30-2010, 02:27 PM
Fergs,

Thanks for the tips. I have been controlling the XV-11 using teleop_twist_keyboard and if I slow down the initial speed settings the robot will move around. I then ran the "rosrun gmapping slam_gmapping scan:=base_scan _srr:=0.001 _srt:=0.001 _str:=0.000001 _stt:=0.000001 _linearUpdate:=0.5 _angularUpdate=0.4" command and started to create a map of a closed off hallway.

I get weird results though. It seems like the left and right turn are swapped in the software. I say this because if i move the XV-11 only forward and backward it seems to map an area properly. But once I turn the robot it wipes out any fixed walls and every area turns into one big open space. Driving around in a large area also makes it VERY obvious that I can not close my map, and an nowhere near closing it.

Think this is something related to the teleop keyboard, or just a setting I am confusing?

-Hash

lnxfergy
12-30-2010, 02:33 PM
Fergs,

Thanks for the tips. I have been controlling the XV-11 using teleop_twist_keyboard and if I slow down the initial speed settings the robot will move around. I then ran the "rosrun gmapping slam_gmapping scan:=base_scan _srr:=0.001 _srt:=0.001 _str:=0.000001 _stt:=0.000001 _linearUpdate:=0.5 _angularUpdate=0.4" command and started to create a map of a closed off hallway.

I get weird results though. It seems like the left and right turn are swapped in the software. I say this because if i move the XV-11 only forward and backward it seems to map an area properly. But once I turn the robot it wipes out any fixed walls and every area turns into one big open space. Driving around in a large area also makes it VERY obvious that I can not close my map, and an nowhere near closing it.

Think this is something related to the teleop keyboard, or just a setting I am confusing?

-Hash

Not sure how fast you were moving, but the gmapping scanmatcher has some issues if you turn quickly. All in all, gmapping has serious issues with low-end lasers (even the $1200 Hokuyo), and I don't think any of us has found a set of parameters that really works well for all situations.

-Fergs

hash79
12-30-2010, 05:13 PM
What would your suggestion be for creating a map of say a 12 x 12 foot space? Just measure it manually and create a base image to load in? Or should I just try to move forward and backward in a space and get the best map i can without turning? My turn speed was slow I thought, about 1 rotation ever 15 - 20 seconds.

Are there other mapping utilities besides gmapping I should give a try? Or are we basically where no open source projects have gone before? :)

Definitely gives me an appreciation for what Neato has done...

-Hash

lnxfergy
12-30-2010, 07:50 PM
What would your suggestion be for creating a map of say a 12 x 12 foot space? Just measure it manually and create a base image to load in? Or should I just try to move forward and backward in a space and get the best map i can without turning? My turn speed was slow I thought, about 1 rotation ever 15 - 20 seconds.

Are there other mapping utilities besides gmapping I should give a try? Or are we basically where no open source projects have gone before? :)

Definitely gives me an appreciation for what Neato has done...

-Hash

I assume you could create an image by hand, although, in such a small space you should be able to get a decent map from gmapping.

One of the problems with low-cost/short-range lasers is that most labs don't use them -- so available, off-the-shelf SLAM doesn't work so well on them. For instance, there are very few papers that reference the low end Hokuyo, the two that I'm aware of are:


Stachniss et Al 2008, How To Learn Accurate Grid Maps With a Humanoid -- discusses some improvements for gmapping that will especially help in long hallways and open spaces.
Steux et Al 2010, CoreSLAM: A SLAM Algorithm in less than 200 Lines of C Code -- presents an interesting approach to mapping, which is developed on the Hokuyo and shows good results with the Neato -- I'm still working on a few items for an ROS wrapper around CoreSLAM (See my blog for some more recent images of the CoreSLAM-generated maps).

Another consideration, is that beam likelihood models found in typical approaches (like gmapping/karto) are quite a bit different from what the Neato laser is like -- there's probably a lot of space for improvement right there in a new beam likelihood model for gmapping. I plan to eventually look at this -- but I just haven't found time.

-Fergs

hash79
12-30-2010, 10:28 PM
Fergs,

I think I know what is wrong with my setup, although not sure how to fix it yet. It appears that my laser data is in one orientation and the data returned from gmapping that is visualized in RVIZ is in another orientation. They look 180 degrees off. If I drive forward and back in an enclosed room I get a flipped version of the room on the screen. The moment I rotate while near a wall then the map of the room is inaccurate since it thinks a wall is suddenly an open space.

-Hash

lnxfergy
12-31-2010, 09:03 AM
Fergs,

I think I know what is wrong with my setup, although not sure how to fix it yet. It appears that my laser data is in one orientation and the data returned from gmapping that is visualized in RVIZ is in another orientation. They look 180 degrees off. If I drive forward and back in an enclosed room I get a flipped version of the room on the screen. The moment I rotate while near a wall then the map of the room is inaccurate since it thinks a wall is suddenly an open space.

-Hash

Could you record a short bag file of your drive ("rosbag record -a"), and then PM me for an email address to send to. I can take a look at it then and see if something is off.

-Fergs

lnxfergy
12-31-2010, 02:21 PM
I've now got the CoreSLAM package building correctly, so it's now in full release. You can see the documentation here: http://www.ros.org/wiki/coreslam.

A few example maps:
http://forums.trossenrobotics.com/attachment.php?attachmentid=2307&stc=1&d=1293826782

http://forums.trossenrobotics.com/attachment.php?attachmentid=2308&stc=1&d=1293826782

Peter_heim
12-31-2010, 05:27 PM
Hi Fergs
I tried the core slam package with PML it builds fine but when i turn on the PML i get this error

[ INFO] [1293837851.057421445]: Initialized with sigma_xy=0.100000, sigma_theta=0.350000, hole_width=0.600000, delta=0.050000
[ INFO] [1293837851.057600094]: Initialization complete
/opt/ros/cturtle/ros/bin/rosrun: line 35: 4274 Floating point exception$exepath "[email protected]"

regards Peter

lnxfergy
12-31-2010, 06:13 PM
Hi Fergs
I tried the core slam package with PML it builds fine but when i turn on the PML i get this error

regards Peter

Peter, I just committed an update to patch a last minute bug that I must have introduced while doing updates for parameters to be in meters/radians. Do an "svn update" and re-make, and you should be in business.

Also, you may want to turn down the sigma_xy parameter, the 0.1 is the default from the authors of CoreSLAM, and it works pretty well with the Hokuyo URG-04LX-UG01, but with the PML, it might be too much. The maps from a PML are a bit better from CoreSLAM than gmapping, but I can't say they are that great yet:
http://forums.trossenrobotics.com/attachment.php?attachmentid=2317&stc=1&d=1293839938

-Fergs

hash79
12-31-2010, 08:25 PM
Fergs,

Check out the attached image of when I initially started gmapping. You can see the white dots of the laser scanner and the black lines of the room look mirrored. I have messed with every setting I can find that looks user adjustable and same results.

Here is a link to my bag file. Drove forward then backward, then rotated to the left 180 degrees and went backward/forward again.

http://dl.dropbox.com/u/1828805/2010-12-31-18-13-03.zip

-Hash

lnxfergy
12-31-2010, 09:17 PM
Ok, the reason I previously didn't notice this was that my slam_gmapping had a small hack to work with the PML (which publishes scans which alternate in direction). I've now checked out the standard "cturtle" tag again, and found the differences.

For now, you can get gmapping to work by applying the following patch:

339c339
- gsp_laser_angle_increment_ = (angle_max - angle_min) / scan.ranges.size();
+ gsp_laser_angle_increment_ = scan.angle_increment;

to slam_gmapping.cpp. This uses the angle_increment from the laser scan, rather than the computed one, which is incorrect for full rotation scans. This would affect the inverting selection during scan process -- as well as the scan matching. Actually, this also explains a number of problems I've had getting gmapping to scan match at all with the Neato laser.

I'm not sure of a more permanent fix, but I'll see if there's anything we can do inside the neato laser specification. If not, we'll have to talk to gmapping developers about an upstream patch (however, it seems likely to be a very nasty fix, since it's very different use case).

-Fergs

hash79
12-31-2010, 11:09 PM
Fergs,

I made the first change you posted, changed it back, and now made this change. The orientation looks correct now, but performance during rotation is still poor, although not as bad. The room i am in is carpeted so this may be the cause, I am going to test in a room with hardwood floors in a moment.

I still have the second issue of my 2D Nav Goal not working properly. I loaded in my new map of the room, set my 2D Pose Esitmate but when I set a 2D Nav Goal nothing happens. I get a red arrow on my screen but the robot just sits there...

-Hash

p.s. Thanks for all the help! This has been keeping me up all night lately, super excited to be playing around with some sweet robot software! :)

hash79
12-31-2010, 11:43 PM
Fergs,

I take it back... In the living room now with a lot more detail and it is definitely still mirrored. Scanning through the gmapping code i saw comments about discarding short range scans, and keeping the scans above a certain distance. It seems with the Neato this should be reversed. Short scans will have greater accuracy, the farther away from a wall i get I can see gmapping struggle. Perhaps I should give coreslam a shot...

-Hash

lnxfergy
12-31-2010, 11:58 PM
Fergs,

I take it back... In the living room now with a lot more detail and it is definitely still mirrored. Scanning through the gmapping code i saw comments about discarding short range scans, and keeping the scans above a certain distance. It seems with the Neato this should be reversed. Short scans will have greater accuracy, the farther away from a wall i get I can see gmapping struggle. Perhaps I should give coreslam a shot...

-Hash

I'm not sure, with the patch above applied on a fresh cturtle checkout of gmapping, the map is no longer mirrored here.

-Fergs

hash79
01-01-2011, 12:11 AM
I will check mine out again and start fresh... Might be slowly losing my mind! going to test coreslam as well.

-Hash

hash79
01-01-2011, 01:59 AM
Fergs,

I installed and tested coreslam and it works great with the Neato! I was able to create a map of my living room by navigating the Neato around it. The only issue I have to resolve is why I can't use the 2D Nav Goal and make the robot navigate to a location. I will work on that next year... ;)

-Hash

lnxfergy
01-01-2011, 08:20 AM
Fergs,

I installed and tested coreslam and it works great with the Neato! I was able to create a map of my living room by navigating the Neato around it. The only issue I have to resolve is why I can't use the 2D Nav Goal and make the robot navigate to a location. I will work on that next year... ;)

-Hash

This is most likely an issue with AMCL -- it seems to crash frequently on me (there are open tickets with similar descriptions, but nobody seems to have found the cause/solution of the crashes).

I'm also not sure how well AMCL will work with CoreSLAM maps -- I'll hopefully be testing that later today and if I find it sucks, I'll try to create an AMCL-like wrapper around the CoreSLAM localization routines.

-Fergs

P.S. -- mind posting an image of your map + the parameters used with CoreSLAM?

hash79
01-01-2011, 03:24 PM
Fergs,

Gladly! I have been playing with parameters to see if I can get some better results. I see what you guys means now about the range of the laser, accuracy, and existing SLAM algorithms. I am going to read the CoreSLAM paper on how their algorithm works to try and gain a better understanding.

I noticed when navigating around that as I got further from a wall the inaccuracies of the XV-11 LIDAR caused the SLAM algo to screw up pretty badly. I tried limiting the scan range from 5 meters to 2.5 meters (neato.py scan.range_max) so all of my readings would be accurate like the higher end laser scanners. This seemed to work well in small places like the Dining room image below. I then changed the hole_size from 0.6 to 0.3. This did not help with drift, but made a sharper picture at least.

It seems with CoreSLAM though that as you lose sight of a wall everything starts to progressively drift and the whole room with be indistinguishable. If i just follow the wall closely and make a lap it looks nothing like the image below. Is this a result of poor odometry readings? How is this improved with higher end systems, gyro's?

-Hash

p.s. How do you feed parameters to coreslam via command line? I couldn't seem to get it to accept anything so I changed the cpp file and just recompiled.

2319
2320

lnxfergy
01-01-2011, 05:21 PM
The way to feed parameters is:

rosrun coreslam slam_coreslam scan:=base_scan _sigma_xy:=0.05 _sigma_theta:=0.1

You need the leading underscore on parameter names, and the := before your values.

Currently, the CoreSLAM wrapper doesn't do the "loop closing" they briefly discuss in the end of the paper, as it needs a cutoff in sensor readings (see the OpenSLAM repository, in the test directory for an example of how that works, currently we don't check out the test directory as it adds about 5 minutes to the build with the large datasets). Presumably, this loop closing step could help, but I'm not sure how you would trigger it within the ROS framework and the current use case.

I'll probably go back and add in the "maxRange" parameter to CoreSLAM (as found in gmapping/karto) so that you don't have to modify the Neato package. I was initially keeping the CoreSLAM wrapper as simple as possible until I knew it was going to work.

-Fergs

hash79
01-01-2011, 09:59 PM
Fergs,

Thanks, that will make testing a little simpler! :)

I plan on integrating my DustBin Computer (modified Chumby) into the neato driver you created. Instead of connecting to the XV-11 via USB, I will connect via wifi to the DustBin Computer that redirects all communication to the USB port. It is a raw connection on any port I choose, currently port 1234. It will definitely be easier to drive around my living room without following it around! :)

I would like to be able to set a flag upon launching the neato_node that selects either wifi or USB. Do you have any input on how I should proceed? My current plan is to just throw a condition statement around all of the places you communicate via serial and add the necessary code. Then read a flag on launch that either uses USB or wifi. Is there another method you would suggest?

-Hash

lnxfergy
01-01-2011, 10:09 PM
I plan on integrating my DustBin Computer (modified Chumby) into the neato driver you created. Instead of connecting to the XV-11 via USB, I will connect via wifi to the DustBin Computer that redirects all communication to the USB port. It is a raw connection on any port I choose, currently port 1234. It will definitely be easier to drive around my living room without following it around! :)

I would like to be able to set a flag upon launching the neato_node that selects either wifi or USB. Do you have any input on how I should proceed? My current plan is to just throw a condition statement around all of the places you communicate via serial and add the necessary code. Then read a flag on launch that either uses USB or wifi. Is there another method you would suggest?

What is the chumby running, linux right? Have you thought about getting the basic ros communications working? It might be easier to setup the neato driver running on the chumby, and leave the roscore/rviz/everything else on your laptop/pc. One reason this might be a good idea, is that you won't have to keep re-hacking the Neato driver everytime we push an update. Should also make the whole control/lidar-publish loops run much closer to realtime.

Edit: you'd only need rospy, pyserial (I assume you already have this), and the common_msgs stack (for laserscan and twist).

-Fergs

hash79
01-01-2011, 11:25 PM
Fergs,

That is a great idea! I am using the tcp redirector program from the pyserial webpage. I will look into how rospy and common_msgs work tonight.

I also got gmapping reinstalled. The laser data is correct, and performance of the data when rotating looks pretty good with the angular update set to 0.1. I think my odometry data is moving in the opposite direction of the robot however. When I move forward towards a wall I see the /base_link move towards the wall and the /odom move in the opposite direction. Perhaps this is something I can feed in via a parameter? srr or str?

-Hash

lnxfergy
01-01-2011, 11:45 PM
I also got gmapping reinstalled. The laser data is correct, and performance of the data when rotating looks pretty good with the angular update set to 0.1. I think my odometry data is moving in the opposite direction of the robot however. When I move forward towards a wall I see the /base_link move towards the wall and the /odom move in the opposite direction. Perhaps this is something I can feed in via a parameter? srr or str?

If you drive the robot forward, then your robot's base_link is now forward of where it started (which is denoted as the origin of the odom frame). So, it sounds like odometry is working correctly (unless I'm misunderstanding you).

Depending on which frame is selected as your fixed frame in RViZ (I typically do either /odom or /map), will define which other frames appear to be "moving" in RViZ.

-Fergs

hash79
01-02-2011, 01:54 AM
Fergs,

Here is a bag file of my test using gmapping. I ran a test using coreslam to test my understanding as well.

as you can see from the bag file when i rotate left or right staying at the origin everything looks fine. When I start to move towards a wall, /base_link moves away from /map in one direction, and /odom moves from /map in the other direction. When I use coreslam /map and /odom stay relatively close to each other.

Is /odom being processed incorrectly by gmapping? My Fixed Frame is set to /map and my Target Frame is set to <Fixed Frame>

http://dl.dropbox.com/u/1828805/2011-01-01-23-27-07.bag.gz

-Hash

lnxfergy
01-02-2011, 09:05 AM
I don't think the issue is in the neato driver -- looking at base_link->odom against cmd_vel, it looks pretty decent. The scanmatcher in gmapping is improperly determining the current pose, which is why you see a big divergence in the /odom->/map transform coming out of gmapping.

-Fergs

hash79
01-02-2011, 11:49 AM
It's odd that it works for you, and is backwards for me...

lnxfergy
01-02-2011, 12:55 PM
It's odd that it works for you, and is backwards for me...

What exactly do you think is "backwards"? The base_link->odom is correct (based on the commanded velocity from cmd_vel topic). Gmapping is misaligning the scans, causing the differences seen in map->odom. What are your parameters for gmapping -- I noticed on my machine that default parameters yield a similar issue (the map frame is jumping around as the most likely particle oscillates between one that is fairly correct, and one that seems to outright suck). You'll probably have to hack around with the parameters to yield any decent output with the Neato in such a confined space.

-Fergs

hash79
01-02-2011, 01:11 PM
Fergs,

Well, considering this is the first time I have ever used a SLAM algorithm and a laser scanner I could just not have any clue how this is supposed to work! :)

All I have as a baseline is CoreSlam. When I use CoreSlam and rotate the map holds its shape. When I move forward and backward it also seems to hold its shape. Over time there is the slight rotating of the map and the issue with closing the map, but I attribute that to low end sensors.

With gmapping I get the same results when rotating, map looks good and I can discern the area. When I move forward and backward though it stretches the map out in that direction. Since I have no understanding of how it works, it just may be gmapping will not perform using the settings provided, or this laser sensor. I have it in a rather large area right now and I get the same results.

-Hash

EDIT: I am currently reading your tutorial "An Introduction to Mapping And Localization" to further educate myself... ;)

hash79
01-03-2011, 01:13 PM
Fergs,

In regards to the Chumby, it is indeed running linux. I have python 2.6 and pyserial on it. You said I will need rospy and the common_msgs stack. I have been reading about this but can't quite get a picture of how everything will communicate.

Is there another project you are aware of I could take a look at to get an idea?
Will I need roslib as well, and does this need to be cross compiled for the ARM processor?

-Hash

lnxfergy
01-03-2011, 01:25 PM
Fergs,

In regards to the Chumby, it is indeed running linux. I have python 2.6 and pyserial on it. You said I will need rospy and the common_msgs stack. I have been reading about this but can't quite get a picture of how everything will communicate.

Is there another project you are aware of I could take a look at to get an idea?
Will I need roslib as well, and does this need to be cross compiled for the ARM processor?

-Hash

ROS is distributed -- you can run separate parts on different machines. I'm not really aware of a project doing specifically that -- but I commonly run things like hardware drivers on a Netbook while the more computaitonally challenging things (move_base) are upstream on a i5 laptop.

You might look at "eros" which is the embedded ROS project for points on cross-compiling.

-Fergs

hash79
01-04-2011, 04:34 AM
Fergs,

I found a much easier route to establish a TCP link between the Chumby/XV-11 and Ubuntu. I run a python serial port redirector on the Chumby which is connected via USB to the XV-11. I then run socat on Ubuntu to redirect a TCP connection from the Chumby to a pty on Ubuntu. The only modification I make to your driver is to change /dev/ttyUSB0 in neato.py to /home/hash/dev/xv11 and it works like a charm!

I will post full details on my blog this week.

-Hash

lnxfergy
01-04-2011, 12:52 PM
The only modification I make to your driver is to change /dev/ttyUSB0 in neato.py to /home/hash/dev/xv11 and it works like a charm!

Just FYI, you can change the port using the parameter "port".

-Fergs

hash79
01-04-2011, 03:37 PM
Fergs,

Perfect...

When you are using higher end laser scanners how many points are generally being returned for a given view angle? With the Neato we have 360 points for the full 360 degree view. Are the SLAM algorithms being used usually accustomed to a denser point cloud?

Regarding wheel encoders is there a method I can use for calibration? Over time my odometry seems to drift, can I compensate for that in some way?

Do you have any experience interfacing accelerometer/gyro data into ROS to improve odometry data?

-Hash

EDIT: Would you mind posting your RViz configuration you are using? I would like to make sure I am testing with the same setup.

Nammo
01-05-2011, 02:39 PM
Regarding wheel encoders is there a method I can use for calibration? Over time my odometry seems to drift, can I compensate for that in some way?

Isn't that the "Simultaneous" part of SLAM? I'm not versed in ROS, but I thought SLAM was usually used to correct the inevitable odometry drift while mapping.

I suepect that an accelerometer alone can't really improve odometry performance when fused with encoders.

I believe with a gyroscope, accelerometer, and encoders it is possible to reduce drift somewhat. This especially compensates for the typical "slippery wheel" conditions that occur during sudden maneuvers such as turns and start/stops. They do not help much if you are continuously slipping for a long time, such as when the wheels have poor traction on deep carpet or whatever.

- Nammo

lnxfergy
01-05-2011, 03:37 PM
When you are using higher end laser scanners how many points are generally being returned for a given view angle? With the Neato we have 360 points for the full 360 degree view. Are the SLAM algorithms being used usually accustomed to a denser point cloud?

A more typical scanning laser, like a Hokuyo or SICK, typically gives 3-5 readings per degree. Certainly less beams in a scan means that scans are less distinct, but I think a bigger issue is the accuracy of each beam. The error over the entire working range of a Hokuyo UTM-30 (as found in the PR2, ~$6k a pop) is about 1% if I recall.


Regarding wheel encoders is there a method I can use for calibration? Over time my odometry seems to drift, can I compensate for that in some way?

Effectively, the SLAM algorithms attempt to do just that: given a noisy odometry reading + noisy sensory, what is the most likely pose? The accuracy with which the pose can be ascertained leads to a good or a bad map.


Do you have any experience interfacing accelerometer/gyro data into ROS to improve odometry data?

It's not something I've done, but the "robot_pose_ekf" package can merge odom and imu data for a more reliable odometry output.


EDIT: Would you mind posting your RViz configuration you are using? I would like to make sure I am testing with the same setup.

I usually use "odom" as my fixed frame and load up laserScan/map/etc tools as needed.

-Fergs

lnxfergy
01-05-2011, 03:39 PM
I suepect that an accelerometer alone can't really improve odometry performance when fused with encoders.

I believe with a gyroscope, accelerometer, and encoders it is possible to reduce drift somewhat. This especially compensates for the typical "slippery wheel" conditions that occur during sudden maneuvers such as turns and start/stops. They do not help much if you are continuously slipping for a long time, such as when the wheels have poor traction on deep carpet or whatever.

See also my above response, but I wanted to add that the main use of a gyro is to reduce the "rotational" error in the odometry, since that's typically far worse than translational error. The accelerometer is needed to cancel drift found in the gyro.

-Fergs

hash79
01-05-2011, 08:04 PM
A more typical scanning laser, like a Hokuyo or SICK, typically gives 3-5 readings per degree. Certainly less beams in a scan means that scans are less distinct, but I think a bigger issue is the accuracy of each beam. The error over the entire working range of a Hokuyo UTM-30 (as found in the PR2, ~$6k a pop) is about 1% if I recall.Do you think there is any benefit in taking multiple scans of the LIDAR and averaging the data before it is sent to the SLAM algorithm? Say reducing the LIDAR to 1 HZ in an attempt to provide more stable data, or is that an unnecessary step already being handled by the SLAM algorithm.


See also my above response, but I wanted to add that the main use of a gyro is to reduce the "rotational" error in the odometry, since that's typically far worse than translational error. The accelerometer is needed to cancel drift found in the gyro. I brought that up since I know accelerometers/gyro's are frequently used in IMU's like you mentioned and gyro's alone in RC heli's very effectively.

I have an IMU coming in the mail from Trossen I will try to get added onto my setup. At least with that added on we will have a second data point regarding rotation.

Do both gmapping and coreslam have parameters we can adjust that compensate for rotation? Seems when rotating, the shift seen from the LIDAR data might be causing more error than any slippage in the wheels.

-Hash

SK.
01-06-2011, 04:26 AM
Isn't that the "Simultaneous" part of SLAM? I'm not versed in ROS, but I thought SLAM was usually used to correct the inevitable odometry drift while mapping.

I suepect that an accelerometer alone can't really improve odometry performance when fused with encoders.

I believe with a gyroscope, accelerometer, and encoders it is possible to reduce drift somewhat. This especially compensates for the typical "slippery wheel" conditions that occur during sudden maneuvers such as turns and start/stops. They do not help much if you are continuously slipping for a long time, such as when the wheels have poor traction on deep carpet or whatever.

- Nammo

Of course, SLAM is supposed to correct errors by odometry, but most SLAM algorithms use odometry and work better, the more accurate the provided odometry is.

A gyro around the vertical axis often can be used to improve odometry angular error by an order of magnitude or more. Other than these sensor fusion/additional hardware approaches, there are quite some papers about improving wheel odometry itself out there:

http://mapleleaf.csail.mit.edu/~nickroy/papers/icra99a.pdf
Relatively sophisticated odometry parameter estimation by using a SLAM approach and optimizing odometry parameters so they agree best with the path estimated by SLAM.

http://www-personal.umich.edu/~johannb/Papers/umbmark.pdf
UMBmark is a often-cited approach for comparably simple calibration of odometry mostly for differential drive robots by letting the robot drive a bi-directional square path and optimizing parameters from this experiment.

BTW: Is there a bag file available with Neato log data? I could use one with some sensor_msgs/LaserScan messages for testing a SLAM approach. Odometry or anything else is not needed.

SK.
01-06-2011, 04:32 AM
See also my above response, but I wanted to add that the main use of a gyro is to reduce the "rotational" error in the odometry, since that's typically far worse than translational error. The accelerometer is needed to cancel drift found in the gyro.

-Fergs
Well that won't work for gyros around the yaw axis, correcting gyro bias using accelerometers works for the roll and pitch axes because you can measure gravity there. For the yaw axis gyro bias correction of our Hector UGV, we use orientation updates from the SLAM approach that are fed back into the attitude estimation filter.

lnxfergy
01-06-2011, 10:29 AM
BTW: Is there a bag file available with Neato log data? I could use one with some sensor_msgs/LaserScan messages for testing a SLAM approach. Odometry or anything else is not needed.

I've uploaded a couple of bag files created on the Neato -- See this post for details: http://forums.trossenrobotics.com/showpost.php?p=44997&postcount=12

-Fergs

hash79
01-06-2011, 02:34 PM
SK.

Thank you for the links to the papers on odometry. If you have specific setups in mind for some bag files let me know, I will be more than happy to set it up.

Definitely looking forward to integrating this IMU unit into the robot.
http://www.trossenrobotics.com/store/p/6451-PhidgetSpatial-3-3-3-9-Axis-IMU.aspx

-Hash

hash79
01-11-2011, 02:15 AM
Fergs,

While using rostopic to troubleshoot my issues I noticed something interesting. When I echo base_scan under stamp: I see secs: and nsecs: and both have non-zero values that are incrementing. When I echo odom both of those values are 0. RViz was complaining of this. I'm not sure if it would also be causing other problems as well.

I added
odom.header.stamp = rospy.Time.now() right below # prepare odometry (line 118). This seemed to cure the problem RViz was having with the Odom data.

-Hash

hash79
01-11-2011, 03:09 AM
Fergs,

Since I am not very ROS savvy I missed something that might be obvious. In the global_costmap_params.yaml file under 2dnav_neato/params you set the update_frequency to 0. This is why I was unable to set a goal and have the robot navigate. I changed this to a non-zero value and now I can set a goal and the robot moves around, although not too well :)

Why did you default that to 0? Seems like anything other than zero would have saved me about 5 days of head scratching :)

-Hash

hash79
01-11-2011, 03:43 AM
Final update...

Seems that after I have everything setup, set my pose, then set a goal the robot will start to navigate towards the goal and then the following error appears.


Connectivity Error: Could not find a common time /base_link and /map.

Not sure if this is processing power related since I am running it in a virtual machine, or something else. Any ideas?

-Hash

lnxfergy
01-11-2011, 08:13 AM
Fergs,

While using rostopic to troubleshoot my issues I noticed something interesting. When I echo base_scan under stamp: I see secs: and nsecs: and both have non-zero values that are incrementing. When I echo odom both of those values are 0. RViz was complaining of this. I'm not sure if it would also be causing other problems as well.

I added
odom.header.stamp = rospy.Time.now() right below # prepare odometry (line 118). This seemed to cure the problem RViz was having with the Odom data.

-Hash

Thanks for the patch, I'll get that integrated this afternoon -- however, note that nothing in the navigation stack uses the odom topic (amcl/gmapping get their odometry from the TF transform from base_link->odom). The /odom publication topic is really only useful if you use the ekf filter.

-Fergs

lnxfergy
01-11-2011, 08:21 AM
Fergs,

Since I am not very ROS savvy I missed something that might be obvious. In the global_costmap_params.yaml file under 2dnav_neato/params you set the update_frequency to 0. This is why I was unable to set a goal and have the robot navigate. I changed this to a non-zero value and now I can set a goal and the robot moves around, although not too well :)

Why did you default that to 0? Seems like anything other than zero would have saved me about 5 days of head scratching :)

-Hash

Hmm, that is an error -- it should be update_frequency is set to something like 1.0 -- the publish_frequency should be 0.0. Why 0? Because that means publish as often as needed/possible. This is actually the default value that pretty much everyone is using. I've just committed an (untested) update.


Connectivity Error: Could not find a common time /base_link and /map.

This probably means that AMCL has locked up, and stopped publishing the odom->map transform. (tf view_frames can confirm this). This is likely caused by one of two things: 1) AMCL can't localize properly against your map, or 2) you've run out of processing power.

-Fergs

hash79
01-11-2011, 11:42 AM
Hmm, that is an error -- it should be update_frequency is set to something like 1.0 -- the publish_frequency should be 0.0. Why 0? Because that means publish as often as needed/possible. This is actually the default value that pretty much everyone is using. I've just committed an (untested) update.

That could explain some of my issues perhaps. I think I cranked up the publishing frequency which probably overloaded my computers capabilities.

It was definitely pretty sweet watching it all work and try to navigate around my living room! You can tell it has a hard time localizing with that laser scanner because even when it got to a location it seemed to hunt back and forth trying to get everything lined up.

I have been trying to get this all working so I can wipe out my Ubuntu VM, start from scratch and post a step by step guide on the xv11hacking wiki. Help out some completely clueless folks get ROS up and running! :happy:

-Hash

hash79
01-13-2011, 10:11 AM
Fergs,

In a previous post you said I could specify the port used to connect to the XV-11 using the "port" parameter. How would that be done?

Right now I run "roslaunch neato_node bringup.launch"

FYI on the ros.org page for 2dnav_neato your example for creating a map needs a ":=" for _angularUpdate

-Hash

lnxfergy
01-13-2011, 10:24 AM
Fergs,

In a previous post you said I could specify the port used to connect to the XV-11 using the "port" parameter. How would that be done?

Right now I run "roslaunch neato_node bringup.launch"

FYI on the ros.org page for 2dnav_neato your example for creating a map needs a ":=" for _angularUpdate

-Hash

You would add it as a parameter, in your bringup launch file you could change the neato_node entry to:

<node name="neato" pkg="neato_node" type="neato.py" output="screen" >
<param name="port" value="/dev/ttyUSB1" />
</node>

-Fergs

hash79
01-13-2011, 10:40 AM
Thanks, I thought you meant a command line parameter from the previous post and was scratching my head... :)

I am close to a "ROS for Dummies" type install guide assuming a Ubuntu system, hopefully some people with an XV-11 can follow it and give me some feedback.

-Hash

lnxfergy
01-13-2011, 10:59 AM
Thanks, I thought you meant a command line parameter from the previous post and was scratching my head... :)

I am close to a "ROS for Dummies" type install guide assuming a Ubuntu system, hopefully some people with an XV-11 can follow it and give me some feedback.

-Hash

You could also specify it as a command line parameter if you were loading only the neato_node. However, to get anything useful, we need a transform between base_link and base_laser, so that's why we use the bringup launch file.

Have you walked through all of the introductory tutorials on ROS.org? I know there are like are 20-22 total, but they are really quite informative.

-Fergs

hash79
01-13-2011, 11:54 AM
I have gone through a majority of them... What I noticed was it seemed very overwhelming at the start. It was hard to figure out what I needed to install, and if I had installed it properly. Now that I actually have it up and running the tutorials make a lot more sense and I can experiment with what I am reading.

My goal is to help someone capable get everything they need installed so they can actually use it while reading the tutorials instead of reading it all to try and figure it out.

The RViz video was VERY helpful for setup, but if the other parts are not installed or working then you are left feeling lost and not sure where to troubleshoot. Hopefully this will serve as a baseline for getting a XV-11 and ROS working so the focus can be learning ROS/working on slam instead of trying to get ROS working.

-Hash

lnxfergy
01-13-2011, 12:02 PM
I have gone through a majority of them... What I noticed was it seemed very overwhelming at the start. It was hard to figure out what I needed to install, and if I had installed it properly. Now that I actually have it up and running the tutorials make a lot more sense and I can experiment with what I am reading.

My goal is to help someone capable get everything they need installed so they can actually use it while reading the tutorials instead of reading it all to try and figure it out.

The RViz video was VERY helpful for setup, but if the other parts are not installed or working then you are left feeling lost and not sure where to troubleshoot. Hopefully this will serve as a baseline for getting a XV-11 and ROS working so the focus can be learning ROS/working on slam instead of trying to get ROS working.

-Hash

What exactly wasn't installed? If you install "base" from this page: http://www.ros.org/wiki/cturtle/Installation/Ubuntu you should have everything you need except the neato drivers. An SVN checkout and build should have them ready to go.

-Fergs

hash79
01-13-2011, 12:27 PM
Fergs,

Let me start by saying my goal is for someone who has never used ROS to be up and running viewing data in RViz and moving their XV-11 around in the length of time it takes to install the software.

With that being said, things like



teleop_twist need to be installed so you can test/move around
the environment variables need to be setup for the user account and root
you need to run rosmake as root, then as the user to view the packages
you should make sure you are in the proper directories when installing everything
you need to get the USB connection to your XV-11 working by modprobing the driver
if cdc_acm is not blacklisted it can be loaded instead of usbserial causing problems
if you are not using /dev/ttyUSB0 then you need to make changes

All of these are simple things, and just off the top of my head right now. Combined with no prior knowledge of ROS it means something that should have been relatively simple to get going took quite some time.

Of course now I could get ROS going in about 30 minutes on a bare system, but I'd like the next guy with an XV-11 to have THAT ROS experience :wink:

-Hash

lnxfergy
01-13-2011, 01:22 PM
A few thoughts:





teleop_twist need to be installed so you can test/move around
the environment variables need to be setup for the user account and root
you need to run rosmake as root, then as the user to view the packages



I would avoid this -- create a ~/ros folder for your own installs -- don't put them in the /opt/ros directories. This avoids all the root issues entirely. In all honesty, you should never have to sudo for ROS unless you're dealing with restricted ports (for instance, when doing modprobe).

If you want to see an example of how we set up our machines at Albany: take a look at our front page: http://code.google.com/p/albany-ros-pkg/ and then look in the config directory in svn/branches for our install scripts. Everything not installed through apt-get goes into ~/ros/xyz-ros-pkg/





you need to get the USB connection to your XV-11 working by modprobing the driver
if cdc_acm is not blacklisted it can be loaded instead of usbserial causing problems
if you are not using /dev/ttyUSB0 then you need to make changes



The usbserial aspects are documented on the neato_node page -- could you elaborate on the cdc_acm issues? If there's issues there, we should try to document that right on the wiki page as well.

Are you intending to put this on the xv11 hacking site? I would also suggest it go on the ROS wiki.

-Fergs

hash79
01-13-2011, 02:00 PM
I would avoid this -- create a ~/ros folder for your own installs -- don't put them in the /opt/ros directories. This avoids all the root issues entirely. In all honesty, you should never have to sudo for ROS unless you're dealing with restricted ports (for instance, when doing modprobe). Thanks Fergs, I will update my procedure to reflect this to avoid these issues. Definitely some confusion around this during the beginning.


The usbserial aspects are documented on the neato_node page -- could you elaborate on the cdc_acm issues? If there's issues there, we should try to document that right on the wiki page as well.I saw you added that to the neato_node page, very helpful. I did a fresh install of Ubuntu 10.10 on my Macbook and when I plug in the XV-11 the cdc_acm driver grabs it and mounts it as /dev/ttyACM0. I did not try to connect to it, but read other people had issues with their USB serial devices having issues with this driver.


echo "blacklist cdc_acm" >> /etc/modprobe.d/blacklist.confI did that to keep cdc_acm from being used.


Are you intending to put this on the xv11 hacking site? I would also suggest it go on the ROS wiki. I plan on having it on the XV-11 hacking site since it will be specific to using ROS with the XV-11. If you feel the ROS wiki is a better place we can post it there and then link to it from the xv-11 site. In that case, i just need to create it in a format suitable for posting on the ROS site.

-Hash

DaveC
01-13-2011, 09:38 PM
Fergs,

Of course now I could get ROS going in about 30 minutes on a bare system, but I'd like the next guy with an XV-11 to have THAT ROS experience :wink:

-Hash

oh, oh, *raises hand*, pick me!

I'm rounding up parts -- I should be able to alpha-test the draft of your notes Real Soon Now, using a Chumby hacker board (V1).

hash79
01-14-2011, 02:03 AM
I'm rounding up parts -- I should be able to alpha-test the draft of your notes Real Soon Now, using a Chumby hacker board (V1).

Awesome Dave! Only thing I am assuming at this point is that you already have Ubuntu linux installed on a system. Whether that is a Virtual machine, or a dedicated machine is up to you. My instructions will basically be copy/paste with explanations, so very simple.

Once I have this guide done I am looking forward to adding my Chumby on as well with a small guide written on the steps involved in using the Chumby to create the virtual serial port link.

-Hash

Pi Robot
01-15-2011, 09:11 AM
Although I don't yet have a Neato, I am really looking forward to your guide!

--patrick

hash79
01-16-2011, 04:29 AM
Thanks Patrick! I'm looking forward to getting some feedback and hearing stories of easy installs! ;)

Here is a preliminary guide.
http://xv11hacking.wikispaces.com/Connecting+to+ROS

I need to expand it to include creating a map and then navigating, but this is shaky territory right now. I am still learning how to use gmapping to create a descent map and then attempting to navigate within that map without crashing the software. As I hammer out these details I will add this procedure as well.

My BestBuy Infocast (Chumby) will be the following procedure which will take everything else we have setup and make it all wireless. I have a suitable regulator for it now and it can accept from 7V - 36V. http://search.digikey.com/scripts/DkSearch/dksus.dll?vendor=0&keywords=%09811-2196-5-ND

-Hash

lnxfergy
01-16-2011, 08:46 AM
Thoughts:

8-11 in the second list add the paths to the root account -- this really isn't necessary, right? At the very least though, i would move them to the end of the list -- so that a user doesn't miss the "exit" and end up compiling as root (which will cause headaches in the future).

I don't see any mention of starting a roscore process. If you're relying on the auto-master functionality with neato_bringup, that's probably a bad idea (if the driver crashes and you restart the launch file, you'll toast everything that's running and have to restart it).

I've added a bug report (http://code.google.com/p/albany-ros-pkg/issues/detail?id=2) for the "The default speed settings are too fast for the XV-11 so you need to reduce the speed before it will respond to any input" so that we can fix that in trunk.

Lastly, I notice this comment "Don't worry if you don't hear it, press CTRL-C to exit the driver then enter the command again. I have found it does not seem to start properly upon first load." -- could you elaborate on that here? I've had issues where the node doesn't shut down properly, but I haven't seen any issues where the node doesn't start properly.

-Fergs

DaveC
01-16-2011, 02:13 PM
So Hash,

I take it you installed some ARM build of Ubuntu on your Chumby? Which image did you start with? Can you post a direct link?

-dave

hash79
01-16-2011, 04:53 PM
8-11 in the second list add the paths to the root account -- this really isn't necessary, right? At the very least though, i would move them to the end of the list -- so that a user doesn't miss the "exit" and end up compiling as root (which will cause headaches in the future).


Moving it to the end is a good idea. I added it so the user can edit the gmapping source and recompile it since it sounded like that was not something that would be changed in the near future.


I don't see any mention of starting a roscore process. If you're relying on the auto-master functionality with neato_bringup, that's probably a bad idea (if the driver crashes and you restart the launch file, you'll toast everything that's running and have to restart it).


So that is exactly how I have been operating it since no mention was made of explicitly starting roscore. I figured that was why we had it in the launch file. Should the procedure be to start roscore on its own in a terminal window then launch everything else?


I've added a bug report (http://code.google.com/p/albany-ros-pkg/issues/detail?id=2) for the "The default speed settings are too fast for the XV-11 so you need to reduce the speed before it will respond to any input" so that we can fix that in trunk.


I like that so many things are getting taken care of! :)


Lastly, I notice this comment "Don't worry if you don't hear it, press CTRL-C to exit the driver then enter the command again. I have found it does not seem to start properly upon first load." -- could you elaborate on that here? I've had issues where the node doesn't shut down properly, but I haven't seen any issues where the node doesn't start properly.

I can reproduce this like clockwork. It happens every singe time I have done a fresh install and used the driver for the first time. It also appears that upon restarting the computer it happens on the first try as well. Maybe the command to put it into testmode is being missed, or the command to start the LIDAR? Perhaps a wait is needed between sending the commands.

-Hash

hash79
01-16-2011, 04:57 PM
So Hash,

I take it you installed some ARM build of Ubuntu on your Chumby? Which image did you start with? Can you post a direct link?

-dave

Dave,

The Chumby has the OS on it that shipped with it, I have done nothing major to it. All of the ROS stuff is installed on a laptop. With the Chumby I am only installing python and a serial -> tcp redirection program so you can send data from the USB connection over wifi. Pretty simple and just makes use of existing software. I will post a procedure shortly.

-Hash

lnxfergy
01-16-2011, 05:47 PM
Moving it to the end is a good idea. I added it so the user can edit the gmapping source and recompile it since it sounded like that was not something that would be changed in the near future.

Ah yes, good call. I don't think I see any acceptable change for gmapping ending up in trunk anytime soon -- it's too big of an API breaker -- and it's way too late for API changes for diamondback (eturtle would push such a change to August or so).

I have some changes that I need to test (and test a lot, to make sure they work well) -- but we may be able to fix this within the Neato driver itself, by changing the format of the LaserScan published.


So that is exactly how I have been operating it since no mention was made of explicitly starting roscore. I figured that was why we had it in the launch file. Should the procedure be to start roscore on its own in a terminal window then launch everything else?I would recommend it. It may not be totally necessary -- but as we currently don't have a way for neato_node to cleanly respawn itself, the launch file needs to be restarted whenever the driver crashes.


I can reproduce this like clockwork. It happens every singe time I have done a fresh install and used the driver for the first time. It also appears that upon restarting the computer it happens on the first try as well. Maybe the command to put it into testmode is being missed, or the command to start the LIDAR? Perhaps a wait is needed between sending the commands.Ok, I was wondering if that was the case. A wait is probably the easy fix -- but the more comprehensive one would be to determine that the getLDS is failing and check/reset the test mode. I've added another bug report (http://code.google.com/p/albany-ros-pkg/issues/detail?id=3) for this issue. (Issue #2, the speed stuff, should be updated later tonight, I just need to test it before committing).

-Fergs

DaveC
01-16-2011, 06:32 PM
Dave,

The Chumby has the OS on it that shipped with it, I have done nothing major to it. All of the ROS stuff is installed on a laptop. With the Chumby I am only installing python and a serial -> tcp redirection program so you can send data from the USB connection over wifi. Pretty simple and just makes use of existing software. I will post a procedure shortly.

-Hash

Ah OK... for some reason I thought you were shoe-horning ROS into the Chumby.

lnxfergy
01-16-2011, 07:55 PM
Ah OK... for some reason I thought you were shoe-horning ROS into the Chumby.

It's something I had initially suggested. I've actually now a Chumby freed from it's packaging, and up and running. I was playing with Python on it last night, and with a little time I intend to shove ros_comm and common_msgs stacks onto it later this week -- which should be just enough to connect to a remote master, open connection to the neato, and publish laser scans.

If I do get it to work -- I'll post details.

-Fergs

lnxfergy
01-16-2011, 08:44 PM
I like that so many things are getting taken care of! :)

I've committed a change (r146) that should fix the out of range issues on cmd_vel. I've also committed some updates to neato_driver (r147) that should fix the crashes on shutdown (which leave the laser running).

-Fergs

ringo42
03-07-2011, 03:02 PM
Is there a tutorial anywhere on using the Lidar sensor and ROS without the rest of the neato robot? I have an existing robot I want to mount the lidar on. I've seen the thread on getting the lidar working by supplying a PWM signal to the motor, but how about the difference in the output of the lidar vs the robot?
Ringo

lnxfergy
03-08-2011, 01:04 PM
Is there a tutorial anywhere on using the Lidar sensor and ROS without the rest of the neato robot? I have an existing robot I want to mount the lidar on. I've seen the thread on getting the lidar working by supplying a PWM signal to the motor, but how about the difference in the output of the lidar vs the robot?
Ringo

You might be interest in http://www.ros.org/wiki/xv_11_laser_driver

-Fergs

Nammo
03-08-2011, 11:17 PM
You might be interest in http://www.ros.org/wiki/xv_11_laser_driver

This link seems to suggest that only the old protocol is supported:
http://www.ros.org/wiki/xv_11_laser_driver/Tutorials/The%20Underlying%20XV-11%20Protocol%20Explained

There is a new protocol out there too:
http://forums.trossenrobotics.com/showthread.php?t=4470&page=10

The above thread contains exhaustive details on both protocols. It might be a fun read, and hopefully inspire somebody to make a simple summary of the new protocol on ros.org. (Is there one already?)

- Nammo

ringo42
03-09-2011, 07:49 AM
Thanks for the info. I also found this.
http://www.ros.org/wiki/xv_11_laser_driver/Tutorials/Running%20the%20XV-11%20Node
does anybody know if this is for the new or old protocol?
Ringo

Peter_heim
04-16-2011, 12:51 AM
Hi Fergs
I just reloaded coreslam again i had to change the SVN url in makefile.coreslam to
openslam.informatik.uni-freiburg.de/data/svn/tinyslam/trunk/
to get it to build


peter

tdeyle
07-17-2011, 11:54 AM
I am curious what nodes were setup to control the neato via the PS3 controller.

lnxfergy
07-17-2011, 01:57 PM
I am curious what nodes were setup to control the neato via the PS3 controller.

'ps3joy' publishes joystick messages, 'pr2_teleop' with a modified launch file to convert to 'cmd_vel' topic for the neato node. You can see the teleop launch file in the armadillo_2wd package (in my other repository: vanadium-ros-pkg)

-Fergs

tdeyle
07-17-2011, 07:47 PM
ROS noob here. I am taking a look at the launch file and trying to figure out what to change on it to match the needs of the neato.

I have tried a few changes that did not work so a slight push would be a welcome thing.

***Well, after some persistence/stubbornness, I figured out that I needed to install and build the PR2 apps in the /opt/ros.... directory, instead of the home directory.***

lnxfergy
07-17-2011, 10:28 PM
ROS noob here. I am taking a look at the launch file and trying to figure out what to change on it to match the needs of the neato.

I have tried a few changes that did not work so a slight push would be a welcome thing.

***Well, after some persistence/stubbornness, I figured out that I needed to install and build the PR2 apps in the /opt/ros.... directory, instead of the home directory.***

Yeah, once you have PR2 apps, the launch file should just work -- the max speed and other characteristics are pretty similar between the neato and armadillo (I really did use exactly that launch file for the neato when I made the videos months ago).

-Fergs

thesa1nt01
07-29-2011, 09:54 PM
Hash or fergy,

Did you guys ever figure out what was wrong with the inverted/mirroring scans?
I'm using the LDS as a standalone and when I display the /scan in rviz, the data seems correct with respect to the frame but when I overlay the /map from gmapping it is inverted.

My LDS is firmware version 2.4, which I wrote my own ros driver for, I'm publishing on /tf zero vector and identity quaternions. I'm using Diamondback.

I've messed with the tf a lot and even inverted (put a negative sign in front of) my scan.angle_increment in my driver, and while it moves the /scan all over the place in rviz, the /map is always the inverse/mirror of it.


Rviz Screenshot (https://docs.google.com/leaf?id=0B8qFfkX7QbQtNjhlNGVhNDAtNmNkYy00ZWYzLTg3O DEtMzU5ZjVjZjNjZmNj&hl=en_US)

Bag file (https://docs.google.com/leaf?id=0B8qFfkX7QbQtNDc2MWVmYzEtNTk1Ny00YmM5LWE2N mQtZmU4NmQ0NzIwZjZi&hl=en_US)

Regards,

lnxfergy
07-29-2011, 10:21 PM
Have you applied my patch to gmapping: http://www.showusyoursensors.com/2010/12/neato-slam.html

I believe this patch made it into the Electric release which is currently in beta.

-Fergs

thesa1nt01
07-29-2011, 10:40 PM
I hadn't tried it (though I had seen it). Just tried it and replayed my bag file to no affect.
:(

(Btw, I messaged you on youtube about I different topic)

Regards,

lnxfergy
07-30-2011, 12:19 AM
Ok, so I'm thinking you either A) didn't actually recompile gmapping after patching or B) didn't notice that you bagged a map and were actually seeing the new output and old output at the same time! Results of my tests on your bagfile:

1) Diamondback without patch -- FAILS
2) Diamondback with patch -- WORKS
3) Electric (patch included by default) -- WORKS

To actually see the output -- I suggest the following for starting gmapping:


rosrun gmapping slam_gmapping _temporalUpdate:=5.0 map:=map2 map_metadata:=map2_metadata _base_frame:=robot_base

That remaps to "map2" since "map" is already included in your bagfile. Switch your RVIZ to view "map2" and all should work.

A few other notes:
1) I really recommend always calling the robot base link "base_link" -- that way it lines up with all sorts of tools that ROS uses without having to remap frames.
2) You had separately asked about high error in the ~6m range readings -- honestly, the Neato is a 4-5m device, anything you get above that is of limited use. Playing with parameters for error estimation might help incorporating this data into gmapping -- but the sensor model in gmapping doesn't incorporate the fact that error for the Neato is not a flat function over range.

-Fergs

thesa1nt01
07-30-2011, 01:00 AM
I double checked that I had changed the line:

gsp_laser_angle_increment_ = scan.angle_increment;

and remade using:

rosmake --pre-clean gmapping

then I tried your suggestion:

rosrun gmapping slam_gmapping _temporalUpdate:=5.0 map:=map2 map_metadata:=map2_metadata _base_frame:=robot_base

but rviz didn't like /map2 for the Fixed Frame so I remade the bagfiles, producing two bagfiles, one containing only /tf and the other containing only /scan, then reran gmapping without renaming the map parameter...

To see the same results :(

thesa1nt01
07-30-2011, 01:04 AM
Ok, I tried it again with:


rosmake --pre-clean slam_gmapping

Same results.

thesa1nt01
07-30-2011, 01:12 AM
Also, I'll rename the base frame as this seems like convention.
The reason for asking about the resolution and accuracy at 6 meters is that the white paper describes a 3 cm accuracy at 6 meters. As such my professor had us take empirical readings from the sensor which showed at 5-6 meters it was off by 60 cm, with a range of readings that spanned about 15 cm. He'd wanted me to check and see if others were getting similar results, or if there was a way to calibrate it (since I know such a function exists in the LDS menu).

lnxfergy
07-30-2011, 06:53 AM
but rviz didn't like /map2 for the Fixed Frame so I remade the bagfiles, producing two bagfiles, one containing only /tf and the other containing only /scan, then reran gmapping without renaming the map parameter...

To see the same results :(

/map2 is not a tf frame -- it is a topic name. I meant that you should change the name of the Map Display topic to "map2" -- your fixed frame should still be /map.


Also, I'll rename the base frame as this seems like convention.
The reason for asking about the resolution and accuracy at 6 meters is that the white paper describes a 3 cm accuracy at 6 meters. As such my professor had us take empirical readings from the sensor which showed at 5-6 meters it was off by 60 cm, with a range of readings that spanned about 15 cm. He'd wanted me to check and see if others were getting similar results, or if there was a way to calibrate it (since I know such a function exists in the LDS menu).

The paper also describes a 10hz rate, and a metal frame -- both of which don't exist on the production model as it is an even cheaper device than the paper described.

-Fergs

thesa1nt01
07-30-2011, 11:47 AM
Ok, so, I played the original bag once again, ran gmapping with the map2 parameters, ran rvix with Fixed Frame as /map and Map and /map2

Same results.

Is there something that I didn't do to gmapping? I should only have to modify the slam_gmapping.cpp file under /opt/ros/diamondback/stacks/slam_gmapping/gmapping/src and only line 339 (or thereabouts) right?

Is pre-clean not enough?

We determined that the 5 Hz rate is a function of the baud rate. The paper says 115.2K, 10Hz, and 2 bytes per reading. The LDS does 115.2K, 5 Hz, and 4 bytes per reading. I looking at increasing the baud rate so that we can up the speed.

If we can't calibrate it through the LDS, the error might be predictable enough the correct for in my driver. It obviously wouldn't be 100% accurate, but it would be better. The predictability of the error is what leads me to believe that it can be corrected in the LDS. For us, we would have to try materials with different reflectivity if we were going apply an external correction.

lnxfergy
07-30-2011, 12:13 PM
Is rosmake actually succeeding? /opt/ros is a system directory and so unless you change the permissions it will fail. I didn't do anything but:

roscd slam_gmapping
sudo chown -R mikef:mikef gmapping
cd gmapping
# edit src/slam_gmapping.cpp (line 339)
make

-Fergs

thesa1nt01
07-30-2011, 12:43 PM
YES! Awesome! I owe you some champagne. The orientation look good now.
Why would make do something other than rosmake?
rosmake always reported that build succeeded and that there were no errors.
The only thing I noticed is that make downloaded something.

Btw, from the Robot Film Festival that just passed, Davin Lu from Washington University St. Louis
Researchin' (http://vimeo.com/25016146)
I actually cried.

SK.
08-04-2011, 01:52 PM
Finally, I've uploaded a number of datasets I've collected over the past semester on the Neato, Create, and my Armadillo robot to http://www.fergy.me/slam/ (Please note, that downloads might be a bit slow, the server is not the fastest link the world).
Hi Fergs, I wanted to try out the datasets again, unfortunately I can't reach your server at all. I hope it's a temporary issue, but if you know how to actively make it better I'd be grateful :)

lnxfergy
08-05-2011, 04:09 PM
Hi Fergs, I wanted to try out the datasets again, unfortunately I can't reach your server at all. I hope it's a temporary issue, but if you know how to actively make it better I'd be grateful :)

SK -

It appears my server has died! I'm currently uploading onto our lab server, files should be here: http://robotics.ils.albany.edu/slam/ in a few minutes.

-Fergs

lnxfergy
08-05-2011, 04:22 PM
And it turns out someone had unplugged the server from the uplink... both sites should now be working.

-Fergs

SK.
08-08-2011, 06:34 AM
Thanks, got them all and already tested some of them. Have tested only without using odometry at all so far, but that breaks the maps in some places. I hope I get around to testing some more (and adding proper odometry support) in the coming days.

lnxfergy
08-08-2011, 09:49 AM
Thanks, got them all and already tested some of them. Have tested only without using odometry at all so far, but that breaks the maps in some places. I hope I get around to testing some more (and adding proper odometry support) in the coming days.

Great! Could we get some pics of the maps generated so far?

-Fergs

SK.
08-08-2011, 01:33 PM
Great! Could we get some pics of the maps generated so far?

-Fergs

Yes, we can ;)
Obviously, there is quite some error at loop closure with about ~0.6m offset. This is without using any odometry at all though, I expect much better results when properly fusing odometry and LIDAR scanmatching estimates as well as taking the much more noisy (compared to Hokuyos) sensor data into account. I'll try to find some time for doing that, can give no guarantees though.

http://img263.imageshack.us/img263/5950/xv11test200205.png

hash79
08-20-2011, 09:31 PM
I'm looking forward to trying something new... I have my wireless link working with my XV-11 and it interfaces with ROS seamlessly but I have yet to generate a usable map. Gmapping just generates garbage, and something must have happened with the coreslam package because I couldn't get it to build.

I can manually navigate my XV-11 all around my house just watching the LIDAR in rviz using teleop_twist! :)

-Hash

lnxfergy
08-20-2011, 10:17 PM
Hash,

I just updated the Makefile (they apparently moved the coreslam repository). It should build -- however, there is also an SSL issue with their server, when prompted enter "p" or "t" to accept the repository SSL or the build will fail.

As for gmapping, are you under the Diamondback distribution or the new electric beta? Electric should fix several issues we had with the 360 laser.

-Fergs

hash79
08-20-2011, 10:27 PM
I am under Diamondback, I will download and install Electric right now and post my results.

-Hash

hash79
08-20-2011, 10:52 PM
Did something break with Teleop-twist? I can't control the robot anymore.

*EDIT* Nevermind, restarted everything and it works now, also hard booted XV-11

-Hash

hash79
08-20-2011, 11:25 PM
gmapping under electric does seem to scan match better diamondback, but it must be the limited distance of the lidar or lack of unique things to track in my living room because it still gets thoroughly lost creating a map...

Coreslam seg faults when I run it on my system, seemed to build fine though.

-Hash

ringo42
08-21-2011, 01:46 PM
What range are you getting from the lidar? I'm only getting 1.5 meters then it drops off of rviz. Even a white cardboard box. It is the old fw, does that matter?

hash79
08-21-2011, 02:48 PM
I get 5 meters which is the limit set in the neato ROS driver... Which driver are you using?

ringo42
08-21-2011, 02:51 PM
I'm using the ros xv-11 driver for the lidar, not the whole neato

lnxfergy
08-21-2011, 04:38 PM
I'm using the ros xv-11 driver for the lidar, not the whole neato

That's an entirely different driver, and I believe several people have said it is quite buggy.

-Fergs

lnxfergy
08-21-2011, 04:49 PM
Coreslam seg faults when I run it on my system, seemed to build fine though.

Can you get me either a) a gdb traceback on the seg fault? or b) a bag file recorded on your neato. I don't have a neato available right now -- but running coreslam under electric generated a map no problem for me (using an bit older bag file: http://robotics.ils.albany.edu/slam/bags/2010-12-12-neato-ils.bag). Also, you might try running coreslam on that bag file to see if it works....

-Fergs

hash79
08-21-2011, 05:32 PM
Fergs,

What does the z and w stand for under orientation? This is using rxplot on /odom.

child_frame_id: base_link
pose:


pose:


position:


x: -1.04419702728


y: -0.0288575943219


z: 0.0


orientation:


x: 0.0


y: 0.0


z: -0.885383719505


w: -0.46486091386


I ran coreslam inside gdb and got the following output, this was run while roscore/neato was active.

Starting program: /home/hash/ros/albany-ros-pkg/slam_coreslam/coreslam/bin/slam_coreslam scan:=base_scan _sigma_xy:=0.005 _sigma_theta:=0.045 _delta:=0.05 _hole_width:=600
[Thread debugging using libthread_db enabled]

Program received signal SIGSEGV, Segmentation fault.
main (argc=6, argv=0x7fffffffe0c8)
at /home/hash/ros/albany-ros-pkg/slam_coreslam/coreslam/src/main.cpp:25
25 {


As a side note, covering up the rear of the laser scanner so only the front ~270 degrees of the scanner is active generates MUCH better maps using gmapping. It would be nice if there was a way to weight the data frm the laser scanner/odometer. The wheel odometry is much more acurate than the laser at measuring travel when only distant objects are visible to the laser scanner. It seems the maps get all twisted and destroyed when using gmapping perhaps because the distances are constantly changing from the laser scanner.

I have been testing on hard wood floors with great traction, any testing on a carpeted surface is a complete waste of time so far.

-Hash

ringo42
08-21-2011, 06:43 PM
That's an entirely different driver, and I believe several people have said it is quite buggy.

-Fergs

I just tried it with the Python program that xevel wrote to display the data and it goes away in that as well. I got out a tape measure and at about 1 meter it goes away, It is rock solid before that. I must have some bad hardware or something. I'm running it at 3.3V, I wonder if I should try bumping up the voltage a bit. It seems like I remember someone saying that some early units used 5V. I checked the connector on the robot though and it was putting out 3.3V.

Anybody have any ideas? Any idea if there is a way to ask neato a question like this or would they not care since the robot is taken apart?
Ringo

lnxfergy
08-21-2011, 06:46 PM
Orientations in ROS are always quaternions (so x,y,z form a rotation axis, and w is the amount of rotation about that axis).

The ROS version of gmapping is pretty much always going to suck on short range lasers as the major improvements of gmapping over previous SLAM algorithms depend on the benefits of long range lasers (~30m).

If you type "bt" after the seg fault, you can get more information, since I can't seem to replicate this error....

-Fergs

hash79
08-21-2011, 06:46 PM
Using my partially blocked laser scanner I created a descent map of my kitchen. This whole time I have been running the 2dnav_neato at the same time as slam_gmapping. This way I could visualize the robot and drive it remotely. I could never create a descent map so this last time I ran gmapping alone and it worked much better.

I thought you could run gmapping and 2dnav at the same time? If I do everything gets turned around and the map is useless.

-Hash

hash79
08-21-2011, 07:01 PM
Fergs,

while in gdb issuing command "bt" doesn't give any useful data either. I am running 64 bit Ubuntu, is that what you are running?

ringo42
08-22-2011, 08:48 PM
I tried contacting Neato, here is the replay I got.
"Unfortunately, no one is our department can assist you with your inquiry. We are not trained in the subject matter you’ve presented."
Guess it might be time to buy another one :-(

Xevel
08-22-2011, 09:13 PM
I tried contacting Neato, here is the replay I got.
"Unfortunately, no one is our department can assist you with your inquiry. We are not trained in the subject matter you’ve presented."
Guess it might be time to buy another one :-(

That's unfortunate :(
Before doing that, would you mind sending me a dump of what the lidar says when you move the box this way, just so I could have a look please?

hash79
08-22-2011, 09:14 PM
Ringo,

Do you still have the rest of your neato vacuum? Have you thought about plugging the LIDAR back into the main PCB and running the neato firmware update? Perhaps new firmware on your LIDAR may help... Doesn't sound like it could hurt at this point!

-Hash

lnxfergy
08-22-2011, 11:06 PM
Fergs,

while in gdb issuing command "bt" doesn't give any useful data either. I am running 64 bit Ubuntu, is that what you are running?

"bt" should give a backtrace after it dumps in gdb.... odd that it doesn't.

I am on 64-bit Ubuntu 10.10 on one machine, and 10.04 on the other.

-Fergs

ringo42
08-23-2011, 08:09 AM
I'll capture the data tonight and post it.
I still have the rest of the vac, I need to see if I can put it back together enough to update the firmware. I wonder if that updates the Lidar firmware at the same time? It would sure be nice if it did, just so everybody could be at t the same level.
Ringo

ringo42
08-23-2011, 05:06 PM
3411
here is a file I captured in putty, is this what you need? I move the box out to about 2 meters then bring it back
Ringo

Xevel
08-23-2011, 06:12 PM
Well, try with this script instead please : 3412
Edit it to set the right serial port, run it (it will log every incoming character from the serial port) and when you kill it, it should write a file dump_2.1.txt.

Also, we should maybe move this troubleshooting issue to an other thread since it seem to have nothing to do with ROS.

ringo42
08-24-2011, 10:09 AM
Moved to Neato Lidar Problem

ringo42
09-08-2011, 08:33 PM
I'm thinking of upgrading to Electiric but I heard there are issues with the neato code. Is this with the Neato robot code, or the laser code?
Ringo

lnxfergy
09-09-2011, 01:08 AM
I'm thinking of upgrading to Electiric but I heard there are issues with the neato code. Is this with the Neato robot code, or the laser code?
Ringo

Can you be more specific of the "issues"? I've not had any bug reports.

-Fergs

ringo42
09-09-2011, 10:34 AM
I don't have any details, I heard it third-hand. I asked PiRobot a question about upgrading to electric, and he said he heard that some of the HBRC guys were having issues with the neato laser and electric. I thought I would post here and ask before upgrading to electric.
Ringo

lnxfergy
09-09-2011, 03:26 PM
I don't have any details, I heard it third-hand. I asked PiRobot a question about upgrading to electric, and he said he heard that some of the HBRC guys were having issues with the neato laser and electric. I thought I would post here and ask before upgrading to electric.
Ringo

Ah. I know one of the HBRC guys had some issues, but that was due to an incorrect installation of ROS itself. Note, that this thread is primarily about the neato_robot drivers that I wrote (which reside in albany-ros-pkg), which are geared towards using the entire Neato robot and laser. I can't speak at all about the laser-only drivers (which appear to have issues in all distributions of ROS).

-Fergs

xen
12-15-2011, 06:42 PM
Newbie question. I've just bought Neato and installed ROS on it following Hash's wonderful tutorial at http://xv11hacking.wikispaces.com/Connecting+to+ROS
[ (http://xv11hacking.wikispaces.com/Connecting+to+ROS)I did however replaced cturtle with electric]

Rviz says "no map received" http://screencast.com/t/TouaJ1cr1
roslaunch 2dnav_neato move_base.launch keeps sending these messages http://screencast.com/t/V7TE6J1TR96R
I ran gmapping, but it didn't change anything http://screencast.com/t/nVnAxl42
My next troubleshooting step is reading ROS tutorials;-)

Any shortcuts you have in mind:-)?

Thanks and cheers.

lnxfergy
12-15-2011, 09:24 PM
Rviz says "no map received" http://screencast.com/t/TouaJ1cr1

The issue here is that you need to click in the box directly below that which says "Fill in topic here..." and select the topic name on which the map is being broadcast.



roslaunch 2dnav_neato move_base.launch keeps sending these messages http://screencast.com/t/V7TE6J1TR96R


You probably can't bring up move_base.launch at the same time as gmapping -- as you'll get conflicting TF data from AMCL and Gmapping. Rather than running move_base during mapping, I would just use a teleop program (the turtlebot_teleop package has a decent keyboard one, or you can get teleop_twist_keyboard from the brown repositories [sudo apt-get install ros-electric-brown-remotelab])



I ran gmapping, but it didn't change anything http://screencast.com/t/nVnAxl42


So, the issue here is that gmapping is listening for "scan" topic, you have "base_scan", you need to remap by adding "scan:=base_scan" to your list of arguments.

My answers of course assume a bit of ROS knowledge (mainly, that you have read the tutorials), if nothing I said makes sense, you probably need to read the tutorials.

-Fergs

xen
12-17-2011, 07:03 PM
Thanks for the tip!
I tried it with and w/o roslaunch 2dnav_neato move_base.launch
In both cases rosrun gmapping slam_gmapping _temporalUpdate=5.0 _map=map2 _map2_metadata=map2_metadata _base_frame:=robot_base scan:=base_scan
gives me this:

[ WARN] [1324169967.803029090]: Failed to compute laser pose, aborting initialization (Frame id /robot_base does not exist! Frames (5): Frame /odom exists with parent /map.
Frame /map exists with parent NO_PARENT.
Frame /base_laser_link exists with parent /base_link.
Frame /base_link exists with parent /odom.
)

Is there some important command/config I missed?

Thanks.

xen
12-19-2011, 01:09 AM
I've read the tutorials, everything works fine now, the weekend project complete, time to sleep;-)

lnxfergy
12-19-2011, 01:14 AM
rosrun gmapping slam_gmapping _temporalUpdate=5.0 _map=map2 _map2_metadata=map2_metadata _base_frame:=robot_base scan:=base_scan

I'm not sure where you got all these parameters, but you should drop the _base_frame one as the default is base_link, which happens to also be the name of the base link on the Neato (and most robots) (you'll probably want to drop the map and map_metadata ones as well).

-Fergs

xen
12-19-2011, 01:27 PM
Thanks. At the time I've taken the parameters from http://forums.trossenrobotics.com/showthread.php?4540-Neato-ROS!&p=48678#post48678
After reading the tutorials, I've applied a different set and it worked fine.

kapustiy-72
03-20-2012, 02:49 PM
Hello all,

Recently we bought a Neato robot for our laboratory and I made some tries of it with the ROS. Before in this thread it was said that the gmapping doesn't work very good with this robot, but I have a filling that my results are especially bad. Here are a photo of the setup and the map produced by gmapping.
https://documents.epfl.ch/_xy-3-1850894_1-1331370
https://documents.epfl.ch/_xy-3-1850893_1-1331369

Could someone tell, please, if this result is normal?

For gmapping I have the following command line that I found in this thread:
rosrun gmapping slam_gmapping scan:=base_scan _srr:=0.001 _srt:=0.001 _str:=0.000001 _stt:=0.000001 _linearUpdate:=0.5 _angularUpdate:=0.4

I also attach the bag file and the rviz file:
https://documents.epfl.ch/_xy-3-1850896_1-1331372
https://documents.epfl.ch/_xy-3-1850895_1-1331371

I use the Electric under Ubuntu 11.10


Many thanks in advance!

hash79
05-12-2012, 01:45 AM
I realize this response is two months late, but I thought I would comment.

I realized as watching YouTube videos of people using ROS on different high end robotic platforms that most lidars being used did not provide more than a 270 degree view with the 90 degrees behind the robot generally blind. I thought what the hell, it's worth a shot. I put some tape over the back of the Neato's LIDAR to simulate a 270 degree view and the results I got were WAY better!!!

Give it a try and let us know if you get similar results.

-Hash