PDA

View Full Version : ArbotiX ROS Package 0.3.0



lnxfergy
10-16-2010, 06:16 PM
I've released the an 0.3.0 release the vanadium_drivers stack. The vanadium-ros-pkg wiki has more information about this particular release: http://code.google.com/p/vanadium-ro...i/ReleaseNotes (http://code.google.com/p/vanadium-ros-pkg/wiki/ReleaseNotes)

The biggest change in this release is that base_controller is now complete, including easily adjustable PID parameters, and that the BigMotors library is now supported. Several bugs in the controllerGUI were fixed -- I also changed us to use the names of head_pan_joint and head_tilt_joint to be more PR2-like. An experimental voltage monitor sensor is also included. This release depends on the newest RoboControllerLib release (0008) to build the firmware.

We'll have a more complete tutorial up shortly on using the motor driver with the ArbotiX, but, referring to the image at http://www.pololu.com/picture/view/0J121, the pinout is as follows (from top to bottom):
1. 1DIAG/EN -- is connected to ArbotiX D3
2. 1INA -- connects to ArbotiX motor socket pin 2
3. 1INB -- connects to ArbotiX D2
4. 1PWM -- connects to ArbotiX motor socket pin 7
5. n/c
6. n/c
7. n/c
8. 5V -- should get connected to a 5V line of the ArbotiX
9. GND -- should get connected to a ground line of the ArbotiX
10. n/c
11. n/c
12. 2PWM -- connects to ArbotiX motor socket pin 15
13. 2INB -- connects to ArbotiX D1
14. 2INA -- connects to ArbotiX motor socket pin 10
15. 2DIAG/EN -- connects to ArbotiX D3

Moving forward, once documentation on 0.3.0 is finished, the PML updates will progress (and an 0.3.1 release should be out shortly with them). Several of the other controllers are also being laid out and should be available as experimental versions shortly.

-Fergs

P.S. For historical reference, see the 0.2.0 thread (http://forums.trossenrobotics.com/showthread.php?t=4304).

Peter_heim
10-17-2010, 01:27 AM
Hi Fergs
Just download ver 0.3.0 its fixed my problems PML pans 180 deg and no read errors now i have to get some values out of the ir sensor(more reading)

regards Peter

lnxfergy
10-17-2010, 01:34 AM
Hi Fergs
Just download ver 0.3.0 its fixed my problems PML pans 180 deg and no read errors now i have to get some values out of the ir sensor(more reading)

regards Peter

Great to hear! If you load up RViz and set up a LaserScan visualizer, you should see your scan being broadcast (http://www.ros.org/wiki/rviz/DisplayTypes/LaserScan).

-Fergs

Peter_heim
10-17-2010, 01:55 AM
Hi Fergs
loaded up RViz but i get a laser scan error no messages recieved but my ir is a gp2D12 so i will have to wait until my long range ir arrives

regards peter

lnxfergy
10-17-2010, 10:04 AM
Hi Fergs
loaded up RViz but i get a laser scan error no messages recieved but my ir is a gp2D12 so i will have to wait until my long range ir arrives

regards peter

Check that you've selected the right topic name for the LaserScan visualizer (the default in rviz is 'scan' but the default for the PML is 'base_scan'). Even with the wrong sensor, you should see data arriving (it just won't be the real range reading, it'll be much longer ranges than actually observed because it thinks it's reading a different sensor).

-Fergs

Pi Robot
10-17-2010, 11:13 AM
I've released the an 0.3.0 release the vanadium_drivers stack. The vanadium-ros-pkg wiki has more information about this particular release: http://code.google.com/p/vanadium-ro...i/ReleaseNotes (http://code.google.com/p/vanadium-ros-pkg/wiki/ReleaseNotes)



Congrats Fergs on your 0.3.0 release. I look forward to giving it a try. Just so I don't make a mistake, which Pololu motor controller should I order to work with the ArbotiX? I would need something that can handle two 7.2V DC motors drawing around 3-5 amps per motor continuously with peak current a little higher (roughly 5-7 A).

--patrick

lnxfergy
10-17-2010, 11:46 AM
Congrats Fergs on your 0.3.0 release. I look forward to giving it a try. Just so I don't make a mistake, which Pololu motor controller should I order to work with the ArbotiX? I would need something that can handle two 7.2V DC motors drawing around 3-5 amps per motor continuously with peak current a little higher (roughly 5-7 A).

--patrick

I've been using this guy (http://www.pololu.com/catalog/product/707) on some very beefy motors for a ~35lb robot that I've got running around here. I realize that the specs are way north of the 5-7A you're talking about, but really, you're going to find that most motor drivers in that 5-7A range are $35-45, and for a few dollars more you really future-proof yourself.

-Fergs

Pi Robot
10-17-2010, 11:58 AM
I've been using this guy (http://www.pololu.com/catalog/product/707) on some very beefy motors for a ~35lb robot that I've got running around here. I realize that the specs are way north of the 5-7A you're talking about, but really, you're going to find that most motor drivers in that 5-7A range are $35-45, and for a few dollars more you really future-proof yourself.

-Fergs

Thanks Fergs. Would the slightly more expensive http://www.pololu.com/catalog/product/708 be worth the current sensing feature? In other words, would the current be readable via the ArbotiX?

--patrick

lnxfergy
10-17-2010, 12:06 PM
Thanks Fergs. Would the slightly more expensive http://www.pololu.com/catalog/product/708 be worth the current sensing feature? In other words, would the current be readable via the ArbotiX?

--patrick

You could definitely wire it up as such, and access the values without a firmware change -- although the current software makes no use of it. I suppose it could be used for logging and/or determining that the bot is stalled. Both of my motor drivers are the no-sense versions, so I probably wouldn't be adding this sort of functionality any time soon (but we welcome community created patches!).

I'm currently not taking such a fine-grained approach to current logging though. I have one of these 0-30A current sensors (http://www.pololu.com/catalog/product/1186) on my robot right now, on the main power line. I'm working on getting a complete Voltage/Current monitor running as an ArbotiX-ROS sensor, which will also log over time (and thus predict power usage and battery levels).

-Fergs

Pi Robot
10-17-2010, 12:09 PM
OK, thanks. Maybe I'll get the current sensing version just to have the option down the road.

--patrick

lnxfergy
10-17-2010, 12:13 PM
OK, thanks. Maybe I'll get the current sensing version just to have the option down the road.

--patrick

Yeah, and if you're interested in the battery monitoring, that little $10 current sense wouldn't be bad to pick up now either. The battery_monitor sensor should be available in our next release (along with the fully operational PML).

-Fergs

Pi Robot
10-17-2010, 12:16 PM
Good idea. As it turns out, the current-sensing version of the motor controller is out of stock so I'll get the version without it. The only reason to get it was, as you guessed, stall-sensing, but of course with the ROS planner, Pi really shouldn't run into anything anyway. :veryhappy:

--p

Pi Robot
10-17-2010, 12:28 PM
Hi Fergs
loaded up RViz but i get a laser scan error no messages recieved but my ir is a gp2D12 so i will have to wait until my long range ir arrives

regards peter

Hey Peter,

I can't tell you why, but I don't see the Laser Scan messges in rviz until I also run the mapping server and fake localization node. I have a blank map (just a 50x50 pixel white square) and after firing up the base scanning node, I launch the mapping stuff using the following launch file:


<launch>
<master auto="start"/>

<param name="robot_description" command="cat $(find pi_robot)/urdf/pi_robot_urdf.xml" />

<!-- Run the map server -->
<node name="map_server" pkg="map_server" type="map_server" args="$(find pi_robot)/maps/blank_map.yaml"/>

<!-- Run Fake Localization -->
<node pkg="fake_localization" type="fake_localization" name="fake_localization" output="screen" />

<!-- Create a static transform between the /map frame and /odom -->
<node pkg="tf" type="static_transform_publisher" name="world_map_broadcaster" args="0 0 0 0 0 0 /odom /map 100" />

<!-- Map the location of the panning IR sensor with a static transform relative to the base frame. -->
<node pkg="tf" type="static_transform_publisher" name="scan_base_broadcaster" args="0.18 0.0 0.18 0 0 0 /base_link /base_scan 100" />

<node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen">
<rosparam file="$(find pi_robot)/params/costmap_common_params.yaml" command="load" ns="global_costmap" />
<rosparam file="$(find pi_robot)/params/costmap_common_params.yaml" command="load" ns="local_costmap" />
<rosparam file="$(find pi_robot)/params/local_costmap_params.yaml" command="load" />
<rosparam file="$(find pi_robot)/params/global_costmap_params.yaml" command="load" />
<rosparam file="$(find pi_robot)/params/base_local_planner_params.yaml" command="load" />
</node>
</launch>--patrick

lnxfergy
10-17-2010, 12:42 PM
Patrick & Peter -- you probably need to change the "Fixed frame" found at the top to the frame in which the laser is published (typically, "base_laser"). Alternatively, if you are publishing a base_link -> base_laser (either a static_tf_publisher, or via URDF+robot_state_publisher) you could have fixed frame set to base_link.

Patrick, the reason you currently need fake localization is probably because your fixed frame is set to "odom" and until fake_localization publishes an odom->base_link transform, the laser can't be displayed.

-Fergs

Pi Robot
10-17-2010, 12:52 PM
Yep, that was it. My fixed frame in rviz was set to /map so of course it wasn't defined until I started the map server. I just tried it with fixed frame set to /base_scan and now I see the IR ranges ("laser scan") without having to launch the map stuff.

Thanks!
patrick

Pi Robot
10-17-2010, 01:41 PM
Hey Fergs,

Do you know how to give the occupancy cell values a finite lifetime? In other words, if my PML IR sensor sees a cell as occupied on the current sweep, I'd like the cell to be set to "unoccupied" a few seconds later. The reason is that my IR sensor is reporting a fair bit of false data and so I'd like the robot to just use the most recent occupancy data as it moves around.

--patrick

lnxfergy
10-17-2010, 02:01 PM
Hey Fergs,

Do you know how to give the occupancy cell values a finite lifetime? In other words, if my PML IR sensor sees a cell as occupied on the current sweep, I'd like the cell to be set to "unoccupied" a few seconds later. The reason is that my IR sensor is reporting a fair bit of false data and so I'd like the robot to just use the most recent occupancy data as it moves around.

--patrick

I'm not sure this is currently possible with the ROS costmap_2d, simply because the costmap stores only the grid values, not the timestamp at which they were last touched.

I know some people have been doing the inverse of this by publishing a point cloud from a map (where, for instance, there is a glass wall the robot can't see with a laser scanner, they publish a point cloud from the map to make sure the wall always shows up).

Is the robot moving when it picks up "false data"? Or is it static? From my experience, there shouldn't be much bad data when sitting still (I'm currently working on ROSalyn's PML in the lab this afternoon, hoping to have a much better PML release shortly which will handle movement)

-Fergs

Pi Robot
10-17-2010, 03:02 PM
OK thanks. Yeah, the bad data is mostly when moving but I also found this thread on the ROS users list:

https://code.ros.org/lurker/message/20100921.183301.35032936.en.html

which I'll try to digest later today. Right now I am using the GP2Y0A0 sensor which has a max range of only 1.5m so perhaps if I map max range data values into something larger (e.g. 5m) then the map will stay clearer longer. I think I also need to slow the robot down as the 1 second sweep is lagging behind the planner too much. I'll report back later today.

lnxfergy
10-17-2010, 03:13 PM
Yeah, there's a couple not-so-nice issues when moving due to delays -- the new PML driver should correct 90% of this and allow robots to move at higher speeds. Right now, slowing down the robot is the only easy fix until the new drivers are ready.

-Fergs

lnxfergy
10-17-2010, 08:38 PM
I've gotten several things done this afternoon, in particular, I added new sensor types. Unfortunately, I'm still working on a number of other items in the PML, and in the mean time, it's slightly less stable than before. I'm therefore posting the information here, so you can modify your own copy.

For a GP2D12, or equivilant 4-30" sensor, you can change lines 72-74 of src/arbotix_sensors/pml.py to read:

if k > 40:
# TODO: add other sensor abilities
ranges.append( (52.0/(k-12)) )

For a GP2Y0A02YK (8-60") you can change lines 72-74 to read as:

if k > 80:
# TODO: add other sensor abilities
ranges.append( (115.0/(k-12)) )

Be sure to keep the spaces all the same, just change the numbers (this forum ate all the formatting, yay!)

-Fergs

Pi Robot
10-17-2010, 10:17 PM
Cool. In the meantime I've gotten side tracked trying to fix my rviz from continually crashing with a
OGRE EXCEPTION(7:InternalErrorException). Seems ROS (I'm using the cturtle release) has its own OGRE as does Ubuntu 10.04 and neither of them likes the video card in my old HP Pavillion. Hopefully I'll get back to the PML stuff tomorrow.

--patrick

RobotAtlas
10-17-2010, 11:02 PM
I also had to upgrade my video card for rviz. The good news is you can get very decent NVidia videocard for around $60.

Peter_heim
10-18-2010, 06:40 AM
Hi Fergs
for some strange reason when i started today the read errors came back i reloaded ver 0.3.0 with the same result did a fresh down load from the web (tag0.3.0) and the same result this time i cant move the servo when i set head pan id to 1 but if i load pypose then i can move the servo
any clues what to try next

regards peter

Pi Robot
10-18-2010, 10:40 AM
I also had to upgrade my video card for rviz. The good news is you can get very decent NVidia videocard for around $60.

Yeah, I'll pick up a new video card but in the meantime I discovered that I can get by by lowering the resolution from 1920x1200 to 1680x1050.

--patrick

Pi Robot
10-18-2010, 10:43 AM
Hi Fergs
for some strange reason when i started today the read errors came back i reloaded ver 0.3.0 with the same result did a fresh down load from the web (tag0.3.0) and the same result this time i cant move the servo when i set head pan id to 1 but if i load pypose then i can move the servo
any clues what to try next

regards peter

Hey Peter,

I just upgraded to 0.3.0 and I'm not having any read errors. The only thing I can think of (and Fergs would know better) is that perhaps you have a flaky connection to your PML servo (either the servo cable or the communications connection) or even the servo itself. I'm guessing you have already tried it with a different servo? Or perhaps swapped out the cable? And perhaps take all the other servos off line to see if you can make it work with just the PML servo.

--patrick

lnxfergy
10-18-2010, 01:14 PM
Hi Fergs
for some strange reason when i started today the read errors came back i reloaded ver 0.3.0 with the same result did a fresh down load from the web (tag0.3.0) and the same result this time i cant move the servo when i set head pan id to 1 but if i load pypose then i can move the servo
any clues what to try next

regards peter

Sorry, but could you be a bit more specific when posting issues. If I follow correctly, you've uploaded 0.3.0 an it initially worked, but now does not. What is the exact error messages coming out? Also, when you say "I can't move the servo" -- how are you trying to move the servo, via services, joint_cmd topic, or using the controllerGUI?

-Fergs

Peter_heim
10-18-2010, 03:26 PM
Fergs
I'm trying to move the servo with the controllerGUI this worked before but not now the erorr message is as follows
[INFO] 1286963159.510477: Started joint_controller 'joint_controller' controlling: []
[INFO] 1286963159.539447: Started pml sensor 'pml' using servo: 1
Fail Read
without sync_false i get 15 fail reads

Patrick
I only have 1 servo i tried all 3 ports with no differance i loaded pypose with a terminal and listed the servos the first run showed no servos but the second and every run it showed up

regards peter

lnxfergy
10-18-2010, 03:57 PM
Fergs
I'm trying to move the servo with the controllerGUI this worked before but not now the erorr message is as follows
[INFO] 1286963159.510477: Started joint_controller 'joint_controller' controlling: []
[INFO] 1286963159.539447: Started pml sensor 'pml' using servo: 1
Fail Read
without sync_false i get 15 fail reads


Ok -- this looks like a YAML configuration issue. The joint_controller is controlling [] -- which is an empty list of servos (thus, the controllerGUI can't control any servos). If you have only a single servo, and want to use it both for PML and a neck, your YAML would look something like:


port: /dev/ttyUSB0
rate: 15
sensors: {
"pml": {type: pml, id: 1}
}
controllers: {
"j_con": {type:joint_controller,joints:["head_pan_joint"]}
}
dynamixels: {
head_pan_joint: {id: 1}
}
You just want to be careful not to run controllerGUI at the same time as the PML is enabled -- or the servo will start moving erratically.

-Fergs

lnxfergy
10-18-2010, 04:44 PM
The reason is that my IR sensor is reporting a fair bit of false data

Just a thought, how is your sensor mounted? Is it vertical like the PML shown below? If not, you'll want to orient it vertical for sure.

http://forums.trossenrobotics.com/gallery/files/1/7/6/8/pml.jpg

-Fergs

Peter_heim
10-19-2010, 05:12 AM
Hi fergs
i loaded the new YAML i still get read_fails errors but i can control the servo with controllerGUI
if i use use_sync false i get Read Failed: Servo ID = 1
just some other questions what voltage should i run the servos at? arbotix manual says 12 v but the ax 12 manual says 7 to 10 v (i really don't need any more blown up parts :rolleyes:)
when i make changes to the YAML file it dosn't always work ie remove the PML line the next start and pml still starts(no reboot just ctrl c the arbotix thread and then restart)

regards peter

lnxfergy
10-19-2010, 06:50 AM
Hi fergs
i loaded the new YAML i still get read_fails errors but i can control the servo with controllerGUI
if i use use_sync false i get Read Failed: Servo ID = 1
In this case, I'd really check your wiring, in particular, make sure all the cables are seated and that jumper J1 on the ArbotiX isn't loose.



just some other questions what voltage should i run the servos at? arbotix manual says 12 v but the ax 12 manual says 7 to 10 v (i really don't need any more blown up parts :rolleyes:)
when i make changes to the YAML file it dosn't always work ie remove the PML line the next start and pml still starts(no reboot just ctrl c the arbotix thread and then restart)
12V seems to be fine -- you'll note that the bioloid power brick is 12V, and the new Bioloid Premium comes with an 11.1V lipo, so Robotis doesn't even follow the 7-10V. You want to be careful not to go to far above 12V (A fully charged NiMH, LiPO, or SLA battery should still be fine).

As for changes not showing up, if you remove sections of a parameter (YAML) file, you need to clear the parameter server. On your next load, new parameters will be loaded in, but if you don't redefine "sensors", it keeps it's old value. Easiest way to do this is to run "rosparam delete arbotix" which will remove all entries under the arbotix namespace (You'd need to change "arbotix" to the node name if you change the name of the node")

-Fergs

Pi Robot
10-19-2010, 09:09 AM
Just a thought, how is your sensor mounted? Is it vertical like the PML shown below? If not, you'll want to orient it vertical for sure.
-Fergs

Thanks for the tip. I'll do this when I try the long range IR sensor. In the meantime, here is a screen shot of RViz using the medium range sensor with a wall 1m in front of the robot, some clutter on the left, and an open room on the right. The edge of the forward wall is about 0.25m to the right of robot. (The scale of the grid is 0.5 meter per square.)


http://forums.trossenrobotics.com/attachment.php?attachmentid=2155&stc=1&d=1287497216

--patrick

Pi Robot
10-19-2010, 09:15 AM
The next thing I'd like to try is to use the ArbotiX for the PML sensor while using the Serializer to control my drive motors and collect odometry data. I already have a ROS node all set up to do the base controller with the Serializer. However, both the ArbotiX node and the Serializer node need access to /dev/ttyUSB0 on my PC (which in turn is connected to my XBee explorer). Does anyone know how to share a single USB port under Ubuntu between two ROS nodes?

--patrick

lnxfergy
10-19-2010, 12:47 PM
Thanks for the tip. I'll do this when I try the long range IR sensor. In the meantime, here is a screen shot of RViz using the medium range sensor with a wall 1m in front of the robot, some clutter on the left, and an open room on the right. The edge of the forward wall is about 0.25m to the right of robot. (The scale of the grid is 0.5 meter per square.)

That's a particularly messy looking output. You might consider adding a ~10uF cap right on the back of the sensor, those short rangers can have some terrible current spikes (I've not seen hardly any issues with the large sensor, presumably because there is plenty of space in the case to add enough capacitors).

-Fergs

lnxfergy
10-19-2010, 12:51 PM
The next thing I'd like to try is to use the ArbotiX for the PML sensor while using the Serializer to control my drive motors and collect odometry data. I already have a ROS node all set up to do the base controller with the Serializer. However, both the ArbotiX node and the Serializer node need access to /dev/ttyUSB0 on my PC (which in turn is connected to my XBee explorer). Does anyone know how to share a single USB port under Ubuntu between two ROS nodes?

--patrick

Even if you can share the port between the two nodes -- how are the two hardware devices going to handle the commands from the other arriving (I can tell you now, getting a Serializer data stream thrown at the ArbotiX with the ROS sketch on it is a recipe for unreliability, if not complete failure).

You'll probably either want to add an additional XBEE on the PC, or move drive & odometry to the ArbotiX (I realize that the latter option will take some time as you're waiting on a motor driver to arrive).

-Fergs

Pi Robot
10-19-2010, 01:05 PM
That's a particularly messy looking output. You might consider adding a ~10uF cap right on the back of the sensor, those short rangers can have some terrible current spikes (I've not seen hardly any issues with the large sensor, presumably because there is plenty of space in the case to add enough capacitors).

-Fergs

Thanks for the suggestion--I won't have the sensor in front of me until tonight at which point I might ask where exactly to connect the capacitor. Also, how can you tell the data is messy? At this early stage, any data looks good to me so I wouldn't have known this looks noisy.

--patrick

Pi Robot
10-19-2010, 01:11 PM
Even if you can share the port between the two nodes -- how are the two hardware devices going to handle the commands from the other arriving (I can tell you now, getting a Serializer data stream thrown at the ArbotiX with the ROS sketch on it is a recipe for unreliability, if not complete failure).

You'll probably either want to add an additional XBEE on the PC, or move drive & odometry to the ArbotiX (I realize that the latter option will take some time as you're waiting on a motor driver to arrive).

-Fergs

Yeah, I was wondering about that. I asked on another thread a few weeks ago about sending a command stream to two XBee devices simultaneously and the thinking seemed to be that each device would simply ignore the packets not meant for them. But I haven't tried it. I also have a Bluetooth option for the Serializer so I'll give that a try first to separate the streams. And yeah, I'm looking forward to getting everything going on one board (the ArbotiX of course), but I really want to see some basic navigation with the PML. (My laser scanner is still a few weeks away...) I actually got it working so-so using the Serializer on its own and a hobby servo for the PML.

Which raises one other question: would it be possible to use a hobby servo instead of a Dynamixel for the PML on the ArbotiX? Like you mentioned earlier, the Dynamixel noise can start to drive you nuts after awhile and I noticed that using the hobby servo was much quieter. I realize you don't have as good a position control as with the Dynamixel but maybe it would be good enough?

--patrick

lnxfergy
10-19-2010, 01:15 PM
Thanks for the suggestion--I won't have the sensor in front of me until tonight at which point I might ask where exactly to connect the capacitor. Also, how can you tell the data is messy? At this early stage, any data looks good to me so I wouldn't have known this looks noisy.

--patrick

Obviously I don't have a view of the room, but based on your description I would expect those points on the left side of your screen cap to be a bit more uniform (in depth from the robot). I also wouldn't expect the last two points on the right to be where they are (maybe you have a funky wall, but I would expect them to be in line with the other row of points that is clearly the wall in front of the robot). If that's actually what the environment looks like, disregard my comments....

-Fergs

lnxfergy
10-19-2010, 01:32 PM
Yeah, I was wondering about that. I asked on another thread a few weeks ago about sending a command stream to two XBee devices simultaneously and the thinking seemed to be that each device would simply ignore the packets not meant for them. But I haven't tried it. I also have a Bluetooth option for the Serializer so I'll give that a try first to separate the streams. And yeah, I'm looking forward to getting everything going on one board (the ArbotiX of course), but I really want to see some basic navigation with the PML. (My laser scanner is still a few weeks away...) I actually got it working so-so using the Serializer on its own and a hobby servo for the PML.
The ArbotiX can *probably* handle the extra data -- it'll just ignore it. I don't think there is any way in which the Serializer data could look like a Dynamixel packet format (the probability of having 2 0xff bytes and then forming the bytes required to have the right length and a proper checksum is pretty much 0). I'd more be worried about overrunning the buffer in the ArbotiX (although this could probably be patched in the firmware, and is likely something that should be done regardless).



Which raises one other question: would it be possible to use a hobby servo instead of a Dynamixel for the PML on the ArbotiX? Like you mentioned earlier, the Dynamixel noise can start to drive you nuts after awhile and I noticed that using the hobby servo was much quieter. I realize you don't have as good a position control as with the Dynamixel but maybe it would be good enough?

You could switch out the AX-12 for a hobby servo -- with some firmware updates. I'm getting close to an 0.3.1 release which will include the new PML (practically a ground up re-write). Changes include:


The pml scanning and data storage has been split off into an Arduino library. This does 2 things: a) makes it possible to swap out the code easier (as in your case, to move to a hobby servo) or to include the PML in your own Arduino program, b) makes it possible to have mutliple PMLs running at the same time.
Many updates that improve the reliability of PML data within ROS. PML LaserScans now carry correct time information (and are correctly ordered) such that the costmap can do the interpolation taking into account odometry information. While still not perfect, the PML is now far more robust under movement.
Ability to adjust the scanning range of the PML (defaults to 180.0 degrees, but can be reduced to increase the speed at which a full scan is completed).
Example launch and configuration files that show how to use the PML with a costmap_2d. I've been running ROSalyn around the lab a lot the last few days, and tuning the performance of the PML and the costmap.

While the new PML code is still not going to help anyone with localization, I think it'll now be capable of doing serious obstacle avoidance, even on quicker moving robots (ROSalyn does about 40cm/s). With multiple PMLs running, one could probably boost the maximum speed by enabling multiple PMLs each scanning over a small range.

-Fergs

Pi Robot
10-19-2010, 03:44 PM
Obviously I don't have a view of the room, but based on your description I would expect those points on the left side of your screen cap to be a bit more uniform (in depth from the robot). I also wouldn't expect the last two points on the right to be where they are (maybe you have a funky wall, but I would expect them to be in line with the other row of points that is clearly the wall in front of the robot). If that's actually what the environment looks like, disregard my comments....

-Fergs

I'll pay closer attention to the object layout tonight and take another scan. There might have been a floor lamp partially visible around the right edge of the wall...

--patrick

Pi Robot
10-19-2010, 03:49 PM
You could switch out the AX-12 for a hobby servo -- with some firmware updates. I'm getting close to an 0.3.1 release which will include the new PML (practically a ground up re-write). Changes include:

(Lot's of great stuff here...)

-Fergs

This is all great news! 40cm/s is quite impressive. And I was just about to ask for an adjustable scanning range. :happy: Don't know how I missed it before, but "ROSalyn" is a great choice of name. :cool:

--p

lnxfergy
10-19-2010, 04:15 PM
:happy: Don't know how I missed it before, but "ROSalyn" is a great choice of name. :cool:

Unfortunately, the name is more impressive than the bot: http://robotics.ils.albany.edu/wiki/ROSalyn (That picture is pre-PML installation).

-Fergs

Pi Robot
10-19-2010, 04:26 PM
Unfortunately, the name is more impressive than the bot: http://robotics.ils.albany.edu/wiki/ROSalyn (That picture is pre-PML installation).

-Fergs

Cool hairdo though! :veryhappy:

lnxfergy
10-19-2010, 05:25 PM
Just a heads up -- I've moved most of the documentation over to the ROS wiki:

http://www.ros.org/wiki/arbotix

There are still quite a few pieces to finish up though.

-Fergs

Pi Robot
10-19-2010, 06:02 PM
Very cool and very official looking. Congrats!

--patrick

RobotAtlas
10-19-2010, 09:00 PM
PML is Planar Meta-Laser now, ha. You didn't like Poor Man LiDAR? :)

lnxfergy
10-19-2010, 09:06 PM
PML is Planar Meta-Laser now, ha. You didn't like Poor Man LiDAR? :)

I believe the SVN commit says 'making this more politically correct for publications'. We re-branded when putting together a paper on one of the robots here, that uses the PML.

-Fergs

RobotAtlas
10-19-2010, 09:21 PM
I believe the SVN commit says 'making this more politically correct for publications'. We re-branded when putting together a paper on one of the robots here, that uses the PML.

-Fergs

That was very creative of you.

I'm looking to order that new Sharp IR GP2Y0A710YK. It's smaller and 10% longer range.
Is Acroname Robotics ($19.50) a good place to get it from?

Would AirbotiX be able to support 2 or 3 of these sensors? I remember the answer was "we are getting close to filling some table". Is it still the case?

lnxfergy
10-19-2010, 09:40 PM
I'm looking to order that new Sharp IR GP2Y0A710YK. It's smaller and 10% longer range. Is Acroname Robotics ($19.50) a good place to get it from?
They are *the* only place I know of to get these things. I have 1 of each the old and new style, but I haven't yet calibrated the new one....


Would ArbotiX be able to support 2 or 3 of these sensors? I remember the answer was "we are getting close to filling some table". Is it still the case?

I've made some major changes in the protocol (which are still under revision) that should alleviate this issue (at the expense of what might be considered some semi-funky behavior).

-Fergs

RobotAtlas
10-19-2010, 10:01 PM
Ok, I think I went a little crazy though:
Order Items
Quantity Part Number Item Total
1 Sharp GP2D120XJ00F IR Package $14.50
1 Sharp GP2Y0A21YK0F IR Package $14.50
2 Sharp GP2Y0A710YK0F Package $39.00
Item Sub-Total:$68.00

Peter_heim
10-20-2010, 02:42 AM
Hi Fergs
what is the quick way to run the motors and read the encoders values so i can calibrate them

regards peter

Peter_heim
10-20-2010, 06:52 AM
Hi Fergs
i started to play with the encoders so i disabled the servos in the yaml file now i get the following error
process[arbotix-1]: started with pid [7811]
[INFO] 1287574993.402173: Starting ArbotiX-ROS on port /dev/ttyUSB0
[]
[]
Fail Read
[INFO] 1287574993.610039: Started base_controller 'b_con' for a base of 0.14m wide with 200.0 ticks per meter
Fail Read
Read Failed: Servo ID = 253
[ERROR] 1287574993.779378: Could not update encoders
Fail Read
Read Failed: Servo ID = 253
[ERROR] 1287574993.881250: Could not update encoders
Fail Read
Read Failed: Servo ID = 253
[ERROR] 1287574993.983306: Could not update encoders
Fail Read
this repeats about 12 times when i look in Rviz and move the wheels i get movement the base_link moves away from the odom
could this be related to my problems getting the PML servo to work?
here is a copy of my yaml file
port: /dev/ttyUSB0
rate: 15
use_sync: False
#sensors: {
#"pml": {type: pml, id: 1}
#}
controllers: {
"b_con": {type: base_controller,base_width: 0.14,ticks_meter: 200}
}
#dynamixels: {
# head_pan: {id: 1}
#}

regards Peter

lnxfergy
10-20-2010, 10:13 AM
Peter,

I apologize if I asked this before, but, how are you connected to the ArbotiX? It would appear that there is a delay in making a connection (or possibly in the ArbotiX first responding) that is causing failed reads at startup.

There's two cases in which I've seen this. It seems your problem is not persistent throughout runtime (as sometimes noticed when using XBEE radios in highly RF-noisy environments). Therefore, I'm guessing you might be using a sparkfun FTDI breakout? If so, try removing the reset-enable jumper - as opening the port resets the board and could cause this delay in response at startup. (this isn't an issue with the black FTDI cable, as it uses a different pin for reset)

-Fergs

Pi Robot
10-20-2010, 10:58 AM
Obviously I don't have a view of the room, but based on your description I would expect those points on the left side of your screen cap to be a bit more uniform (in depth from the robot). I also wouldn't expect the last two points on the right to be where they are (maybe you have a funky wall, but I would expect them to be in line with the other row of points that is clearly the wall in front of the robot). If that's actually what the environment looks like, disregard my comments....

-Fergs

Yeah, there was definitely a floor lamp in view just to the right of the wall edge. In the meantime, here is another test under cleaner conditions. The robot is in a long hallway with its back against the wall and facing across the hallway to the center of a door way into another room. There are no objects in the hallway within range of the PML's IR sensor. (I am still using the medium range sensor with max range = 1.5m).

I made a short Youtube video so you can see how the points jump around. The video starts off slowly but then you will see the changes more quickly on each sweep.

YouTube - PML test 1 with medium range IR

What do you think?

--patrick

Peter_heim
10-20-2010, 03:20 PM
Hi Fergs
I'm useing a sparkfun FTDI breakout i removed the jumper and i works rostopic echo /base_scan returns data
thank you for your help

regards peter

lnxfergy
10-20-2010, 03:38 PM
Hi Fergs
I'm useing a sparkfun FTDI breakout i removed the jumper and i works rostopic echo /base_scan returns data
thank you for your help

regards peter

Great -- I'll be sure to add a note about this to the wiki.

-Fergs

lnxfergy
10-20-2010, 04:07 PM
Patrick, that scan doesn't look too bad, the new PML setup should fix that skewing where the up scan and the down scan aren't aligned.

Speaking of new releases, I just checked in a tag for 0.3.1 - which includes a completely overhauled PML. You'll need to update the firmware for sure (a number of changes were made in the register table), and compiling the firmware will require installing the new RoboControllerLib 0009 release (which includes the pml library).

A number of API changes for the PML were made, which is quite important:


Biggest change: "id" is now "servo_id" so you'll have to change existing YAMLs
The reason for this, is to add a "sensor_id" for which analog port is being used.

There's also a lot of new parameters for the PML, including configuring the step size and range, and configuring which sensor type is used. Details on new parameters can be found on the ROS wiki: http://www.ros.org/wiki/arbotix/pml

There was also a patch to fix the no dynamixels + use_sync bug.

The ChangeList can be found here: http://www.ros.org/wiki/vanadium_drivers/ChangeList

-Fergs

Pi Robot
10-20-2010, 07:00 PM
Patrick, that scan doesn't look too bad, the new PML setup should fix that skewing where the up scan and the down scan aren't aligned.

Speaking of new releases, I just checked in a tag for 0.3.1 - which includes a completely overhauled PML. You'll need to update the firmware for sure (a number of changes were made in the register table), and compiling the firmware will require installing the new RoboControllerLib 0009 release (which includes the pml library).

A number of API changes for the PML were made, which is quite important:


Biggest change: "id" is now "servo_id" so you'll have to change existing YAMLs
The reason for this, is to add a "sensor_id" for which analog port is being used.

There's also a lot of new parameters for the PML, including configuring the step size and range, and configuring which sensor type is used. Details on new parameters can be found on the ROS wiki: http://www.ros.org/wiki/arbotix/pml

There was also a patch to fix the no dynamixels + use_sync bug.

The ChangeList can be found here: http://www.ros.org/wiki/vanadium_drivers/ChangeList

-Fergs


Awesome Fergs! Release 0.3.1 compiled and uploaded fine on my end. Next I'll start playing with all the new parameters.

Many thanks!
patrick

Pi Robot
10-21-2010, 08:45 AM
It won't make the top-10 videos of the year, but here is a short demo on using Mike's PML code and ROS to drive Pi Robot to a few manually specified locations while avoiding obstacles. In this demo I am using the Sharp 2Y0A02 sensor which has a max range of 1.5m. Odometry data is collected from my Serializer board and published by a ROS node based on Mike's base_controller.py code and using my PySerializer library. The PML itself is controlled by the ArbotiX of course. Communication is over standard XBee radios from my Ubuntu PC. Note how the robot can get back fairly close to the starting position even without using localization (i.e. no SLAM yet). The orange dots are points detected by the scanning IR sensor. ROS then marks that cell as occupied (red square) plus a little buffer zone (grey squares) that is based on the dimensions of the robot.

(The video is best viewed in full screen mode.)

YouTube - PML Test 2 - Simple Navigation and Obstacle Avoidance using ROS

--patrick

RobotAtlas
10-21-2010, 10:14 AM
Patrick, this is a fantastic result.
Is the sweep 180 degrees? 30 points/sweep is not bad at all for a short-range sensor.

Pi Robot
10-21-2010, 01:06 PM
Patrick, this is a fantastic result.
Is the sweep 180 degrees? 30 points/sweep is not bad at all for a short-range sensor.

Thanks--though it only works as well as it does because of all of Mike's work. And yes, the sensor is currently sweeping through 180 degrees.

--patrick

lnxfergy
10-21-2010, 02:27 PM
Patrick, looking good. Might you post your costmap parameters? (specifically, what resolution the map is, so that we can start getting good #'s on a variety of sensor models).

And one more note -- if you don't need 180 degree sweep, a ~140 degree sweep makes the servo much quieter (which I know you mentioned before). Parameters for the PML (in the launch file) would include: {step_start: 280, step_value: 16, step_count: 30}.

-Fergs

P.S. - I'm totally stealing your idea of just bringing up a webcam view and doing desktop capture -- I've been trying to figure out how to do Picutre-In-Picture of the view of the robot alongside the map... I'm hoping to put together a short promo video about the PML this weekend.

RobotAtlas
10-21-2010, 03:46 PM
I'm hoping to put together a short promo video about the PML this weekend.

PML on any walker would be really cool. I think it would be first in its class too.
Congratulations Mike on successful release. I see multiple PML support is coming in 0.3.2 too.

Pi Robot
10-21-2010, 04:06 PM
Patrick, looking good. Might you post your costmap parameters? (specifically, what resolution the map is, so that we can start getting good #'s on a variety of sensor models).


Here is the link to my current costmap parameters:

http://code.google.com/p/pi-robot-ros-pkg/source/browse/#svn/trunk/pi-robot-ros-stack/pi_robot/params

and this one is to my blank_map.yaml file:

http://code.google.com/p/pi-robot-ros-pkg/source/browse/trunk/pi-robot-ros-stack/pi_robot/maps/blank_map.yaml

The map itself is just a 200x200 pixel white square.

Anyone else should also feel free to browse through the SVN code though keep in mind it could break at any minute. At some point I'll start making some tagged releases for some of the pieces but for the moment it's just the way I sync the code between machines.




And one more note -- if you don't need 180 degree sweep, a ~140 degree sweep makes the servo much quieter (which I know you mentioned before). Parameters for the PML (in the launch file) would include: {step_start: 280, step_value: 16, step_count: 30}.


Cool, I'll give that a try.




P.S. - I'm totally stealing your idea of just bringing up a webcam view and doing desktop capture -- I've been trying to figure out how to do Picutre-In-Picture of the view of the robot alongside the map... I'm hoping to put together a short promo video about the PML this weekend.


Great! I was actually just flailing about on Google last night trying to figure out how to do any kind of capture. I'm using Cheese for the webcam (http://projects.gnome.org/cheese/) and vnc2flv to do the screen capture (http://www.unixuser.org/~euske/python/vnc2flv/index.html (http://www.unixuser.org/%7Eeuske/python/vnc2flv/index.html)) though I would like to find a more GUI based utility for capture...

--patrick

P.S. I put in a feature request at ROS.org for a "radius" option in rviz for the Path and Polygon displays so that they will be more visible in screen captures.

RobotAtlas
10-21-2010, 07:35 PM
Patrick, if I can see correctly, IR sensor is positioned horizontally on you robot, while accourding to Sharp's IRspec recommendation it shuld be used vertically:
http://forums.trossenrobotics.com/attachment.php?attachmentid=2158&stc=1&d=1287707550

I don't know how much difference it would make, but it would be interesting to find out.

lnxfergy
10-21-2010, 07:37 PM
RobotNV -- I had suggested that earlier in the discussion. It does make the readings quite a bit better in my experience (in particular I've seen this while doing wall following with an IR sensor). It'll be an especially important consideration for anyone using multiple PMLs (to minimize cross-talk).

-Fergs

RobotAtlas
10-21-2010, 08:11 PM
Sorry Fergs. I know I need to reread both this thread and 0.2. I had very busy 3 weeks with a big project and missed some stuff here. Talking about ArbotiX, they are on backorder at Trossen - what's up with that? Are they too polular? :)

lnxfergy
10-21-2010, 08:48 PM
I don't yet have a tutorial up on how to add the Pololu motor driver to the ArbotiX, but I figured I'd at least get these pictures up. As you can see I created a small breakout board that the motor drivers plug into. I then pulled the SN754410 motor driver, and connected the motor driver to 5V and GND with a 2-pin connection, the motor PWM/Direction lines using just raw wires plugged into the original motor driver socket, and a 3-pin connection for the second direction lines on each motor and the enable line that is shared between the two motors. I then looped the driver around to the back so that the rest of the IO is accessible. The motors & motor power connect to the Pololu motor driver. (Full details of wiring hookup is in the first post in this thread, which announced 0.3.0).

http://forums.trossenrobotics.com/attachment.php?attachmentid=2162&stc=1&d=1287711595
http://forums.trossenrobotics.com/attachment.php?attachmentid=2163&stc=1&d=1287711595

lnxfergy
10-21-2010, 08:52 PM
Sorry Fergs. I know I need to reread both this thread and 0.2. I had very busy 3 weeks with a big project and missed some stuff here. Talking about ArbotiX, they are on backorder at Trossen - what's up with that? Are they too polular? :)

No problem, I just wanted to point out that I would already *really* suggest doing this -- especially at the speed we're moving the sensor. As for ArbotiX and stock levels, they should be back in stock early next week.

-Fergs

Pi Robot
10-21-2010, 09:13 PM
Patrick, if I can see correctly, IR sensor is positioned horizontally on you robot, while accourding to Sharp's IRspec recommendation it shuld be used vertically:

I don't know how much difference it would make, but it would be interesting to find out.

Yeah, Fergs did mention this earlier but I appreciate the reminder. At first I had it mounted vertically with some sticky tape on the "bottom" but it kinda wobbled around so I went to the horizontal mount temporarily until I figure out a better way to go vertical.

--patrick

Pi Robot
10-21-2010, 09:14 PM
I don't yet have a tutorial up on how to add the Pololu motor driver to the ArbotiX, but I figured I'd at least get these pictures up. As you can see I created a small breakout board that the motor drivers plug into. I then pulled the SN754410 motor driver, and connected the motor driver to 5V and GND with a 2-pin connection, the motor PWM/Direction lines using just raw wires plugged into the original motor driver socket, and a 3-pin connection for the second direction lines on each motor and the enable line that is shared between the two motors. I then looped the driver around to the back so that the rest of the IO is accessible. The motors & motor power connect to the Pololu motor driver. (Full details of wiring hookup is in the first post in this thread, which announced 0.3.0).


This is great--thanks Fergs. I hope to try this out this weekend.

--patrick

Peter_heim
10-22-2010, 05:38 AM
Hi All
Trying to get my robot to move i copied the teleop python code from Patrick site(nice package i need to do a lot of reading:happy:) started the arbotix the TK interface came up and the cmd_vel topic was displaying linear x .15 when i pressed forward. But there was no movement
I am using arbotix ver 0.3.0 with some small motors and encoders hooked up to the onboard driver
my launch file is
<launch>
<master auto="start"/>
<node name="arbotix_teleop" pkg="arbotix" type="arbotix_teleop.py" output="screen" />


<node name="arbotix" pkg="arbotix" type="arbotix-node.py" output="screen">
<rosparam file="$(find arbotix)/default.yaml" command="load" />
<param name="robot_description" command="cat $(find arbotix)/urdf/robbie_urdf.xml" />
</node>
</launch>

regards peter

lnxfergy
10-22-2010, 12:48 PM
Peter,

I'm not sure what arbotix_teleop does -- and I can't find the source. The arbotix package comes with a program called controllerGUI.py -- this is a tele-operation program that allows you to drive the robot (by dragging the red dot inside the white box, up = drive forward, etc), and you can move the head with sliders. You can start it with "rosrun arbotix controllerGUI.py"

If that program doesn't work -- please post your updated YAML. You could also post the PDF output of the rxgraph tool.

-Fergs

lnxfergy
10-22-2010, 12:54 PM
I've been ironing out an 0.3.2 release feature list -- and wanted to point a few major changes that will be included (and how they may affect your use of parameters, etc):


Add support for dynamic reconfigure of the scanning range of the PML. This will include several changes to the parameter API -- instead of being specified as a start/step/count parameter, it will become center angle and count parameters.
Add support for multiple PMLs. The pml_id parameter is already added to handle multiple PMLs in the hardware, just not actually doing anything yet. Each PML (like any other sensor/controller) already has a name to differentiate within ROS.
add example launch and configuration files that show how to use the PML with a costmap_2d.

0.3.2 will also hopefully include first versions of the joint trajectory controller and battery monitor sensor (but depending on how long they take to work out... may not).

Again, feedback is appreciated as we move towards a stable 1.0 release (albeit slowly).

-Fergs

Pi Robot
10-22-2010, 03:03 PM
Just a note to all the ArbotiX / ROS fans here: I have removed the "arbotix" folder from my own Pi Robot SVN site (posted earlier in this thread) to reduce confusion with Mike's official source. I only needed my own copy in the beginning when learning from Mike's code. Henceforth, I'll be using Mike's releases straight from his repository.

--patrick

Peter_heim
10-22-2010, 03:48 PM
Hi Fergs
here is a copy of the YAML file and a png of rosgraph i cant do it in pdf:sad:

port: /dev/ttyUSB0
rate: 15
#use_sync: False
sensors: {
"pml": {type: pml, id: 1}
}
controllers: {
"b_con": {type: base_controller,base_width: 0.14,ticks_meter: 189}

}
#dynamixels: {
# head_pan_joint: {id: 1}
#}

regards peter

Pi Robot
10-22-2010, 03:56 PM
Hi Peter,

I'm guessing you got the arbotix_teleop.py node from my "arbotix" folder before I deleted it, but it is not part of Mike's distribution. It is something I cobbled together some time ago to control the robot using the arrow keys. I would recommend using Mike's controllerGUI.py node instead.

--patrick

Peter_heim
10-22-2010, 04:05 PM
Hi Patrick
arbotix_teleop.py is from your site (just the name changed) i have tried both but heither move the motors if i use rostopic echo cmd_vel i can see the changes (both are similar)

regards peter

Pi Robot
10-22-2010, 08:58 PM
Hi Peter,

I'm afraid I only used the pi_robot_teleop.py node for moving my robot via the Serializer rather than the ArbotiX. I just got the Pololu motor controller in the mail so I can now try driving my motors using the ArbotiX, hopefully some time this next week. In the meantime, I can't test either the teleop code or controllerGUI.py directly on the ArbotiX.

--patrick

lnxfergy
10-22-2010, 09:04 PM
Peter, did you enable motor driver support when compiling and uploading the ROS firmware? (uncomment the #USE_BASE, and #USE_BIG_MOTORS if using the Pololu motor drivers)

-Fergs

lnxfergy
10-23-2010, 02:45 PM
FYI -- I'd recommend everyone re-checkout/update their 0.3.1 release. I apparently botched how I created the 0.3.1 tag, and it wasn't actually the 0.3.1 release most people would have gotten. It has now been fixed in SVN.

-Fergs

Peter_heim
10-23-2010, 05:28 PM
Hi Fergs
just tried to upload the 0.3.1 firmware to the arbotix and recieved the following error
in
pml.setupStep(step_start,step_value,params[k]);


ros.cpp: In function 'unsigned char handleWrite()':
ros:248: error: 'class PML' has no member named 'setupStep'

I'm using robocontroller .009
0.3.1 i downloaded after your update

regards Peter

lnxfergy
10-23-2010, 09:12 PM
Hi Fergs
just tried to upload the 0.3.1 firmware to the arbotix and recieved the following error
in
pml.setupStep(step_start,step_value,params[k]);


ros.cpp: In function 'unsigned char handleWrite()':
ros:248: error: 'class PML' has no member named 'setupStep'

I'm using robocontroller .009
0.3.1 i downloaded after your update

regards Peter

Whoops -- I had uploaded 0009 -- but I hadn't committed the last round of changes. I just posted an 0010 release which is what I've *actually* been using here and thought the 0009 release was. If you pull down the updated robocontrollerlib, you should be in business.

-Fergs

Peter_heim
10-24-2010, 07:19 AM
Hi Fergs


Whoops -- I had uploaded 0009 -- but I hadn't committed the last round of changes. I just posted an 0010 release which is what I've *actually* been using here and thought the 0009 release was. If you pull down the updated robocontrollerlib, you should be in business.

It works after the update I had to put the reset enable jumper back to upload the program' The other issue i have is my left motor is not moving forward but is is moving backwards i have checked my wires they are ok. Is there a way to read just the left or right encoder rostopic echo odom is not very clear

regards peter

lnxfergy
10-24-2010, 08:20 AM
Hi Fergs


It works after the update I had to put the reset enable jumper back to upload the program' The other issue i have is my left motor is not moving forward but is is moving backwards i have checked my wires they are ok. Is there a way to read just the left or right encoder rostopic echo odom is not very clear

regards peter

Ok, so eventually we'll have a tutorial that covers the following:

1. You can turn on debug output by changing the first lines of ArbotiX_ROS. You need to uncomment the line with log_level=rospy.DEBUG, and comment out the other line with rospy.init_node:



class ArbotiX_ROS(ArbotiX):

def __init__(self):
rospy.init_node('arbotix', log_level=rospy.DEBUG)
#rospy.init_node('arbotix')


You'll then get lots of encoder output in the debugging system. Easiest way to view it is to open rxconsole. When rotating wheels such that the bot would drive forward, encoder counts should increase.

Alternatively, a possibly easier way is to just disconnect motors, just leave encoders attached. Launch RViz, set the fixed frame to "odom". Set your robot on the floor. Push the bot forward on the floor and see that the base_link TF frame should move forward from the odom frame.

2. To check motor wiring, disconnect encoders, and then use a teleop node to give the robot a forward movement command -- both leds on the Pololu motor driver should glow green (or you've got a wiring problem between the ArbotiX and motor driver), and the wheels should drive forward -- if one or both doesn't drive forward, reverse the way it's plugged into the pololu motor driver.

-Fergs

Pi Robot
10-24-2010, 10:45 AM
Hey Fergs,

I have the ArbotiX joint_controller running along with controllerGUI.py and when I move the pan/tilt sliders on the GUI I get good movement of the servos and a corresponding update in RViz of my robot model. However, there is a noticeable delay between the real-world movement of the servos and the movement in RViz--I'd say as much as 0.5 second. My Update Interval for TF in RViz is set to 0 which I believe is as fast as possible but I haven't been able to find an update interval parameter for RViz itself. Any ideas?

--patrick

Pi Robot
10-24-2010, 11:01 AM
Ooops--my bad. My ArbotiX "rate" parameter was still set to 1 from my PML experiments. Setting it to 10 solved the delay issue.

--patrick

Peter_heim
10-25-2010, 05:28 AM
Hi Fergs
Thanks for the tip traced my problem to a crossed wire (10 and 15) no more checking wires after a glass of red:happy:. the robot now moves and i can see the PML data on Rviz now to set up tf and nav
the original problem where i had no movement i traced to clicks per meter was to low (189 clicks)

regards peter

lnxfergy
10-25-2010, 09:51 AM
Patrick --
I'd be interested in how well higher rate joint_state publication works for you. I've only been using a hard connection through USB thus far, since all my bots carry netbooks around on them. I'm interested to see how XBEE works, but haven't had time to really test it. The default rate is 15.0 -- and I've seen it work at up to about 60.0hz with a moderate number of servos and a hard connection.

Peter--
Make sure your ticks_meter parameter is correct for your encoders/wheel combo: you'll have to compute the wheel circumference in meters (C = pi*Diameter in meters) and then your ticks_meter is the # of ticks per rotation of the wheel divided by the circumference of the wheel.

The issue you may be having, is if your encoder count is very low, the PID parameters will not work. You'll note that our default PID parameters of 5,1,0,50 are for a ticks_meter of ~20k->60k.

-Fergs

Peter_heim
10-25-2010, 03:11 PM
Hi Fergs
Thanks for that I changed my wheels and encoders i now have 5189 clicks per meter I'll save the fine tuning until the weekend.

The issue you may be having, is if your encoder count is very low, the PID parameters will not work. You'll note that our default PID parameters of 5,1,0,50 are for a ticks_meter of ~20k->60k.

-Fergs

as a rule of thumb dose the PID go lower or higher for lower ticks per meter?

regards peter

lnxfergy
10-25-2010, 03:40 PM
as a rule of thumb dose the PID go lower or higher for lower ticks per meter?

It's tough to say. The base_controller docs have the calculation that is used: http://www.ros.org/wiki/arbotix/base_controller

The Ko (output scale) probably should vary *with* the ticks per meter, so less ticks equals lower Ko. The other parameters. However, too high of a Kp/Ko ratio will cause oscillation. This presentation by Larry Barello (of the Seattle Robotics Society) is about the best thing I've ever come across as far as understanding PID tuning: http://barello.net/Papers/Motor%20Control,%20SRS%2020080621.pdf

I'm hoping to eventually have some heuristics that help calibrate the PID a bit better to start with -- Pi, the serializer has adjustable PID parameters, right? How have you dealt with that?

-Fergs

RobotAtlas
10-25-2010, 08:06 PM
It won't make the top-10 videos of the year, but here is a short demo on using Mike's PML code and ROS to drive Pi Robot to a few manually specified locations while avoiding obstacles.

One thing I noticed in the video around 1:26 is when robot is rotating, the scan is rotating with it. Thus the position of couch is detected incorrectly.

After looking at how pml.py publishes LaserScan, I think I see why this is happening.
The whole scan of ranges (in one direction) get the same timestamp and uses the same transform. Which is fine when robot is stationary, but creates a problem when robot is rotating _while_ PML is also scanning.

Am I thinking correctly that this can be addressed by posting as many LaserScan messages as there are samples, with different timestamps?
Then rviz will transform each sample using correct values of sensor frame and robot base frame.

I know ROS guys are making a new message for sonars/IR sensors, but it’s not available in cturtle.

Also I was looking at what ROS people did when making Lego NXT assisted_teleop.
I think it is the closest application to PML as it can get, except they don't do panning and they use NXT ultrasonic sensor.
Anyway, what they do is they take each range reading and then convert it to a PointCloud message (with many points for each range reading).
I wonder if the same approach could be used in PML, by converting each sample into a PointCloud message. Each sample would need to have its own x,y,z calculated using tf from robot position at the (approximate) time of each sample.

What6 do you guys think?

lnxfergy
10-25-2010, 09:02 PM
One thing I noticed in the video around 1:26 is when robot is rotating, the scan is rotating with it. Thus the position of couch is detected incorrectly.

After looking at how pml.py publishes LaserScan, I think I see why this is happening.
The whole scan of ranges (in one direction) get the same timestamp and uses the same transform. Which is fine when robot is stationary, but creates a problem when robot is rotating _while_ PML is also scanning.

False. RViz does not interpolate the laser scan, but other packages such as costmap_2d do (using the laser_geometry (http://www.ros.org/wiki/laser_geometry) package). Yes the LaserScan message has a timestamp, and this is the beginning of the scan -- but the time between points is also transmitted in a LaserScan message. In fact, the 0.3.1 PML actually switches the direction of scan to account for the left/right movement of the sensor head so that the interpolation is done in the right order.


Am I thinking correctly that this can be addressed by posting as many LaserScan messages as there are samples, with different timestamps?
Then rviz will transform each sample using correct values of sensor frame and robot base frame.
I know ROS guys are making a new message for sonars/IR sensors, but it’s not available in cturtle.

This would actually make the PML pretty much useless -- it would no longer work with the costmap_2d package, and thus would no longer work with the navigation stack.


Also I was looking at what ROS people did when making Lego NXT assisted_teleop. I think it is the closest application to PML as it can get, except they don't do panning and they use NXT ultrasonic sensor.
Anyway, what they do is they take each range reading and then convert it to a PointCloud message (with many points for each range reading).
I wonder if the same approach could be used in PML, by converting each sample into a PointCloud message. Each sample would need to have its own x,y,z calculated using tf from robot position at the (approximate) time of each sample.

My understanding is that all points in a cloud have the same timestamp -- you can't interpolate a point cloud because it's assumed that the underlying sampling method (camera) is globally synchronized. Thus, an interpolated LaserScan is what we want --- and what we have (and it eventually gets converted to a point cloud inside the costmap_2d using laser_geometry, and saving the overhead in transmitting the much larger point cloud message).

-Fergs

RobotAtlas
10-25-2010, 09:28 PM
but the time between points is also transmitted in a LaserScan message.
Are you talking about time_increment field in LaserScan message? If yes, I couldn't find where it's set in pml.py.I looked there today. Can you point me to where it's set?

What do you think is the reason for the effect I described in my previous post.

lnxfergy
10-25-2010, 09:39 PM
Are you talking about time_increment field in LaserScan message? If yes, I couldn't find where it's set in pml.py.I looked there today. Can you point me to where it's set?

Scan_time is being set... although apparently, this might not be giving the best performance. I will investigate that.


What do you think is the reason for the effect I described in my previous post.

The primary reason is that RViz does not do the more complex interpolation (as far as I know). If you were to look at the costmap_2d output at the same time, it looks much better than the RViz display of the LaserScan.

-Fergs

RobotAtlas
10-25-2010, 09:45 PM
False. RViz does not interpolate the laser scan, but other packages such as costmap_2d do (using the laser_geometry (http://www.ros.org/wiki/laser_geometry) package).

Thanks for that laser_geometry reference. They actually explain how interpolation of data works.
With that better understanding, wouldn't these 3 fields in PML:

scan.angle_min = (self.step_start-512) * 0.00511
scan.angle_max = (self.step_start+self.step_value*(self.step_count-1)-512) * 0.00511
scan.angle_increment = self.step_value * 0.00511

be zeroes, and then transformLaserScanToPointCloud("base_link", ...) be used to account for "situations where the laser is moving and skew needs to be accounted for."

What I don't understand is how tilting angles of LiDAR are defined in PR2. Do they just get determined using tf and laser_link?

RobotAtlas
10-25-2010, 09:54 PM
Scan_time is being set... although apparently, this might not be giving the best performance. I will investigate that.

I see you have this:
PML_TIME_L = 89 # time offset from first reading

But I don't see it used anywhere.

On a related note, it looks like transformLaserScanToPointCloud () is not available in Python. But that's what we absolutely need for PML.
Do you have plans of using C++ for ArbotiX code? I guess the main (only?) downside is people would have to use rosmake now, right?

lnxfergy
10-25-2010, 10:07 PM
Thanks for that laser_geometry reference. They actually explain how interpolation of data works.
With that better understanding, wouldn't these 3 fields in PML:

scan.angle_min = (self.step_start-512) * 0.00511
scan.angle_max = (self.step_start+self.step_value*(self.step_count-1)-512) * 0.00511
scan.angle_increment = self.step_value * 0.00511

be zeroes, and then transformLaserScanToPointCloud("base_link", ...) be used to account for "situations where the laser is moving and skew needs to be accounted for."

What I don't understand is how tilting angles of LiDAR are defined in PR2. Do they just get determined using tf and laser_link?

No. You can't set angle_min/angle_max/etc to 0 -- these are the geometry of the laser scanner (where it takes samples) -- see the LaserScan message. The tilting angle is entirely external to the LaserScan - it's based on a tf transform of the laser frame.


I see you have this:
PML_TIME_L = 89 # time offset from first reading

But I don't see it used anywhere.

On a related note, it looks like transformLaserScanToPointCloud () is not available in Python. But that's what we absolutely need for PML.
Do you have plans of using C++ for ArbotiX code? I guess the main (only?) downside is people would have to use rosmake now, right?

That definition is a register definition -- in the firmware, we store the offset (in milliseconds) from when the first scan point was taken until the scan was "latched" to be read by the PC. We pass that address when we first read the laser_scan.

We don't need to do the transformation locally -- it's better to publish the laser scan and let those receiving the information decide how to use it (such as costmap_2d does using laser_geometry).

As for C++/Python -- the core arbotix package will continue to be Python based -- as the underlying non-ROS specific code we're using is Python based for better cross-platform portability. Note that you still have to use make to create the message definitions. Other packages in the vanadium_drivers stack may be C++ (for instance, the new branch includes a "simple_controllers" package that will include C++ implementations of action-based controllers that can sit on top of the arbotix package).

-Fergs

RobotAtlas
10-25-2010, 11:21 PM
No. You can't set angle_min/angle_max/etc to 0 -- these are the geometry of the laser scanner (where it takes samples) -- see the LaserScan message. The tilting angle is entirely external to the LaserScan - it's based on a tf transform of the laser frame.

So it looks like we are on the same page, except was thinking based on this from
http://www.ros.org/doc/api/sensor_msgs/html/msg/LaserScan.html:

Header header # timestamp in the header is the acquisition time of
# the first ray in the scan.

and this:
float32 scan_time # time between scans [seconds]

that they fit all "rays" of one scan from tilting laser in one LaserScan message.
So if I follow their "ray in the scan" terminology, then time_increment must be used for how long it takes to go from one point to another point in one "ray". Is it right? The definite answer would be in the code that uses message. The question is how to find that code? There's just so much stuff in ROS...

Also, do they use (angle_max - angle_min) /angle_increment to determine number of points in a "ray"? If yes, then PML would have 1 point in a "ray" (also instant) with panning being equivalent to laser tilting.
While this will make timing of PML conform to LiDAR, the orientation would be wrong.

Is that the reason you are using PML data the other way around?

lnxfergy
10-25-2010, 11:51 PM
So it looks like we are on the same page, except was thinking based on this from
http://www.ros.org/doc/api/sensor_msgs/html/msg/LaserScan.html:

Header header # timestamp in the header is the acquisition time of
# the first ray in the scan.

and this:
float32 scan_time # time between scans [seconds]

that they fit all "rays" of one scan from tilting laser in one LaserScan message.
So if I follow their "ray in the scan" terminology, then time_increment must be used for how long it takes to go from one point to another point in one "ray". Is it right? The definite answer would be in the code that uses message. The question is how to find that code? There's just so much stuff in ROS...

A LaserScan is just that -- a "scan" of measurements, where each measurement is based on a ray of light being projected from a laser at a given angle within the 2D plane. Thus, a scan is a 2D construct -- to describe it we need the minimum angle at which a ray is traced (angle_min), and the max angle (and, redundantly we can have the angle_increment between two adjacent rays).

There is a bug then in the PML, in that it currently posts scan_time -- which is effectively the time between complete scans -- and is apparently only used to determine if the laserscan publisher has gone down. There is then a second parameter, time_increment, which is actually the time between adjacent measurements. (I've checked the code for the costmap_2d package, and this is the case).


Also, do they use (angle_max - angle_min) /angle_increment to determine number of points in a "ray"? If yes, then PML would have 1 point in a "ray" (also instant) with panning being equivalent to laser tilting.
While this will make timing of PML conform to LiDAR, the orientation would be wrong.

Is that the reason you are using PML data the other way around?
A tilting laser is an entirely different beast. You are assembling multiple 2D scans into a 3D cloud. Over time, the tilting assembly moves the 2D scanner through a 3rd dimension - tf and the laser_assembler package are then used to assemble 3d clouds.

The reason that we post LaserScan messages from the PML is that we are trying to emulate a low-cost laser scanner alternative. It's just a *really* slow 2D laser scanner that doesn't use laser, it's not a 3D device (hence publishing 30-pt LaserScans, not 30 1pt-LaserScans). Hook it up -- try it out, before you try to re-engineer it. A considerable amount of time has gone into the design, into refining it, and into getting it working. It clearly does the job of obstacle avoidance quite well for it's cost.

As for localization, which has been a previous topic of discussion -- making robust localization work on any device that has only 30 point measurements per second is going to be quite difficult (if not impossible, as far as robustness goes). I've now gotten to the point that gmapping is able to semi-produce a map (until we start to track back over the area, then all hell breaks loose). For instance, the following 3 images are all of maps of our hallway -- the first is with a real Hokuyo laser, the second is with the PML but before the robot starts backtracking over space, and the third is the final (and totally useless) map generated by the PML:

http://forums.trossenrobotics.com/gallery/files/1/7/6/8/map-ils.png

http://forums.trossenrobotics.com/gallery/files/1/7/6/8/map_pml_partial.png

http://forums.trossenrobotics.com/gallery/files/1/7/6/8/map_pml_final.png

The PML maps are *after* the time bug is fixed (previously, it was completely useless 100% of the time). The PML maps also use scan doubling -- that is, we interpolate between each point, so there are a total of 59 points published (without doubling, the scan matcher completely fails, and maps immediately look like the last image).

-Fergs

RobotAtlas
10-26-2010, 01:12 AM
A LaserScan is just that -- a "scan" of measurements, where each measurement is based on a ray of light being projected from a laser at a given angle within the 2D plane. Thus, a scan is a 2D construct
Then this is description in LaserScan header is totally confusing:
> timestamp in the header is the acquisition time of the first ray in the scan.

I interpreted it as there are multiple rays in a scan. But I don't have Hokuyo nor enough experience with ROS, so I'll just trust you on that.


There is a bug then in the PML.
It would be interesting to see how Pi Bot is seeing the couch with the new code.



Hook it up -- try it out, before you try to re-engineer it.
I guess I would need ArbotiX for that, ha. I'm still waiting for that Interbotix kit to buy all together. It looks like my wait is almost over.

I was not trying to reingeneer anything. I'm just trying to understand how it works to make PML out of Legos, while I wait for the kit. I think I know how it sounds. But No, I'm not crazy, just inexperienced. :)


and the third is the final (and totally useless) map generated by the PML:
http://forums.trossenrobotics.com/gallery/files/1/7/6/8/map_pml_final.png


This is very promising result. And I don't think it's useless at all. I can totally see how your robot drove forward, turned right, drove more, turned around, drove more, turned left, drove more to return to where it came from. Right?
Maybe I'm just imagining things...

I think odometry is not accurate enough and that's what messing up the map.
That's why I'm going to see how gyro and accelerometer are going to help with navigation on NXT. I just got those two sensors today.



The PML maps also use scan doubling -- that is, we interpolate between each point, so there are a total of 59 points published (without doubling, the scan matcher completely fails, and maps immediately look like the last image).

Do you mean you created 1 imaginary point in between real points?
I'm guessing you didn't try creating 3 or 10 because it's was really late where you are.

I would think reducing servo step between samples (we have enough resolution in that AX-12, right?) and taking longer to take each scan, plus stopping until scan is completely done would improve that map significantly.

I have to say I'm really impressed with what you are able to do with such cheap hardware already. And all this in less than 2 months since you were saying "no mapping, guys". :)

SK.
10-26-2010, 02:03 AM
The PML maps are *after* the time bug is fixed (previously, it was completely useless 100% of the time). The PML maps also use scan doubling -- that is, we interpolate between each point, so there are a total of 59 points published (without doubling, the scan matcher completely fails, and maps immediately look like the last image).

-Fergs
I guess you want to keep using GMapping as a black box (e.g. keep feeding it laser scan topics) and don't want to write your own sensor model (the PML is probably somewhere inbetween ultrasonic sensors and commercial LIDARs in terms of noise and sampling density from what I can see).
The distances samples themselves don't look to bad, if only a little sparse. So what I'd try if I had a PML and time is extracting lines from the PML scan (some ideas are here: http://infoscience.epfl.ch/record/97571/files/nguyen_2005_a_comparison_of.pdf) and then republishing a dense laser scan message constructed by sampling the found line features in the PML scan.

Peter_heim
10-26-2010, 02:18 AM
Hi all

Do you mean you created 1 imaginary point in between real points?
I'm guessing you didn't try creating 3 or 10 because it's was really late where you are.

I would think reducing servo step between samples (we have enough resolution in that AX-12, right?) and taking longer to take each scan, plus stopping until scan is completely done would improve that map significantly.

I have to say I'm really impressed with what you are able to do with such cheap hardware already. And all this in less than 2 months since you were saying "no mapping, guys". :)

is there any reason why you can't increase the number of sensors to say 10 spin the base read the speed with an encoder would that give you a high enough resulution to make mapping work?

as it is at the moment my robot can see the door way with the PML somthing it couldnt before using sonar and roborealm keep up the good work

regards peter

lnxfergy
10-26-2010, 06:43 AM
is there any reason why you can't increase the number of sensors to say 10 spin the base read the speed with an encoder would that give you a high enough resulution to make mapping work?

Sure, but at 10 servos + 10 long-range IR, you're almost half way to the lower-cost Hokuyo! (which would also be a whole lot quieter)


as it is at the moment my robot can see the door way with the PML somthing it couldnt before using sonar and roborealm keep up the good work

Glad to hear!

-Fergs

lnxfergy
10-26-2010, 06:43 AM
I guess you want to keep using GMapping as a black box (e.g. keep feeding it laser scan topics) and don't want to write your own sensor model (the PML is probably somewhere inbetween ultrasonic sensors and commercial LIDARs in terms of noise and sampling density from what I can see).
I've actually had to apply several patches to gmapping to even get this far -- but I haven't yet written a custom sensor model.


The distances samples themselves don't look to bad, if only a little sparse. So what I'd try if I had a PML and time is extracting lines from the PML scan (some ideas are here: http://infoscience.epfl.ch/record/97571/files/nguyen_2005_a_comparison_of.pdf) and then republishing a dense laser scan message constructed by sampling the found line features in the PML scan.
I've been trying to avoid building too much extra "fake" information into each scan, particularly because we're using this sensor primarily for obstacle avoidance, and in the current environment there's a bunch of tables with pesky skinny legs (which also happen to be round, and black, and shiny, and cause problems for the lower-end Hokuyo as well).

-Fergs

RobotAtlas
10-26-2010, 10:25 AM
is there any reason why you can't increase the number of sensors to say 10

I'm thinking of trying 2 GPY0A710 sensors on one panning servo.
Each at 90 degrees rotation to each other, then servo can rotate for only 90 degrees, yet cover 180 degrees.
If that works, adding 2 more long range servos would increase number of points to 120.
For a total cost of only $39 * 4 + $43 = $199.
Except GPY0A710 min range is 100 cm. So once that robot approaches a doorway, it's not going to see it anymore. It's good our robots are relatively small in size. They should have no problem going through the door, even blindly. It's good thing short range IR sensors are much cheaper.
Pololu has a good price on them. GP2YA21 (10-80 cm) is $9.95 http://www.pololu.com/catalog/product/136

RobotAtlas
10-26-2010, 10:30 AM
there's a bunch of tables with pesky skinny legs (which also happen to be round, and black, and shiny, and cause problems for the lower-end Hokuyo as well).

It couldn't get any worse, could it? Or wait, it could. They could've been made out of glass. :)

lnxfergy
10-26-2010, 12:31 PM
I had entered most of this in earlier this morning, but a browser crash lost it. Didn't have time to respond again, so here we go:


This is very promising result. And I don't think it's useless at all. I can totally see how your robot drove forward, turned right, drove more, turned around, drove more, turned left, drove more to return to where it came from. Right?

I think odometry is not accurate enough and that's what messing up the map.
That's why I'm going to see how gyro and accelerometer are going to help with navigation on NXT. I just got those two sensors today.
That map is pretty useless - you can't use it for global path planning, nor localization, which are the two main purposes for a static map in ROS. As for odometry, this was collected on an iRobot Create, going approximately 0.1m/s. While the odometry is not excellent, the overall odometry was within about 3ft (with nearly perfect heading) at the end of the approximately 130ft run.

The problem here is actually with scan matching. If you were to watch the gmapping output, you'd notice that the scan matcher fails about 50-60% of the time, and has to resort on odometry. For those who've read the paper on gmapping, the authors cite that they only saw their system "fail once" to match scans in the MIT map or the Freiburg map. The fact that both of these are much larger maps shows just how ineffective the PML is for scan matching -- and that's due to 2 things: first, the slow scan, and second, the sensor itself is not nearly as accurate as a typical lidar.


Do you mean you created 1 imaginary point in between real points?
I'm guessing you didn't try creating 3 or 10 because it's was really late where you are.

I would think reducing servo step between samples (we have enough resolution in that AX-12, right?) and taking longer to take each scan, plus stopping until scan is completely done would improve that map significantly.

I didn't try more points because introducing "fake" data could miss important things in the environment (see discussion on table legs above). As it is, I would not recommend scan doubling in general -- using it for obstacle avoidance with a costmap_2d could be disastrous -- I only turned it on for this data collection because a human was driving the robot and gmapping could almost never match scans when they had only 30 points.


I have to say I'm really impressed with what you are able to do with such cheap hardware already. And all this in less than 2 months since you were saying "no mapping, guys". :)

That last scan is almost identical to the results from 2mo ago (I just never posted an image). I believe my quote at that time was that it wouldn't work for mapping "using the existing ROS system". While it's possible that this could be made to work with some other map generating, global planning, and localization system, that represents a huge effort. Seeing as the navigation stack developers are also contemplating a move to nodelet-based plugins down the road, it might be a particularly bad time to invest the effort in creating such extensions/replacements for the existing components.

-Fergs

SK.
10-26-2010, 03:46 PM
I've been trying to avoid building too much extra "fake" information into each scan, particularly because we're using this sensor primarily for obstacle avoidance, and in the current environment there's a bunch of tables with pesky skinny legs (which also happen to be round, and black, and shiny, and cause problems for the lower-end Hokuyo as well).
Well for that job, PML seems to be the wrong tool to me, with the sparse readings it gives. So from the limited information I have, I´d try a dual sensor strategy:
-Use filtered PLM data to feed GMapping and generate a map and pose estimate
-Use ultrasonic sensors with their wider measurement cone to do mapping with known poses (we have those from GMapping) and detect/map those pesky small objects.
-Fuse information from both maps in a costmap used for navigation.

That of course would probably require a lot of fiddling with the navigation stack.

lnxfergy
10-26-2010, 04:09 PM
Well for that job, PML seems to be the wrong tool to me, with the sparse readings it gives. So from the limited information I have, I´d try a dual sensor strategy:
-Use filtered PLM data to feed GMapping and generate a map and pose estimate
-Use ultrasonic sensors with their wider measurement cone to do mapping with known poses (we have those from GMapping) and detect/map those pesky small objects.
-Fuse information from both maps in a costmap used for navigation.

That of course would probably require a lot of fiddling with the navigation stack.

Actually, because the platform is moving, the coverage is pretty good out of the PML. Using a reasonable size costmap with ROS, we do pretty good at avoiding obstacles (I haven't seen much issue with missing an obstacle, we have more issues with extra obstacles appearing and limiting the planning space, however the ROS clearing behaviors do a good job of fixing that). What I don't want to have happen is that "fake" readings clear out real ones. I'd be worried that the addition of sonars would further exacerbate the issue of shrinking the available paths.

Just as a quick overview of how I'm currently using this:


Enable intelligent navigation on a small/cheap platforms.
Replace laser scanner with PML for obstacle avoidance.
Replace laser scanner with visual markers for localization.

Most of the current work has been done on an iRobot Create, with a PML + pan/tilt head with webcam, and a small netbook for brains. I've been using the navigation stack as much as possible (to avoid creating too much custom code that has to be maintained). To that end, we use the navigation stack's planners and costmaps (we use rolling costmaps for both local and global planning, which does lead to sub-optimal global planning, but is a neccessary side-effect of avoiding gmapping/acml). We currently use the ROS global planner, however down the road I'd like to replace it with something more suited to our localization/planning scheme. We've replaced the existing gmapping/amcl localization system with a simple pose adjustment based on visual markers. In the environment I'm currently using this robot (a campus lab), we have attached ar_pose markers under the University sign that marks each office. We've then created a map of the marker locations, and we can do our localization/planning at the topological/semantic level rather than at pose level (which lends itself well to natural language instructions, assuming you have defined the map using natural language names for markers).

-Fergs

Pi Robot
10-26-2010, 04:22 PM
I'm hoping to eventually have some heuristics that help calibrate the PID a bit better to start with -- Pi, the serializer has adjustable PID parameters, right? How have you dealt with that?

-Fergs

Hey Fergs,

Yeah, it took me awhile to figure out the PID parameters for the Serializer. I'm not sure any of this will help with the ArbotiX, but I'll post it anyway. The documentation is pretty good and is found here:

http://www.roboticsconnection.com/multimedia/docs/Serializer_3.0_UserGuide.pdf

The PID explanation is toward the end of the manual. Using this as a guide, I use the following parameters for PID calculations in the PySerializer library:

UNITS = 0 # 1 is inches, 0 is metric (cm for sensors, meters for wheels measurements) and 2 is "raw"
WHEEL_DIAMETER = 0.132 # meters (5.0 inches) meters or inches depending on UNITS
WHEEL_TRACK = 0.3365 # meters (12.8 inches) meters or inches units depending on UNITS
ENCODER_RESOLUTION = 624 # encoder ticks per revolution of the wheel without external gears
GEAR_REDUCTION = 1.667 # This is for external gearing if you have any. In this case there is a 60/36 tooth gear ratio.

ENCODER_TYPE = 1 # 1 = quadrature, 0 = single
MOTORS_REVERSED = True # Multiplies encoder counts by -1 if the motor rotation direction is reversed.

VPID_P = 2 # Proportional
VPID_I = 0 # Integral
VPID_D = 5 # Derivative
VPID_L = 45 # Loop: this together with UNITS and WHEEL_DIAMETER determines real-world velocity

DPID_P = 1 # Proportional
DPID_I = 0 # Integral
DPID_D = 0 # Derivative
DPID_A = 5 # Acceleration
DPID_B = 10 # Dead band

MILLISECONDS_PER_PID_LOOP = 1.6 # Do not change this! It is a fixed property of the Serializer PID controller.
LOOP_INTERVAL = VPID_L * MILLISECONDS_PER_PID_LOOP / 1000 # in seconds

When the library is first imported, the following calculation is performed:

self.ticks_per_meter = int(self.encoder_resolution * self.gear_reduction / (self.wheel_diameter * math.pi))

Then, to move the robot a given distance and speed in meters and meters per second I use:


def travel_distance(self, dist, vel):
''' Move forward or backward 'dist' (inches or meters depending on units) at speed 'vel'. Use negative distances
to move backward.
'''
revs_per_second = float(vel) / (self.wheel_diameter * math.pi)

ticks_per_loop = revs_per_second * self.encoder_resolution / self.loop_interval
vel = (int(ticks_per_loop / self.VPID_P))

revs = dist / (self.wheel_diameter * math.pi)
ticks = revs * self.encoder_resolution * self.gear_reduction
self.digo([1, 2], [ticks, ticks], [vel, vel])The "digo" command is the Serializer's firmware command to move the motors a given number of encoder ticks at a speed related to the number of ticks per processing loop as related above.

Similarly, for a rotation through a given number of radians:


def rotate(self, angle, vel):
''' Rotate the robot through 'angle' degrees or radians at speed 'vel'. Use the ROS convention for right handed
coordinate systems with the z-axis pointing up so that a positive angle is counterclockwise.
Use negative angles to rotate clockwise.
'''
if self.units == 0:
revs_per_second = float(vel) / (2.0 * math.pi)
rotation_fraction = angle / (2.0 * math.pi)
full_rotation_dist = self.wheel_track * math.pi
elif self.units == 1:
revs_per_second = float(vel) / 360.
rotation_fraction = angle / 360.
full_rotation_dist = self.wheel_track * 2.54 / 100 * math.pi

vel = revs_per_second * full_rotation_dist # in m/s

rotation_dist = rotation_fraction * full_rotation_dist
revs = rotation_dist / (self.wheel_diameter * math.pi)

ticks = revs * self.encoder_resolution * self.gear_reduction
self.digo_m_per_s([1, 2], [ticks, -ticks], [vel, vel])where


def digo_m_per_s(self, id, dist, vel):
''' Identical to digo but specifies velocities in meters per second.
'''
if type(id) == int: id = [id]
if type(dist) == int: id = [dist]
if type(vel) == int: vel = [vel]

spd = list()
for v in vel:
if self.units == 0:
revs_per_second = float(v) / (self.wheel_diameter * math.pi)
elif self.units == 1:
revs_per_second = float(v) / (self.wheel_diameter * math.pi * 2.54 / 100)
ticks_per_loop = revs_per_second * self.encoder_resolution * self.loop_interval * self.gear_reduction
spd.append(int(ticks_per_loop))

return self.execute('digo %s' %' '.join(map(lambda x: '%d:%d:%d' %x, zip(id, dist, spd))))

--patrick

Pi Robot
10-26-2010, 06:39 PM
FYI -- I'd recommend everyone re-checkout/update their 0.3.1 release. I apparently botched how I created the 0.3.1 tag, and it wasn't actually the 0.3.1 release most people would have gotten. It has now been fixed in SVN.

-Fergs

Hey Fergs,

I just did a fresh checkout of 0.3.1 and I am getting a failed dependency on trajectory_msgs. More specifically, this is what happens:

* check out 0.3.1
* cd arbotix
* rosdep install arbotix

Failed to find stack for package [trajectory_msgs]

Any thoughts?

--patrick

lnxfergy
10-26-2010, 07:20 PM
Hey Fergs,

I just did a fresh checkout of 0.3.1 and I am getting a failed dependency on trajectory_msgs. More specifically, this is what happens:

* check out 0.3.1
* cd arbotix
* rosdep install arbotix

Failed to find stack for package [trajectory_msgs]

Any thoughts?

--patrick

Wow.. who knew you could screw up a release so many ways. I think it's now correct (it was slightly too new). That trajectory_msgs stuff is a future release (which I'm now moving into a new arbotix_experimental package). Do another check out -- all should work this time. Although -- if you hang on about an hour, I'll have 0.3.2 out (with several patches to PML).

-Fergs

Pi Robot
10-26-2010, 07:49 PM
Hey Fergs,

No problem! I always knew we were riding the wave of the future on this forum but now I have positive proof. ;) I may wait for 0.3.2 but take your time--I just got my laser scanner in the mail. :veryhappy:


--patrick

Pi Robot
10-26-2010, 07:54 PM
P.S.

0.3.1 builds fine now--thanks!

lnxfergy
10-26-2010, 07:59 PM
P.S.

0.3.1 builds fine now--thanks!

Great -- I just uploaded 0.3.2 (which patches 2 bugs, one in the time_increment for the scan, and the other that is needed for actual min/max ranges for different sensors). I'd recommend anyone using a PML upgrade to 0.3.2.

-Fergs

Pi Robot
10-26-2010, 08:01 PM
Cool--is that a ROS upgrade only or firmware too?

--patrick

lnxfergy
10-26-2010, 08:05 PM
Cool--is that a ROS upgrade only or firmware too?

--patrick

No changes to firmware.

I should note that this does not implement most of the earlier 0.3.2 list (that will now be 0.3.3) instead it implements several bug fixes, and also creates a new arbotix_experimental package:

* [arbotix] fix time_increment bug in PML.
* [arbotix] fix min_range/max_range settings in PML.
* [arbotix] move action-based controllers to new simple_controllers package.
* [arbotix] move joint_traj_controller to experimental package.
* [arbotix_experimental] add joint_traj_controller.

The reason for arbotix_experimental is that it currently has numerous extra dependencies -- including trajectory_msgs which is in the pr2_controllers stack but is slated to be moved to a common stack for diamondback (at which point the joint_traj_controller stuff will be moved to the core arbotix packages).

-Fergs

Pi Robot
10-26-2010, 08:10 PM
Brilliant--just built and ran 0.3.2 and my PML is humming away nicely.

--p

lnxfergy
10-26-2010, 08:26 PM
Brilliant--just built and ran 0.3.2 and my PML is humming away nicely.

--p

I'd love to know how it works for you on the shorter IR ranger. I've only been testing on the 0.5-5.5m model. In particular, the new release should have a more appropriate min/max range (unless you had edited it already yourself in previous release).

The time_increment fix should also improve costmap_2d from what I've seen in the last few hours.

-Fergs

Pi Robot
10-26-2010, 09:12 PM
I'm going to shoot some more video tomorrow using the shorter ranger and I'll post back the results.

--patrick

RobotAtlas
10-27-2010, 09:47 AM
I'm thinking of trying 2 GPY0A710 sensors on one panning servo.
Each at 90 degrees rotation to each other, then servo can rotate for only 90 degrees, yet cover 180 degrees.
If that works, adding 2 more long range servos would increase number of points to 120.
For a total cost of only $39 * 4 + $43 = $199.

I just realized $39 was for 2 servos, not one.

So the total is $20 * 4 + $43 = $123 - much better.

Pi Robot
10-27-2010, 11:30 AM
I'm going to shoot some more video tomorrow using the shorter ranger and I'll post back the results.

--patrick

Hey Fergs,

I'm trying to do a new obstacle avoidance video with PML 0.3.2 and part way through the run I keep getting the following exception:


Exception in thread pml:
Traceback (most recent call last):
File "/usr/lib/python2.6/threading.py", line 532, in __bootstrap_inner
self.run()
File "/home/patrick/Eclipse/ros/vanadium-0.3.2/arbotix/src/arbotix_sensors/pml.py", line 107, in run
if self.device.read(253,self.PML_DIR,1)[0] == d:
TypeError: 'int' object is unsubscriptable

I'll wrap it in a try-except block for now but thought you'd probably want to see this.

--patrick

lnxfergy
10-27-2010, 11:34 AM
Patrick, yes, try wrapping it in a try-except block (I'll have to do a similar thing here) -- but this looks like you're having trouble reading from the register table. Your firmware is up to date? Are you getting any Failed reads anywhere else?

-Fergs

Pi Robot
10-27-2010, 11:38 AM
Yes, firmware is up-to-date and no Failed reads as far as I have seen.

--patrick

lnxfergy
10-27-2010, 12:09 PM
Ok, a bit of further investigation. The first issue of course, is what individual components use for tf transformation of LaserScan messages:


costmap_2d -- uses the laserProjector transformLaserScanToPointCloud() method, as previously discussed.
gmapping and amcl -- do not use laserProjector methods, and as such cannot deal with the skew from a moving platform.

This also gives some motivation as to why gmapping performs so poorly -- and also leads to some possible experiments moving forward.

-Fergs

Pi Robot
10-27-2010, 12:31 PM
Fergs,

I'm having trouble getting the cost map to be cleared when the PML scans across a cell previously marked as occupied but is actually free. Have you found some set of parameters that allows cells to be more easily cleared? I'm reading up on mark_thresholds but I could spend all day trying different values.

--patrick

lnxfergy
10-27-2010, 12:39 PM
Fergs,

I'm having trouble with the cost map being cleared when the PML scans across a cell previously marked as occupied but is actually free. Have you found some set of parameters that allows cells to be more easily cleared? I'm reading up on mark_thresholds but I could spend all day trying different values.

--patrick

Not sure I understand which issue you're having: the cell is marked as occupied (because of a previous, poor sensor reading) and it's not getting cleared out? Or is the cell supposed to be occupied but it's not getting marked that way?

There's a fine balance as far as grid size. Too small and you'll never clear false obstacles. Too large and you'll never see small objects (because adjacent long-range readings wipe them out). I've been running around 0.05m for a costmap grid size.

-Fergs

Pi Robot
10-27-2010, 12:58 PM
Yeah, I didn't word that very well, but as it turns out, 0.05m was exactly what I needed. I've been using 0.1m all along and cutting it in half made all the difference.

Thanks!
patrick

lnxfergy
10-27-2010, 01:00 PM
Yeah, I didn't word that very well, but as it turns out, 0.05m was exactly what I needed. I've been using 0.1m all along and cutting it in half made all the difference.

Thanks!
patrick

If you're still using the smaller sensor, you may find you need to drop that even a bit more.

-Fergs

RobotAtlas
10-27-2010, 02:20 PM
If you're still using the smaller sensor, you may find you need to drop that even a bit more.

Fergs, can you elaborate on that more to help us understand why?

lnxfergy
10-27-2010, 04:26 PM
Fergs, can you elaborate on that more to help us understand why?

The resolution of the smaller sensors is going to be a lot better, and in theory, the robot/environment is a bit smaller -- so you'd like the better resolution. The reason finer grids don't work with the larger sensor has to do with how wide the "no sensing" beam becomes.

An example: If we have 30 readings over 180 degrees, that means it's 6 degrees between readings. Suppose our sensor returns two adjacent readings of 0.1m -- these two 0.1m sides trace out a triangle, and the third leg of that triangle is only 0.01m (basic trig). Say now that our readings are 2.5m -- that third leg of the triangle is now 0.25m! Now imagine we have a "bad" sensor reading that creates a false obstacle that has to be cleared by later readings -- if the grid is too fine, our changes of a later reading clearing it is too small -- and obstacles pop up all over the place. (The opposite situation, where the grid is too course also causes problems -- in that "small" obstacles get cleared too easily).

-Fergs

RobotAtlas
10-27-2010, 08:19 PM
So at 6 meters (for long-range IR sensor) 6 degrees is about 0.62m, which is 12 times more than costmap resolution of 0.05m. Then the data is too sparse.

I don't understood why it takes so long to get a reading from those Sharp sensors and wonder if there are faster IR sensors out there.

I did notice that Sharp sensor updates data every 16.5ms (+-3.7ms):
http://forums.trossenrobotics.com/attachment.php?attachmentid=2181&stc=1&d=1288228049


Would ArbotiX be able to support faster reading of data?

lnxfergy
10-27-2010, 09:01 PM
If the 710 does support faster measurement, then yes, we could measure that fast (but we're only going to increase the density - 120 degrees/sec is already quite loud on the AX-12s). The reason the Sharp's are so slow is that they basically have a small circuit that measures the reflectance and then outputs the analog voltage (it's actually a digitally latched output, it literally updates at Xhz when you watch it on a scope).

-Fergs

Peter_heim
10-27-2010, 09:25 PM
Hi Fergs

Quote:
Originally Posted by Peter_heim http://forums.trossenrobotics.com/ambience/buttons/viewpost.gif (http://forums.trossenrobotics.com/showthread.php?p=43478#post43478)
is there any reason why you can't increase the number of sensors to say 10 spin the base read the speed with an encoder would that give you a high enough resulution to make mapping work?

Sure, but at 10 servos + 10 long-range IR, you're almost half way to the lower-cost Hokuyo! (which would also be a whole lot quieter)


Not 10 servos only 1 motor spinning the 10 IR 360 deg(ignore the engineering problems)
that should only cost $250.

regards peter

RobotAtlas
10-27-2010, 10:22 PM
but we're only going to increase the density - 120 degrees/sec is already quite loud on the AX-12s

To help with this I was suggesting 2 sensors. Here's a drawing made in PPSP* of what I envision:
http://forums.trossenrobotics.com/attachment.php?attachmentid=2182&stc=1&d=1288235931


Edit: The fact you only have one LaseScan message, it will be really easy to assemble it out of two sensors.

* PPSP = Pen, Paper, Scanner, Paint

Pi Robot
10-27-2010, 11:19 PM
I'm going to shoot some more video tomorrow using the shorter ranger and I'll post back the results.

--patrick

Well, it took the entire day and I finally had to go back to release 0.3.0 to get consistent results (details tomorrow). But here is a so-so video using the medium range IR sensor (0.2m-1.5m) as well as a 3d model (URDF) of Pi Robot in RViz mimicking the real world motion.

YouTube - PML Test 3

--patrick

Pi Robot
10-28-2010, 01:12 AM
I finally had to go back to release 0.3.0 to get consistent results (details tomorrow).

OK, I just did a fresh checkout of release 0.3.2 and upgraded the firmware accordingly. I also installed the RoboController version 0010 libraries. Running the navigation stack and watching the results in RViz, there seems to be a consistent delay of 2-3 seconds between the movement of the robot and the update of the IR readings. In other words, if I rotate the robot 90-degrees to the left at a rate of about 0.15 rad/s and there is an object now in range of the PML, I won't see an IR reading in RViz from the object for 2-3 seconds after the rotation. Could this have something to do with the timestamp adjustments on the LaserScan you were referring to earlier?

--patrick

Peter_heim
10-28-2010, 07:01 AM
Hi All
I'm still having some trouble moving my robot using the controllerGUI it works but if i use the simple goal tutorial to move the robot forward 1 meter the wheels start to pulse and they will just continue until i stop the program my PID values are 5 10 0 20 using controllerGUI this run smoth if i use 60 10 0 80 the using the controllergui i get surges but with simple_nagigation_ goal the motors have a stronger pulse. I have adjusted the max_vel from .01 to 8 the min_vel from .001 to 3 in the base_local planner_params.yaml basicly it changes but on improvment. Using Rviz and setting a goal is similar
when i shut down arbotix thread i get a warning control loop missed is desired rate of 5 hrz the loop took .5418 seconds and a similar one for map update loop

regards peter

Peter_heim
10-28-2010, 07:17 AM
Hi Fergs


1. You can turn on debug output by changing the first lines of ArbotiX_ROS. You need to uncomment the line with log_level=rospy.DEBUG, and comment out the other line with rospy.init_node:

Code:
class ArbotiX_ROS(ArbotiX):

def __init__(self):
rospy.init_node('arbotix', log_level=rospy.DEBUG)
#rospy.init_node('arbotix')
You'll then get lots of encoder output in the debugging system. Easiest way to view it is to open rxconsole. When rotating wheels such that the bot would drive forward, encoder counts should increase.


Just a question on encoder resultion my encoders 2500 ticks per rev (us digital E6) when i rotate the wheels 1 turn i get 5000 ticks is this normal? (the serializer shows 2500)

regards peter

RobotAtlas
10-28-2010, 07:33 AM
there seems to be a consistent delay of 2-3 seconds between the movement of the robot and the update of the IR readings. In other words, if I rotate the robot 90-degrees to the left at a rate of about 0.15 rad/s and there is an object now in range of the PML, I won't see an IR reading in RViz from the object for 2-3 seconds after the rotation.

I can probably explain 1.5 seconds of delay. Since we have one LaserScan messageper scan in one direction (which takes 1.5 seconds, right?) the ROS will not see this message until scan is done.

I do see # 8 times oversampled in pml.py now. Fergs, can you explain how oversampling works?

lnxfergy
10-28-2010, 08:36 AM
Hi Fergs

Just a question on encoder resultion my encoders 2500 ticks per rev (us digital E6) when i rotate the wheels 1 turn i get 5000 ticks is this normal? (the serializer shows 2500)

regards peter

Because we're using pin change interrupts, we get an interrupt on both the up and down side of the A line -- which makes our state machine more complex, but also means we capture the full resolution of the encoder -- so count will likely be 2x what a simple one-way interrupt counter would be.

-Fergs

lnxfergy
10-28-2010, 08:41 AM
Hi All
I'm still having some trouble moving my robot using the controllerGUI it works but if i use the simple goal tutorial to move the robot forward 1 meter the wheels start to pulse and they will just continue until i stop the program my PID values are 5 10 0 20 using controllerGUI this run smoth if i use 60 10 0 80 the using the controllergui i get surges but with simple_nagigation_ goal the motors have a stronger pulse. I have adjusted the max_vel from .01 to 8 the min_vel from .001 to 3 in the base_local planner_params.yaml basicly it changes but on improvment. Using Rviz and setting a goal is similar
when i shut down arbotix thread i get a warning control loop missed is desired rate of 5 hrz the loop took .5418 seconds and a similar one for map update loop

regards peter

Um, the min_vel is set to 3 -- that's meters per second! I doubt your platform can do 3-8m/s. I'd revisit the min/max velocity settings and make them more realistic. As for shutting down the arbotix thread -- this means your PML is no longer publishing, which causes the costmap to not update, which eventually causes trouble for the trajectory planner -- so missed rates are expected.

-Fergs

lnxfergy
10-28-2010, 08:48 AM
OK, I just did a fresh checkout of release 0.3.2 and upgraded the firmware accordingly. I also installed the RoboController version 0010 libraries. Running the navigation stack and watching the results in RViz, there seems to be a consistent delay of 2-3 seconds between the movement of the robot and the update of the IR readings. In other words, if I rotate the robot 90-degrees to the left at a rate of about 0.15 rad/s and there is an object now in range of the PML, I won't see an IR reading in RViz from the object for 2-3 seconds after the rotation. Could this have something to do with the timestamp adjustments on the LaserScan you were referring to earlier?

--patrick

2-3s seems way too much. I would expect a bit of delay (as we need to complete a scan before we can publish it), but the new code should latch the scan immediately when it's done. What sort of computer are you on? Are we cycle bound and missing our rates?

I know you also mentioned errors being tossed earlier in trying to read the PML direction register -- is this still the case? Are we missing reads? I haven't tested too much over XBEE, because most of my bots carry onboard computers now.


I can probably explain 1.5 seconds of delay. Since we have one LaserScan messageper scan in one direction (which takes 1.5 seconds, right?) the ROS will not see this message until scan is done.

I do see # 8 times oversampled in pml.py now. Fergs, can you explain how oversampling works?

Basically, here's the deal: the PML scans in the up direction and down direction (where up/down is defined by which way the servo # is going from 0->1023 or vice versa). When the PC requests the first register of the PML data block, the arbotix code latches a variable stating which direction the last completed scan was in (up=0, down=1). So, we have the PC code check the direction frequently, and wait for it to change -- when the direction changes from our previous data, we will then read the entire scan data -- this data consists of 1) the direction of the scan, 2) the time offset in milliseconds from the first measurement in the scan to the time the direction bit was latched, 3) the 60 bytes of data.

In testing here, I saw that such a scheme dropped the delay from an average 2-3s down to almost exactly the 1.5s of the scan.

-Fergs

Pi Robot
10-28-2010, 09:25 AM
2-3s seems way too much. I would expect a bit of delay (as we need to complete a scan before we can publish it), but the new code should latch the scan immediately when it's done. What sort of computer are you on? Are we cycle bound and missing our rates?

I know you also mentioned errors being tossed earlier in trying to read the PML direction register -- is this still the case? Are we missing reads? I haven't tested too much over XBEE, because most of my bots carry onboard computers now.

-Fergs

Hey Fergs,

I started everything up again this morning and so far the weird delays have gone away. I'm starting to wonder if RViz gets a little messed up some times when displaying the scan data inbetween my test runs. I'll keep a closer eye on it and restart RViz when I suspect it might be confused.

--patrick

Pi Robot
10-28-2010, 09:47 AM
Looks like I spoke too soon. Here is a test run using 0.3.2 that is fairly typical. In this case, the problem seems to be false readings near the blue/red/yellow object. With 0.3.0, the robot has little trouble getting past this object.

Oh, and to answer your question about my computer: I'm running everything from a new Fujitsu Intel Core i5 tablet notebook which seems at least as snappy as my 3Ghz dual-core desktop.

YouTube - Testing PML release 0.3.2

Any thoughts?

--patrick

lnxfergy
10-28-2010, 09:58 AM
Looks like I spoke too soon. Here is a test run using 0.3.2 that is fairly typical. In this case, the problem seems to be false readings near the blue/red/yellow object. With 0.3.0, the robot has little trouble getting past this object.

Had you updated the range min/max when using 0.3.0 for your sensor? Also, what is the size of that visualized grid? What is the size of your costmap grid? I'm thinking we may need to fix the minimum range on that particular sensor model...


Oh, and to answer your question about my computer: I'm running everything from a new Fujitsu Intel Core i5 tablet notebook which seems at least as snappy as my 3Ghz dual-core desktop.Yeah, an i5 should be no problem -- my laptop has the same.

-Fergs

RobotAtlas
10-28-2010, 10:04 AM
2) the time offset in milliseconds from the first measurement in the scan to the time the direction bit was latched,

Thank Fergs for explaining the code so well. I wounder if we should put that explanation inside of pml.py for future/other people use.

So offset in pml.py is scan_time + unknown_pull_delay_time (0 to 4 ms).
Than this is not exactly correct:
scan.scan_time = offset
scan.time_increment = offset/(self.step_count-1)

You know scan_time, right? Why not use it instead of unknown/varied offset for those two.
4 ms is probably not a big deal though. I guess I'm just "spoiled" by working with RTOS too much.

Still, since polling is not the best use of resources, it would be nice to have some type of sync between ArbotiX and pml (in that direction).
Is there a way for ArbotiX to notify pml about certain events, such as finishing of a scan?

Pi Robot
10-28-2010, 11:00 AM
Had you updated the range min/max when using 0.3.0 for your sensor? Also, what is the size of that visualized grid? What is the size of your costmap grid? I'm thinking we may need to fix the minimum range on that particular sensor model...
-Fergs

Yeah, that could be it. I played around with range_min in 0.3.0 but of course I forget what I ended up with. The visualized grid spacing in that last video is 0.25m. And my costmap parameters are as follows:


local_costmap:
global_frame: odom
robot_base_frame: base_link
update_frequency: 5.0
publish_frequency: 2.0
static_map: false
rolling_window: true
width: 3.0
height: 3.0
resolution: 0.05
global_costmap:
global_frame: /map
robot_base_frame: /base_link
update_frequency: 5.0
static_map: true
rolling_window: false
obstacle_range: 3.0
raytrace_range: 3.0
footprint: [[-0.16, 0.13], [0.16, 0.13], [0.16, -0.13], [-0.16, -0.13]]
inflation_radius: 0.05
transform_tolerance: 1.0
observation_sources: base_scan
base_scan: {sensor_frame: base_laser, data_type: LaserScan, topic: base_scan, marking: true, clearing: true}
TrajectoryPlannerROS:
max_vel_x: 0.2
min_vel_x: 0.05
max_rotational_vel: 0.25
min_in_place_rotational_vel: 0.1
acc_lim_th: 5
acc_lim_x: 3
acc_lim_y: 3
yaw_goal_tolerance: 0.2
xy_goal_tolerance: 0.2
controller_frequency: 5.0
holonomic_robot: false
path_distance_bias: 0.8
goal_distance_bias: 0.3
sim_time: 2.0--patrick

lnxfergy
10-28-2010, 12:25 PM
Thank Fergs for explaining the code so well. I wounder if we should put that explanation inside of pml.py for future/other people use.

We're still at V0.3.2 -- so there's a number of revisions before 1.0.0. Things will get much more cleaned up before then.


So offset in pml.py is scan_time + unknown_pull_delay_time (0 to 4 ms).
Than this is not exactly correct:
scan.scan_time = offset
scan.time_increment = offset/(self.step_count-1)

You know scan_time, right? Why not use it instead of unknown/varied offset for those two.
4 ms is probably not a big deal though. I guess I'm just "spoiled" by working with RTOS too much.
It's the "seconds" that we're mostly concerned with here. Frankly, the difference in 4ms of travel is less than the absolute error of the sensor itself. However, given Pi's issues with very long delays, I may take a look into adjusting these to be the fixed values based on #of samples.


Still, since polling is not the best use of resources, it would be nice to have some type of sync between ArbotiX and pml (in that direction). Is there a way for ArbotiX to notify pml about certain events, such as finishing of a scan?We can't change this without breaking the entire current driver structure. The communication prototcol with the ArbotiX is effectively an extension of the Dynamixel protocol -- which has no asynchronous capabilities. The costs of breaking the protocol for the PML are just too high (we're using this same firmware to interact with PyPose, or any other of a variety of dynamixel-protocol based programs).

-Fergs

Peter_heim
10-28-2010, 03:27 PM
Hi Fergs

Um, the min_vel is set to 3 -- that's meters per second! I doubt your platform can do 3-8m/s. I'd revisit the min/max velocity settings and make them more realistic.

I have Tried from .002for min and .02 for max as my lowest settings then went up with not much diffrence

peter

lnxfergy
10-28-2010, 03:46 PM
Hi Fergs


I have Tried from .002for min and .02 for max as my lowest settings then went up with not much diffrence

peter

Peter, can you post the following:
* Your complete YAML config file (be sure that ticks_meter is correct)
* What is the advertised RPM of your motors?
* What is the wheel diameter?

-Fergs

Peter_heim
10-29-2010, 02:21 AM
Hi fergs
here is my YAML file
port: /dev/ttyUSB0
rate: 15
#use_sync: False

sensors: {
"pml": {type: pml, servo_id: 1}
}
controllers: {
"b_con": {type: base_controller,base_width: 0.425,ticks_meter: 10063, kp: 2, kd: 1, ki: 0, ko: 6}

}
#dynamixels: {
# head_pan_joint: {servo_id: 1}
#}
my wheel dia is 15cm
motor RPM is 160
the encoders are mounted on the wheels and the motor has a 2 to 1 reduction to the wheels

regards Peter

Peter_heim
10-29-2010, 04:40 AM
Hi All


I started everything up again this morning and so far the weird delays have gone away. I'm starting to wonder if RViz gets a little messed up some times when displaying the scan data inbetween my test runs. I'll keep a closer eye on it and restart RViz when I suspect it might be confused.

--patrick Patrick you may be on to something i had all kinds of problems with moving my robot yesterday when i started today with yesterdays settings (that didnt work) the robot moved smothly and with good speed it just didnt stop i got a warning topic = base_scan, target /odom /base_base_scan dropped 100% of messages
in Rviz i cant have both the laser and odom working at the same time would this be a transform problem?
the other curious thing when i stopped the arbotix thread the motors kept driving??
I'm using ver 0.3.2

regards peter

lnxfergy
10-29-2010, 10:01 AM
my wheel dia is 15cm
motor RPM is 160
the encoders are mounted on the wheels and the motor has a 2 to 1 reduction to the wheels

So then, wheel RPM is 80, right? At a circumference of Pi*15cm=~47cm, and 80RPM (80/60=1.33rev/s) we get max speed of about 47cm*1.33rev/s= 60cm/s = 0.6m/s. So, I'd expect that you'd want your navigation stack parameters to be something like 0.4m/s for the top speed (as you'll lose some speed under load). You can then experiment to see what the minimum speed is that the platform actually moves at (probably somewhere in the 0.05-0.1m/s range). But first, we have to work out your PID parameters:



here is my YAML file
port: /dev/ttyUSB0
rate: 15
#use_sync: False

sensors: {
"pml": {type: pml, servo_id: 1}
}
controllers: {
"b_con": {type: base_controller,base_width: 0.425,ticks_meter: 10063, kp: 2, kd: 1, ki: 0, ko: 6}

}

At 10063 counts per meter, at 0.4m/s, we expect to get 10063*0.4=4025 counts per second at max speed. Our PID loop currently runs at 30hz -- so we get 4025/30=134 counts per frame. Our motor output is +/- 255, so we want to pick PID parameters that scale the typical error in 134 counts/frame into the -255 to 255 range. For instance, say you actually have 124 counts in a frame, you'd want the robot to speed up a bit. 10 counts * 2 (kp) / 6 (ko) = +3 change in motor PWM value. That may be a bit high (and cause a lot of oscillation). The fact that your Kd is half your Kp could also lead to interesting issues.

On my Armadillo robot, I get 19414 ticks/meter, and max speed is about 0.9m/s. I tend to run it around 0.4m/s though. So, we see that 19414*0.4/30=258ticks/frame in average case. Here assume 10 ticks off (which is a much smaller error, 10/258 ~ 4% off), 10*5(kp)/50(ko) = +1 change in motor PWM value.

Since 10 counts is 10/124 ~ 8% error in your case (but near the top end of your speed/pwm values), I would expect to see an output of +1 or +2. You might try going to Kp=4, Kd=1, Ko=24. (you could also try doing Kd = 0, although I find that a bit of Kd is needed, especially in homes/offices that have door thresholds, it gives the bot a bit of extra kick to get over obstacles).

-Fergs

lnxfergy
10-29-2010, 10:09 AM
Hi All

Patrick you may be on to something i had all kinds of problems with moving my robot yesterday when i started today with yesterdays settings (that didnt work) the robot moved smothly and with good speed it just didnt stop i got a warning topic = base_scan, target /odom /base_base_scan dropped 100% of messages
in Rviz i cant have both the laser and odom working at the same time would this be a transform problem?

/base_base_scan? Did you copy that wrong, if not, you probably have TF frame is. It looks like RViz is trying to transform from odom to base_base_scan (which probably does not exist). If you do a "rosrun tf view_frames" you'll get a PDF output that shows your TF diagram.


the other curious thing when i stopped the arbotix thread the motors kept driving??
I'm using ver 0.3.2

This is a known *issue* -- there's currently no watchdog on the serial communications. If serial drops out, the motors continue doing whatever they were last told to (so an ArbotiX node crash would be bad...). I've got it on my list of things to fix in the firmware (I had an arbotix node crash the other day, and noticed this glaring omission). I've just added it to our issues list to keep track of it: http://code.google.com/p/vanadium-ros-pkg/issues/list

-Fergs

Peter_heim
10-29-2010, 03:51 PM
Hi Fergs
That would be the best explanation for setting PID don't loose it

peter

RobotAtlas
10-29-2010, 08:14 PM
The problem here is actually with scan matching. If you were to watch the gmapping output, you'd notice that the scan matcher fails about 50-60% of the time, and has to resort on odometry.

Fergs, can you give me some pointers on what do I need to have to see gmapping output?
Also how do I know it fails. I'm asking about what can I read/do/try/etc to get me there?

As far as my background is concerned:
I've read and done all beginner ROS, tf and rviz tutorials, half of navigation tutorials (I'm planning to finish them this weekend).
I've read half of slam_gmapping docs.
I created a map from a bag file.
I can run nxt_assisted_teleop package (uses navigation).
I don't have AirbotiX (still!).

lnxfergy
10-29-2010, 08:42 PM
Fergs, can you give me some pointers on what do I need to have to see gmapping output?
Also how do I know it fails. I'm asking about what can I read/do/try/etc to get me there?

As far as my background is concerned:
I've read and done all beginner ROS, tf and rviz tutorials, half of navigation tutorials (I'm planning to finish them this weekend).
I've read half of slam_gmapping docs.
I created a map from a bag file.
I can run nxt_assisted_teleop package (uses navigation).
I don't have ArbotiX (still!).

You'd need to create (or find) a bagfile with a LaserScan topic and good TF for laser->base->odom. One you have this, the gmapping docs have a tutorial on how to run it.

-Fergs

Pi Robot
10-31-2010, 11:04 PM
Good news on the PML front. I discovered that my false IR readings were being caused by having the sensor just a little too low off the carpet. It was enough to flip it over end-over-end (I now have it mounted "vertically") to get rid of the false readings. Here is the latest video of the results. Best viewed full screen as I made a fairly hi-res screen capture.

YouTube - Obstacle Avoidance using ROS + PML

--patrick

lnxfergy
10-31-2010, 11:56 PM
Looking good Pi!

On a somewhat different note, I wanted to test odometry on the Armadillo over wide areas. First I did a test run through the house, and created this map with gmapping (using my Hokuyo laser, not the PML):

http://forums.trossenrobotics.com/gallery/files/1/7/6/8/maphome.png

I then went for a larger run through the CS dept, which created the following map. On the left we have the map from gmapping (with some obvious issues), and on the right we have a view of the raw costmap that was generated, giving an idea of where the odometry issues were (mostly in turns):

http://forums.trossenrobotics.com/gallery/files/1/7/6/8/screenshot-11.png
I'm having a few issues with gmapping and scan matching on the shorter range Hokuyo, it took quite a bit of parameter hacking to get that map as good as it is. You can see where gmapping falsely aligned the scans and "shortened" the left hallway of the map.

All in all, extremely good odometry on a $200 robot, that's almost 100m of travel in the second map. I'm now looking into adding a gyro to the robot and using the ekf filter package to see if that helps angular odometry (which would probably help with loop closure). I also plan to take a complete run of the house, and a longer run in the CS department (with a bit more distance into the loop a second time to force closure).

-Fergs

RobotAtlas
11-01-2010, 09:56 AM
Here is the latest video of the results.

Patrick, I tried counting how many orange dots you have at one time and the average is about 10.
Why is it so low? Is it because you IR sensor is short range. 1.5 meters is about 6 ft. In your environment I was expecting more orange dots.

lnxfergy
11-01-2010, 10:02 AM
Patrick, I tried counting how many orange dots you have at one time and the average is about 10.
Why is it so low? Is it because you IR sensor is short range. 1.5 meters is about 6 ft. In your environment I was expecting more orange dots.

Yes -- if the sensor returns "all clear" within it's range, you don't get a dot for that reading.

-Fergs

Pi Robot
11-01-2010, 11:00 AM
Hey Fergs,

Very cool mapping results! This week I finally get to start playing with my laser scanner so your timing is perfect. And to RobotNV, yes Fergs gave the answer I would have.

In the meantime, I finally had a chance to write up some of the PML + ROS stuff. You can find it here:

http://www.pirobot.org/blog/0014/

--patrick

Peter_heim
11-02-2010, 05:39 AM
Hi All
Nice write up Patrick

I'm working with the PML in rviz I worked thru the tutorial the PML shows dots but the obstacles and the inflated obstacles are not showing. The topics in rviz don't show a error and the are receiving date but a rostopic echo shows no cells data (cell size is .5)
I'm getting this error message
[ WARN] [1288689452.082873337]: MessageNotifier [topic=base_scan, target=/map /base_scan ]: Dropped 100.00% of messages so far. Please turn the [ros.costmap_2d.message_notifier] rosconsole logger to DEBUG for more information
but there is no more information
I have read Patrick's previous posts and tried his cost maps and no luck
can anyone point me in the right direction?

regards Peter

lnxfergy
11-02-2010, 02:03 PM
Hi All
Nice write up Patrick

I'm working with the PML in rviz I worked thru the tutorial the PML shows dots but the obstacles and the inflated obstacles are not showing. The topics in rviz don't show a error and the are receiving date but a rostopic echo shows no cells data (cell size is .5)
I'm getting this error message
[ WARN] [1288689452.082873337]: MessageNotifier [topic=base_scan, target=/map /base_scan ]: Dropped 100.00% of messages so far. Please turn the [ros.costmap_2d.message_notifier] rosconsole logger to DEBUG for more information
but there is no more information
I have read Patrick's previous posts and tried his cost maps and no luck
can anyone point me in the right direction?

regards Peter

Your costmap parameters appear to be wrong -- in particular, the map is defined in a /map frame, where you probably have only defined /odom or /base_link frames, and thus there is no transform from /base_laser -> /map (and the messages can't be overlaid onto the map.

-Fergs

Pi Robot
11-02-2010, 04:22 PM
Hey Peter,

Sounds like you are close! Just to elaborate on Fergs' reply, I ran into similar trouble until I made sure I had a transform for all the links in the tree. In particular, you need one between /base_link and /base_laser. My /base_link -> /base_laser transform is taken care of in my URDF file but you can also do it as a static transform in your launch file. Here's how it might look:


<node pkg="tf" type="static_transform_publisher" name="base_laser_broadcaster" args="0 0.18 0.10 0 0 0 /base_link /base_laser 100" />In this case, I have assumed the laser (PML) is 18 cm forward of the center of the base and 10 cm above it so adjust accordingly.

Also, in the launch file I use to bring up the (fake) mapping stuff, I include two more static transforms, one between /odom and /map and the other between /map and /world. Here is my move_base.launch file:


<launch>
<master auto="start"/>

<!-- Run the map server -->
<node name="map_server" pkg="map_server" type="map_server" args="$(find pi_robot)/maps/blank_map.yaml"/>

<!-- Run Fake Localization -->
<node pkg="fake_localization" type="fake_localization" name="fake_localization" output="screen" />

<!-- Create a static transform between the /odom frame and /map -->
<node pkg="tf" type="static_transform_publisher" name="odom_map_broadcaster" args="0 0 0 0 0 0 /odom /map 100" />
<node pkg="tf" type="static_transform_publisher" name="world_map_broadcaster" args="0 0 0 0 0 0 /map /world 100" />

<node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen">
<param name="controller_frequency" value="5.0" />
<rosparam file="$(find pi_robot)/params/costmap_common_params.yaml" command="load" ns="global_costmap" />
<rosparam file="$(find pi_robot)/params/costmap_common_params.yaml" command="load" ns="local_costmap" />
<rosparam file="$(find pi_robot)/params/local_costmap_params.yaml" command="load" />
<rosparam file="$(find pi_robot)/params/global_costmap_params.yaml" command="load" />
<rosparam file="$(find pi_robot)/params/base_local_planner_params.yaml" command="load" />
</node>
</launch>


So in summary, put the /base_link -> /base_laser transform in your robot launch file (or URDF file) and the other two in your map launch file. (Of course, you might have everything in one launch file.)

Hopefully Fergs will correct me if I missed something.

--patrick

Peter_heim
11-03-2010, 08:35 AM
Thanks Patrick and Fergs
I'm getting closer i can see obsticals and inflated obsticals it dosnt look quite right yet the PML scan seams to go about 240 degrees but the servo only goes 180 deg and the distance dosnt seem right . But at least it working.:happy:

regards Peter

Pi Robot
11-06-2010, 09:55 PM
OK, on to the next step: SLAM with the Hokuyo laser scanner. I've got the scanner up and working and displaying in RViz. I can also collect a bag file of scan/odom data by teleoping the robot around, and then I create the map without trouble. The slightly odd thing is that there seems to be data missing in the map that I thought would have been marked as clear since I passed over it twice at very slow speed. I've attached a sample image with an example of what I am talking about circled in green. That area is complete free of obstacles but it was not marked as clear. Has anyone else run into this?

--patrick

http://forums.trossenrobotics.com/attachment.php?attachmentid=2188&stc=1&d=1289098368

lnxfergy
11-06-2010, 10:29 PM
Patrick,

In gmapping, lower the linear_update and angular_update values. This will cause more scans to be used in building the map.

-Fergs

Pi Robot
11-06-2010, 11:02 PM
Patrick,

In gmapping, lower the linear_update and angular_update values. This will cause more scans to be used in building the map.

-Fergs

OK, thanks Fergs. I'll give that a try tomorrow.

--patrick

Pi Robot
11-07-2010, 08:36 AM
Awesome! Once again Fergs has saved the day. Here is the map from gmapping using the *same* bag file as before but setting the following gmapping parameters:

<param name="linearUpdate" value="0.1" />
<param name="angularUpdate" value="0.05" />
<param name="xmin" value="-20" />
<param name="ymin" value="-20" />
<param name="xmax" value="20" />
<param name="ymax" value="20" />
<param name="maxUrange" value="6" />

The map represents a living room/dining room with piles of furniture and random stuff on the floor.

http://forums.trossenrobotics.com/attachment.php?attachmentid=2189&stc=1&d=1289140371

BTW, I noticed on the GMapping page of the OpenSLAM website the following statement:


Input Data
The approach takes raw laser range data and odometry. This version is optimized for long-range laser scanners like SICK LMS or PLS scanner. Short range lasers like Hokuyo scanner will not work that well with the standard parameter settings. Fergs, besides the parameter adjustments made above, do you know of others that should be tweaked for use with the Hokuyo?

--patrick

lnxfergy
11-07-2010, 11:28 AM
Fergs, besides the parameter adjustments made above, do you know of others that should be tweaked for use with the Hokuyo?

There really aren't any other adjustments that add to the map quality. The comments about "shorter range lasers" aren't really about the default parameters -- but rather the overall design of gmapping. The shorter range Hokuyo should be fine in smaller environments (like homes) and environments with clutter (again homes). Where it starts to fail is in wide open spaces, or especially long, empty and uniform corridors.

The problem here is that the gmapping algorithm puts a heavy emphasis on the laser quality, and thus scan matching -- odometry is only used to seed the laser scan matcher, or as a fallback if scan matching fails. So, in a long hallways, the longer range lasers can actually see the end of the hallway -- whereas the shorter range sensor cannot. The end of the hallway is very important for the scan matcher to be correct, and when it is missing, the scan matcher actually "shortens the hallway" by believing that the robot has not moved. I suppose that the parameters for the scan matcher could be altered to search within a much smaller window, but I couldn't seem to find any sweet spot that avoided the long hallway issues without completely killing the scan matcher.

-Fergs

Pi Robot
11-07-2010, 11:30 AM
Interesting, and thanks for the explanation!

--patrick

Pi Robot
11-07-2010, 06:19 PM
Hey Fergs,

I want to try using ROS to do some simple head tracking of a colored blob. The best algorithm I have used in the past requires that I adjust the speeds on pan and tilt dynamixels; however, I don't see a setSpeed service in the arbotix-node.py ROS node. I can add one myself, but I was wondering if there was a reason you don't have it there yet since I see you *do* have a setSpeed function in the arbotix.py driver code.

--patrick

Pi Robot
11-07-2010, 07:36 PM
Just following up on my own post--I see that joint_controller.py file is only sending position info to the servos. So I tried adding the line:

self.device.servos[joint].setSpeed( msg.velocity[msg.name.index(joint)] )

just before the line:

self.device.servos[joint].setAngle( msg.position[msg.name.index(joint)] )

to include speed as well. This seems to work if I test it in controllerGUI.py by adding a couple of lines like this:

speed = 1.0
j.velocity = [speed, speed, speed]

just below the line:

j.position = [-self.pan.GetValue()/100.0, self.tilt.GetValue()/100.0, self.tilt2.GetValue()/100.0]

and then play with the value set for speed on different runs. However, adding the speed info produces a big delay in the response of the servos to the slider controls. If I comment the lines out, the response is good again.

My yaml file is:


port: /dev/ttyUSB0
rate: 10
baud: 57600
use_sync: False

dynamixels: {
head_pan_joint: {id: 1, max_speed: 200},
head_tilt_joint: {id: 2, max_speed: 200}
# left_shoulder_up_joint: {id: 3, max_speed: 200},
# left_shoulder_out_joint: {id: 4, max_speed: 200},
# left_elbow_roll_joint: {id: 5, max_speed: 200},
# left_wrist_bend_joint: {id: 6, max_speed: 200},
# right_shoulder_up_joint: {id: 7, max_speed: 200},
# right_shoulder_out_joint: {id: 8, max_speed: 200},
# right_elbow_roll_joint: {id: 9, max_speed: 200},
# right_wrist_bend_joint: {id: 10, max_speed: 200},
# torso_joint: {id: 11, max_speed: 200}
}

controllers: {
#joint_controller: {type: joint_controller, joints: [head_pan_joint, head_tilt_joint, left_shoulder_up_joint, left_shoulder_forward_joint, left_elbow_joint, left_wrist_joint, right_shoulder_up_joint, right_shoulder_forward_joint, right_elbow_joint, right_wrist_joint, torso_joint] }
joint_controller: {type: joint_controller, joints: [head_pan_joint, head_tilt_joint] }
}


Any thoughts?

--patrick

lnxfergy
11-07-2010, 08:13 PM
So, one of the major problems with setting the speed is that it majorly reduces the available torque -- the way that the AX-12 reduces speed is by limiting the max voltage output to the servo motor (lower voltage == lower torque). While this works ok for heads, it really falls for things like arms and legs. Instead, the microcontroller/pc should do the interpolation from point to point itself -- this is why the ArbotiX NUKE system outputs positions at 30-50hz, making smooth interpolation without sacrificing torque.

So, the joint_controller is really just a very basic controller -- in fact, it will probably be phased out down the road (as using a JointStates message to *set* positions of servos is really poor form). I have plans to instead use the new joint_traj_controller -- which uses the more ROS-standard JointTrajectory message, which actually allows setting positions and velocities (and also accelerations, although not commonly used). This controller would then run a high-rate loop that interpolates the servo positions over time. The problem right now is that JointTrajectory is part of trajectory_msgs package -- which is currently in a PR2-related stack, but will be moved to common_msgs for ROS diamondback.

The joint_traj_controller is not yet 100% functional -- but should be later this week (along with a short video showing it in action).

You'll also notice another package -- simple_controllers. This will include several action-based controllers for doing things like moving arms and point heads. The point_head action will be nearly identical to the PR2 version, but without dependency on PR2-centric stack. The joint_trajectory_action will also be similar.

Why would I bother with all this advanced joint_traj_action and joint_traj_controller stuff? Because it will allow use to build ArbotiX/Dynamixel based arms which can be interfaced with the move_arm and arm_navigation packages! There's still quite a bit of work to be done here -- but I'm hoping to have a fully functional example by the 20th of November (I really want this demo working when I travel up to CNRG).

-Fergs

Pi Robot
11-07-2010, 08:41 PM
Ah, great stuff! Since controlling Pi Robot's arms is one of my biggest reasons to use ROS, I'm looking forward to the trajectory/action stuff. In the meantime, can you help me understand why adding the speed line to joint_controller.py introduces such a big delay in servo motion when using the slider control in controllerGUI.py? Does it have to do with not using a sync write? I know I was able to get smooth head tracking this way using the Forest Moon library and the USB2Dynamixel controller but the FM library has a sync write function (see line 470 in http://code.google.com/p/pydynamixel/source/browse/tags/0.1.0/src/dynamixel_network.py).

--patrick

lnxfergy
11-07-2010, 08:58 PM
Ah, great stuff! Since controlling Pi Robot's arms is one of my biggest reasons to use ROS, I'm looking forward to the trajectory/action stuff. In the meantime, can you help me understand why adding the speed line to joint_controller.py introduces such a big delay in servo motion when using the slider control in controllerGUI.py? Does it have to do with not using a sync write? I know I was able to get smooth head tracking this way using the Forest Moon library and the USB2Dynamixel controller but the FM library has a sync write function (see line 470 in http://code.google.com/p/pydynamixel/source/browse/tags/0.1.0/src/dynamixel_network.py).

--patrick

Are you setting the speed too low? It seems that you're setting it to 1.0 (out of 1024) -- that'd be almost no torque to move the servo?

-Fergs

Pi Robot
11-07-2010, 09:13 PM
Are you setting the speed too low? It seems that you're setting it to 1.0 (out of 1024) -- that'd be almost no torque to move the servo?
-Fergs

Well, you're right--that *could* have been it as I was mixing up speed in rad/s and ticks. So I cleaned that up, and I just tried it with a speed of 500 ticks (out of 1024) and the movement is quick but there is still a delay--some times as long as 5 or 6 seconds.

--patrick

Pi Robot
11-08-2010, 08:11 AM
So I cleaned that up, and I just tried it with a speed of 500 ticks (out of 1024) and the movement is quick but there is still a delay--some times as long as 5 or 6 seconds.Good news on this front--seems I'm only getting the weird delay when operating the servos through controllerGUI. I just ran a test node that simply pans and tilts the head through random angles and speeds and everything seems to work A-OK.

--patrick

lnxfergy
11-08-2010, 08:42 AM
Good news on this front--seems I'm only getting the weird delay when operating the servos through controllerGUI. I just ran a test node that simply pans and tilts the head through random angles and speeds and everything seems to work A-OK.

--patrick

What is the publication rate of your test node? I'm wondering if you're saturating the XBEE when the controllerGUI is running....

-Fergs

Pi Robot
11-08-2010, 08:47 AM
Fergs,

I have a basic ROS question for you that I should know the answer to but I am stumped. I wrote a little test node to move the pan and tilt head servos through random angles and at random speeds using your joint_controller. Everything works OK except the following: If I replace the time.sleep(2.0) command in the while... loop below with rospy.sleep(2.0), the loop only executes a single time. If I go back to time.sleep(2.0), it executes indefinitely as intended (until I hit Ctrl-C). What am I missing here?



#!/usr/bin/env python
import roslib; roslib.load_manifest('pi_robot')
import rospy
import time
import random
from sensor_msgs.msg import JointState
from math import radians

rospy.init_node("head_track")
cmd_joints = rospy.Publisher("/cmd_joints", JointState)

head_cmd = JointState()
head_cmd.name = ["head_pan_joint", "head_tilt_joint"]
head_cmd.position = [0, 0]
head_cmd.velocity = [1, 1]
time.sleep(1)

def move_head():
while not rospy.is_shutdown():
rospy.loginfo("HELLO 1!")
head_cmd.velocity = [random.randrange(50, 200), random.randrange(50, 200)]
head_cmd.position = [radians(random.randrange(-20, 20)), radians(random.randrange(-20, 20))]
cmd_joints.publish(head_cmd)
rospy.loginfo("HELLO 2!")
time.sleep(2.0)
#rospy.sleep(2.0)

if __name__ == '__main__':
try:
move_head()
except rospy.ROSInterruptException:
rospy.loginfo("Shutting down head track node...")
head_cmd.velocity = [1, 1]
cmd_joints.publish(head_cmd)

Pi Robot
11-08-2010, 09:12 AM
What is the publication rate of your test node? I'm wondering if you're saturating the XBEE when the controllerGUI is running....

-Fergs

Hey Fergs,

I think you nailed it. I wasn't running a separate test node in that case but I noticed that the controllerGUI timer is running at 50 milliseconds:

self.timer.Start(50)

So I changed it to 500 milliseconds and now it is much better behaved when including speed info.

--patrick

Pi Robot
11-08-2010, 10:30 PM
Fergs,
I have a basic ROS question for you that I should know the answer to but I am stumped. I wrote a little test node to move the pan and tilt head servos through random angles and at random speeds using your joint_controller. Everything works OK except the following: If I replace the time.sleep(2.0) command in the while... loop below with rospy.sleep(2.0), the loop only executes a single time. If I go back to time.sleep(2.0), it executes indefinitely as intended (until I hit Ctrl-C). What am I missing here?


OK, forget this one--seems roscore was in some funky state. Once I restarted it, this weirdness went away.

--patrick

lnxfergy
11-08-2010, 11:45 PM
OK, forget this one--seems roscore was in some funky state. Once I restarted it, this weirdness went away.

--patrick

Might you have had use_sim_time set to True? The core shouldn't affect time as far as I know -- other than possibly holding such a parameter as use_sim_time...

-Fergs

Pi Robot
11-09-2010, 08:06 AM
Might you have had use_sim_time set to True? The core shouldn't affect time as far as I know -- other than possibly holding such a parameter as use_sim_time...
-Fergs

That could easily have been it--I was playing with gmapping the day before and there is a good chance I left use_sim_time set to True...

--patrick

Pi Robot
11-09-2010, 08:30 AM
Hey Fergs,

I have my head tracking node working (but without the visual input yet) and I couldn't remember what is the status on getting sync writes to work? I see a note in your joint_controller.py file that says "TODO: sync this!". Does this mean sync writes are still in the queue?

Thanks!
patrick

Pi Robot
11-12-2010, 09:24 AM
I have my head tracking node working (but without the visual input yet) and I couldn't remember what is the status on getting sync writes to work? I see a note in your joint_controller.py file that says "TODO: sync this!". Does this mean sync writes are still in the queue?


Fergs,

Not sure if these mods will be useful to you, but I needed getVelocity and setVelocity functions similar to your getAngle and setAngle functions in arbotix-node.py. I've attached the modified arbotix-node.py and arbotix.py files. (I began with your 0.3.2 versions). I was getting some sporadic readings from one of my servos from both getAngle and getVelocity so I have a check in both of those to discard a value that does not make sense.

--patrick

2196

lnxfergy
11-12-2010, 09:31 AM
Fergs,

Not sure if these mods will be useful to you, but I needed getVelocity and setVelocity functions similar to your getAngle and setAngle functions in arbotix-node.py. I've attached the modified arbotix-node.py and arbotix.py files. (I began with your 0.3.2 versions). I was getting some sporadic readings from one of my servos from both getAngle and getVelocity so I have a check in both of those to discard a value that does not make sense.

--patrick

2196

Patrick,

I'm planning to integrate the velocity setting through cmd_joints (as you already have) -- and additionally get the joint_states publication to carry velocity info (which was really an oversight, but requires some gentle interaction with the controllers).

I'm trying to move away from having too many services, and instead having more featured topics. To this extent, the "getAngle/setAngle" services are turned off by default -- they may also disappear before we go V1.0.0, because using either a joint_controller or joint_traj_controller would provide better access (especially once sync_write is used). I'll of course put an API review notice before killing any existing services.

-Fergs

Pi Robot
11-12-2010, 09:39 AM
Sounds good! Also, I just noticed I had some left over debugging stuff in that last attachment. So please discard if you haven't already and here is the correct version just for reference. (Note that I have also included an ExtraJointState.msg that I was using as a hack to get isMoving info from the servos rather than using a service.)

--patrick

2197

Pi Robot
11-12-2010, 10:46 AM
BTW, I discovered a neat debugging feature of RViz. One of my servos was returning a random reading of 1023 for its current position. It was occurring perhaps 1 time in 500. I'm not sure I ever would have detected this except that in RViz, my virtual robot's right arm would periodically fling all the way over to one side and then back to where it started. Of course, RViz was faithfully visualizing the false reading!

--p

Pi Robot
11-12-2010, 06:02 PM
Fergs--have you tried your URG scanner outdoors? I just did a quick experiment here in late afternoon on an outside asphalt walkway and the scanner seemed to work basically OK. Just curious if you have done some actual mapping outside.

--patrick

Pi Robot
11-12-2010, 06:22 PM
OK, I've gone through the basic gmapping exercise using a recorded bag file. Next I'd like to do the "simultaneous" part of SLAM. Fergs (or anyone else working on this), can you tell me how I get the map server and gmapping and acml to all play together? I know how to launch each of these individually in my launch file, but I'm guessing there is some magic incantation required to get them to all talk to each other. Just to be clear--I'm hoping to have the map built up on the fly as the robot begins moving about, not by first recording a bunch of data in a bag file.

--patrick

Pi Robot
11-12-2010, 06:35 PM
Ooops. I think I found the answer to my question on the ROS-users list. Sorry, should've gone there first. So it looks like I simply do not run the amcl node--just add gmapping to the launch file and subscribe to the map in RViz and voila, there it is.

--patrick

lnxfergy
11-12-2010, 09:38 PM
Patrick, I haven't run the URG outside yet. I had wanted to run some experiments, but I never had a decent day outside when I had the Armadillo with me.

As for gmapping, yeah, just don't run AMCL, gmapping publishes /dynamic_map->/odom transforms. Honestly though, I would recommend still bagging up tf and laser scans so that you can re-run gmapping again later if the parameter set fails to work well (unless you're attempting to do continuous mapping...).

-Fergs

Pi Robot
11-13-2010, 08:59 AM
Yeah, I see the value of doing the mapping from recorded bag data--I was just curious to see how it did "in real time". Pretty cool to watch in RViz. BTW, is it possible to start with a map from earlier data and have it updated with new data? One thread I was reading on ros-users seems to suggest that gmapping always starts over from scratch.

--patrick

Pi Robot
11-14-2010, 08:06 PM
Just did my first short run through the house using the Hokuyo URG laser scanner. The attached image shows the result generated in real time by gmapping. The robot never once got hung up on an obstacle or door edge. Truly inspiring.

http://forums.trossenrobotics.com/attachment.php?attachmentid=2200&stc=1&d=1289786763

BTW, the beams flying outward on the right of the picture are penetrating a sliding glass door on that side of the room.

--patrick

sthmck
11-14-2010, 08:47 PM
how long did it take to generate this map?

Pi Robot
11-14-2010, 08:53 PM
how long did it take to generate this map?

The map was generated as fast as I drove the robot through the rooms. I started the robot in the dining room, then clicked in RViz into the middle of the grid where the living room is and the robot drove there, collecting more data and filling in the map. Then I did another click into the hallway, one at the doorway into the kitchen, then a final click back to where I started in the living room. I have the robot's speed throttled back to 0.2 m/s max which is considerably slower than it can go and almost certainly slower than how fast the laser scanner and gmapping could keep up. Even so, the whole process only took a few minutes.

--patrick

Pi Robot
11-14-2010, 08:56 PM
Speaking of laser scanners, does anyone know if there is an optimum height off the floor to mount the beam? Since it's a laser, I'm guessing it could be mounted quite low without getting false positives off the carpet or ground. And I figure the lower you can mount it, the more likely the beam could pick up low lying obstacles such as small toys indoors or pebbles on an outside asphalt walkway.

--patrick

lnxfergy
11-14-2010, 09:38 PM
Speaking of laser scanners, does anyone know if there is an optimum height off the floor to mount the beam? Since it's a laser, I'm guessing it could be mounted quite low without getting false positives off the carpet or ground. And I figure the lower you can mount it, the more likely the beam could pick up low lying obstacles such as small toys indoors or pebbles on an outside asphalt walkway.

--patrick

I think you can go all the way down to the ground, with the following caveats that I've noticed:


If bot is prone to rocking forward, you have to keep it high enough that the costmap does not accumulate any "floor" spots that it sees.
The low-cost URG laser has some peculiarities with certain materials. In particular, that crappy black plastic trim often found in commercial office space doesn't show up at much of an angle. I have to make sure all our lasers are at least 5"+ off the floor to avoid the laser falling on that stuff.

-Fergs

Pi Robot
11-14-2010, 10:26 PM
Cool. That's all good to know.

--patrick

Pi Robot
11-15-2010, 11:27 PM
Here is a screen capture of RViz while running gmapping in real time with the Hokuyo URG. The robot starts in the dining room and is then sent into the living room, hallway and kitchen using mouse clicks in RViz.

YouTube - SLAM 1: Testing ROS with the gmapping package

--patrick

Pi Robot
11-18-2010, 08:48 PM
For a slightly more entertaining (and much shorter) version of the above video, check out this 6x speeded up version:

YouTube - SLAM 1: Viewed at 6x speed

--patrick

Peter_heim
11-19-2010, 03:27 PM
Hi Patrick
I just got my serializer robot up and running is there a thread to discuss using your ros package?

peter

Pi Robot
11-20-2010, 12:30 PM
Hi Patrick
I just got my serializer robot up and running is there a thread to discuss using your ros package?

peter

Peter,

I started a new thread on this topic here:

http://forums.trossenrobotics.com/showthread.php?t=4483

--patrick

Peter_heim
11-25-2010, 06:04 AM
Hi Fergs
I have the PML working with the map now and i can drive around in rviz but the long range IR isn't very good in a tight location i sonars or shorter range IRs be used with the PML or on there own in static locations

regards Peter

lnxfergy
11-25-2010, 07:26 PM
Hi Fergs
I have the PML working with the map now and i can drive around in rviz but the long range IR isn't very good in a tight location i sonars or shorter range IRs be used with the PML or on there own in static locations

regards Peter

The PML can work with several different IR sensors -- see the wiki page. As for sonar, there's currently no such support (mainly because there's no easy way to integrate this with any portion of the navigation stack).

-Fergs

Peter_heim
11-29-2010, 01:28 AM
Hi Fergs
Trying to get 2 hobby servos to move using the Arbotix but i cant find the right pin number or i have made a mistake in the yaml file. Will the GUI controller control hobby servos?
here is a copy of my yaml file


port: /dev/ttyUSB1
rate: 15

sensors: {
"pml": {type: pml, servo_id: 1, sensor_type: A710YK},
"vm": {type: v_monitor, id: 2}
}
servos: {
head_pan_joint: {id: 19, sync: false}
}

controllers: {
joint_controller: {type: joint_controller, joints: [head_pan_joint] }
}


regards peter

Peter_heim
11-29-2010, 05:10 AM
Hi Fergs
Just got the voltage monitor to work is there a way to get it to run continuously in a small display (resized terminal)

peter

lnxfergy
11-29-2010, 10:14 AM
Hi Fergs
Trying to get 2 hobby servos to move using the Arbotix but i cant find the right pin number or i have made a mistake in the yaml file. Will the GUI controller control hobby servos?
here is a copy of my yaml file


regards peter

The ID should correspond to the digital pin # -- and only goes 0-9. If you've compiled with "HW_SERVO" then only 0 and 1 (and the actual pin for HW servos is 12 & 13, the hobby servo headers, for ids 0 and 1)

-Fergs

lnxfergy
11-29-2010, 10:15 AM
Hi Fergs
Just got the voltage monitor to work is there a way to get it to run continuously in a small display (resized terminal)

peter

No GUI currently, but it should be fairly easy for you to write a GUI that calls the service at 1hz and then displays the number.

-Fergs

Pi Robot
11-29-2010, 11:41 AM
Hey Fergs,

I am getting a periodic "Checksum ERROR" message from arbotix.py (version 0.3.0) when I have all 11 of my AX-12's connected to the bus. The frequency of the error message is about once every 15-20 seconds. I don't see this message when just connecting my head pan and tilt servos. I tried printing out the id with the error message and it always reads 255. Is this something to be concerned about? Is it likely caused by one servo or bad cable?

Thanks,
patrick

lnxfergy
11-29-2010, 12:01 PM
Hey Fergs,

I am getting a periodic "Checksum ERROR" message from arbotix.py (version 0.3.0) when I have all 11 of my AX-12's connected to the bus. The frequency of the error message is about once every 15-20 seconds. I don't see this message when just connecting my head pan and tilt servos. I tried printing out the id with the error message and it always reads 255. Is this something to be concerned about? Is it likely caused by one servo or bad cable?

Thanks,
patrick

Are you hardwired or over XBEE? If you're hardwired with an FTDI cable, it might be a bad cable. If you're on XBEE, it's most likely an XBEE issue -- occasional checksum issues aren't a huge problem (they should be caught and the packet should be ignored, let me know if it's actually causing glitches).

-Fergs

Pi Robot
11-29-2010, 04:41 PM
I'm hardwired over an FTDI cable but I was plugged into a USB hub which was then plugged into the PC running ROS. Plugging the FTDI cable directly into the PC reduced the number of checksum errors considerably. In the meantime, it doesn't seem to be affecting control of the servos so I guess it's OK.

--patrick

lnxfergy
11-29-2010, 04:46 PM
I'm hardwired over an FTDI cable but I was plugged into a USB hub which was then plugged into the PC running ROS. Plugging the FTDI cable directly into the PC reduced the number of checksum errors considerably. In the meantime, it doesn't seem to be affecting control of the servos so I guess it's OK.

--patrick

Yeah, you should be fine. I'd assume you're running at least 15hz for the rates -- at 11 servos, that's about 2475 servo updates per 15s of runtime -- losing 1 packet every 15s is an error rate of less than 0.05%

Furthermore, it's a checksum error, rather than a timeout -- so it's not slowing down the bus any.

-Fergs

Pi Robot
11-29-2010, 05:15 PM
Yes, 15hz exactly. Thanks for the assurances. Now back to arm navigation, which should only take me another 3 or 4 months...

--patrick

RobotAtlas
11-29-2010, 09:37 PM
Now back to arm navigation, which should only take me another 3 or 4 months...

--patrick

Only 3 to 4 months? Man, you are fast. But then something tells me you are not going to write it from scratch. :)

Pi Robot
11-30-2010, 01:28 PM
Only 3 to 4 months? Man, you are fast. But then something tells me you are not going to write it from scratch. :)

You got that right. In fact, it was the tf library in ROS that mostly motivated me to switch over. And a little prodding from Fergs. ;)

P.S. This is me doing it from scratch (http://www.pirobot.org/blog/0011/) for just two joints per arm. Now I am glad to let ROS/tr figure out the rest.

lnxfergy
11-30-2010, 01:34 PM
Yes, 15hz exactly. Thanks for the assurances. Now back to arm navigation, which should only take me another 3 or 4 months...

--patrick

Are you planning to use the arm_navigation stack? If so, you'll want to keep an eye on arbotix_experimental, as I'm currently developing joint_trajectory_controller in there (as it's dependent on trajectory_msgs, which is currently in a PR2 specific stack)-- I've also got a new setup for a tilting laser using the ArbotiX (but it's still a bit buggy). I'm working on a major overhaul to the joint_trajectory_controller over the next few weeks, which should really smooth things out -- at which point I should have a pretty good demo of arm_navigation working on an arbotix (the perceptual stuff is mostly ready, and I'm stealing the IK off something else).

-Fergs

Pi Robot
11-30-2010, 05:40 PM
Yeah, this weekend I started playing with David Lu's arm_navigation stack. So far all I've done is to get the forward kinematics to work on one of Pi's 4-jointed arms. Next I'll try the IK part. (Darn day job keeps getting in the way...) Good to know you are working on the trajectory stuff!

--patrick

Correction: That's the arm_kinematics stack by David Lu, not the arm_navigation stack.

Peter_heim
12-03-2010, 01:37 AM
Hi fergs

The ID should correspond to the digital pin # -- and only goes 0-9. If you've compiled with "HW_SERVO" then only 0 and 1 (and the actual pin for HW servos is 12 & 13, the hobby servo headers, for ids 0 and 1)

-Fergs

I still can't get the servos to respond I have compiled with HW_servo and put a voltage regulator
on the power supply to the servos to limit the voltage to 6 volts, when i apply power to the board the servos move. When i use the controller GUI nothing happens. I have used id 0 1 and 12 13 with no difference
When i use the controllergui the PML servo (servo_id 1 ) moves even after i have removed the line

peter

lnxfergy
12-03-2010, 04:32 PM
Hi fergs


I still can't get the servos to respond I have compiled with HW_servo and put a voltage regulator
on the power supply to the servos to limit the voltage to 6 volts, when i apply power to the board the servos move. When i use the controller GUI nothing happens. I have used id 0 1 and 12 13 with no difference
When i use the controllergui the PML servo (servo_id 1 ) moves even after i have removed the line

peter

Could you post your YAML? I'll try to test this out tonight and see if something got broken.

-Fergs

Peter_heim
12-03-2010, 04:57 PM
Hi Fergs
here is the yaml file

port: /dev/ttyUSB1
rate: 15

#sensors: {
#"pml": {type: pml, servo_id: 1, sensor_type: A710YK},
#"vm": {type: v_monitor, id: 1}
#}

servos: {
head_pan_joint: {id: 0},
head_tilt_joint: {id: 1}
}

controllers: {
joint_controller: {type: joint_controller, joints: [head_pan_joint, head_tilt_joint] }
}

Pi Robot
12-05-2010, 08:43 PM
Hey Fergs and other ROS users,

Can anyone give me some hints on how I can substitute for the ArbotiX in a Gazebo simulation? This is what I have done so far:


Created an xacro file for Pi Robot including Gazebo references and collision/inertial blocks.
Run Gazebo with either the empty world or simple world.
Loaded the Pi Robot model in Gazebo and verified that the joints are all rotating in the correct manner by applying efforts using the gazebo/apply_joint_effort service from the command line.

Now I'd like to try moving the links in the simulated robot either through another node or by using the joint_state_publisher GUI sliders. So I need to somehow simulate the ArbotiX joint_controller in Gazebo so that it takes cmd_joint messages and applies the appropriate efforts in Gazebo. I've gone through the Gazebo documentation on the ROS wiki and so far I have no clue how to do this...

--patrick

lnxfergy
12-05-2010, 09:35 PM
Hey Fergs and other ROS users,

Can anyone give me some hints on how I can substitute for the ArbotiX in a Gazebo simulation? This is what I have done so far:


Created an xacro file for Pi Robot including Gazebo references and collision/inertial blocks.
Run Gazebo with either the empty world or simple world.
Loaded the Pi Robot model in Gazebo and verified that the joints are all rotating in the correct manner by applying efforts using the gazebo/apply_joint_effort service from the command line.

Now I'd like to try moving the links in the simulated robot either through another node or by using the joint_state_publisher GUI sliders. So I need to somehow simulate the ArbotiX joint_controller in Gazebo so that it takes cmd_joint messages and applies the appropriate efforts in Gazebo. I've gone through the Gazebo documentation on the ROS wiki and so far I have no clue how to do this...

--patrick

Generally, you need a Gazebo plugin. I know some people have adapted the "differential drive" plugin to work on the Videre and Create platforms -- I'm sure it could be adapted to work as a replacement for the base controller. The arm controller is a bit more complex I imagine, and I've not seen a simple Gazebo plugin for that (although I imagine one *must* exist). You might take a look around the University of Arizona repositories -- they have a custom manipulator based on a videre+arm, and I know they use Gazebo with it....

-Fergs

Pi Robot
12-06-2010, 08:35 AM
Thanks Fergs--I'll take a look at the UA repository.

--patrick

Peter_heim
12-08-2010, 04:20 PM
Hi Fergs

Could you post your YAML? I'll try to test this out tonight and see if something got broken.

-Fergs
Have you had any luck with the problem of hobby servos not working?

peter

anton
12-13-2010, 03:13 PM
Folks,

Over the past few weeks I was trying to get the motion planning stack working with our custom arm. I think I was successful, and was able to plan a collision free trajectory to an arbitrary point in Cartesian space. To document some of the steps I went through I created a new tutorial which might be helpful to people trying get their arms running with ROS' motion planning stack. Here's the link:

http://www.ros.org/wiki/arm_navigation/Tutorials/Running%20arm%20navigation%20on%20non-PR2%20arm

Thanks,
Anton

Pi Robot
12-13-2010, 07:02 PM
Thank you Anton--perfect timing and very nice!

--patrick

Peter_heim
12-14-2010, 04:34 AM
Hi All
i have added 4 dynamixel servos as a start of a arm how can i move these servos the controllerGUI only has 3 servo sliders and the joint controller with rviz doesn't work?

peter

Peter_heim
01-05-2011, 05:42 AM
Hi Fergs
I have tried a short range IR sensor on the PML and the navigation in tight places works but i think it also needs the longer range as well. is there a plan to be able to use 2 IR for a wider range in the future
I think i will be using this until Neato bring out a cheap laser

peter

lnxfergy
02-06-2011, 11:36 AM
Just a heads up to anyone using the "trunk" of vanadium_drivers. I'm going to be moving the new arbotix stack from a branch back into trunk over the next couple hours in preparation for an 0.4.0 release.

This also means our docs on the wiki are going to look messed up for a short time until I update them for the new stack/package structure.

I hope to have a tagged release mid-week.

-Fergs