PDA

View Full Version : ArbotiX ROS Package 0.2.0



lnxfergy
08-26-2010, 11:23 AM
There's been a lot of discussion lately about ROS and the ArbotiX -- I've now released v0.2.0 which is the first release stable enough for people to really start playing with. The vanadium-ros-pkg wiki has more information about this particular release: http://code.google.com/p/vanadium-ros-pkg/wiki/ReleaseNotes

I've also created a quick-start page: http://code.google.com/p/vanadium-ros-pkg/wiki/ArbotixGettingStarted, while the more complete documentation can be found on the package documentation page http://code.google.com/p/vanadium-ros-pkg/wiki/arbotix. Eventually as this is more refined, it will be moved over to the main ROS wiki.

There's still a lot of functionality in the works, and I've got a lot of work to do on documentation in the coming weeks.

-Fergs

Pi Robot
08-26-2010, 04:20 PM
Thanks for the announcement Fergs. This is terrific. I'm getting ready to place an order for an ArbotiX and I have a couple of questions about your dual 1A motor drivers and encoder headers. Are these compatible with the following motors from Robotics Connection?

http://www.roboticsconnection.com/p-51-dc-gearhead-robot-motor.aspx

These motors come with a very nice encoder harness and so I am hoping it is plug-and-play with the ArbotiX (I know that might be wishful thinking!).

The other question is that I routinely see greater than 1A draw on these motors when connected to a Serializer controller and my 12 lb robot. Is it possible to use an external H-Bridge such as the following with the ArbotiX?

http://www.roboticsconnection.com/p-77-10-amp-dual-h-bridge.aspx

Many thanks and congrats on getting release 0.2 out!

--patrick

lnxfergy
08-26-2010, 04:27 PM
I'm not sure about those motors/encoders -- the motors/encoders I've usually used are:

http://www.lynxmotion.com/p-653-gear-head-motor-12vdc-301-200rpm-6mm-shaft.aspx
http://www.lynxmotion.com/p-448-quadrature-motor-encoder-wcable.aspx

Note, the lynxmotion motors are 12V -- which makes it quite easy to run AX-12s and the motors off the same 11.1V or 12V battery. The encoders have wiring harness that plug right into the encoder headers on the ArbotiX -- and you can buy little motor plugs that would connect very easily to the larger motor driver below.

You can pretty easily tie in a new motor driver, but it will probably require a new library be created. I'm actually going to have standard support for this bigger motor driver: http://www.pololu.com/catalog/product/707 Which is what I've been using on my bigger robots - the library is justly called "BigMotors" and has the same IO interface as the existing Motors/Motors2 libraries that the ArbotiX environment currently has. If you've seen my Armadillo 2WD robot, it's getting an upgrade to this motor driver as we speak (so the library should be out in a few days).

-Fergs

Pi Robot
08-26-2010, 05:09 PM
Thanks--I just happen to have a set of those Lynxmotion motors and encoders. Just found your Armadillo 2WD project pictures--very nice!

--patrick

RobotAtlas
08-27-2010, 08:10 PM
Fergs, you used GP2Y0A700K for your PML, right?

Actually it looks like a combination of those will give almost perfect sensing ability:
GP2Y0A700K (100 - 500 cm) or GP2Y0A21YK (10 - 80 cm).

What I don't understand is how do you decide how fast it can be moved?
Are you syncronizing IR pulses with the movement?

lnxfergy
08-27-2010, 08:39 PM
Fergs, you used GP2Y0A700K for your PML, right?

Actually it looks like a combination of those will give almost perfect sensing ability:
GP2Y0A700K (100 - 500 cm) or GP2Y0A21YK (10 - 80 cm).

What I don't understand is how do you decide how fast it can be moved?
Are you syncronizing IR pulses with the movement?


Yes, it's the 700K. And yes, there's been discussion of adding a shorter range sensor (hanging off the side).

The PML code is undergoing a huge overhaul right now -- the first version had a lot of hard coded stuff, the newer version is highly customizable, both at start time and using dynamic_reconfigure. We can now change how many readings to take in a scan, what the min/max angle of a scan is, etc -- and hopefully, the interface will be very close to that of the hoyuko_node package.

There will also be support for changing what kind of sensor is attached (the conversion from raw 10-bit readings to distance is already done in the PC side code, but support for additional sensor types is yet to be implemented). This is mainly intended for the fact that a 700K is perfect for human-scale environments, but switching to a GP2D12 would be awesome for a small Trinity Fire Fighting robot. Support for multiple sensors may be implemented -- but it's going to take some major changes in the firmware, as currently, we'd get dangerously close to filling the register table address space with more sensors.

As for timing: the IR sensors update at about a 30hz rate. Our runs at a slightly slower 20hz rate. It moves the sensor to a new position, waits until the next update, reads the current sensor value, and moves to the next position. As long as we don't make the transit too far, everything runs pretty good (there is a chance that the scan could be skewed by up to 6 degrees, but, fairly small trade off for a cheap little device). All servo control and sensor reading is handled by the ArbotiX firmware, the PC just sets up the min/max angle and readings, and then starts and stops scanning. It can read back the current scan asynchronously from the scanning operation.

However, the major problem with the current PML implementation is this: it's really good sitting still, and sorta sucks when the robot starts to move (since the robot is not in the same position throughout the very long scan period). ROS has some ability (via tf) to deal with sensor data coming in over a long time, and I've got a number of items we're going to be implementing to try and remedy this, but on higher-speed platforms it may still not work well -- I really don't have much data to go on here yet as typically the delay is betweens scans (in the case of building a point cloud from a tilting laser), not within the scan itself. Regardless, the quantity of data is much smaller (30 readings per second, as opposed to ~6000 on the low end lidars, currently limited by the IR sensor itself).

-Fergs

RobotAtlas
08-27-2010, 09:16 PM
It would be nice to see a video of PML working. Also some rviz captures would be nice too. ;)
As far as 30 vs 6000 points go, I think it really depends of what you are trying to accomplish with the data.
If yo just want to use it for navigation and assume that everything is completely vertical, then why would you need for 6000 points/second?

How long does one scan take in your experiments?

lnxfergy
08-27-2010, 09:57 PM
It would be nice to see a video of PML working. Also some rviz captures would be nice too. ;)
As far as 30 vs 6000 points go, I think it really depends of what you are trying to accomplish with the data.
If yo just want to use it for navigation and assume that everything is completely vertical, then why would you need for 6000 points/second?

How long does one scan take in your experiments?

No video -- but basically, imagine an AX-12 swinging back and forth -- it makes a 180 degrees/second scan (it's quite loud... one major drawback). The blog post (http://www.showusyoursensors.com/2010/08/ros-and-other-ramblings.html) about it has some pics of the sensor, and also an rviz screen capture (http://3.bp.blogspot.com/_qmthA_fNJ-Y/TF-FJYnkVJI/AAAAAAAAAK0/f46bmen9fdU/s1600/Screenshot-RViz-2.png) of a costmap generated by driving the create down the hallway and approaching the office doorways. The costmap generation is limited to 3m, although the PML goes out to 5, hence the dots in the distance (related to walls in the lab offices), where the costmap has not yet generated.

So, the PML works for costmap generation on slower platforms -- that costmap was generated when going about 0.2m/s. One major issue is the map skewing, and false obstacles appearing, due to the very slow update rates (especially when turning). Hopefully some of the improvements in timing and the publication parameters can help this -- allowing slightly faster speeds.

However, where the PML really fails us, is that it completely does not work for localization or map generation (using amcl or gmapping). I really doubt that any improvements are going to make the sensor good enough for localization - the density of each scan (30 points vs 600) means that you just can't get enough information for localization from a scan. So, no global maps or localization with the PML -- but it is pretty decent for avoiding obstacles.

-Fergs

lnxfergy
08-28-2010, 01:14 PM
I just realized I mis-spoke a bit earlier -- the sensor loop runs at 20hz, and so, with 30 points in a scan, a scan takes 1.5 seconds. I did originally have the scan running at 30hz, but it was obnoxiously loud as it tried move the servo that fast.

I'm going to be checking in a patch (as the last software update actually had the 30hz loop, instead of the correct 20), and releasing an 0.2.1 release for anyone planning to play with the PML before 0.3.0 comes out. I also forgot to check the NUKE firmware in to SVN, so that will be fixed in 0.2.1 as well.

-Fergs

RobotAtlas
08-28-2010, 05:21 PM
I saw somewhere that your IR sensor has 36ms rate, so that's no more than 27.77hz anyways.

RobotAtlas
08-28-2010, 10:38 PM
However, where the PML really fails us, is that it completely does not work for localization or map generation (using amcl or gmapping). I really doubt that any improvements are going to make the sensor good enough for localization - the density of each scan (30 points vs 600) means that you just can't get enough information for localization from a scan. So, no global maps or localization with the PML

Can't you just go slower or sit in one place for 30 seconds or so -> 20 * 30 / 1.5 = 400 points.
Would that be enough? If gmapping needs more, can you just create intermediate points yourself?

It sounding like you are giving up though...

lnxfergy
08-28-2010, 10:58 PM
Can't you just go slower or sit in one place for 30 seconds or so -> 20 * 30 / 1.5 = 400 points.
Would that be enough? If gmapping needs more, can you just create intermediate points yourself?

It sounding like you are giving up though...

You might be able to coax gmapping into working with the PML, but it'll take you several days to collect enough information for a human-scale map if you have to sit still for 30s at each pose. As for creating intermediate points -- it might help in tricking gmapping (it is something I have on the list of experiments) -- but more likely, it's going to just throw it off (as the interpolated points really aren't independently generated from the original points, and don't completely represent the underlying environment). The biggest gain I see with interpolated points is that it will help with costmap clearing (although, with the increased possibility of running into thin objects, such as table legs -- it's all about weighing the trade-offs, there really is no free lunch).

As for "giving up", um, it's open source software, you are more than welcome to pick up wherever I leave off and run with it. I have numerous projects on my plate at any given time, and I have to focus on where I think the most gains can be made. After overhauling tf and scan publication so that timing is correct for the costmap nodes, and probably implementing a simple scan interpolation, the PML will be pretty effective for obstacle avoidance on lower speed platforms. Making it work (reliably) with gmapping and amcl is an entirely different level of performance, and something I currently don't think can be done without a very intensive amount of work (and even then, not entirely sure it *can* be done -- there's disclaimers as it is that amcl doesn't like lower-cost (and shorter range) lidars).

I should also point out though: you can still use the Navigation stack without amcl, gmapping, or a global map -- by using a rolling window costmap for both the local *and* global planner -- although since the global planner really doesn't know anything globally, long-distance plans will be sub-optimal (or possibly completely wrong).

-Fergs

Pi Robot
08-30-2010, 08:39 PM
I realize this would cost a little more, but how about using two or even three IR sensors pointing 90 or 60 degrees apart (assuming a 180 degree field of view) so you can get 2 or three times the data in the same sweep time?

--patrick

Pi Robot
09-03-2010, 11:06 AM
Hey Fergs,

I've been going through your ArbotiX ROS code and I can't tell you how helpful it is to have a working example to learn from. In the meantime, I have a simple ROS question: Why is it that the topics in the turtlesim tutorial are displayed prefixed by the node name like this:

/turtle1/color_sensor
/turtle1/command_velocity
/turtle1/pose

whereas the ArbotiX joint_states topic appears simply as:

/joint_states

rather than:

/arbotix/joint_states

I looked through the turtle.cpp file and, not being a C++ programmer, I couldn't see where the node name was been added to the topics.

--patrick

lnxfergy
09-03-2010, 11:23 AM
I'm not 100% certain about publishers, but I know that for reading parameters, you have to explicitly add the node namespace in Python, whereas in C++ the namespace is dependent on which type of node handle is used. Part of the issue here is also that the NodeHandle is passed into the Turtle constructor, I notice that on line 150 of turtle_frame.cpp, we have:



TurtlePtr t(new Turtle(ros::NodeHandle(real_name), turtle_images_[rand() % 3], \
Vector2(x, y), angle));
And so, real_name corresponds to turtle1, and we have our final service/topic name.

A second note is that you could do such things by remapping /pose to /turtle1/pose in a launch file -- although this is not the case here (I think).

On a less technical note, the reason we publish to joint_states instead of arbotix/joint_states is because joint_states by itself is a common topic (like cmd_vel) -- it's really not dependent on the underlying hardware (any hardware *should* publish joint_states if we want to use standard tools such as robot_state_publisher).

-Fergs

Pi Robot
09-03-2010, 11:41 AM
Many thanks--that makes perfect sense now.

--patrick

P.S. Just got my shiny new ArbotiX and can't wait to try it with your code.

lnxfergy
09-03-2010, 01:02 PM
Just a note -- I posted an 0.2.1 release to fix 2 bugs:


Update PML firmware update rate to 20hz (30hz is too fast) as discussed above.
Fix bugs in servo-level sync. You can turn off the reading of servos (the joint state will just be the last position command sent). This is used on Nelson's HS-55 to AX-12 bridge board (who just got updated to 0.2.0, so I just noticed the issue today).

-Fergs

Pi Robot
09-04-2010, 08:01 AM
Thanks for the update. I've been practicing ROS stuff with my Serializer board before taking on my new ArbotiX and I have a question about ROS parameters that I think is independent of the board. In short, I can't get the ~param construct to work properly.

My params.yaml file looks like this:

port: /dev/ttyUSB0
baud: 19200
rate: 10
gear_reduction: 2

and my serializer.launch file looks like this:


<launch>
<node name="serializer" pkg="serializer" type="serializer_node.py" output="screen">
<rosparam file="$(find serializer)/params.yaml" command="load" />
</node>
</launch>Then in my ROS Serializer node file I have for example:


class SerializerROS():
def __init__(self):
self.port = rospy.get_param("~port", "/dev/ttyUSB0")However, if I change the port in my params.yaml file to the wrong port (e.g. /dev/ttyUSB1), and re-launch the Serializer node, it still works as if it is not getting the value for ~port. On the other hand if I use the fully qualified parameter name like this:


self.port = rospy.get_param("/serializer/port", "/dev/ttyUSB0")then it *does* read the value for port.

I thought I was following your example on the vanadium-ros-pkg wiki but clearly I am missing something about the parameter server name space.

Thanks!
patrick

lnxfergy
09-04-2010, 11:29 AM
Do you have a call to rospy.init_node() before that? If not -- the node isn't initialized and so it typically can't communicate with the parameter server.

-Fergs

Pi Robot
09-04-2010, 08:46 PM
Oh for crying out loud--I had the call to rospy.init_node() after the get_param() lines...As always, thanks again.

--patrick

Pi Robot
09-06-2010, 08:09 PM
Hey Fergs,

What's the accepted way to read in a parameter file when using rosrun to launch a node instead of roslaunch? Should I just make a system call to "rosparam load" from within my node's Python file? Or is there a better way?

Thanks,
patrick

lnxfergy
09-06-2010, 08:15 PM
Hey Fergs,

What's the accepted way to read in a parameter file when using rosrun to launch a node instead of roslaunch? Should I just make a system call to "rosparam load" from within my node's Python file? Or is there a better way?

Thanks,
patrick

I think you can do rosparam load -- but really, the roslaunch file is the more accepted form within the ROS community. Any reason to avoid a launch file?

-Fergs

Pi Robot
09-06-2010, 08:27 PM
It's really just for testing when I am editing the node in Eclipse/PyDev. If I run it from within Eclipse, the parameters are not loaded, which got me thinking in general how to load the parameters for a single node without using a launch file. But yeah, I am happy to use the launch file for normal operation.

While on the subject, do you use Eclipse/PyDev to develop your ROS code? If so, I am wondering how to get PyDev to recognize rospy as a legitimate module so that (a) code completion works and (b) the editor doesn't red underline everything to do with ROS. I can run the file fine by making sure the path to rospy is in the PyDev environment variables, but rospy still seems to be considered somehow outside the editing environment.

--patrick

lnxfergy
09-06-2010, 08:34 PM
An, no I don't use Eclipse. I know there has been discussion on ros-users about Eclipse issue recently (although I don't know that it was related to Python development).

-Fergs

Pi Robot
09-06-2010, 08:35 PM
OK thanks. I'll check the ros-users list for more info.

Pi Robot
09-08-2010, 05:46 PM
P.S. Just got my shiny new ArbotiX and can't wait to try it with your code.

Hey Fergs,

I'm finally trying out my new ArbotiX for the first time. Since I won't be using the Arduino software, I ignored that part of the setup. Also, I am trying things with the AVR/ISP cable first before braving the Xbee radios.

I installed the Windows Polulu drivers for the AVR Programmer. With the Programmer plugged into a USB port on my PC and the other end plugged into the ISP port on the ArbotiX, I see the solid green light and blinking yellow light on the Programmer as expected. I am powering the ArbotiX with an 8.4V NiMH battery pack and the ArbotiX shows a steady green "On" LED.

Then I connected two AX-12+'s to the ArbotiX Bioloid bus with IDs 1 and 2. No sparks--good. Finally, I fired up your arbotix.py driver and set the COM port to "COM19" which Windows labels as "Polulu AVR USB Programmer TTL Serial Port". I tried different baud rates, starting with 38400 but also 1 Mb/s, 19200 and 9600. In all cases, I get the following error when trying to read the position on servo ID 1:

Fail Read
Read Failed: Servo ID = 1
-1

I checked that Jumper J1 is set. Also, everything works fine if I plug the two servos into a USB2Dynamixel controller. No doubt I have missed a critical step during the setup, or perhaps the AVR Programmer cannot be used as a serial port this way?

Thanks!
patrick

lnxfergy
09-08-2010, 05:53 PM
The Pololu programmer TTL signal is actually on the side (the unplugged header) -- not the ISP 2x3 pin header (which connects to SPI, not TTL port). Typically, one would use an FTDI breakout to connect the ArbotiX to the PC.

You'll need to put the ArbotiX-ROS firmware on the board also -- as the new ROS firmware is way more advanced than the PyPose firmware that the boards are currently shipping with. If you're not interested in installing Arduino IDE, I can try to put compiled hex files up on the SVN.

-Fergs

Pi Robot
09-08-2010, 05:58 PM
OK, I see. Since I don't have an FTDI breakout handy, I'll go for broke and try the Xbee's...

Pi Robot
09-08-2010, 11:02 PM
You'll need to put the ArbotiX-ROS firmware on the board also -- as the new ROS firmware is way more advanced than the PyPose firmware that the boards are currently shipping with. If you're not interested in installing Arduino IDE, I can try to put compiled hex files up on the SVN.
-Fergs

We have lift off! Just now got back to setting up the ArbotiX, plugged in the XBEEs, set the baud to 38400 and connected perfectly on the first try using arbotix.py. Tomorrow I'll upgrade the firmware and try it with your ROS package.

--patrick

Pi Robot
09-09-2010, 09:03 AM
Hey Fergs,

Things are going smoothly with the ArbotiX on Pi Robot. I haven't yet upgraded the firmware to try ROS 'cause I'm still testing all the basic functions in the Python driver. I was wondering if the syncWrite() function is still incomplete? I ask because there are two undefined variables--length and valsum. Also, and sorry for asking such a basic question, how do I use syncWrite() properly? I'd like to use it to send a bunch of goal positions to my 11 servos at the same time. Similarly for setting the speeds. Would you please give me an example to follow?

Thanks!
patrick

lnxfergy
09-09-2010, 11:14 AM
Hey Fergs,

Things are going smoothly with the ArbotiX on Pi Robot. I haven't yet upgraded the firmware to try ROS 'cause I'm still testing all the basic functions in the Python driver. I was wondering if the syncWrite() function is still incomplete? I ask because there are two undefined variables--length and valsum. Also, and sorry for asking such a basic question, how do I use syncWrite() properly? I'd like to use it to send a bunch of goal positions to my 11 servos at the same time. Similarly for setting the speeds. Would you please give me an example to follow?

Thanks!
patrick

So, there's actually a few things with SyncWrite. First -- the firmware support isn't there yet (it's on my list for the next release) as it wasn't needed before for the old PyPose sketch (since we instead loaded final values and had the ArbotiX do interpolation, something I haven't been using much with ROS). Second, because I haven't been testing SyncWrite, I broke it in the Python program (you'll notice that between 0.1.0 and 0.2.0 SycnRead/Write were overhauled to use the execute() function, making the mutex usage cleaner and clearer -- SyncRead was tested, SyncWrite was not).

I've gotten busy the last few days with conference submission deadlines -- but I'm hoping to have an 0.3.0 release out Monday or Tuesday -- SyncWrite should be operational in there.

-Fergs

Pi Robot
09-09-2010, 11:57 AM
...I've gotten busy the last few days with conference submission deadlines -- but I'm hoping to have an 0.3.0 release out Monday or Tuesday -- SyncWrite should be operational in there.

-Fergs

Thanks Fergs--I'm definitely not in a hurry.

--patrick

Pi Robot
09-10-2010, 08:13 AM
Yet another ROS question: I defined a message type called SensorState to mimic JointState that looks like this:

Header header

string[] name
float64[] value

After remaking my package, the SensorState works as expected *except* that header.seq does not auto-increment (it stays fixed at 0). I can increment it manually as my node publishes, but I thought it was supposed to auto-increment?

--patrick

Pi Robot
09-14-2010, 07:45 AM
Just a note -- I posted an 0.2.1 release to fix 2 bugs:


Update PML firmware update rate to 20hz (30hz is too fast) as discussed above.
Fix bugs in servo-level sync. You can turn off the reading of servos (the joint state will just be the last position command sent). This is used on Nelson's HS-55 to AX-12 bridge board (who just got updated to 0.2.0, so I just noticed the issue today).

-Fergs

Hi Mike,

Do you have a step-by-step guide to upgrading the firmware on the ArbotiX? I have the Arduino software loaded on my Windows machine and I'm looking at your Getting Started guide but it's still not clear to me how to just upgrade the firmware. I have the Polulu ISP programmer.

Thanks!
patrick

Pi Robot
09-14-2010, 08:08 AM
The reason I was looking to upgrade the firmware is that I am trying to read the present speed and load from the servos so I can publish them as part of the JointState topic. I'm getting back some odd numbers (and read failures) so does this mean these two registers are not readable in the firmware shipped with the Arbotix?

Thanks again,
patrick

DresnerRobotics
09-14-2010, 10:44 AM
Hi Mike,

Do you have a step-by-step guide to upgrading the firmware on the ArbotiX? I have the Arduino software loaded on my Windows machine and I'm looking at your Getting Started guide but it's still not clear to me how to just upgrade the firmware. I have the Polulu ISP programmer.

Thanks!
patrick


Follow all the steps in the Getting Started, make sure you have copied over the appropriate libraries and hardware folder, have the the programmer selected as your comm port, and that you have modified your boards.txt file within your arduino0018\arbotix\ folder to be setup for an ISP. At that point, you should be able to just load the ROS .PDE file into the Arduino IDE and click upload.

Pi Robot
09-14-2010, 11:08 AM
Thanks Tyberius--I'll give that a try.

--patrick

Pi Robot
09-14-2010, 05:02 PM
OK, I'm stuck part way through the Upload process with the following error:


ros.cpp:59:26: error: EncodersAB.h: No such file or directory
In file included from ros.cpp:61:
/pid.h: In function 'void updatePID()':
pid.h:123: error: 'Encoders' was not declared in this scope
/pid.h: In function 'void clearAll()':
pid.h:149: error: 'Encoders' was not declared in this scope
ros.cpp: In function 'void setup()':
ros:80: error: 'Encoders' was not declared in this scope
ros.cpp: In function 'unsigned char handleWrite()':
ros:137: error: 'class Servo' has no member named 'writeMicroseconds'
ros.cpp: In function 'int handleRead()':
ros:250: error: 'Encoders' was not declared in this scope
ros:254: error: 'Encoders' was not declared in this scope
I am using version 0019 of the Arduino software and everything seemed to work OK up to that point meaning, it saw my ISP Programmer port and the ArbotiX board. The ros.pde file I am trying to upload is from version 0.2.1 of the SVN distribution.

One discrepancy I found in the Getting Started instructions: it refers to a "sketchbook" directory which does not exist in either the Arduino folder nor the Arbotix folder.

Any thoughts?
-patrick

DresnerRobotics
09-14-2010, 05:11 PM
Sketchbook is usually the Arduino folder created inside your My Docs. I've found it's best to keep your hardware folder in both the sketchbook and the root arduino folder, because it's kind of wonky. Also, not sure on 0019 support, its brand new and thus not 'stable' nor tested.

Did you also install the Robocontroller libraries? Sounds like it's not finding the encoder library included in those.

Pi Robot
09-14-2010, 05:34 PM
Thanks for the quick reply--I was missing the robocontroller library! This time uploading completes, but I get the following ominous warning:

Binary sketch size: 11908 bytes (of a 65536 byte maximum)
avrdude: stk500v2_command(): command failed
avrdude: initialization failed, rc=-1
Double check connections and try again, or use -F to override
this check.

What do you think?

--patrick

lnxfergy
09-14-2010, 05:41 PM
Thanks for the quick reply--I was missing the robocontroller library! This time uploading completes, but I get the following ominous warning:

Binary sketch size: 11908 bytes (of a 65536 byte maximum)
avrdude: stk500v2_command(): command failed
avrdude: initialization failed, rc=-1
Double check connections and try again, or use -F to override
this check.

What do you think?

--patrick

This means the ISP isn't quite working (have you selected it? have you selected the right port? have you attached the cable the right way?)

-Fergs

lnxfergy
09-14-2010, 05:44 PM
Also, not sure on 0019 support, its brand new and thus not 'stable' nor tested.

I haven't updated any of our docs for 0019 -- but I am developing against it right now, and it doesn't appear to have any issues as not much changed in the IDE.

On a side note, I spent some time Saturday overhauling our arbotix core code to be more in line with new features in 0019, but we don't have an official release of that code yet (it is in SVN).

-Fergs

Pi Robot
09-14-2010, 06:10 PM
There were two COM ports listed for the ISP so I chose the other one and this time there was no error; however, the uploading status message has been stuck at "Uploading to I/O Board..." for about 10 minutes. Should it take that long, or should I start everything over?

--patrick

Pi Robot
09-14-2010, 06:34 PM
OK, don't laugh, but I did not realize you had to have the ArbotiX ON for the upload to work...

Problem solved. Nothing to see here folks. Move it along...

--patrick

Adrenalynn
09-15-2010, 02:13 AM
Sorry I laughed. Couldn't help it.

Pi Robot
09-15-2010, 09:18 AM
With my Arduino snafu behind me, things are going well with the ArbotiX and the 0.2.1 firmware, drivers and ROS code. FYI, I wanted to try getting the current speed and load from the servos, so I added the following two functions to my copy of arbotix.py:


def getSpeed(self, index):
""" Returns speed in ticks """
try:
values = self.read(index, P_PRESENT_SPEED_L, 2)
val = int(values[0]) + (int(values[1])<<8)
if val & 0x400 != 0:
return -(val & 0x3FF)
else:
return val
except:
return -1
def getLoad(self, index):
""" Returns load in ticks """
try:
values = self.read(index, P_PRESENT_LOAD_L, 2)
val = int(values[0]) + (int(values[1])<<8)
if val & 0x400 != 0:
return -(val & 0x3FF)
else:
return val
except:
return -1--patrick

Pi Robot
09-16-2010, 01:36 PM
Hey Fergs,

I am using your base_controller.py file with my Serializer by simply changing the cmdVel callback to execute the appropriate Serializer command. (I would use my ArbotiX but I still need the higher current capacity of the Serializer's H-bridges.) Everything works great except I run into a small glitch when running things at faster rates, say > 10Hz. I'm wondering if you have experienced something similar with the ArbotiX as the base controller.

More specifically, I am running two ROS nodes, one to query and publish the Serializer's sensor values (as well as launch your base_controller.py code in its own thread), plus another node that accepts keystrokes and publishes cmd_vel values when I hit the arrow keys. Using these two nodes I can very nicely drive the robot around with the arrow keys while collecting sensor data.

The glitch appears when I run the main Serializer node at rates greater than about 10Hz. (For now, I am always running the base controller thread at 10Hz.) What happens is that some times my keystrokes are ignored. By that I mean the robot keeps moving in the direction of the last keystroke rather than responding to the new command. Using "rostopic echo /cmd_vel" I can see that ROS is responding perfectly so I'm guessing I am running into either a thread blocking issue on the serial port or the Serializer itself doesn't like getting hit too often.

I don't expect you to answer anything specific to the Serializer board but I am just wondering if you have run into something similar with the ArbotiX?

Thanks,
patrick

lnxfergy
09-16-2010, 01:38 PM
I haven't had any issues like that -- and one of my robots here is running at 20hz cmd_vel updates (from the base_local_planner).

-Fergs

Pi Robot
09-16-2010, 05:51 PM
That's good to hear. I'll keep digging deeper with the Serializer but I am assured that when I move my base control over to the ArbotiX I won't have to worry.

--patrick

lnxfergy
09-17-2010, 09:09 AM
...but I am assured that when I move my base control over to the ArbotiX I won't have to worry....

Just a note (mainly for other readers), but we're still at 0.2.1 (0.3.0 should be out soon), and so this is still beta-level software. Although I've been trying to be very careful in adding new features only when others have been fully tested, the entire package still has only about 30 hours of test time on it....

-Fergs

Pi Robot
09-17-2010, 03:51 PM
The glitch appears when I run the main Serializer node at rates greater than about 10Hz. (For now, I am always running the base controller thread at 10Hz.) What happens is that some times my keystrokes are ignored. By that I mean the robot keeps moving in the direction of the last keystroke rather than responding to the new command. Using "rostopic echo /cmd_vel" I can see that ROS is responding perfectly so I'm guessing I am running into either a thread blocking issue on the serial port or the Serializer itself doesn't like getting hit too often.

Looks like my problem was in the value I set for the timeout on the Serializer. I had set it to 5 seconds since I was still thinking like a C-sharper wherein timeouts and sleep values are in milliseconds...

--patrick

Pi Robot
09-20-2010, 08:54 PM
Hey Fergs,

I don't know if this affects you but it threw me for a loop so I'll point it out just in case. As I mentioned earlier in this thread, I am using your base_controller.py file (thanks for the head start!) with my Serializer by just changing the cmdVel callback to use the Serializer's functions. But I was puzzled by the odometry values I was getting on the ROS /odom topic until I realized that the calculations:

d_left = (left - self.enc_left)/self.ticks_meter
d_right = (right - self.enc_right)/self.ticks_meter

were returning 0 for d_left and d_right since all values are integers and my self.ticks_meter (1563) is bigger than the typical delta in encoder counts. So the division was returning zero. By changing the above lines to:

d_left = float((left - self.enc_left)) / self.ticks_meter
d_right = float((right - self.enc_right)) / self.ticks_meter

my problems seem to have gone away.

--patrick

lnxfergy
09-20-2010, 10:52 PM
self.ticks_meter should be of type float (see the parameter loading code in the __init__ function of base_controller) -- so this shouldn't be an issue in the ArbotiX code -- I'm guessing you might have modified the __init__?

-Fergs

Pi Robot
09-20-2010, 11:44 PM
Of course--I see now. I am pulling in self.ticks_meter from my Serializer driver file rather than the parameter file and I wasn't typing it as float as you are in __init__.

Thanks for the clarification.

--patrick

Pi Robot
09-21-2010, 01:33 PM
I'm going through the ROS navigation and tf tutorials and I have to say, this stuff is brilliant. This will save soooo much time controlling the arms on my robot down the road. Thanks to Mike for pushing me in the right direction and providing so much useful code for getting started. :cool:

--patrick

lnxfergy
09-21-2010, 01:54 PM
I'm going through the ROS navigation and tf tutorials and I have to say, this stuff is brilliant. This will save soooo much time controlling the arms on my robot down the road. Thanks to Mike for pushing me in the right direction and providing so much useful code for getting started. :cool:

--patrick

Yep, the navigation stack is amazing -- as long as you can get adequate sensory tied into the system. Currently that means a lidar for SLAM. If you're only interested in obstacle avoidance and not localization, then a stereo camera setup or PML would work -- some sort of other localization routine would be needed though.

-Fergs

Pi Robot
09-21-2010, 06:07 PM
Yep, the navigation stack is amazing -- as long as you can get adequate sensory tied into the system. Currently that means a lidar for SLAM. If you're only interested in obstacle avoidance and not localization, then a stereo camera setup or PML would work -- some sort of other localization routine would be needed though.

-Fergs

I'm hoping to have the same Hokuyo lidar you have by mid-October, if not sooner. In the meantime, I might try a PML or just do things in simulation (but I'll have to learn Player/Stage first...)

With still some ways to go in the navigation tutorials, I am wondering if I can at least try out the move_base package by faking a point cloud node that shows all clear in all directions (i.e. no obstacles anywhere). Then I could at least test sending the robot a goal to go to point (x,y) and assume a particular post...

--patrick

lnxfergy
09-21-2010, 07:59 PM
With still some ways to go in the navigation tutorials, I am wondering if I can at least try out the move_base package by faking a point cloud node that shows all clear in all directions (i.e. no obstacles anywhere). Then I could at least test sending the robot a goal to go to point (x,y) and assume a particular post...

--patrick

You could hack the PML sensor to not request values from the ArbotiX and instead just post all 0's -- that would do the same thing.

-Fergs

Pi Robot
09-22-2010, 07:27 AM
You could hack the PML sensor to not request values from the ArbotiX and instead just post all 0's -- that would do the same thing.

-Fergs

Great idea--I'll give that a try.

Pi Robot
09-23-2010, 12:37 PM
I haven't updated any of our docs for 0019 -- but I am developing against it right now, and it doesn't appear to have any issues as not much changed in the IDE.
-Fergs

I just reloaded the ArbotiX ROS firmware after setting the baud rate to 57600 and I wanted to record what worked for me using the 0019 Aruduino framework.

The key steps were:

1. Create a subfolder called "sketchbook" in the root Arduino folder (since it doesn't exist by default)
2. Create a "hardware" subfolder of the sketchbook folder.
3. Bring up the Arduino application and click on File->Preferences and set the Sketchbook location to the sketchbook folder created above.

The rest is the same as the instructions in the "Getting Started" section of the ArbotiX website; i.e.

4. From the Arbotix release (I used 0013), copy the"arbotix" subfolder into the "sketchbook/hardware" folder create above.
5. From the Arbotix release, copy the "libraries" subfolder into the "sketchbook" folder created above (not in the hardware subfolder).

etc.

--patrick

Pi Robot
09-24-2010, 06:27 PM
You could hack the PML sensor to not request values from the ArbotiX and instead just post all 0's -- that would do the same thing.

-Fergs

Hey Fergs,

I've gone through the navigation stack tutorial and now I'm ready to give it a try on my robot. Like you suggested, I'm running a fake PML sensor node with the ranges set to 0's. And I'm running my Serializer node (based on your base_controller) to publish/subscribe to the /odom and /cmd_vel topics. The navigation tutorial assumes you have a map file to work from which I don't have. Is it possible to do a test without a map? And if so, how would I modify the costmap files and/or launch file. My current files look like this:


global_costmap:
global_frame: /map
robot_base_frame: base_link
update_frequency: 5.0
static_map: true
local_costmap:
global_frame: /odom
robot_base_frame: base_link
update_frequency: 5.0
publish_frequency: 2.0
static_map: false
rolling_window: true
width: 6.0
height: 6.0
resolution: 0.05base_local_planner_params.yaml

TrajectoryPlannerROS:
max_vel_x: 0.2
min_vel_x: 0.07
max_rotational_vel: 1.0
min_in_place_rotational_vel: 0.4

acc_lim_th: 3.2
acc_lim_x: 2.5
acc_lim_y: 2.5

holonomic_robot: false
move_base.launch:

<launch>
<master auto="start"/>

<!-- Run the map server -->
<node name="map_server" pkg="map_server" type="map_server" args="$(find my_map_package)/my_map.pgm my_map_resolution"/>
<!--- Run AMCL -->
<include file="$(find amcl)/examples/amcl_diff.launch" />

<node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen">
<rosparam file="$(find pi_robot)/costmap_common_params.yaml" command="load" ns="global_costmap" />
<rosparam file="$(find pi_robot)/costmap_common_params.yaml" command="load" ns="local_costmap" />
<rosparam file="$(find pi_robot)/local_costmap_params.yaml" command="load" />
<rosparam file="$(find pi_robot)/global_costmap_params.yaml" command="load" />
<rosparam file="$(find pi_robot)/base_local_planner_params.yaml" command="load" />
</node>
</launch>


Is there a way to create a simple map file that is just a big empty room? Or not run move_base without the map server at all?

Thanks,
patrick

Pi Robot
09-24-2010, 06:42 PM
OK, I just found the instructions for downloading a sample bag file and creating a map from it so I now have a map. But it would still be nice to run without a map, if that makes sense, or with a "blank" map that simply allows me to move the robot to coordinates <x, y> relative to its starting point. (Again, if that makes sense...)

--patrick

lnxfergy
09-24-2010, 07:54 PM
OK, I just found the instructions for downloading a sample bag file and creating a map from it so I now have a map. But it would still be nice to run without a map, if that makes sense, or with a "blank" map that simply allows me to move the robot to coordinates <x, y> relative to its starting point. (Again, if that makes sense...)

--patrick

Just configure the global costmap to be rolling_window=true, static_map=false (you may also have to play with the frames, and size parameters)

-Fergs

Pi Robot
09-24-2010, 08:59 PM
Awesome--that's exactly what I needed. The one missing element seems to be that I don't have a /world frame defined. The base_controller sets up a transform between /odom and /base_link but how or where do I introduce the /world frame?

--patrick

lnxfergy
09-24-2010, 09:02 PM
Awesome--that's exactly what I needed. The one missing element seems to be that I don't have a /world frame defined. The base_controller sets up a transform between /odom and /base_link but how or where do I introduce the /world frame?

--patrick

Your localization routine would typically broadcast the transform between /odom and /world (or /map). You can fake this with either fake_localization (or just a static_transform_publisher)

-Fergs

Pi Robot
09-24-2010, 09:07 PM
Ah, I never would have guessed there was a package called fake_localization. This should get me going. Thanks again!

--patrick

Pi Robot
09-25-2010, 08:45 PM
Your localization routine would typically broadcast the transform between /odom and /world (or /map). You can fake this with either fake_localization (or just a static_transform_publisher)

-Fergs

Hey Fergs,

It's been a long day of trial and error and reading the ROS wiki to get the nav stack *almost* working. I can now move my robot in rviz by setting a 2D Nav Goal with the mouse. However, even when I set the goal to simply move straight ahead a small distance, the robot does a bunch of swerving before eventually getting to the goal location.

I am using rolling_window = true and static_map = false since I couldn't get things working even with a blank map. I am using the PML node with all the ranges set to 5 and they show up in rviz as a nice green arc well out from the robot foot print. The only clue I have that something might be amiss is the following warning that I get continually from my move_base.launch window:

[ WARN] [1285464922.948809774]: Costmap2DROS transform timeout. Current time: 1285464922.9488, global_pose stamp: 1285464922.0486, tolerance: 0.3000

I am making /odom and /map children of /world via static transforms in my launch file which looks like this:


<launch>
<master auto="start"/>

<param name="robot_description" command="cat $(find pi_robot)/urdf/pi_robot_urdf.xml" />

<!-- Run the map server -->
<node name="map_server" pkg="map_server" type="map_server" args="$(find pi_robot)/blank_map.yaml"/>

<!-- Run AMCL -->
<include file="$(find amcl)/examples/amcl_diff.launch" />

<!-- Create a static transform between the /map frame and /word -->
<node pkg="tf" type="static_transform_publisher" name="odom_map_broadcaster" args="-5 -5 0 0 0 0 /world /odom 100" />
<node pkg="tf" type="static_transform_publisher" name="world_map_broadcaster" args="-5 -5 0 0 0 0 /world /map 100" />
<node pkg="tf" type="static_transform_publisher" name="scan_base_broadcaster" args="0 0 0 0 0 0 /base_link /base_scan 100" />

<node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen">
<rosparam file="$(find pi_robot)/costmap_common_params.yaml" command="load" ns="global_costmap" />
<rosparam file="$(find pi_robot)/costmap_common_params.yaml" command="load" ns="local_costmap" />
<rosparam file="$(find pi_robot)/local_costmap_params.yaml" command="load" />
<rosparam file="$(find pi_robot)/global_costmap_params.yaml" command="load" />
<rosparam file="$(find pi_robot)/base_local_planner_params.yaml" command="load" />
</node>
</launch>Any ideas why the robot is doing such a dance on the way to its goal with no obstacles present?

--patrick

lnxfergy
09-26-2010, 01:24 AM
[ WARN] [1285464922.948809774]: Costmap2DROS transform timeout. Current time: 1285464922.9488, global_pose stamp: 1285464922.0486, tolerance: 0.3000



This is probably part of your culprit - you need to set tolerance for the costmap (see the wiki for the exact configuration) to either 0 or > 1s since the PML broadcasts laser scans slower than a laser scanner (1hz vs 10-40).

Additionally, you'll have to tune the parameters for TrajectoryPlannerROS -- to match the acceleration/decel capabilities of your platform (the defaults are for PR2, which is likely way higher than what Pi can do -- so the robot can't achieve the issued commands).

-Fergs

Pi Robot
09-26-2010, 01:04 PM
Thanks Fergs,

Alas, no combination of tolerance and acceleration parameters seems to fix the problem and the timeout warnings remain. I've also scoured the ROS users list and while some people have had similar problems, none of the posted suggestions worked for me. I have verified that my /cmd_vel topic is working correctly with my base_controller.py node by manually publishing velocity commands. And tf_monitor seems to indicate that all frames are being broadcast at decent rates and small delays. However, even the simple_navigation_goals node given in the wiki does not work to move the robot 1 meter forward in the correct way-- the robot shoots forward for about 1 meter, but then goes into a random wandering mode...

Regarding the PML publishing rate, since I am faking the ranges at the moment, can I not publish them as fast as I like?

I'll post to the ROS users list next, but the only other clue I have at the moment is the following message that appears repeatedly when looking at the cost map debug messages:

MessageNotifier [topic=base_scan, target=/odom /base_link ]: Successful Transforms: 4620, Failed Transforms: 29541, Discarded due to age: 0, Transform messages received: 7242, Messages received: 4622, Total dropped: 0

--patrick

Pi Robot
09-26-2010, 01:29 PM
Actually, setting the transform_tolerance to 1s can suppress the costmap timeout errors, but the erratic navigation behavior still remains.

--p

lnxfergy
09-26-2010, 05:03 PM
Actually, setting the transform_tolerance to 1s can suppress the costmap timeout errors, but the erratic navigation behavior still remains.

--p

Have you tried playing with the goal tolerances for the Base Local Planner? I had to increase the alloted tolerance on the Creates (I think most diff-drive bases probably need to -- otherwise, a small overshoot leads to crazy movements trying to correct, since you can't strafe).

-Fergs

Pi Robot
09-26-2010, 07:25 PM
Yeah, I even set the tolerances as high as 1.0 for both xy and yaw which should be equivalent to "get within a country mile of the target." What's more, my max_vel_x and max_rotational_vel parameters seem to be ignored since even if I set them to really small values (e.g 0.05 and 0.3), the robot still shoots off after the target at about 0.5 m/s. The parameters themselves are getting set correctly as can be verified with rosparam dump. And I saw your post on the ROS users list about using smaller values for controller_frequency so I have that set to 5. Still no joy.

I'm not sure what else to try. Below are my latest param file settings in case something obvious jumps out at you:

base_local_planner_params.yaml:

max_vel_x: 0.05
min_vel_x: 0.005
max_rotational_vel: 0.3
min_in_place_rotational_vel: 0.1
controller_frequency: 5.0
acc_lim_th: 0.1
acc_lim_x: 0.1
acc_lim_y: 0.1
xy_goal_tolerance: 1.0
yaw_goal_tolerance: 1.0
holonomic_robot: false

costmap_common_params.yaml:

obstacle_range: 2.5
raytrace_range: 3.0
footprint: [[-0.13, 0.16], [0.13, 0.16], [0.13, -0.16], [-0.13, -0.16]]
#robot_radius: ir_of_robot
inflation_radius: 0.55
transform_tolerance: 1.0
observation_sources: base_scan
base_scan: {sensor_frame: base_link, data_type: LaserScan, topic: base_scan, marking: true, clearing: true}global_costmap_params.yaml:

global_costmap:
global_frame: /map
robot_base_frame: base_link
update_frequency: 5.0
publish_frequency: 0
static_map: true
rolling_window: false

local_costmap_params.yaml:
local_costmap:
global_frame: odom
robot_base_frame: base_link
update_frequency: 5.0
publish_frequency: 0
static_map: false
rolling_window: trueand my launch file for move_base:

<launch>
<master auto="start"/>
<param name="robot_description" command="cat $(find pi_robot)/urdf/pi_robot_urdf.xml" />

<!-- Run the map server -->
<node name="map_server" pkg="map_server" type="map_server" args="$(find pi_robot)/blank_map.yaml"/>

<!-- Run AMCL -->
<include file="$(find amcl)/examples/amcl_diff.launch" />

<!-- Create a static transform between the /map frame and /world -->
<node pkg="tf" type="static_transform_publisher" name="odom_map_broadcaster" args="0 0 0 0 0 0 /world /odom 100" />
<node pkg="tf" type="static_transform_publisher" name="world_map_broadcaster" args="0 0 0 0 0 0 /world /map 100" />

<node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen">
<rosparam file="$(find pi_robot)/costmap_common_params.yaml" command="load" ns="global_costmap" />
<rosparam file="$(find pi_robot)/costmap_common_params.yaml" command="load" ns="local_costmap" />
<rosparam file="$(find pi_robot)/local_costmap_params.yaml" command="load" />
<rosparam file="$(find pi_robot)/global_costmap_params.yaml" command="load" />
<rosparam file="$(find pi_robot)/base_local_planner_params.yaml" command="load" />
</node>
</launch>Finally, my version of your PML node to simulate the laser scan:

import roslib; roslib.load_manifest('pi_robot')
import rospy
from sensor_msgs.msg import LaserScan
from tf.broadcaster import TransformBroadcaster

rospy.init_node("base_scan")
scanPub = rospy.Publisher('base_scan', LaserScan)
scanBroadcaster = TransformBroadcaster()

scan_rate = 1
rate = rospy.Rate(scan_rate)

rospy.loginfo("Started base scan at " + str(scan_rate) + " Hz")

while not rospy.is_shutdown():
# scanBroadcaster.sendTransform(
# (0, 0, 0),
# (0, 0, 0, 1),
# rospy.Time.now(),
# "base_scan",
# "base_link"
# )

ranges = list()

for i in range(30):
ranges.append(5)

scan = LaserScan()
scan.header.stamp = rospy.Time.now()
scan.header.frame_id = "base_link"
scan.angle_min = -1.57
scan.angle_max = 1.57
scan.angle_increment = 0.108275862
scan.scan_time = scan_rate
scan.range_min = 0.5
scan.range_max = 6.0
scan.ranges = ranges
scanPub.publish(scan)

rate.sleep()

lnxfergy
09-26-2010, 07:55 PM
Have you done a "rostopic echo cmd_vel" to confirm that base_local_planner isn't listening to the settings? If it's actually publishing low velocities, there may be an issue in the base_controller....

-Fergs

Pi Robot
09-26-2010, 08:44 PM
Good idea. With max_vel_x set to 0.2, I am seeing linear x values as high as 0.499 on /cmd_vel. And I just now confirmed that my base_controller is getting these high values. I'm testing by publishing the following goal:

rostopic pub /move_base_simple/goal geometry_msgs/PoseStamped '{ header: { frame_id: "odom" }, pose: { position: { x: 0.5, y: 0, z: 0 }, orientation: { x: 0, y: 0, z: 0, w: 1 } } }'

After publishing this goal, the first value I see appear on /cmd_vel is:

linear:
x: 0.25
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 0.0168421052632

which already exceeds my upper limit of 0.2. The linear x value then quickly jumps up to 0.499 before falling back down again.

--patrick

Pi Robot
09-26-2010, 10:02 PM
Well, don't ask me why, but the weird velocity surges have disappeared--for now at least. What's more, I can get the robot to move toward a goal straight ahead. But goals off to the side or behind it still send it off wandering. The only hint now that something is amiss is that I am repeatedly getting the warning message:

Control loop missed its desired rate of 5.0000Hz... the loop actually took 1.0779 seconds

even though I am running the controller_frequency at only 5.0 Hz has you can see.

--patrick

Pi Robot
09-26-2010, 10:58 PM
Hey Fergs,

A quick question about ROS frames: is a clockwise rotation in the plane considered a positive or negative rotation? Or put another way, does the quaternion {x: 0, y: 0, z: 0.5, w: 0} represent a clockwise or counter clockwise rotation of 90 degrees?

--patrick

Pi Robot
09-26-2010, 11:38 PM
Success!!! Pi Robot can now get to a goal I set in rviz or by way of the command line. However, I still have some fine tuning to do since he is still does some random fits and starts along the way to a goal--including suddenly going backwards for no apparent reason. This behavior was actually reported on the ROS users group a couple of months ago and it was never figured out--it just went away when the user kept randomly tweaking the parameters. The parameters I am now using for the base planner are:


TrajectoryPlannerROS:
max_vel_x: 0.2
min_vel_x: 0.05
max_rotational_vel: 0.2
min_in_place_rotational_vel: 0.15
acc_lim_th: 4.0
acc_lim_x: 3.0
acc_lim_y: 3.0
yaw_goal_tolerance: 0.2
xy_goal_tolerance: 0.2
holonomic_robot: false

Note how I have the acceleration limits set fairly high--turns out setting them lower makes the jerkiness of the movement worse. Go figure.

--patrick

Pi Robot
09-27-2010, 11:09 AM
Fergs, have you every run into the situation where the base_local_planner issues velocity commands well in excess of the max_vel_x and max_rotational_vel parameters? I have mine set at 0.2 and 0.15 respectively (confirmed with rosparam get), and I am seeing x velocities as high as 0.8 and angular velocities as high as 1.3! And this happens on virtually every run toward a goal. Consequently, the robot is going all over the place at crazy speeds and while it seems to eventually reach the goal, clearly something is amiss...

P.S. I posted this problem to the ROS users list but I wanted to see if you had ever witnessed this on any of your robots.

--patrick

lnxfergy
09-27-2010, 11:19 AM
Fergs, have you every run into the situation where the base_local_planner issues velocity commands well in excess of the max_vel_x and max_rotational_vel parameters? I have mine set at 0.2 and 0.15 respectively (confirmed with rosparam get), and I am seeing x velocities as high as 0.8 and angular velocities as high as 1.3! And this happens on virtually every run toward a goal. Consequently, the robot is going all over the place at crazy speeds and while it seems to eventually reach the goal, clearly something is amiss...

P.S. I posted this problem to the ROS users list but I wanted to see if you had ever witnessed this on any of your robots.

--patrick

I've not seen such a situation, but I also didn't look at the raw cmd_vel -- just the output.

-Fergs

Pi Robot
09-27-2010, 11:56 AM
Thanks--I was afraid of that. I'm mildly suspicious of the encoder readings I'm getting from the Serializer and I'm wondering if that might be at the heart of the problem though I would think that the base_local_planner would respect the max velocity parameters regardless. We'll see if I find anything from the ROS users list...

--patrick

lnxfergy
09-27-2010, 12:08 PM
Thanks--I was afraid of that. I'm mildly suspicious of the encoder readings I'm getting from the Serializer and I'm wondering if that might be at the heart of the problem though I would think that the base_local_planner would respect the max velocity parameters regardless. We'll see if I find anything from the ROS users list...

--patrick

Well, that was exactly why I asked for the "rostopic echo" of cmd_vel. There's obviously some sort of issue with parameter loading in base_local_planner (be it your config files, or an actual bug). Once you sort that out, we'll have to take a look at the Serializer stuff.

-Fergs

Pi Robot
09-27-2010, 12:43 PM
In the meantime, I think I've tracked my Serializer problem down to the XBee communication link. Seems I am getting frequent timeouts when using the XBee radios at 57600 but I get zero timeouts when using a USB cable even when running encoder queries at 100Hz. I'm getting timeouts with the XBees even at a 1Hz query rate. Any tricks on configuring the XBees and PySerial to minimize timeouts? My radios are both the regular version and are about 6 feet apart. I've got the one on the PC set to broadcast and the one on the Serializer to send back only to the PC radio's ID.

--patrick

Pi Robot
09-27-2010, 01:17 PM
OK, that problem is solved--turns out setting the PC-based XBee radio to broadcast mode doesn't work very well. I just now tried setting it to beam only to the Serializer's radio ID and the timeouts have disappeared. I had hoped to use broadcast mode to be able to send to both the Serializer and the ArbotiX at the same time but I won't worry about that right now.

--patrick

Pi Robot
09-27-2010, 05:20 PM
Eureka! While it is still unknown why the base_local_planner ignores max velocity settings when it gets bad odometry, after fixing the XBee timeout issue and slowing my base_controller publishing rate down from 20 Hz to 10 Hz, Pi Robot is now behaving perfectly with goals set in rviz. Dang, that was a long haul and many thanks to Fergs for shortening the route considerably.

BTW, has anyone tried the PML with ROS yet? Since I won't have a laser scanner for a few more weeks, I'm tempted to give it a try just to see if I can get obstacle avoidance to work.

--patrick

Pi Robot
09-29-2010, 06:35 PM
Yes, it's the 700K. And yes, there's been discussion of adding a shorter range sensor (hanging off the side).

The PML code is undergoing a huge overhaul right now -- the first version had a lot of hard coded stuff, the newer version is highly customizable, both at start time and using dynamic_reconfigure. We can now change how many readings to take in a scan, what the min/max angle of a scan is, etc -- and hopefully, the interface will be very close to that of the hoyuko_node package.

There will also be support for changing what kind of sensor is attached (the conversion from raw 10-bit readings to distance is already done in the PC side code, but support for additional sensor types is yet to be implemented). This is mainly intended for the fact that a 700K is perfect for human-scale environments, but switching to a GP2D12 would be awesome for a small Trinity Fire Fighting robot. Support for multiple sensors may be implemented -- but it's going to take some major changes in the firmware, as currently, we'd get dangerously close to filling the register table address space with more sensors.

As for timing: the IR sensors update at about a 30hz rate. Our runs at a slightly slower 20hz rate. It moves the sensor to a new position, waits until the next update, reads the current sensor value, and moves to the next position. As long as we don't make the transit too far, everything runs pretty good (there is a chance that the scan could be skewed by up to 6 degrees, but, fairly small trade off for a cheap little device). All servo control and sensor reading is handled by the ArbotiX firmware, the PC just sets up the min/max angle and readings, and then starts and stops scanning. It can read back the current scan asynchronously from the scanning operation.

However, the major problem with the current PML implementation is this: it's really good sitting still, and sorta sucks when the robot starts to move (since the robot is not in the same position throughout the very long scan period). ROS has some ability (via tf) to deal with sensor data coming in over a long time, and I've got a number of items we're going to be implementing to try and remedy this, but on higher-speed platforms it may still not work well -- I really don't have much data to go on here yet as typically the delay is betweens scans (in the case of building a point cloud from a tilting laser), not within the scan itself. Regardless, the quantity of data is much smaller (30 readings per second, as opposed to ~6000 on the low end lidars, currently limited by the IR sensor itself).

-Fergs

Hey Fergs,

Now that I have the basic ROS navigation stuff running, I was playing around with some PML code until I get my lidar unit next month. Is your PML driver already in the ArbotiX firmware? Or is that something you are still working on?

--patrick

lnxfergy
09-29-2010, 08:17 PM
Hey Fergs,

Now that I have the basic ROS navigation stuff running, I was playing around with some PML code until I get my lidar unit next month. Is your PML driver already in the ArbotiX firmware? Or is that something you are still working on?

--patrick

All of the PML stuff is in 0.2.1. There's going to be a number of major improvements shortly, but given your top speed, I think you shouldn't have much problems with the existing version.

-Fergs

Pi Robot
09-30-2010, 07:21 AM
Cool. I'll give it a try today.

--patrick

Pi Robot
09-30-2010, 05:35 PM
Just to see if it could be done, I followed Mike's lead and set up a PML in ROS using a Ping sonar sensor and HiTec servo. Here are my first two scans as seen in rviz. The little axes in the figure indicate the position and orientation of the robot. In the first image, the robot is front of a door leading to the outside with a wall in the distance. In the second (larger scale) image, there is an office trash bin directly in front of the robot. Like Mike says, the robot has to move rather slowly to make use of these properly with the base planner since each scan takes 1 second and only has 20 readings. But it ain't bad at least as a simple demonstration and an exercise in learning ROS.

http://forums.trossenrobotics.com/attachment.php?attachmentid=2135&stc=1&d=1285885726

http://forums.trossenrobotics.com/attachment.php?attachmentid=2136&stc=1&d=1285885944


--patrick

RobotAtlas
09-30-2010, 06:39 PM
Patrick,

One guy ar ros-users pointed me to this resource about using limited sensors for SLAM:
http://www.cs.rpi.edu/~beevek/slam.html

It was written in 2006, so I wonder if there's any newer resources/code/etc.
I think using the method in article and then accumulating more data in a buigger buffer and publishing the whole thing to gmapping might work, but 20 points/sec is really not that much.

For your bigger robot having 8 fixed sensors, all publishing at 30Hz would work better.
Sonars have that extra advantage ove laser that they see glass and black objects.

If you don't have either of those, laser looks like a way to go, especially for your big(ger) robot.

I'm seriously considering purchasing Neato vacuum instead of Hokuyo though. You get a whole SLAM-based mobile robot for $400 instead of just one sensor for $1200. Hackability of Neato is a big question though. I'll start a new thread on it.

Pi Robot
10-01-2010, 08:01 AM
Thanks for the link. As it turns out, I was fortunate enough to get some funding for a laser scanner and stereo camera, both of which should arrive in the next few weeks. Furthermore, I have zero hardware hacking skills so I leave it up to the electronics/data geniuses to dig into the Neato. But yeah, it would be a great platform if you could get the scan data out and send motor commands in.

--patrick

Peter_heim
10-13-2010, 04:54 AM
Hi All
Just received my arbotix controller i loaded the 0.2.1 package when i do a roslaunch
i get the following error
[INFO] 1286963159.458463: Starting ArbotiX-ROS on port /dev/ttyUSB0
[]
[]
[INFO] 1286963159.510477: Started joint_controller 'joint_controller' controlling: []
[INFO] 1286963159.539447: Started pml sensor 'pml' using servo: 5
Fail Read
its the same with out the pml
when the servo was id1 i could move the servo with controllerGUI.py

regards peter

lnxfergy
10-13-2010, 07:22 AM
Hi All
Just received my arbotix controller i loaded the 0.2.1 package when i do a roslaunch
i get the following error
[INFO] 1286963159.458463: Starting ArbotiX-ROS on port /dev/ttyUSB0
[]
[]
[INFO] 1286963159.510477: Started joint_controller 'joint_controller' controlling: []
[INFO] 1286963159.539447: Started pml sensor 'pml' using servo: 5
Fail Read
its the same with out the pml
when the servo was id1 i could move the servo with controllerGUI.py

regards peter

Peter,

A) Have you uploaded the ROS firmware (with PML support) onto the ArbotiX? The default PyPose sketch that ships doesn't support 1) PML and 2) sync read. I think it's the sync read aspect that is failing (you can disable sync_read by setting the parameter /arbotix/use_sync to False, which should stop the "Fail Read" but still won't allow you to use PML)

B) If you're still having trouble after uploading the firmware, please post the contents of your YAML config file.

-Fergs

Peter_heim
10-13-2010, 07:54 AM
Hi Fergs
I uploaded the ros firmware 0.2.1. The one in the trunk i cant compile (dynamixel_bus_config was not declared in this scope)
my latest Yaml file

port: /dev/ttyUSB0
rate: 15
sensors: {
"pml": {type: pml}
}
controllers: {
"j_con": {type: joint_controller,joints:["head_pan","head_tilt"]},
"b_con": {type: base_controller,base_width: 0.145,ticks_meter: 26145}
}
dynamixels: {
head_pan: {id: 5, invert: 1},
head_tilt: {id: 2, max_angle: 100, min_angle: -100}
}
i have tried with just
port: /dev/ttyUSB0
rate: 15
sensors: {
"pml": {type: pml}
}

regards peter

lnxfergy
10-13-2010, 09:00 AM
Hi Fergs
I uploaded the ros firmware 0.2.1. The one in the trunk i cant compile (dynamixel_bus_config was not declared in this scope)
my latest Yaml file

port: /dev/ttyUSB0
rate: 15
sensors: {
"pml": {type: pml}
}
controllers: {
"j_con": {type: joint_controller,joints:["head_pan","head_tilt"]},
"b_con": {type: base_controller,base_width: 0.145,ticks_meter: 26145}
}
dynamixels: {
head_pan: {id: 5, invert: 1},
head_tilt: {id: 2, max_angle: 100, min_angle: -100}
}
i have tried with just
port: /dev/ttyUSB0
rate: 15
sensors: {
"pml": {type: pml}
}

regards peter

Yeah, the trunk currently depends on a newer version of BioloidController library (which is in the trunk of the arbotix package).

The YAML looks fine (although you could probably take out the base_controller if you're not actually using it, if you are using it, you probably want to check the base width and ticks per meter, as they appear to still be the default).

How are you connected: Xbee? FTDI Cable?

-Fergs

Peter_heim
10-13-2010, 03:23 PM
Hi Fergs
i use a FTDI cable. re the YAML file it didnt work using cut and paste from the web page it gave a error i had to put some spaces after the :

the other thing i cant find a referance to setting up sensors like what sensor is on what port

regards peter

lnxfergy
10-13-2010, 03:41 PM
Hi Fergs
i use a FTDI cable. re the YAML file it didnt work using cut and paste from the web page it gave a error i had to put some spaces after the :

the other thing i cant find a referance to setting up sensors like what sensor is on what port

regards peter

Ok, so, what exactly is your robot setup? How many servos, and what are they used for? Just so we can be sure we aren't missing something here. It seems this should work...

As for sensor setup, the PML currently just reads Analog 0, and the analog->distance conversion function expects the ultra-long range IR sensor. Future releases will allow you to configure which port it uses (as well as sensor type), but this requires a firmware revision (which is currently in the works).

More generally, we don't have much support for sensors yet, except the ability to get/set digital, and get analog readings, using the GetDigital/Analog service. There is no YAML setup related to this, the services are always available. In the future, we'll have 2 major upgrades: a sensor type that broadcasts a port value at some frequency, and a sensor type for converting ir/sonar to range reading broadcasts (I'm waiting on the final ROS/std_msgs/range message definition before proceeding on this second item, as it appears to be very close to release).

-Fergs

Peter_heim
10-13-2010, 09:24 PM
Hi Fergs
my setup is 1 ax12 servo(id5) for pml 1 ir sensor on port 0 and the FTDI (motors and encoders will be added on the weekend) linux ubuntu 10.4 i loaded the firmware using windows 7 32 bit and release 21 for the controller

regards peter

lnxfergy
10-13-2010, 11:12 PM
Hi Fergs
my setup is 1 ax12 servo(id5) for pml 1 ir sensor on port 0 and the FTDI (motors and encoders will be added on the weekend) linux ubuntu 10.4 i loaded the firmware using windows 7 32 bit and release 21 for the controller

regards peter

Ok, here we go, try this for a YAML (you may need to adjust the port name of course):

port: /dev/ttyUSB0
use_sync: False
sensors: {
"pml": {type: pml, id: 5}
}

We need that use_sync to get around a small quirk (the main node will try to a sync_read for an empty set of servos... obviously, such a read fails), if we turn off sync read, the list is empty, and it doesn't try to do any reads. (I've added a ticket for 0.3.0 to fix this).

-Fergs

Peter_heim
10-14-2010, 02:19 AM
hi fergs
Just tried the above now i only get 1 read fail not the 15 i had previously

core service [/rosout] found
process[arbotix-1]: started with pid [2733]
[INFO] 1287040221.553928: Starting ArbotiX-ROS on port /dev/ttyUSB0
[]
[]
[INFO] 1287040221.610492: Started joint_controller 'joint_controller' controlling: []
[INFO] 1287040221.638934: Started pml sensor 'pml' using servo: 5
Fail Read

if i remove the PML part then i get no errors

regards peter

Peter_heim
10-14-2010, 05:55 AM
Hi Fergs
I'm replacing the HB25 motor controllers with VNH2sp30 motor controllers how do i wire them to the arbotix?

regards Peter

Pi Robot
10-14-2010, 08:51 AM
Hi Peter and Fergs,

I just got back on this thread and tried the yaml file Fergs provided above and it works fine with my setup meaning I get no read failures and I can turn on and off the pml servo using service calls to /EnablePML and /DisablePML. The one possible glitch I am seeing is that the panning servo seems to moving only 90 degrees in total--about -45 degrees to +45 degrees rather than -90 to +90 as seems to specified in pml.py. Does that make sense?

--patrick

Pi Robot
10-14-2010, 09:09 AM
Hey Fergs,

Would you happen to know the distance formulas for a Sharp 2Y0A02 and/or a Sharp GP2D12? These are the only two IR sensors I have at the moment so I thought I might plug in their distance formulas in pml.py if you have 'em handy.

--patrick

lnxfergy
10-14-2010, 09:25 AM
I just got back on this thread and tried the yaml file Fergs provided above and it works fine with my setup meaning I get no read failures and I can turn on and off the pml servo using service calls to /EnablePML and /DisablePML. The one possible glitch I am seeing is that the panning servo seems to moving only 90 degrees in total--about -45 degrees to +45 degrees rather than -90 to +90 as seems to specified in pml.py. Does that make sense?

It makes sense what you are saying -- I just have no idea why it's doing that. I'll test it out here later today to see if I can replicate the problem.


Would you happen to know the distance formulas for a Sharp 2Y0A02 and/or a Sharp GP2D12? These are the only two IR sensors I have at the moment so I thought I might plug in their distance formulas in pml.py if you have 'em handy.

Check the source code for the SharpIR library within our RoboControllerLib -- it has formulas for all the sensor models, in a similar format to the PML code. I plan to eventually integrate it, and a parameter to change sensor models -- it just hasn't happened yet. If you do integrate it (will require converting C code to Python) and test it out, please send over the patch, and I'll gladly integrate it into our next release (along with the parameter to change between models).

-Fergs

Pi Robot
10-14-2010, 09:33 AM
Thanks Fergs--I'll let you know how I make out with the forumlas in RoboControllerLib. Probably won't get to it until this evening or tomorrow.

--patrick

lnxfergy
10-14-2010, 09:47 AM
Hi Fergs
I'm replacing the HB25 motor controllers with VNH2sp30 motor controllers how do i wire them to the arbotix?

regards Peter

I'll get some documentation for this up shortly -- it's also dependent on the newer ROS sketch, so I really need to release 0.3.0, and also a new RoboControllerLib. I know that you, Pi, and my Armadillo robot are waiting on this functionality to be finished. Along those lines, I'm revising the scope of the 0.3.0 release to be:


Stable Controllers: base_controller (upload PID parameters, do ramping in driver, clear I/D accumulators on stop).
Support for BigMotors library (robocontrollerlib 0007)
New Experimental Sensors: v_monitor.
Actually include the NUKE sketch.
Other changes: GUI now uses head_pan_joint, etc (more PR2-like).

In particular, I'm going to leave the rest of the PML work for a later release (as it's progressing slowly), the head_action has also been moved back, as it turned out a number of the tf functions used in the C++ PR2 head action don't exist in the Python tf. The first 2 items in the list are what you guys need -- but I still have a small bit of tweaking left before it's ready (there's a particularly nasty bug in the PID that I think just figured out, but haven't had time to fix). I'm going to be working on it this afternoon, so either tonight or tomorrow I would hope to have the release out.

-Fergs