PDA

View Full Version : Kinect Teleoperation of KHR1-HV using ROS



veltrop
12-29-2010, 09:25 AM
Hi everyone check out this video I just put together of my KHR1-HV


http://www.youtube.com/watch?v=GdSfLyZl4N0

darkback2
12-29-2010, 10:23 AM
You made plastic pals...good job!

http://www.plasticpals.com/?p=26382

Pi Robot
12-29-2010, 11:10 AM
Simply awesome! Though I think you forgot the link to the video in your post. (I saw it on the Willow Garage site.)

Congratulations!!

--patrick

RobotAtlas
12-29-2010, 01:07 PM
The most exciting part to me is theoretically 90% of your project could be reused on other Dynamixel-based robots a lot of people have on this forum. It definitely shold give everybody a great start.

I was following your projects Tylor since you announced porting your humanoid to ROS.
It's amazing how far you went from Lego. :)

Zenta
12-29-2010, 01:45 PM
Wow, thats awesome!
Looks like doing IK for each arm would be one better solution (like you mentioned in the video).

veltrop
12-29-2010, 08:34 PM
Thanks for the flattering comments! Sorry for the technical difficulties on the YouTube link.

RobotNV: Yes the code is so reusable! The functionality of the Roboard layer of the control system can easily be replaced with whatever one needs for their micro controller. You just need something that controls servos that responds to ROS joint state messages, possibly also broadcast a few sensor topics, and you are set!

If other people get involved in the code I'd love to see improvements in the motion scripting system, GUI's etc.
I better get my next point release out soon! After I finish implementing a GUI for pose capturing of the robot over the next week or two it'll be ready.

Wouldn't it be great to use the Kinect to script motions and poses for the Robot?

Zenta: Before I can do the IK to its full benefit I think I need a better virtual model of the robot. Then I can do motion planning for the arm with collision detection and everything.

I get a pretty good collision map from the stereo camera, this is just the beginning!

Pi Robot: Btw, I've been meaning to look at your great usage of the navigation stack to figure out how to integrate that stuff into Veltrobot.

Pi Robot
12-30-2010, 07:30 PM
Hi Taylor,

I am trying out your super cool Kinect ROS code and I ran into a small snag that will probably be obvious to you. I am following your directions from the Willow Garage site and I am using the latest from your SVN repository as well as the latest OpenNI code. My Kinect is up and running OK and I can view the point cloud in RViz. I am trying your instructions for the simulated humanoid. Steps 1-3 execute fine and I see your humanoid model in RViz. But when executing Step 4 I get:

$ roslaunch veltrobot_teleop kinect_teleop.launch

process[teleop_kinect-1]: started with pid [21982]
process[kinect_base_link-2]: started with pid [21983]
process[kinect_base_link1-3]: started with pid [21984]
process[kinect_base_link2-4]: started with pid [21989]
InitFromXml failed: File not found!
process[kinect_base_link3-5]: started with pid [22005]
[teleop_kinect-1] process has died [pid 21982, exit code -11].
log files: /home/patrick/.ros/log/997bae2a-11ec-11e0-8854-8c736e77238f/teleop_kinect-1*.log

Any idea what that InitFromXml error is about?

Thanks!

--patrick

veltrop
12-31-2010, 06:10 AM
Thank you for trying out the code and reporting this to me Patrick.

In the KinectController.cpp, line 6 there is a hard coded path...! I'll fix that when I get back from my New Years trip. But for now, change
#define SAMPLE_XML_PATH "/home/space/ni/ni/openni/lib/SamplesConfig.xml"
to match your setup.

Pi Robot
12-31-2010, 11:42 AM
Thanks Taylor--that was it exactly. And now it works beautifully. Really nice!

--patrick

veltrop
01-01-2011, 09:01 AM
Glad it's working for you now. Fixed in the 0.2.1 release.

Pi Robot
01-01-2011, 07:05 PM
Great! Thanks to your brilliant code, I was able to very quickly tweak the joint names to control Pi Robot's virtual self in RViz as shown below. Next, to get it to work with the real robot...

YouTube - Veltrop Kinect Teleop of Pi Robot in RViz

veltrop
01-02-2011, 08:06 AM
That's awesome! I'm so glad that the code was easily reusable for you!

May I link to your video on my ROS contest entry page?

Please let me know if you put anything else on the web about this so I can link to it!


Next: movement of the legs on the robot. There's 3 methods I want to use interchangeably. I think that all three methods are needed.

1. Direct. Maybe rarely needed, it doesn't make sense for most situations. Could be useful for experimenting with balancing on the robot, or maybe a karate kick ;)

2. Gesture based. IE, walk in place to trigger the robots walk forward routine. Or maybe in your case it could rotate the robots wheels. I'm going to first attack this by hacking up trackpad gesture drivers, but I'm not really sure what to do. Any other ideas or libraries?

3. Goal based. Take the XYZ position and orientation of the human's torso, and set that (with some scaling) as a goal position for the robot using the ROS navigation stack. This has the limitation that you may walk off camera, but worry about that later. I don't have a navigation stack on Veltrobot yet. Would you be interested in working on this?

Pi Robot
01-02-2011, 11:02 AM
Hi Taylor,

First of all, I'd be honored to be linked into your ROS contest page--hopefully it will add to your points in the "Most Useful" category! I just now made a link to your contest entry page and the Pi Robot Youtube video on my home page at http://www.pirobot.org.

And yes, I am keen to work on the gesture stuff as well. If you can get your humanoid to do a karate kick, I see a whole new future in human-robot gladiator competitions where the world champion is a 5 year-old controlling a 10 foot 5000 lb Transformer from his or her living room. ;)

Regarding using leg gestures: perhaps this is what you meant by trackpad gestures, but I wonder if you could use something like this: step forward (one step) to go forward. Step back to a straight up position to stop. Step backward to go backward. Step right to go right and so on. Also, the NITE API document describes using hand gestures: Click, Wave, Sweep Left, Sweep Right, Raise Hand Candidate, Hand Candidate Moved. I have no idea how to extract these yet.

I also like the Goal Based idea, especially when it comes to directing the robot to go somewhere rather than having it mimic your behavior. How about a simple pointing scheme where you use the pointing arm as a vector and extrapolate it to the ground plane to indicate the location you want the robot to go? Then use the ROS navigation stack to actually get the robot to that location.

--patrick

veltrop
01-02-2011, 12:09 PM
Regarding using leg gestures: perhaps this is what you meant by trackpad gestures, but I wonder if you could use something like this: step forward (one step) to go forward. Step back to a straight up position to stop. Step backward to go backward. Step right to go right and so on.


Hmm, that's a good compromise. Also then I can use simple offset positioning of the operator instead of detecting complex movement.



Also, the NITE API document describes using hand gestures: Click, Wave, Sweep Left, Sweep Right, Raise Hand Candidate, Hand Candidate Moved. I have no idea how to extract these yet.


I wish the documentation wasn't so cryptic.
We could probably leverage this but I don't know about using it with openni or not:
http://singularityhub.com/2010/12/10/mit-uses-xbox-kinect-to-create-cheap-minority-report-interface-video/




I also like the Goal Based idea, especially when it comes to directing the robot to go somewhere rather than having it mimic your behavior. How about a simple pointing scheme where you use the pointing arm as a vector and extrapolate it to the ground plane to indicate the location you want the robot to go? Then use the ROS navigation stack to actually get the robot to that location.


That is an -awesome- idea. It's usability may be limited to situations where the robot is in the same room as the operator, but that's not a problem. That idea is more of a human-robot interaction than a teleoperation.

Which makes it the perfect thing to implement with my new NAO!
Starting the NAO ROS cross compile now.

For the pointing idea we need to match the robot's position relative to the operator's. On Pi Robot's wheeled base you can probably just use a starting point plus odometry, not sure about the NAO though.

veltrop
01-02-2011, 12:24 PM
There's a fighting robot competition here in Japan on the 9th. Entry is by the 5th. I'm thinking of using my Kinect teleoperation for it.

Patrick do you think you could get your gesture teleoperation idea working for moving fwd/back/etc by the 5th?

Pi Robot
01-02-2011, 07:06 PM
There's a fighting robot competition here in Japan on the 9th. Entry is by the 5th. I'm thinking of using my Kinect teleoperation for it.

Patrick do you think you could get your gesture teleoperation idea working for moving fwd/back/etc by the 5th?

I'll definitely be working on it as much as I can--unfortunately I'm back to my day job tomorrow so robot time will be greatly reduced, but I'll post my results back here as I make progress.

--patrick

Pi Robot
01-02-2011, 09:17 PM
OK, I'm already making some progress on gesture based forward/back/etc movement. I should have something to show by the end of the tomorrow.

--patrick

Pi Robot
01-03-2011, 10:25 PM
OK, here is a short demo of basic "navigation" of Pi Robot using Taylor's teleoperation code but modified to send ROS Twist commands to the Serializer PID controller based on the position of the operator's feet. (I had to use a wired USB connection between the robot and my desktop since for some reason my XBee connection to the Serializer has a huge delay no matter what baud rate or channel I use...)

YouTube

I've attached my modified version of Taylor's teleop_kinect.cpp file. The basic premise is to compute the vector pointing from the right foot to the left foot, then use the length of the vector for speed (Twist linear x component) and the relative angle of the vector (computed using atan2) to determine the angular velocity (Twist angular z component). At the moment, I am using just four motion commands (five if you include "stop"). If the right foot is in the forward upper half quadrant, go forward; if it is in the forward lower half quadrant rotate left; if it is in the rearward upper half quadrant rotate right; and if it is in the rearward lower quadrant, go backward. I've also put a top speed on the forward/backward movement of 0.2 m/s for safe testing. And since Pi is rather heavy and hard to turn on carpet, I have fixed the rotation speed at 0.4 rad/s.

Oh, and if both feet are more or less together, the robot stops.

This is just a start of course, but it seems to work fairly well.

--patrick

veltrop
01-03-2011, 11:33 PM
Thanks for the awesome work!

I'll need to setup veltrobot's movement_controller to process twist messages and translate them into the walking and turning motion routines. It'll be limited because as they are now it can only walk or turn at a single speed, and not simultaneously. Veltrobot's motion system needs improvement...

But when I get this working on the NAO it'll be no problem. It has an omniwalk function that essentially takes a vector as input.

But in the meantime I am also giving 1:1 leg control a shot!

I'll report back when I can :)

RefugeZero
01-04-2011, 12:57 AM
Looking at this other (much more expensive) teleoperation system, I think gestures are a perfectly valid solution:

http://www.youtube.com/watch?v=TJmQqC1nHTU
http://www.youtube.com/watch?v=5N93QtVsyv8

...Scrub to 1:00 of the first vid, see how the arms have almost zero lag but the legs lag massively? Not sure what they're doing, but thet're definitely processing the crap out of leg data.

The second vid shows they certainly aren't mimicking the legs, just approximating them. If you could splice up the walking motion routines you could probably make gestures for walk_fwd_left_leg, walk_fwd_right_leg, turn_left and turn_right. walk_fwd gesture could just be lift/bend your leg, and turn gestures could be straight leg lift to the side, kinect should be able to pick those motions up cleanly. (although lift/bend might also need to be to the side so kinect gets a profile view of your bent knee... you'd probably look like you're warming up for sumo while you control veltrobot :D)

RefugeZero
01-04-2011, 01:09 AM
I might finally break down and get an xbox now, after so many years of laughing at the idea... damn you Microsoft, you've finally made something useful!!

SK.
01-04-2011, 01:37 AM
Beware, AFAIK you only get the custom Kinect to X-Box connector on the Kinect itself if you buy the bundle. If you buy a standalone Kinect, you get the USB adaptor too (apparently because older XBoxes only have USB jacks and not the new custom jacks).

RefugeZero
01-04-2011, 01:43 AM
Ah ok then, thanks for the warning!

veltrop
01-23-2011, 10:17 PM
Finished up my ROS contest entry: http://www.ros.org/wiki/openni/Contests/ROS%203D/Humanoid%20Teleoperation

Also have two new vids out:


http://www.youtube.com/watch?v=TmTW61MLm68


http://www.youtube.com/watch?v=FwFO6Vh1990

And Patrick I saw your new vid, nice work!

DresnerRobotics
01-24-2011, 01:08 AM
Finished up my ROS contest entry: http://www.ros.org/wiki/openni/Contests/ROS%203D/Humanoid%20Teleoperation

Also have two new vids out:


http://www.youtube.com/watch?v=TmTW61MLm68


http://www.youtube.com/watch?v=FwFO6Vh1990

And Patrick I saw your new vid, nice work!

This is fantastic stuff Taylor, thanks for sharing!

Pi Robot
01-24-2011, 09:01 AM
Simply awesome Taylor! And good luck in the ROS contest. (I remain about 100 steps behind you but it was a fun exercise to put together an entry.)

--patrick

veltrop
01-24-2011, 10:21 AM
Thanks Patrick. Good luck on your ROS speech!

darkback2
01-24-2011, 11:43 AM
I am going to share these videos with my students today. I'm hoping to have ROS running on a roboard on Hikari as a part of my robotics class next semester. Great work!

veltrop
01-24-2011, 12:43 PM
That's awesome!

The Roboard is a great platform for ROS.

CeeJay
01-26-2011, 08:22 AM
wow awesome!