PDA

View Full Version : [Project] Robotic Eye-Hand Coordination



Pi Robot
03-02-2010, 06:15 PM
Hello,

I just finished up some work on using RoboRealm to guide my robot as it reaches toward a target object. The ultimate goal is for the robot to be able to pick up the object from a random location or take it from someone's hands. For now, I simply wanted to work out the coordinate transformations from visual space to arm space to get the two hands to point in the right direction as the target is moved about. The following video shows the results so far:


http://www.youtube.com/watch?v=m8GthdGApz8

I don't have a full write-up yet on how I did this but it basically just uses 3-d coordinate transformations from the head angles and distance to the target (as measured by sonar and IR sensors mounted near the camera lens) to a frame of reference attached to each shoulder joint. The Dynamixel AX-12 servos are nice for this application since they can be queried for their current position info. The distance to the balloon as measured by the sonar and IR sensors is a little hit and miss and I think I'd get better performance using stereo vision instead.

--patrick

http://www.pirobot.org

Orac
03-03-2010, 11:30 AM
Wow, very good, I have been playing with Roborealm 3d positioning, but no where near as good as that. Kind of spooky to watch, you think it might cry if you don't give it that balloon soon :)

Pi Robot
03-03-2010, 12:02 PM
Thanks Orac! Right now I am using RoboRealm only to get the x-y coordinates of the balloon in the camera frame of reference. I then use a sonar and IR sensor mounted near the lens to get the z-coordinate or distance to the balloon. One could also use the width of the balloon's blob in RoboRealm if you assume it has a fixed size, but I wanted to allow for different sized objects. When STeven and company get their Stereo module completed, I think that will be a good way to go, but for now this seems to get the job done.

And yes, the next step is to actually give the balloon to the robot before it comes after me flailing those arms. The AX-12's can pack quite a punch. :eek:

--patrick

bonmot
03-08-2010, 10:36 AM
Wow, I am looking forward to see you stereo vision logic working. I guess then two hands and the ball are the 3 visual objects in bot's brain to control the it grabbing the ball. Am I right about your plan?
This is what humand do, isn't it? We don't calculate the xyz of our hand to grab a cup :-D

bonmot
03-08-2010, 10:39 AM
$89 for this software is really good. The steven guy you mentioned is from Roborealm?
If $89 can buy a stereo vision SDK, that will be a steal :-)
Thanks

Pi Robot
03-09-2010, 06:59 PM
Actually, I imagine that the human brain (as well as other animal brains) does do some form of calculation without having to look at the hands. After all, you can reach for a cup with your eyes closed (though you might be slightly off target). Also, whenever you begin to reach for an object, your hands are not necessarily in view until they are part way to the goal. So you have to get the hands moving in the right direction before you see them.

Having said that, yes, at some point I was thinking of putting some small fiducials on each hand and these could be used to visually guide the hands to the target.

--patrick

bonmot
03-09-2010, 10:10 PM
Agreed. rough calculation can be done to guide the hand to the area, then use vision system to do the "pick up".
The non linearity of the lenses, servo postion errors, mechanical errors, etc... all add up.
Calibration will solve just small part of it.

The "brain calculation" you mentioned is not actual "calculation", but "imagination". With eyes closed, brain still uses the "vision system" to guide your action by the image created by "imagination".
"Simulation" maybe another word.

Pi Robot
03-09-2010, 10:46 PM
Yes, "simulation" seems like a better word than "calculation". Or perhaps simply "neural computation". As you probably know, much work has been done on this kind of thing in the fields of psychology and neurophysiology. I found this article (abstract) particularly interesting:

http://www.ncbi.nlm.nih.gov/pubmed/18217844?ordinalpos=2&itool=EntrezSystem2.PEntrez.Pubmed.Pubmed_ResultsP anel.Pubmed_RVDocSum

billyzelsnack
03-09-2010, 10:55 PM
Cool article. Thanks.

Pi Robot
03-09-2010, 11:03 PM
Try Googling something like "head shoulder transformation reaching" and you'll find a bunch more interesting stuff. :happy:

--patrick

http://www.pirobot.org

bonmot
03-10-2010, 06:56 AM
This is encouraging me to get fit-pc2 for my future plan.
By the time I am on speed, fit-pc3 maybe avaliable :-D

Suicidal.Banana
03-13-2010, 09:24 AM
Very very awesome, to me (a beginner/noob) this is somewhat of the holy grail in robotics, i hope to one day manage the same, have stereovision determine the location of a object relative to the robot, and have the robot somehow understand where its supposed to move his servos to reach for the object.
Keep it up!

Pi Robot
03-13-2010, 06:51 PM
Hey Suicidal Banana,

Thanks for the encouragement--it means a lot! I've almost got the next step ready which is for the robot to take the balloon from my hands and place it somewhere else. After that, I'm hoping it will tidy up my apartment. :wink: Progress would be so much faster if I didn't need my day job to pay the bills...

--patrick

http://www.pirobot.org

Sigma X
03-13-2010, 07:07 PM
Amazing work this does give robotics a good push I would like to see if I can experiment with somthing like this one day

Pi Robot
03-23-2010, 01:02 PM
Here is the next video demonstrating visually-guided grasping behavior. In this video, a number of independent behavioral threads are running to enable the robot to track and grasp the green balloon. Whenever the balloon is grasped, the robot turns its attention to the red balloon. When the green balloon is released, tracking turns again to it and the red balloon is ignored. I use RoboRealm to do the green/red tracking. There is a sonar sensor on the inside of the left hand that tells the robot when something is ready to be grasped. It can also do this using vision alone along with some trigonometry, but the result is more reliable when using the sensor.


http://www.youtube.com/watch?v=319eVBIYfec

--patrick

http://www.pirobot.org

bonmot
03-24-2010, 12:22 AM
Very nice! so cute :-D

Suicidal.Banana
03-24-2010, 03:49 PM
very cool man! now to make it drive towards the balloons (and drop all in a red basket) i guess? :veryhappy:

Pi Robot
03-25-2010, 12:42 AM
Dang! You guessed it!! :cool:

Zenta
03-30-2010, 01:24 AM
I'm also impressed how fast and great the tracking part work. Excellent!
I loved your last video. Your robot reminds me a bit of Wall-E, a robot with personality.


-Zenta

Pi Robot
03-30-2010, 12:21 PM
Thanks Zenta! I finally figured out--after years of getting it wrong--to do the tracking based on speed rather than position. For example, the head is always moving but with a speed that is proportional to the displacement of the target from the center of the image. (So when it is exactly centered the speed is 0). I wrote up a little tutorial on this at http://www.pirobot.org/blog/0008/. I use the same idea for the arm tracking.

--patrick

shimniok
03-30-2010, 09:33 PM
Wow, that is incredibly cool and inspiring! I am hoping my attempts at object detection are a fraction as successful! :D

kanda
03-31-2010, 03:46 AM
Nice work ! And that's nice of you to share with the community about it !

I have a few questions :
- Did you give a try about doing the same outside ? I am curious about the camera recording and lighting settings to change, would it work as nice or would it requires lot of tweakings ?
- How do you grab the picture from the camera ? Is it similar usage as an usb camera ? I d'like to use a good quality camera with open CV do you recommand this one ? I stick to my quickcam pro 4000 till now, but the picture is a bit too noisy frame to frame, thought it can be reduced using filters, i was wondering if you have to face same problems with this wifi camera. Thanks in advance for sharing.

DresnerRobotics
03-31-2010, 11:23 AM
Congratz! http://www.botjunkie.com/2010/03/31/peppy-wants-both-balloons-only-gets-one/

Pi Robot
04-01-2010, 12:27 PM
Hi Kanda,

I haven't tried the D-Link camera outside yet but I will later today and let you know how it goes. I use it indoors on the "Auto" setting which seems to do a good job of adjusting to lighting changes but I'll see if it can also handle bright sun.

I am using RoboRealm to grab the image. RoboRealm has a module specifically for this D-Link camera. However, RoboRealm can also grab most video streams sent over HTTP. I haven't tried OpenCV with an HTTP video source but maybe someone else here can comment. As for the quality of the image, I am using it at 320x240 resolution and I have no noise issues. If I run it wirelessly instead of wired, I will get dropped frames fairly often. So mostly I run in with a direct ethernet connection to my laptop or onboard mini-ITX.

--patrick



Nice work ! And that's nice of you to share with the community about it !

I have a few questions :
- Did you give a try about doing the same outside ? I am curious about the camera recording and lighting settings to change, would it work as nice or would it requires lot of tweakings ?
- How do you grab the picture from the camera ? Is it similar usage as an usb camera ? I d'like to use a good quality camera with open CV do you recommand this one ? I stick to my quickcam pro 4000 till now, but the picture is a bit too noisy frame to frame, thought it can be reduced using filters, i was wondering if you have to face same problems with this wifi camera. Thanks in advance for sharing.

Pi Robot
04-01-2010, 12:29 PM
Hey thanks Tyberius! I never would have seen that. (I need to get out more on the web...)

--patrick


Congratz! http://www.botjunkie.com/2010/03/31/peppy-wants-both-balloons-only-gets-one/

kanda
04-01-2010, 01:05 PM
Hi Kanda,

I haven't tried the D-Link camera outside yet but I will later today and let you know how it goes. I use it indoors on the "Auto" setting which seems to do a good job of adjusting to lighting changes but I'll see if it can also handle bright sun.

I am using RoboRealm to grab the image. RoboRealm has a module specifically for this D-Link camera. However, RoboRealm can also grab most video streams sent over HTTP. I haven't tried OpenCV with an HTTP video source but maybe someone else here can comment. As for the quality of the image, I am using it at 320x240 resolution and I have no noise issues. If I run it wirelessly instead of wired, I will get dropped frames fairly often. So mostly I run in with a direct ethernet connection to my laptop or onboard mini-ITX.

--patrick

Thanks a lot for the informations.

Pi Robot
04-03-2010, 06:47 PM
Hi Kanda,

The rain finally stopped long enough for me to try the D-Link camera outdoors and it works very well--still getting 30 frames per second at 320x240 resolution and no noise either wireless or wired, although, as always, you can get periodically dropped frames in wireless mode. Using the "Auto" setting seems to take care of lighting changes. However, if the camera is pointed too directly toward the sun, the image will wash out. Here are two images demonstrating how a red balloon can be picked out by RoboRealm even against the red surface. The second image is taken more toward the sun which is fairly low in the sky.

--patrick

kanda
04-04-2010, 05:26 AM
Thanks so much for the feeback Patrick ! From what i have read on your website your working with rgb colors ? I wonder if you tried to work under HSV color space instead of RGB ? I worry about following some not vivid colored items within o many different lighting context. I'll give a try when my robot is working for it (long way to go, im still under the process of testing sensors such as gyro to be interfaced with phidgets and other various electronic issues...I have most of materials but have to put everything together despite my lack of electronic knowledge ^_^)

Thanks again for testing outside, i'll see about this camera if i can interface it with openCV, or i'll have to find another good one to replace my good old quickam 4000.

regards

Pi Robot
04-04-2010, 09:10 AM
Hey Kanda,

Yes, I have been playing with the HSV color space in another project where I need to categorize colors across a greater range of values. For the balloon tracking videos, I am using very bright colors so that using RGB in RoboRealm does the job. However, instead of using the simple RGB_Filter module, I use the Color_Filter module which allows you to filter on more than one RGB triplet at a time. Even better, you can use your mouse to pick colors directly off the image of the target under different lighting conditions and add those to the filter.

By the way, I should have mentioned this earlier on about the D-Link IP camera : unless you are looking for a wireless option, I wouldn't necessarily recommend this camera. For wired operation I prefer a USB camera such as the Philips SPC-1300NC which can process up to 90 frames per second. The Logitech Fusion and 9000 models also seem to work nicely. Also, my understanding is that OpenCV does not have the ability to directly access an IP camera or any HTTP based video stream. I was playing with this yesterday using the D-Link camera and RoboRealm and was able to get it to work using RoboRealm's Virtual Camera mode that makes the IP camera look like a locally-attached video device. Then OpenCV can get at the images. But a USB camera can be read directly by OpenCV.

--patrick

kanda
04-04-2010, 09:44 AM
thanks for the advise, i'll go for an usb cam, as i plan to plug it on a micro or nano itx based brain. Indeed i can avoid to pay extra cost of a remote camera and buy a better usb one. I had a look about Philips SPC-1300NC but sounds like no autofocus on this cam and not good on low light context. I think i'll for logitech againn and a good side for this brand is the integrated mic wich is nice to help with sound location in addition to the Itx mic plug.

Why kind of AI achitecture are you using for PI ? (without telling any secret you d'like to keep of course :) ) Anything close to any Game AI technology such as behavior trees or something close to it ? How do you separate sensing to decision making ? From what is written on the video, you can activate or deactivate any arms, so it doesn't sounds too much like some hardcoded stuffs, but like a well thought system, and quite interesting to talk about it, if you don't mind. (sorry for my french-english lol)

regards

Pi Robot
04-05-2010, 10:47 AM
Hey Kanda,

One note about the nano ITX or similar motherboard. I originally began with a 1 Ghz Via mini-ITX and found that the CPU power and onboard video chipset was barely able to process 160x120 video at 20 fps. A few weeks ago I picked up a Zotac dual-core mini-ITX with the NVIDIA chipset and it does the video processing much better. (I'm running RoboRealm + Windows XP with 4 Gb RAM). The one I got is:

http://www.newegg.com/Product/Product.aspx?Item=N82E16813500036

If you're planning on using Linux and OpenCV, you might be able to get away with less powerful hardware. Other forum readers might have alternative suggestions. In the arm tracking video, I'm just using a laptop with USB cables running to the robot. Unless I specifically need mobility for the robot, I find this is easier than doing development directly on the mini-ITX.

As for AI, I have no secrets since I've barely gotten started on a higher level task planning architecture. At the moment, I code each behavior in its own thread and use simple message passing to block or continue a thread as needed. So for the arm tracking video, I have a head tracking thread and two arm tracking threads, one for each arm. In the meantime, I have a thread that reads and sets RoboRealm variables using the C# API. When the green balloon is grasped, I set a variable in RoboRealm that changes tracking from green to red. I also block the arm tracking threads while grasping is taking place. When the balloon is released, the RoboRealm target variable is set back to track green and the arm tracking threads are un-blocked. I've attached the RoboRealm file in case it helps.

If I were starting from scratch, I'd probably use Linux and take a good look at the Robot Operating System (ROS) from Willow Garage (http://www.ros.org/wiki/). For example, all the matrix operations I had to figure out to translate head coordinates to arm coordinates for the arm tracking can be handled with a simple call to the "transform frame (tf)" library in ROS. Similarly, high level navigation and task planning libraries are already included so you don't have to reinvent the wheel.

--patrick

kanda
04-05-2010, 02:04 PM
Hello Patrick,

Thanks for the tips. Indeed i am also planning to use my old laptop (or buy a newest netbook based on an Atom) to test and program the robot software, i installed clean XP last week for this purpose). What you mentionned about the Via Itx is quite helpfull, i'll probably go for the one the link you provided. However i am planning to make everything from scratch on a programming sides (wich fit much better to my skills than the electronic parts=> thanks to phidgets !!!), so i'll probably have a straight access to the data i need from the sensor, including the camera (i'll use OpenCV or go straight with direct Media), so i'll have a system probably a bit less CPU consuming.
About OS, i'll stick to XP, as i have no knowlodge about linux (i wish i have some !) and don't want to struggles about some drivers issues and sutffs like that. Also, i know perfectly the Visual Studio environment (Wich i use for my job) and prefer to keep using it.


Btw, as you kindly provided the RoboRealm script, i could run it with my quick cam: i sometimes lost the red (a small object i admit) target while moving it : i think i really have to change for a better cam.

What are your next plans for PI ? Have you thought about using some FlexForce pressure sensors for the hands ?

kanda
04-05-2010, 02:24 PM
Forgot to ask : what do you use to power the ITX board ? I d'like to power it with a basic 12V battery...
Thanks.

Pi Robot
04-06-2010, 09:59 AM
Hi Kanda,

If the truth be told, I'm very happy using Windows XP and Visual Studio for development. And I've been a Unix/Linux guy for 20 years. However, much of the work done in robotics in academia uses Linux and C++ or Python. I will be experimenting with ROS on a different machine, but like you said, hardware support tends to be less painful under Windows. I also like the ease of using speech recognition and text-to-speech. The Phidgets components are great. And yes, I use the FlexForce sensors on my larger robot (Pi) to measure the grasping pressure against the hands. (See video below.) On the smaller robot (Peppy), I am using the Robotis Dynamixel servos (AX-12+) which allow you to query the current load or torque. So I can get a sense of hand pressure by measuring the inward force of the shoulder joints. Down the road, I will use FlexForce sensors in the hands to measure slippage of the object being grasped.

Regarding the RoboRealm script in my last post: I optimized the pipeline for use with the large brightly colored balloons. In particular, the Blob Filter eliminates smaller blobs which are often part of the background. So you might play with the settings in that module or disable it altogether if you are trying to track smaller objects. Also, the two Color Filter modules use RGB triplets specific to my balloon colors and lighting. So you might try clearing those then use the color picker to sample your own object.

As for the power requirements of the Zotac board. I actually mistakenly ordered the model with the built-in power supply thinking I could run it on a 12v battery and it turns out it needs 19v! So I use a step-up converter from 12v to 19v. The model I linked to in my earlier post requires a 20-pin ATX power supply but no matter what specs I look at online (even from the manufacturer), I can't find the voltage. You might try sending a query to the Newegg support folks for an answer.

For the immediate future of the Pi Robot project, I am adding speech recognition (using the SAPI 5 SDK from Microsoft) and face tracking (using EmguCV, the .NET wrapper to OpenCV). Then I'd like Peppy to do something useful, like tidy up the living room. ;)

--patrick

http://www.pirobot.org/


http://www.youtube.com/watch?v=WqyDY7ZgvXs

JonHylands
04-06-2010, 10:05 AM
As for the power requirements of the Zotac board. I actually mistakenly ordered the model with the built-in power supply thinking I could run it on a 12v battery and it turns out it needs 19v! So I use a step-up converter from 12v to 19v. The model I linked to in my earlier post requires a 20-pin ATX power supply but no matter what specs I look at online (even from the manufacturer), I can't find the voltage. You might try sending a query to the Newegg support folks for an answer.


Any motherboard requiring a 20-pin ATX connector uses a standard PC-type power supply.

I use this one on Brainbot - it can take anywhere from 6 to 24 volts input:

http://www.mini-box.com/M3-ATX-DC-DC-ATX-Automotive-Computer-car-PC-Power-Supply?sc=8&category=981

- Jon

kanda
04-06-2010, 10:23 AM
Any motherboard requiring a 20-pin ATX connector uses a standard PC-type power supply.

I use this one on Brainbot - it can take anywhere from 6 to 24 volts input:

http://www.mini-box.com/M3-ATX-DC-DC-ATX-Automotive-Computer-car-PC-Power-Supply?sc=8&category=981

- Jon

Thanks for the help. After searching over internet yesterday, i finally found this one on Ebay, the seller also have the same product of the link you provided :

http://cgi.ebay.fr/ws/eBayISAPI.dll?ViewItem&item=290373885677&ssPageName=STRK:MEWAX:IT

JonHylands
04-06-2010, 11:01 AM
Thanks for the help. After searching over internet yesterday, i finally found this one on Ebay, the seller also have the same product of the link you provided :

http://cgi.ebay.fr/ws/eBayISAPI.dll?ViewItem&item=290373885677&ssPageName=STRK:MEWAX:IT

Its close, but not quite.

The only issue with that power supply is the lower limit, which is 12 volts. What that means practically is once your battery drops below about 11.7 volts, the CPU will be shut down. Whether you are using 3-cell Lipos or 10-cell NiMh, 11.7 volts is nowhere near the lower limit of the battery, so you will end up with much lower runtimes, unless you have a switch-up adapter that can provide a steady 12 volts from a lower source.

The blue one I use now has a lower limit of 6 volts, which means I can use the full range of either a NiMh pack or a Lipo pack.

- Jon

Pi Robot
04-06-2010, 11:11 AM
Hi Jon,

Seeing as we have your expertise on the line...the Zotac board I ordered by mistake with the built-in 19V 90W power supply is:

http://www.newegg.com/Product/Product.aspx?Item=N82E16813500027

As you can tell from the pictures, there are solder holes for the 20-pin ATX connector that the PSU would plug into. Can you tell from the specs on this board whether or not I could solder in the connector and use the PSU you linked to? And if so, where could I get such a connector to solder to the motherboard? (I Googled my brains out but struck out...)

Thanks!
patrick

JonHylands
04-06-2010, 11:34 AM
According to the user manual (http://downloads.zotac.com/mediadrivers/mb/man/a108.pdf), on page 14, you can use either the built-in power supply, or the 20-pin ATX connector.

The connector is Digikey part# WM3851-ND.

- Jon

Pi Robot
04-06-2010, 11:39 AM
This is great. Thanks Jon!

--patrick

kanda
04-06-2010, 03:25 PM
Its close, but not quite.

The only issue with that power supply is the lower limit, which is 12 volts. What that means practically is once your battery drops below about 11.7 volts, the CPU will be shut down. Whether you are using 3-cell Lipos or 10-cell NiMh, 11.7 volts is nowhere near the lower limit of the battery, so you will end up with much lower runtimes, unless you have a switch-up adapter that can provide a steady 12 volts from a lower source.

The blue one I use now has a lower limit of 6 volts, which means I can use the full range of either a NiMh pack or a Lipo pack.

- Jon

Your totally right, thanks. I also found this one wich is cheaper than the blue one: http://cgi.ebay.fr/B93-8-28-VDC-input-120W-DC-DC-Mini-ITX-Power-Supply-PSU_W0QQitemZ390178539614QQcmdZViewItemQQptZAU_Com ponents?hash=item5ad874085e

Regards

kanda
04-06-2010, 04:07 PM
Hi Kanda,

If the truth be told, I'm very happy using Windows XP and Visual Studio for development. And I've been a Unix/Linux guy for 20 years. However, much of the work done in robotics in academia uses Linux and C++ or Python. I will be experimenting with ROS on a different machine, but like you said, hardware support tends to be less painful under Windows. I also like the ease of using speech recognition and text-to-speech. The Phidgets components are great. And yes, I use the FlexForce sensors on my larger robot (Pi) to measure the grasping pressure against the hands. (See video below.) On the smaller robot (Peppy), I am using the Robotis Dynamixel servos (AX-12+) which allow you to query the current load or torque. So I can get a sense of hand pressure by measuring the inward force of the shoulder joints. Down the road, I will use FlexForce sensors in the hands to measure slippage of the object being grasped.

Regarding the RoboRealm script in my last post: I optimized the pipeline for use with the large brightly colored balloons. In particular, the Blob Filter eliminates smaller blobs which are often part of the background. So you might play with the settings in that module or disable it altogether if you are trying to track smaller objects. Also, the two Color Filter modules use RGB triplets specific to my balloon colors and lighting. So you might try clearing those then use the color picker to sample your own object.

As for the power requirements of the Zotac board. I actually mistakenly ordered the model with the built-in power supply thinking I could run it on a 12v battery and it turns out it needs 19v! So I use a step-up converter from 12v to 19v. The model I linked to in my earlier post requires a 20-pin ATX power supply but no matter what specs I look at online (even from the manufacturer), I can't find the voltage. You might try sending a query to the Newegg support folks for an answer.

For the immediate future of the Pi Robot project, I am adding speech recognition (using the SAPI 5 SDK from Microsoft) and face tracking (using EmguCV, the .NET wrapper to OpenCV). Then I'd like Peppy to do something useful, like tidy up the living room. ;)

--patrick

http://www.pirobot.org/




Thanks again for sharing ! Is always so nice to see your videos ! And very good to motivate myself to see your work !

I am glad you tried FlexForce too, i ordered some a few days ago and it sounds like it was not a bad idea to do :happy:

What do you use to drive the motors ? I am bought the Phidgets Dual Mortor board, and planning to use it with the help of a gyro (i got one from sparkfun that i can to plug the analog Input of phidgets) and the phidgets accelerometer to make it drive straight. Easier for me to handle it that way than creating a wheel counter. I am about to put all this togethers (on a 40cm diameter aluminium base).

Will probably create a thread about it as soon as it's looking like a robot platform (at least ) :rolleyes:

Pi Robot
04-07-2010, 12:58 PM
Hey Kanda,

I am using the Serializer controller board from RoboticsConnection.com (also available here from Trossen (http://www.trossenrobotics.com/store/p/5196-Robotics-Connection-Serializer-WL.aspx)). It has onboard H-bridges to drive a pair or DC motors as well as a PID controller so you can do neat stuff like command the robot to "rotate 45 degrees" and it just figures it all out for you. I also use the 7.2V motors sold by RoboticsConnection (http://www.roboticsconnection.com/p-51-dc-gearhead-robot-motor.aspx) since they come with integrated encoders and a nice wiring harness that is designed for the Serializer connection headers.

Looking forward to pictures of your 'bot when you start putting it together.

--patrick

kanda
04-08-2010, 03:01 AM
this motor looks good, espcially with the included encoder. Crazy me, didn't thought about looking for a motor with embedded encoder. So i finally had a look last night and found those (sorry it's french website) :
http://www.lextronic.fr/R2084-moteurs-avec-encodeur.html

Maybe if i fail to drive the robot with the Gyro, i'll give a try. But for now i'll stick to the two Geared DC motors i already bought. Maybe i should say that i have different approach for my robot control & AI : i'll try to use iterative and learning system for most actuators instead of using handcoded control. Too early to know if i'll be sucessfull thought.

kanda
04-08-2010, 03:16 AM
Patrick, forgot to ask about the arms of Pi : Looks like Lynxmotion tubes ? I was planning to use this for another arms and orderer some to make tests. However if the tubes are cheap, i think the part to hold the tubes on brackets are a bit expensive : http://www.lextronic.fr/P2143-adaptateur-pour-tube-hub08.html
Is that what you used for Pi's arms ? Does it fit well to dynamixel brackets ? I also ordered squared tube http://www.lextronic.fr/P2158-barre-carree-de-162-cm-de-long.html to make more tests, with some luck i can avoid the expensive round tube to brackets part. If not, i'll stick to round ones.

kanda
04-08-2010, 06:04 AM
About battery & Itx board/PSU, found this : http://www.powerstream.com/BP-23.htm
It includes a Regulator !!!

Pi Robot
04-09-2010, 12:39 PM
Yes, those tubes are from Lynxmotion. And the brackets have holes that match the Dynamixel holes perfectly.

--patrick


Patrick, forgot to ask about the arms of Pi : Looks like Lynxmotion tubes ? I was planning to use this for another arms and orderer some to make tests. However if the tubes are cheap, i think the part to hold the tubes on brackets are a bit expensive : http://www.lextronic.fr/P2143-adaptateur-pour-tube-hub08.html
Is that what you used for Pi's arms ? Does it fit well to dynamixel brackets ? I also ordered squared tube http://www.lextronic.fr/P2158-barre-carree-de-162-cm-de-long.html to make more tests, with some luck i can avoid the expensive round tube to brackets part. If not, i'll stick to round ones.

Pi Robot
04-30-2010, 04:26 PM
I finally had a chance to write up the math behind the Pi Robot arm tracking video. Keep in mind that I am only using the two shoulder joints in each arm--the elbow and wrist servos are fixed--so the inverse kinematics is fairly straightforward. Later on I'll have to deal with the other joints...

Here is the link to the write-up:

http://www.pirobot.org/blog/0011/

--patrick