PDA

View Full Version : [Project] Argos pr1



Connor
03-17-2008, 10:12 PM
Hey folks,


Thought I would show everyone what I've been working on as of late. I call him ARGOS PR1. (Autonomous Robotic Guidance and Orientation System Personal Robot 1) I'm almost done with Phase 1, which includes the base iRobot Create chassis, Onboard Mini-ITX main board, Logitec Orbit AF, Phidgit 8/8/8 with Voltage and AMP Sensors. Currently stuck on the power and charging system.. I was planning on 4 3800mAh Sub-C battery packs (each with 12 cells) wired in parallel, but the charging circuit is the issue there. I plan on using a Roomba Home base for the docking station and modifying it with extra contacts for the separate battery packs.


Phase 2 will include a 2 DOF Arm with a 3 DOF gripper. I'll be using Massive geared servos from RobotZone (ServoCity) for the Shoulder and Elbow.. I hope to be able to lift around 7-8lbs with the arm.. (Not necessarily attached to Argos..)

My current road blocks are the charging circuits. (Planning on using MAX712's) and how to power the servos which require 7.2v to operate at max power and draw over 5amps when stalled (each). I also be adding a few sonar sensors to it as well.. Still debating on were to place them.. On the bottom plat form (on the underside) or on the top platform. Any suggestions?

Thanks, Connor

archcvd
03-17-2008, 11:25 PM
Your bot is looking great! Where were you thinking of installing the arm? I hope it's not big enough to topple him over with a sudden move. If you can't find something to support your servos' current draws you could maybe have them connected to a relay board and have them on their own separate circuit so as to not fry anything ;)

Have you ever thought of consolidating all of your power sources into one? Maybe create your own little power management unit board so that you can run off a single battery. There may be schematics on Google to help you own with that.

As for the sonar, I think you could get away with putting them on the bottom platform just in case something shorter than ARGOS doesn't register as an obstacle.

I look forward to seeing more!

Connor
03-18-2008, 12:42 AM
Your bot is looking great! Where were you thinking of installing the arm? I hope it's not big enough to topple him over with a sudden move.

The arm will be mounted on the right hand side of him (left hand facing him). It'll be made up of 2 8" rounded sections with the gripper at end obviously.



If you can't find something to support your servos' current draws you could maybe have them connected to a relay board and have them on their own separate circuit so as to not fry anything

Have you ever thought of consolidating all of your power sources into one? Maybe create your own little power management unit board so that you can run off a single battery. There may be schematics on Google to help you own with that.

I have a few different thoughts on battery power and management. The idea with the 4 packs was to give me a total of 15200mAh, and with the Mini-ITX board pulling around 1.7amps, that'll give me a really good long runtime without using the arm. I then thought about just using 3 packs to power the Mini-ITX, and turn the other into a 7.2v pack at 7600mAh to run just the arm (run it straight into the Linxmotion SSC board I'm planning to use.) I also thought about using 12 10000mAh D's or 12 14000mAh F's.. Like I said, I'm a tad stuck. The one thing I'm sure of is I'm going to let the Create handle it's own power via the 14.4v pack that's on the underside of him. I did think about maybe hooking up a realy system to toggle the create over to the Mini-ITX pack so that he could run off of it in case the Create's pack started running low. Another question.. The Power supply for the Mini-ITX is rated for 12-25v, I'm planning on 14.4v, I could take it down to 12v by using 10 cells vs 12, but, I think 14.4v might be better in the long run.



As for the sonar, I think you could get away with putting them on the bottom platform just in case something shorter than ARGOS doesn't register as an obstacle.

I could do both! :happy: Put 4 on the bottom (1 forward 1 on each side, 1 in back, and 1 on top forward) Would I want the ones on the sides angled or just straight out?


I look forward to seeing more!
I hope to be able to post more about him, just have to figure out these little road blocks!

Thanks, Connor

lunarnexus
03-18-2008, 08:27 AM
Nice bot! I recently started a robot project and was debating between a Lynxmotion crawler and an iRobot Create. I went with the crawler because of the extra "walking fun", but the iRobot Create had some really appealing features, including the docking station that I now have to design and build for the crawler.

I'd really love to see how your charging/power system is put together. I'm using 2x 2600mAh 12V packs in parallel powering a Lynxmotion SSC-32 and a pico-itx PSU (120watt), both in parallel and I'm having power issues. I put a voltage regulator in line to drop the voltage to the SSC-32 to 9V, but moving more than one leg at a time reboots everything. I think the answer is to have a seperate power rail for the servos (easily done with the SSC-32).

The part of your power system I'm interested in is the charging circuit. I want to be able to "dock" the bot, but I'm not sure how to do it correctly. I'm sure you can't just slap a charger in parallel somewhere and I'm not sure how to correctly isolate the battery.

Thanks,
James

Matt
03-18-2008, 10:38 AM
This is a great design. I never thought of putting a computer on a Create, but it looks like a good fit.

PS: Is that a salad mixing bowl for a head? LOL, Awesome.

Connor
03-18-2008, 07:40 PM
This is a great design. I never thought of putting a computer on a Create, but it looks like a good fit.

Yea, a similar design won a contest. I based mine off of his.. http://www.instructables.com/id/iRobot-Create-Personal-Home-Robot/ Making changes of course. Like I used a pan and tilt camera, Using a Phidgits 8/8/8 for functions, planning to add sonar... and my arm will be way more useful!



PS: Is that a salad mixing bowl for a head? LOL, Awesome. Umm, I'll never say... :happy: Yea, I found them at Wall-Mart, I have a teal colored one also. Was planning on spray painting it white to match the Create, my wife likes it blue.. What's a guy to do?

Thanks, Connor

Alex
04-23-2008, 03:50 PM
Have any updates on your bot?

Have you considered entering it in our contest (http://www.trossenrobotics.com/contest.aspx)? I could always move this thread over to the Project Showcase forum if you'd like and you can beef up the description to better fit an entry:D

Connor
01-08-2009, 06:56 PM
Here are some new pictures and and info on Argos. He now has his Compass, 6 Sonar sensors, a USB-I2C interface board, Microphone, and charging plates on his front bumper.

Hardware:
Via Epia SN1800G Mini ITX Main board with 1.8 Ghz Processor and 800Mhz FSB
2 GB RAM
Toshiba 120GB SATA Laptop Hardrive
MiniPCI 802.11b/g Wifi Card
12-25v Micro PSU
IRobot Create
Logitech Orbit AF USB Pan/Tilt Camera
Phidgets 8/8/8 Digital/Analog I/O Board
Phidgets Voltage Sensor
Phidgets Current Sensor
Phidgets Voltage Divider
Devantech CMPS03
5 Maxbotix LV1 Sonars (Front and Sides)
1 Maxbotix LV0 Sonars (Rear)
Devantech USB-I2C converter
RoombaDevTools RooStick (USB to TTL converter) for the IRobot Create
Old CreativeLabs Desktop Microphone
Custom built 5v Regulator
Cheap USB Powered speakers (now powered via custom built 5v Regulator)
X10 Firecracker

Software:
Windows XP SP2 (Manually Stripped down of all unnecessary services)
RoboRealm for Vision
Python v2.5
pySerial Python module
Phidgets Python API
pyCreate IRobot Create API
RoboRealm Python API
pyTTS
pywin32_system32
win32com
x10.py
Microsoft TTS
Microsoft Speach SAPI v5.1
nircmd (command line package that lets me toggle the mute on the microphone)
ATT Voice pack Mike and Crystal (I'm using Mike)

Current Capabilities:
Voice Recognition based on specific vocabulary.
Text-to-Speach
Sonar Ranging
Compass Readings
Basic movement via Voice commands (Foward 4 units, left 2 units etc..)
X10 Control of lights in house
Tracking Color with Pan/Tilt
Guard mode (Alerts when he notices movement)
Remote Operiation vis Terminal Services (With camera feed)

Future Upgrades:
Li-Ion Battery pack and charging system
Possiby IR Sensors on front bumper or lower Deck
Path Finding based on A-Star algorithm
5DOF Right Arm

Thanks, Connor

Adrenalynn
01-08-2009, 08:18 PM
He's come a long way the last few months! Grats! Nice job!

I'll be picking your brain on your SONAR array for another couple projects pretty soon. :)

Connor
01-08-2009, 08:42 PM
He's come a long way the last few months! Grats! Nice job!

I'll be picking your brain on your SONAR array for another couple projects pretty soon. :)

Most of the upgrades since last time were done over the holidays. He had been put away for a long time, I just finally got around to working on him again.

As for the sonar, that was a PITA... I completly re-wired it after I first had it in a chain, using the Phidgets 8/8/8 with the Analog inputs.. it's now wired in a chain, using serial output. see this thread: http://forums.trossenrobotics.com/showthread.php?t=885&page=3 I've since been told by Maxbotix that this isn't 100% ideal because chaining the TX to the RX directly without the BW pin being in pulse mode can introduce noise on the serial output..(and you can't put it in pulse mode, because then, you don't get your serial data) They told me to put a 10K resister between each TX and RX instead of a wire to help reduce noise.. I haven't done that, but, so far, I'm not having any issues with it either.... However, I did have issues trying to do it this method when powered with anything less than 5v (within 5% margin). I was getting some strange issues with the lead Sonar.. Hence the reason I was asking about switching a 5v supply with a 3v logic. The other thing you have to look at, is timing on the reads and trigger.. If you don't put enough delay between your trigger pulse and your read, then you could have overlapping readings, which DID introduce noise and shift the readings etc etc.. Anyway, it's working now.. and I'm happy.. I just need to get the optocoupler installed so I can have Argos re-calibrate his sonar when ever he needs too.

Thanks, Connor

ooops
01-09-2009, 07:05 AM
Great looking bot! Do you still have aspirations for adding the arm?
Keep the progress and pictures coming!!!

Connor
01-09-2009, 08:58 AM
Great looking bot! Do you still have aspirations for adding the arm?
Keep the progress and pictures coming!!!

Thanks! Oh Yea...I just need to save up the money to purchase the servos and servo controller.. I plan on using one of these for the shoulder. http://www.servocity.com/html/top_mount_servo_power_gearboxe.html

Also, still working out the design of the arm, but, I think I've got it figured out.. I just need to send off the drawings and get them cut out on a water table.

Thanks, Connor

ooops
01-09-2009, 09:16 AM
Those servos-gearboxes have pretty impressive torque/$$$ ratio good find!
I am thinking/working lately on arms myself, and looking for a similiar solution, but with all my stuff I tend to build big. Those could work:)

Connor
01-09-2009, 09:29 AM
Those servos-gearboxes have pretty impressive torque/$$$ ratio good find!
I am thinking/working lately on arms myself, and looking for a similiar solution, but with all my stuff I tend to build big. Those could work:)

ServoCity has upgraded those, they use to be made of ABS or something, they've now went to Aluminum. Also, I don't see the servo I wanted to use.. I was wanting to use the HSR-5990TG Servo. 417 oz/inch torque @ 7.4v Put that in a 1:5 Gearbox, and you get 2085 oz/inch or around 130lbs / inch torque!!! I also contemplated building my own servo using a small DC motor and worm gear setup with a 360degree pot and some sort of microcontroller than could be used to monitor the pot and stop the motor and read PWM.. But, I think that's a little out of my league at this point.

Thanks, Connor

ooops
01-09-2009, 10:12 AM
I also contemplated building my own servo using a small DC motor and worm gear setup with a 360degree pot and some sort of microcontroller than could be used to monitor the pot and stop the motor and read PWM.. But, I think that's a little out of my league at this point.

Yes, at some point the projects within the project get overly complicated. Currently on Tramp, I am going with linear actuators since I don't have any idea what (how much weight) might get added to the end of the boom/arm, this way it is pretty much a non-issue. But speed and $$$ make the linear actuators not real efficient.

Connor
01-10-2009, 04:31 PM
Okay, I've been playing around with RoboRealm's Image match module.. I drove Argos around the house, and took a few strategic pictures with his webcam, named them, and then trained those images using the Image match module. I then quickly bashed out some simple code that breaks the picture names down into meaningful data to Argos.

Examples:
LIVINGROOM-CHAIR.jpg
LIVINGROOM-to-HALL.jpg
HALL-to-LIVINGROOM.jpg

I split the filenames on the '-' and strip the .jpg from it. Run it through a simple translator that lower cases it, and puts spaces where they need to be. I ask Argos where he is, he turns on the image match module for 1 to 2 seconds, and the reports back where he thinks he's at with-in 30% confidence level of the image (anything less and he says he's not sure where he's at)

My next step is to take more pictures from different angles of those pictures and label them in this way.

LIVING_ROOM-CHAIR-1.jpg
LIVING_ROOM-CHAIR-2.jpg
LIVING_ROOM-to-HALL-1.jpg
LIVING_ROOM-to-HALL-2.jpg
LIVING_ROOM-to-HALL-3.jpg

I'll split on the '-' to give me the 3 or 4 key elements (if it's 3, he's looking at a object, if it's 4 elements, then he's transitioning from one room to another) I'll ignore the number as it's not needed, however, I might be able to use that as well in navigation, I would be able to determine which picture in the series, and roughly determine where he's at. I'll also split on the '_' this time around thus, eliminating the need for most of the translator.. However, in some cases, I still need to do some translating, as he needs to know when to use the word "the". Example: I'm in THE living room leading to THE Hall. vs. I'm in THE Hall leading to Connor's Office.

If anyone has any suggestions on how improve upon this or other ideas, let me know!

Thanks, Connor

Adrenalynn
01-10-2009, 05:46 PM
That's awesome! That's just what we were chatting about with landmarks. You nailed it!

Connor
01-10-2009, 05:58 PM
That's awesome! That's just what we were chatting about with landmarks. You nailed it!

I wouldn't necessarily say I've nailed it, but it's a start. He just knows what room he's in, and what landmark he's looking at, not WHERE in the room he's in, or what relation he is in to the landmark he's looking at. However, hopefully, I can improve on this and make him more "aware"

Thanks, Connor..

Amp
01-11-2009, 05:30 AM
Hi Connor,

Really nice robot, looks like its coming on well and with lots of sensors to boot!

You may want to consider adding a nvidia graphics card and then using CUDA (nvidia library for making your own programs to run on the card) to program it. This would allow some serious number crunching in real-time. Its a really exciting concept that Id love to try and put on my robot but I use an embedded micro so the PCI interface would be hard to setup.

I found a link for a selection of computer vision librarys writtin with the CUDA library:
http://openvidia.sourceforge.net/papers.shtml#toolkits (http://openvidia.sourceforge.net/papers.shtml#toolkits)

The main issue with this probably would be the power consumption.Does anyone know how much power graphics cards range between?

Nvidia do support a largeish range of their cards with CUDA, so you could use a cheep one to save power. The high end cards said that you had accesss to 128 parallel thread processors each with 1024 local registers and then a wacking load of shared memory in GDDR3/4.

Might be interesting to see what you can do with this, I can invisage fast SIFT (scale invariant feature transform) used for object recognition. Fast sterescoptic depth map calculations. It will also speed up on parallel search loops in the case of object recogition against a know library, especialy if complex volumetric representations were used. It could prove to be a very powerfull tool in your robot.

Connor
01-11-2009, 02:00 PM
Hi Connor,

Really nice robot, looks like its coming on well and with lots of sensors to boot!

You may want to consider adding a nvidia graphics card and then using CUDA (nvidia library for making your own programs to run on the card) to program it. This would allow some serious number crunching in real-time. Its a really exciting concept that Id love to try and put on my robot but I use an embedded micro so the PCI interface would be hard to setup.

I found a link for a selection of computer vision librarys writtin with the CUDA library:
http://openvidia.sourceforge.net/papers.shtml#toolkits (http://openvidia.sourceforge.net/papers.shtml#toolkits)

The main issue with this probably would be the power consumption.Does anyone know how much power graphics cards range between?

Nvidia do support a largeish range of their cards with CUDA, so you could use a cheep one to save power. The high end cards said that you had accesss to 128 parallel thread processors each with 1024 local registers and then a wacking load of shared memory in GDDR3/4.

Might be interesting to see what you can do with this, I can invisage fast SIFT (scale invariant feature transform) used for object recognition. Fast sterescoptic depth map calculations. It will also speed up on parallel search loops in the case of object recogition against a know library, especialy if complex volumetric representations were used. It could prove to be a very powerfull tool in your robot.

This is intriguing and would be really cool to do, however, it's way out of my league. That looks fairly complex and heavy on math and such.. Sounds like something for Adrenalynn to play with.. But.. not me. :)

Thanks, Connor

Adrenalynn
01-11-2009, 02:26 PM
I do a lot of coprocessing on nVidia cards (my excuse for having dual SLI 9800GTX' in several machines since I don't game much [at all]). I totally agree that it is amazing the throughput that they can handle on some tasks. I haven't played that much with CUDA, most of my code is still old-school DirectX hacking...

Actually, someone might want to drop a bug in Stephen's ear about CUDA over at Roborealm's forum. He's a _great_ guy, always open to suggestion and obviously really cares about his labor of love. He'd be the right person to implement it, but I wouldn't expect it quick. CUDA is *very* dense and even more complex. It's a paradigm shift for most of us and it takes some time just to wrap ones head around it.

Awesome suggestion though!

On a related note, Apple is really pimping OpenCL, and have got the Khronos Group onboard with a huge partner list. Open Computing Language will allow you to write code that looks very similar to C99 and take advantage of the new opening in GPU processing.

I'm not a big fan. The API is hideous. Microsoft decided not to join-up and are adding the functionality into DirectX 11. The Linux Kernel world is neck deep in porting OpenCL now. The partner list looks like: 3DLABS, Activision Blizzard, AMD, Apple, ARM, Barco, Broadcom, Codeplay, Electronic Arts, Ericsson, Freescale, HI, IBM, Intel, Imagination Technologies, Kestrel Institute, Motorola, Movidia, Nokia, NVIDIA, QNX, RapidMind, Samsung, Seaweed, TAKUMI, Texas Instruments and Umeå University.

You can read more here: http://www.khronos.org/news/press/releases/khronos_launches_heterogeneous_computing_initiativ e/

Connor
01-11-2009, 04:31 PM
I do a lot of coprocessing on nVidia cards (my excuse for having dual SLI 9800GTX' in several machines since I don't game much [at all]). I totally agree that it is amazing the throughput that they can handle on some tasks. I haven't played that much with CUDA, most of my code is still old-school DirectX hacking...

See, I just knew you would be into that kind of thing... :)

Adrenalynn
01-11-2009, 04:59 PM
Yeah, that actually squeezes in on what I do "for a living" (if you can call it that). My security camera monitoring stuff is all accelerated in GPU, as is decoding. Right now my encoders are still FPGA, but at some point we'll move over to GPU entirely.

Connor
01-12-2009, 12:25 PM
OKay, so here is my next question.. How many pictures do I take of each landmark? Do I need to be consistent with the angles in which I take them?
I had thought about taking "head on" pictures and putting them as filename_1.jpg, then doing slightly off to the left with filename_2.jpg and off to the right with filename_3.jpg.. (Evens to the left, odds to the right another words) with exception to filename_1.jpg as stated above.. I can then use the #'s from the filename to determine which side of the landmark I'm on, and possible which way I need to navigate to either clear the landmark, or head toward it, depending on the goal. Also, I had thought about taking just general pictures of each room from different places, so I could at least recognize what room I'm in, regardless of the landmarks.. But, I'm not sure..

Thanks, Connor

Connor
01-14-2009, 11:02 PM
Update: I decided to experiment with diodes and power today. I put a 3amp rectifier diode inline with Argos's incoming line power, and one on his battery (that up till this point, I've had to plug/unplug and shut him down etc when switch between the two). From everything I've read and tested, this is a safe way to allow you to have "battery" backup.. However, in this case, it's a easy way to be able to switch between two power sources. Now, my next question.. Is their a way to calculate battery power remaining based on knowing the current voltage and amperage being used? Also, at this point, how would one go about hooking up a charger to this? At this point, I plan on using a commercially available charger for my Li-Ion (when I get it). Argos currently has two charge plates on his front bumper.. I can add a third in the middle, and use a common ground.. That way, I can use a external power supply to keep Argos running, and use the charger to charge his battery. Question is, will that work? and, how do I disconnect the battery from the diode so that the charger doesn't start dumping power into the main power buss?

Thanks, Connor

lnxfergy
01-15-2009, 09:02 AM
Update: I decided to experiment with diodes and power today. I put a 3amp rectifier diode inline with Argos's incoming line power, and one on his battery (that up till this point, I've had to plug/unplug and shut him down etc when switch between the two). From everything I've read and tested, this is a safe way to allow you to have "battery" backup.. However, in this case, it's a easy way to be able to switch between two power sources. Now, my next question.. Is their a way to calculate battery power remaining based on knowing the current voltage and amperage being used? Also, at this point, how would one go about hooking up a charger to this? At this point, I plan on using a commercially available charger for my Li-Ion (when I get it). Argos currently has two charge plates on his front bumper.. I can add a third in the middle, and use a common ground.. That way, I can use a external power supply to keep Argos running, and use the charger to charge his battery. Question is, will that work? and, how do I disconnect the battery from the diode so that the charger doesn't start dumping power into the main power buss?

Thanks, Connor

Connor,

You might want to look at the Open Automaton Project. One of the few things they ever finished was the power supply and automatic recharger....

http://oap.sourceforge.net/

-Fergs

Connor
01-15-2009, 07:18 PM
Connor,

You might want to look at the Open Automaton Project. One of the few things they ever finished was the power supply and automatic recharger....

http://oap.sourceforge.net/

-Fergs

I looked over the design of his base / charger setup. Like me, he's decided to go with a separate source for powering the bot while the charger is charging the battery. Looks like he's using microcontrollers on both the base and the bot with IR to communicate with each other. I don' think I can use IR as I'm using the home base for the iRobot Roomba, which uses IR to help guide the Roomba/Create into the cradle. I'm afraid it'll interfere with any IR devices I use when it's up close to the base.

He's also using relays, is their not something else than can be used for that?

I do like the fact that he turns the charger and line source on AFTER the bot docs, that insures that the contacts aren't hot while it's trying to doc.. This is something I really would like to do, I had thought about using X10 to do that, however, X10 isn't 100% reliable.. and that would require the PC part of the bot to be up and running.

I had a thought a while back.. what about adding in the command module for the Create, setting it up so that it can relay commands back and forth to the Create from the PC, but, give me some low level intelligences, I.E. like a artificial instinct that can kick in if the higher functions don't work. Also, maybe run the sonars through it so it has access to the serial data as well (and could even pulse the serial line if the main CPU died) this would allow me to setup a safe guard and have him seek his home base in case the PC side crashed it's battery was too drained.

Thanks, Connor

Connor
01-16-2009, 12:15 AM
Beginings of my charging setup.. Let me know what you think.. I've not added in the components to trigger the 2 relays. On the charger side, I'm thinking of using a LED on the iRobot Home Base the lights up when the Create has docked. On the Bot side, I'm going to use a Digital Output from the Phidgets 8/8/8 to trigger it. (Might use one of phidgets pre-made relay boards??)

Let me know how this looks, been a long time since I've done schematics, and never really done them with Visio, always did them by hand with stencils or free hand.

Connor
03-08-2009, 01:11 PM
This isn't the arm that I designed, it's temporary, but, hey, I had it and made it work! I turned the vex motors into Servo's by canabalizing a 1/4 scale servo I had gotten for the Shoulder, and later on, I ordered some Hitec HS-5745 1/2 digital servo control boards to use for other custom servo's. The arm can pick up around 2-4lbs. The gripper is still being worked on. I'm not satisfied with it's gripping power.

YouTube - Broadcast Yourself.

Thanks, Connor

kanda
03-09-2009, 04:56 AM
Isn't the epia too slow to handle all these features or is it running perfectly ?

Nice work anyway ! Would be so nice to see some videos of Argos moving around :-)

Connor
03-09-2009, 09:27 AM
Isn't the epia too slow to handle all these features or is it running perfectly ?

Nice work anyway ! Would be so nice to see some videos of Argos moving around :-)

Thanks! I hope to get more video up sometime, I've been concentrating on my mech and hadn't done anything with Argo's in a few weeks.

So far, the 1.8Ghz board is doing, okay.. When it's processing Video, it can chew up all the CPU depending on what it's doing. That's why I only process the video when the situation calls for it. Example: "Argos, track green" The python control code then turns on the camera and Roborealm and the roborealm code then tracks green and moves the pan/tilt. So, in a nut shell, I'm not going to continually process video. Voice Recognition, TTS, and everythning else work very well and doesn't bog the machine done very much.

Thanks, Connor

Toymaker
03-09-2009, 12:08 PM
Hi Connor

Nice robot! The new Atom 330 Dual Core CPU looks very good for a robots onboard PC system. I have left all major processing (speech recognition, TTS and soon face and object recognition) off-robot with just PIC's at the robot end, it was really the vision stuff that forced this. Anyway, keep up the cool work!

Tony

Connor
03-09-2009, 12:26 PM
Hi Connor

Nice robot! The new Atom 330 Dual Core CPU looks very good for a robots onboard PC system. I have left all major processing (speech recognition, TTS and soon face and object recognition) off-robot with just PIC's at the robot end, it was really the vision stuff that forced this. Anyway, keep up the cool work!

Tony

Thanks!

I wanted mine to be some what self sufficient, and with Mini-ITX's available, it just made sense.. I can move video processing off-robot if I want, I had even thought about adding a secondary video processing CPU to mine. I am interested more in that Facial Recognition software..I really want Argos to be able to recognize people. RoboRealm doesn't do it yet, so, I'm waiting...

Thanks, Connor

kanda
03-10-2009, 03:16 AM
Connor, would you mind to give more iformations about the product that you purchased (or built) to power the mini-ITX board ? A link or any information would be welcome...Thanks

Connor
03-10-2009, 09:23 AM
Connor, would you mind to give more iformations about the product that you purchased (or built) to power the mini-ITX board ? A link or any information would be welcome...Thanks

I purchased my pico PSU from Logic Supply. I purchased the 120W Wide Input model.

http://www.logicsupply.com/products/pico120wi_25

here ist a all of the PSU's.

http://www.logicsupply.com/categories/power_supplies/power_supplies

You can choose any of the ones ending with WI (wide input) otherwise, you'll still need a 12v regulated supply.

Thanks, Connor

kanda
03-10-2009, 02:30 PM
So basicaly, it is as simple as pluging a (+) and (-) of your 12V (or more) battery to this small PSU ??

Have measured how many Watts are needed by Argos in worst cases ??

That sounds quite nice ! Thanks a lot Connor !

Adrenalynn
03-10-2009, 02:33 PM
And a switch. :)

Connor
03-10-2009, 03:35 PM
So basicaly, it is as simple as pluging a (+) and (-) of your 12V (or more) battery to this small PSU ??

Have measured how many Watts are needed by Argos in worst cases ??

That sounds quite nice ! Thanks a lot Connor !

Pretty much that simple.. I didn't bother with watts.. I just messure the amperage.. With everything running, (sensors, IR's, mobo, camera and laptop HD) It pulls around 1.3amps at 19v and around 2amps at 14.4v, so that's around 25-30 watts.. 120W PSU way over kill, but, I wanted plenty of extra. I didn't add a switch to the power, just to the mobo.. I probably should do that too. :)

Thanks, Connor

kanda
03-10-2009, 03:43 PM
Thanks Connor, these informations are usefull !

Hope we can see more videos of Argos soon !!!

Best regards

Connor
10-28-2009, 08:58 PM
OKay, here is some video of Argos wondering around my living room.

YouTube - Argos

kanda
10-29-2009, 04:12 AM
Nice work ! I Would prefer an different voice, maybe less "human"-like. Anyway nice robot ! Can't wait to see more !

darkback2
10-29-2009, 07:46 AM
Seams like argos is pretty good at not getting stuck...but maybe that is the editing job.

As for mapping (discussion from yesterday) could you use RFID tags by the various door?, and have argos follow the wall on the right side all the time? seams like a place to start anyway.

DB

Connor
10-29-2009, 10:13 AM
Seams like argos is pretty good at not getting stuck...but maybe that is the editing job.

He did get stuck twice while doing the video. However, it wasn't his fault.. he got hung up on his arm. He aklmost got stuck between the coach and coffee table once. He kept going in circles, and as I turned the camera off so I could fix him, he figured it out and got unstuck.


As for mapping (discussion from yesterday) could you use RFID tags by the various door?, and have argos follow the wall on the right side all the time? seams like a place to start anyway.

DB

I've researched RFID.. I've read that people who tried it had mixed results.. biggest issue is you have to get too close to the tag. People have had far better results with IR beacons or using vision with Roborealm and Fiducials. However, the end goal is to have him navigating without having to modify the environment for him.

Thanks, Connor

darkback2
10-29-2009, 12:14 PM
What if instead you did a sort of deal where periodically...as in nodeally, argos took a picture. then used that to develope a "Fiducial". Then later tried to match up that fiducial based on movement records?

DB

Connor
10-29-2009, 04:03 PM
What if instead you did a sort of deal where periodically...as in nodeally, argos took a picture. then used that to develope a "Fiducial". Then later tried to match up that fiducial based on movement records?

DB

I thought Fiducial's had to be black and white, or in a white-black box or something... I did do something like this with basic image matching.. but, lighting affected it a great deal. It would work for him knowing what room he was in, but how to navigate safely between rooms and maintain a course? What's needed is a both a metric and topological map.. a way to dynamically update the map based on IR, sonar, and bump sensors, (SLAM or HIMM) and then a path finding algorithm such as A-Star, or Wavefront. But, that's were I hit the wall.. I know nothing about metric and topological maps, or how to get Argos to build them and/or update them.. and the path finding algorithm, SLAM, and HIMM is just as much a mystery to me. I also need install my motor controller and encoder into Argos so I can have more accurate data from the encoders. The Create just isn't accurate enough.

Thanks, Connor

lnxfergy
10-29-2009, 04:11 PM
I thought Fiducial's had to be black and white, or in a white-black box or something... I did do something like this with basic image matching..

A Fiducial is simply a marker or point of reference. The key part though: is that you have to be able to reliably recognize it, hence why many of them are black on white, organic looking things.

-Fergs