PDA

View Full Version : Beta Tester Roll Call



DresnerRobotics
06-03-2015, 02:15 PM
Hey guys,

Would appreciate some feedback on everyone's current status on their robot.

I would say the current goal for people is to validate their build using the ps3_demo.

So where's everyone at? Are we running into any missing or incorrect info in the ps3_demo wiki article?

How is your robot walking? Can your robot walk for 5-10 minutes without falling? Any overheating issues? We have a walk_tuner tutorial and general walking information article/video in the works, should hopefully be released by next week. I also should have the next revision walk tuning published in the next 2 weeks that should address some efficiency/heat issues (currently only experienced when the robot is heavily loaded or walking for 10+ minutes in my experience).

Few things to keep in mind-

ps3_demo has direct control to the walking gait input. Smoother, slower joystick input is going to give you better results than jerking it around. There is some code to prevent rapid acceleration/deceleration due to bad input, but it's not going to solve everything.

The robots are tuned to walk on low-pile carpet, think office carpet. If you are trying to walk on a smooth surface such as linoleum, tables, tile, etc, you will need to add some grip to the feet. I've had success using masking tape for testing, on the inner sole of the foot and on the outside corners. The robots are not going to walk well on medium-thick carpet, that's a lot like walking in sinking sand at their scale and I don't think the AX-12s are physically fast/strong enough to compensate.

We found that some of the Edisons may have shipped with Bluez5 instead of our custom Bluez4 image. You can either uninstall bluez5 and install bluez4, or use bluez5 and update your start_edison script to replace the 'killall bluetoothd' line with 'hciconfig hci0 up'. You will still need to issue an 'rfkill unblock bluetooth' command at startup regardless. More info on this inside the wiki.

Would love any and all feedback. If you have run into a stuck or are having issues getting your robot to walk, let's talk and get you a solution!

FYI- I am available via Google Hangouts if that's easier for you guys. PM/email me your gmail address and I will add you to my contacts. andrew (at) trossenrobotics.com

bstag
06-03-2015, 03:20 PM
I have it walking fine on tile. It does not like my spongy carpet. I have the feet taped to walk on tile. I am currently working on adding sound for talking and a microphone to it for responding to voice commands. I still have not printed out the shells as I will be modifying them to include the speaker, mic and a small video screen in the chest or back.

LloydF
06-06-2015, 10:56 AM
Hi, I have it fully skinned and am playing with vision on the rpi2, thinking maybe, going over to a odroid U-3 not sure yet.
Just got unbuntu-mate to run with the Hros1-Framework and guvcview works as well yea! It does seem no matter what I do
that at least one more usb port is always needed, so I attached a 4 port hub, and it seem to fit in pretty nice i'll put up a picture.
(Notice the mic for speech:rolleyes:)60126014

KurtEck
06-06-2015, 11:06 AM
I am a bit embarrassed :o here, I have mine assembled, I have had it move some, but then got too many diversions lately including disassemble PhantomX and in process of figuring out how to put all the parts back into one piece...

But hopefully soon will be giving the HR-OS1 more attention.

jveejay
06-11-2015, 07:37 AM
Hi, I have it fully skinned and am playing with vision on the rpi2, thinking maybe, going over to a odroid U-3 not sure yet.
Just got unbuntu-mate to run with the Hros1-Framework and guvcview works as well yea! It does seem no matter what I do
that at least one more usb port is always needed, so I attached a 4 port hub, and it seem to fit in pretty nice i'll put up a picture.
(Notice the mic for speech:rolleyes:)60126014

Looks great! Wow you really managed to get it to walk with that USB hub huh?!

LloydF
06-11-2015, 09:37 AM
Yes, I hope they will come out with a internal hub, that we can bolt down, but if not I can live with very good double sided tape for now.
:wink:

AlSierra
06-13-2015, 09:30 PM
I finally got mine fully setup using the Raspberry PI. I had some issues with it leaning/falling forward, but then I realized it was balanced for use with the battery (hadn't added that - I was just using a power cable) once I put that in it was able to stand up straight.

I finally got the hang of the different poses with the rme, but I didn't backup the original file so I will need to pull down again from the git repository.

I tried the ps3_demo briefly and it seemed to work just fine. Hope to have more time to work on it in the next few weeks. I just downloaded the stl files for the Orion-Exo, so I will try a quick hand print. =)

LloydF
06-16-2015, 09:06 AM
Is it me or does it fell like only half a dozen or so kits got built, so were did the rest go to?

KurtEck
06-16-2015, 09:40 AM
Some may be like me :o, who has built theirs and it is sitting right next to my monitor, but I got distracted with finally trying to get a slight grip on ROS and in the process of making my PhantomX work using ROS through the help of r3n33 and KevinO...

But I should soon be getting back to the HR-OS1... I also need to figure out how I can best contribute to this. That is for me, I am more interested in helping out in the architecture and the like, than I am in working with or playing with poses. Should be fun!

I actually am hoping that my current diversion will actually help out in the long run as I can imagine that that some/all of this may migrate to something like ROS...

LloydF
06-16-2015, 03:31 PM
The only black hole so far in the HROS1 is the vision.

DresnerRobotics
06-16-2015, 07:28 PM
Is it me or does it fell like only half a dozen or so kits got built, so were did the rest go to?

The representation on the forums is not accurate to the number of robots in the wild. 98% of our customers never touch these forums. :) I know of about 2 dozen that are actively being used between 2 universities alone, of which none are being posted about on here. Plenty of other lone wolves out there doing their own thing.

DresnerRobotics
06-16-2015, 07:29 PM
The only black hole so far in the HROS1 is the vision.

This is mostly due to the nature of the vision included in the Darwin package. It doesn't really do a whole lot by itself, blob tracking is a neat party trick but isn't useful without localization/navigation of some sort. We're looking at other options during development to add more features on the vision side of things, but that will come once the physical hardware and control are in a more final state.

LloydF
06-17-2015, 08:49 AM
Cool, I sorta thought some progressive schools would be stuffing these little jewels away, but I do not think they will help with SLAM or anything else. I do like the fact you want this to be (the best robot ever!) and I'm here to help, if I can :veryhappy:.

tician
06-17-2015, 09:03 AM
Hmm, reminds me of yet another project I should try to finish: Environmental Text Mixture Monte Carlo Localization. Was supposed to use modified text_detect code from the 'literate pr2' project to extract environmental text from a webcam + distance sensor, or just a kinect, and then use the text size and wording (door numbers, emergency exit signs, lab/professor names, project posters, fliers, etc.) to map and localize within a research/office building. Still really wish I had finished it in time for class project presentations two years ago...

LloydF
06-17-2015, 10:37 AM
Hmmm, visual SLAM seems the way to go, unless someone makes a smaller, lighter laser LIDAR that can run 24/7 and almost all are too big and heavy for this little guy. Your Lidar-light might workout if, ros2 and windows10 pan out this year. Wouldn't next month be oh so nice timing.

jveejay
06-17-2015, 10:48 AM
This is mostly due to the nature of the vision included in the Darwin package. It doesn't really do a whole lot by itself, blob tracking is a neat party trick but isn't useful without localization/navigation of some sort. We're looking at other options during development to add more features on the vision side of things, but that will come once the physical hardware and control are in a more final state.


The only black hole so far in the HROS1 is the vision.

I've got vision processing through the PiCamera. With the Edison it was a black hole, I had no idea what to do so I took it out and went with the Pi. So now Harvey is able to detect faces. But I've been using python and doing openCV based vision tracking instead of relying on the Darwin framework.

Is anyone else doing the same?

Overall, I have the robot built and working on it slowly, amongst the floods in Austin and doing my other research work. Unfortunately, he took a face plant recently that broke his neck plate but the Trossen folks were superb and sent me replacement parts immediately. I have to fix his neck before I can get his motion engine back on. After that, it is coupling the motion with vision using Python/JavaScript.

LloydF
06-17-2015, 11:04 AM
I am finally having some success with PocketSphinx and voice recognition. Just need to figure out some sort of speaker arrangement for this guy, did find a 90 degree audio plug in my parts bin ;-)

Weebo
06-17-2015, 02:45 PM
We've already done some work with openCV and added some interactions, so now he is able to detect faces and say hello using espeak, and then offers to shake hands, we also added a push button using the GPIO pins so we can interact with him.
the problem was the frame rate was super slow, 1 frames every 2-3 seconds, and I havent worked on rpi optimizations yet, since I jumped to work with kinematics to make him able to write letters on a plain board, so far no luck.

LloydF
06-17-2015, 05:52 PM
Neat, any chance of getting some help with raspberry pi 2 drivers for XtionPro Live 6040
It is detected but needs arm drivers.:cool:

KevinO
06-17-2015, 08:20 PM
Neat, any chance of getting some help with raspberry pi 2 drivers for XtionPro Live
It is detected but needs arm drivers.:cool:

Xtion is basically a primeSense sensor. do a quick search of OpenNI2. That is what you need to see the color, IK and depth. I used it on several projects on the rpi.

LloydF
06-17-2015, 09:01 PM
Ok but i'm going back to his camera, LOL need to walk before I run:happy:. The Xtion is ideal but it is just a hair to big for this project.:sad:

DresnerRobotics
06-17-2015, 09:04 PM
Ok but i'm going back to his camera, LOL need to walk before I run:happy:. The Xtion is ideal but it is just a hair to big for this project.:sad:


https://software.intel.com/en-us/realsense/devkit

Specifications

- Longer range** (up to 3-4 meters indoors, longer range outdoors) (https://software.intel.com/en-us/articles/intel-realsense-data-ranges)
- Depth/IR: 640x480 resolution at 60fps
- RGB: 1080p at 30fps
- USB 3.0 required
- Developer Kit Dimensions: 130mm x 20mm x 7mm

Smaller once you remove the enclosure too :)
The challenge will be finding a compact enough SBC that has USB3.0 and enough CPU to do anything with the data. ODROID XU3 or Minnowboard Max are at the top of my 'to test' list.

LloydF
06-18-2015, 08:40 AM
Sweet:veryhappy:

jveejay
06-19-2015, 09:38 AM
Sweet:veryhappy:

Indeed -- that's what I'm picking up today from Intel ;)

Kempelen
10-14-2015, 08:22 PM
Hey,

just a status update on my HR-OS1 from the 2nd batch.

Assembled in about 1 day using RPi2 and everything works fine (I haven't tested USB cam yet though). PS3 demo works flawlessly, robot doesn't fall down and can walk both forwards and backwards at full speed including taking turns. So overall I am pretty happy :-) It's a very good start!

There were some small issues with the assembly guide which I communicated back so maybe they will be fixed at some point.

In addition, I glued (with double-sided tape) a small 1W speaker on the top/front of actuator No. 19 so it looks like a mouth (and is audible enough for me to recognize words clearly with maxed out alsamixer), very small USB sound card, microphone and 4-port USB hub inside the chest, installed some voice synthesis and hope to get CMU Sphinx running and recognizing commands. Then I can hopefully extend ps3_demo and control the robot via voice (if the microphone is able to "denoise" actuator movements).

With the camera I hope to port my JavaScript-based CLM tracker (for tracking facial features) to C, make the robot's head mirror the movement of my head and maybe teach it to recognize people around with some added SVM. Later if I have a lot of time maybe a PGM for recognizing objects it points camera at would be possible, and autonomous walking without additional sensors (possibly a RPi3 or Core-M stick would be needed for that).

I am currently taking MITx' 6.832x course in underactuated robotics and hope to be able to understand it sufficiently to use it for programming some clever movements once I familiarize with the techniques and robot's API.

There are more crazy ideas like testing out Parallella board for much faster image processing, playing with DNNs and testing out algorithmic ideas in the real world.

Thanks for a very nice toy! 8-)