Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 24

Thread: Grinder skating hexapod robot

  1. #11
    Join Date
    Dec 2012
    Location
    Los Angeles, CA
    Posts
    860
    Images
    25
    Rep Power
    97

    Re: Grinder skating hexapod robot

    Quote Originally Posted by psyclops View Post
    Does the Prime Sense camera have any libraries to you are working with? What language? Looks pretty interesting; I had considered putting a Kinect on the robot but it's a bit big; this looks like a better bet. I might put a Neato LDS on it too (which would be cool as I designed the mechanicals and optics on it).
    I'm playing around with openNI SDK and some open source middleware you can download. I'm still very early in my development. I'm currently finishing up speech integration using Kurt's code base as a guide.

    In regards to IK have you looked at Kurt's Phoenix code port? It's C/C++ and has all the IK options we have all come to know and love. :P

  2. #12

    Re: Grinder skating hexapod robot

    I'm developing in C/C++. I looked at OpenCV but it seemed too academic (in the sense of "only just barely working to pass the semester project presentation") to actually use for real.

  3. #13
    Join Date
    Dec 2012
    Location
    Los Angeles, CA
    Posts
    860
    Images
    25
    Rep Power
    97

    Re: Grinder skating hexapod robot

    Did you look at openNI ? I didn't mention openCV.

  4. #14

    Re: Grinder skating hexapod robot

    Nope, I haven't yet looked at OpenNI. I was just reporting what I had looked at :-)

    That being said, I looked at the reference, and it seems to only just give me raw device access. I'm already doing that with video4linux. My concern is more with high-level things, like constructing a scene representation from two correlated images, classifying image fields, shape recognition, etc.
    Does OpenNI support plug-in "software sensors" that can provide this kind of functionality? I didn't see anything on the site.
    Last edited by jwatte; 04-29-2013 at 05:38 PM.

  5. #15
    Join Date
    Dec 2012
    Location
    Los Angeles, CA
    Posts
    860
    Images
    25
    Rep Power
    97

    Re: Grinder skating hexapod robot

    Ah my bad then.

  6. Re: Grinder skating hexapod robot

    I like the 5 bar design! 4 and 5 bar mechanisms are crazy useful and I’m surprised I haven’t seen more in the robotics community.

    OpenNi does support plug-in “Middleware” libraries Like NITE2 but Most of them involve skeletal tracking or gesture recognition.

  7. #17
    Join Date
    Feb 2012
    Location
    Sydney, Australia
    Posts
    364
    Rep Power
    44

    Re: Grinder skating hexapod robot

    Very nice design, especially the linkages. What's the total weight?

  8. #18
    Join Date
    Sep 2010
    Location
    ಠ_ಠ
    Posts
    2,317
    Images
    27
    Rep Power
    287

    Re: Grinder skating hexapod robot

    Awesome robot.

    OpenNI is mostly for Kinect/Xtion/Primesense devices but they were hoping other device manufacturers would produce devices and drivers to work with their 'standard' software interface (apparently, the folks at softkinetic did not want their TOF camera to play nice with OpenNI). The basic driver/low-level stuff provides a standard method to retrieve depth and registered color images. The NITE middle-ware processes the depth and color images to provide skeleton/hand tracking and some gesture recognition, and outputs that to the high-level OpenNI interfaces/functions. There is a completely open-source driver for the kinect that does mostly the same as the drivers/low-level OpenNI, but the rest is pretty much roll your own (most basic would be to segment objects based on depth data and trim the registered color image to the same area/blob). There was also an OpenNI wrapper for the Kinect4Windows driver, but I have not kept up to date on its progress (allows the nicer K4W drivers without the Windows Kinect SDK).

    OpenCV is still a bit "not great" on stereo depth map generation, but passive stereo matching is pretty difficult even with really high quality/resolution cameras. The 'full-package' PR2 uses an IR projector to help create 'texture' on plain/flat surfaces (as a form of structured lighting) to improve block matching for the two stereo camera pairs (near and far). There was a pretty cool paper from willow garage on their texture projector and how they greatly improved their stereo depth maps. The cheaper version just uses a kinect or other primesense device (lots of support in ROS, easy to use, readily-available, and very inexpensive).

    I really should be working on my final class project instead of typing this, but I just cannot seem to stop myself. "Environmental Text Mixture Monte Carlo Localization" - yay... ಠ_ಠ
    Please pardon the pedantry... and the profanity... and the convoluted speech pattern...
    "You have failed me, Brain!"
    [git][mech][hack]
    gives free advice only on public threads

  9. #19

    Re: Grinder skating hexapod robot

    Does prime sense work outdoors now? I'm especially interested in identifying/localizing lawns, concrete paths, trees, picnic tables, and persons...

  10. #20
    Join Date
    Sep 2010
    Location
    ಠ_ಠ
    Posts
    2,317
    Images
    27
    Rep Power
    287

    Re: Grinder skating hexapod robot

    Quote Originally Posted by jwatte View Post
    Does prime sense work outdoors now? I'm especially interested in identifying/localizing lawns, concrete paths, trees, picnic tables, and persons...
    Since it uses a low-ish power IR projector, not really... It can work in the dark, and in the shade up to a point, but direct sunlight and other IR light sources tend to blind the depth sensor. The primesense sensor is standard structured lighting: measures distortion of a projected pattern with a single camera to estimate depth. Without the pattern, it is nothing but a really expensive webcam. You probably could try using the B&W IR image and the color image for basic stereo depth map generation, but not sure how well it would work (IIRC, you can get either depth map or B&W IR image, but not both - if usual depth is kaput, why not try stereo with IR?).
    Please pardon the pedantry... and the profanity... and the convoluted speech pattern...
    "You have failed me, Brain!"
    [git][mech][hack]
    gives free advice only on public threads

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Reprapped Hexapod Robot
    By martinprice2004 in forum Project Showcase
    Replies: 4
    Last Post: 10-16-2012, 04:10 AM
  2. Hexapod V1-TR
    By amorfy in forum Project Showcase
    Replies: 11
    Last Post: 05-27-2011, 02:11 AM
  3. Project AX12 Hexapod Robot
    By pabloxid in forum Project Showcase
    Replies: 9
    Last Post: 03-23-2011, 05:08 AM
  4. Microcontroller-driven grinder
    By BillSlugg in forum Robotics General Discussion
    Replies: 1
    Last Post: 09-11-2008, 11:42 PM
  5. Project Hexapod Robot CNC router
    By mdenton in forum Humanoids, Walkers & Crawlers
    Replies: 29
    Last Post: 06-11-2008, 06:24 AM

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •