Page 3 of 5 FirstFirst 12345 LastLast
Results 21 to 30 of 49

Thread: Autonomous robot's navigation

  1. #21
    Join Date
    Apr 2008
    Nelson, New Zealand
    Rep Power

    Re: Autonomous robot's navigation

    Awesome work EDV so interesting. keep up the good work.
    People yearn after this robotic dream, but you can't strip your life of all meaning, emotion and feeling and expect to function.

  2. #22

    Re: Autonomous robot's navigation

    Thanks, I will try to go further and once maybe machines will see the light and T-1000 will cease to be a fantasy

  3. #23

    Re: Autonomous robot's navigation

    The first iteration of AVM v0.5 SDK porting to C# is done:

    This package is using C# porting of OpenCV v2.0 by library: "Emgu CV".

    Emgu CV download

    About OpenCV v2.0

    OpenCV v2.0 download

    The test video ".\\bin\RcgTraining4s.avi" is using XviD codec.
    Make sure that it installed on your PC.
    Last edited by EDV; 02-19-2010 at 03:11 AM.

  4. #24

    Re: Autonomous robot's navigation

    Awesome I love seeing this type of work this may help me understand your prev work better! Thanks! +rep for sharing!

  5. #25

    Re: Autonomous robot's navigation

    I made more convenient implementation of algorithm AVM for C # and also I prepared detailed documentation "Library AVM SDK simple.NET".

    Just print file .\\bin\Go.png and show it against camera for recognition in example 2.
    Last edited by EDV; 08-05-2010 at 01:26 AM.

  6. #26

    Re: Autonomous robot's navigation

    RoboRealm company begins distribution of "Navigator" plugin within RoboRealm software.

  7. #27

    Re: Autonomous robot's navigation

    Dr. Bruce shared his successful experience with AVM Navigator plugin:

    Leaf can look for a fiducial. Then he can calculate angle and distance using x-rotation and size.
    Next, because he knows where in the room the fiducial is located, he can apply some trigonometry
    to calculate his exact pose in the room (X Y Orientation). He then goes to a specific place
    in the room about 3 feet in front of the door. Next, switching to AVM, he can navigate through
    the doorway successfully!

    In fact, this was so successful, I then decided to have him try the hallway. He goes down the hallway
    stopping when his sonar sensors detect the wall. I then have him programmed to turn clockwise 90 degrees.
    And he then finishes by going down a short narrow irregular hallway (previously a difficult challenge).

    Here are links to 2 videos:

    I have more to do - but this is a great start...

  8. #28

    Re: Autonomous robot's navigation

    Now I work at developing of new mode for "AVM Navigator plugin" that will completely autonomous. This mode will have name "Walking mode" and robot in this mode will be able to explore an unknown rooms environment.

    And first stage of this development (obstacle avoidance) is successfully passed now:

    Visual odometry, autonomous mapping will further...

  9. #29

    Re: Autonomous robot's navigation

    -= Walking mode - how it work? =-
    *Obstacle avoidance

    1. Scan the vertical contours (the legs of chairs, corners of walls, doors and jambs, etc)
    and if there is vertical contours in front of robot then it is obstacle
    (yellow indicator in form as trapeze), and then robot begin to quietly go around.

    2. The camera has infrared illumination (headlight) and if robot comes close to the wall
    then the lamp provides illumination in the form of white spots on the screen.
    Accordingly if in the center of the screen will appear a large white spot then robot
    will roll back with a writing of obstacle image to AVM (this case indicated by a large
    red crossed rectangle). The marked obstruction (which are stored in AVM) is indicated
    by blue rectangles with the inscription "Obstacle". If the robot sees the marked obstacles
    in front then robot just will go around it at a distance (it is exclude needs in going closer
    to wall in second time).

    3. The robot processes the motion history of input images for one second earlier. And if robot
    were given the command "forward" but it not caused to any changes in input image (nothing happens)
    then the robot got stuck (his fore-part stuck into the wall). Then robot gives the command
    backwards with a writing of obstacles in the AVM (it could prevent collision with the wall
    in second time). This case indicated by a red rectangle with a circle in the center.


    The robot sets the marks (it writes central part of the screen image with associated data to AVM tree).
    Marker data (inside AVM) contain horizon angle (azimuth), path length from start and
    X,Y location position (relatively of start position). Information for marker data is based
    on marks tracking (horizontal shift for azimuth and change of mark scaling for path length measurement).
    Generalization of all recognized data of marks in input image gives actual value of azimuth and path length.
    If we have information about motion direction and value of path length from previous position and x,y
    coordinates of previous position then we can calculate a next coordinates of current position.
    This information will be written to the new mark (inside AVM) when it will be created and so forth.

    Set of marks inside AVM gives a map of locations (robot see the marks and recognize his location).

    I finished update of my robot (now it Nettop + motor/servo controller + local camera + tracks platform):

    And it allows me to proceed with developing of "Walking mode".

    >> What does this kaleidoscope in the center of the screen?

    This is an indicator of markers density inside of map (gradation from blue to red is respectively:
    blue - no markers in this direction, the red - the maximum density of markers in this direction).
    It works like radar that indicates the ratio between the numbers of markers per sector in each
    direction with increments of 5 degrees (72 partitions in a circle). The "blue arrow" shows the direction
    to the least markers density and robot will try to turn in this direction for the exploration
    of new uncharted territory.

    >> What's going there?

    The robot looks around and marks territory by markers along the arc then moving forward
    and inspects again etc. If he sees an obstacle then robot marks the corresponding marker
    in the map as "Obstacle". Further the robot guided by a map marker will not touch
    with obstacle again (red light in the center of the green ellipse).

  10. #30

    Re: Autonomous robot's navigation

    Especially for improvement of algorithm navigation in "Walking mode" was developed mod for Quake 3
    that provide environment emulation and can be used instead real robot.

Thread Information

Users Browsing this Thread

There are currently 5 users browsing this thread. (0 members and 5 guests)

Similar Threads

  1. Question(s) Writing an educatonal book on robotics
    By darkback2 in forum Robotics General Discussion
    Replies: 107
    Last Post: 05-25-2010, 12:51 AM
  2. on encouraging autonomous mechs
    By societyofrobots in forum Mech Warfare
    Replies: 116
    Last Post: 11-23-2009, 12:25 PM
  3. Ideas for Autonomous Kondo Robots
    By Atom in forum Humanoids, Walkers & Crawlers
    Replies: 8
    Last Post: 08-23-2008, 08:45 PM
  4. Discussion What is a "robot"????
    By Matt in forum Robotics General Discussion
    Replies: 83
    Last Post: 07-22-2008, 10:22 AM
  5. robots who learn. what happened to them..
    By openmindedjjj in forum Robotics General Discussion
    Replies: 36
    Last Post: 07-02-2008, 04:46 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts