View Full Version : [Contest Entry] The eyeRobot: Robot Blind Aid

02-10-2008, 12:54 PM
I have had the idea of building a Robot Blind aid floating around in my head for a while, it is best described as a cross between a white cane and a seeing eye dog, trying to integrate the best of both worlds through robotics. When I was offered a free roomba create by a new your times reporter and found out about this contest, it seemed the perfect way to prototype the project using the roomba and get some money for building a full scale model. Having to use the roomba as a base would be rather constraining, I prefer to build my own robots from scratch, but the sensor arrays I wanted for the real world version aren't cheap. Unfortunately... other robots entered the contest (who could have predicted that?) and I didn't win, I ended up about breaking even, so I'm still looking for the money to bring it to the next level. Enjoy this length walk through:

Using the iRobot Roomba Create, I have prototyped a device called eyeRobot. It will guide blind and visually impaired users through cluttered and populated environments by using the Roomba as a base to marry the simplicity of the traditional white cane with the instincts of a seeing-eye dog. The user indicates his/her desired motion by intuitively pushing on and twisting the handle. The robot takes this information and finds a clear path down a hallway or across a room, using sonar to steer the user in a suitable direction around static and dynamic obstacles. The user then follows behind the robot as it guides the user in the desired direction by the noticeable force felt through the handle. This robotic option requires little training: push to go, pull to stop, twist to turn. The foresight the rangefinders provide is similar to a seeing eye dog, and is a considerable advantage over the constant trial and error that marks the use of the white cane. Yet eyeRobot still provides a much cheaper alternative than guide dogs, which cost over $12,000 and are useful for only 5 years, while the prototype was built for well under $400. It is also a relatively simple machine, requiring a few inexpensive sensors, various potentiometers, some hardware, and of course, a Roomba Create.

Video Demonstration

High Quality Version (http://www.barshaylego.com/eyeRobot.zip)

Operation Overview
User Control:
The operation of eyeRobot is designed to be as intuitive as possible to greatly reduce or eliminate training. In order to begin motion the user simply has to begin walking forward, a linear sensor at the base of the stick will pick up this motion and begin moving the robot forward. Using this linear sensor, the robot can then match its speed to the desired speed of the user. eyeRobot will move as fast as the user wants to go. To indicate that a turn is desired, the user simply has to twist the handle, and if a turn is possible, the robot will respond accordingly.
Robot Navigation:
When traveling in open space, eyeRobot will attempt to keep a straight path, detecting any obstacle that may impede the user, and guiding the user around that object and back onto the original path. In practice the user can naturally follow behind the robot with little conscious thought.
To navigate a hallway, the user should attempt to push the robot into one of the walls on either side, upon acquiring a wall the robot will begin to follow it, guiding the user down the hallway. When a intersection is reached, the user will feel the robot begin to turn, and can choose, by twisting the handle, whether to turn down the new offshoot or continue on a straight path. In this way the robot is very much like the white cane, the user can feel the environment with the robot and use this information for global navigation.

Range Sensors
The eyeRobot carries 4 Ultrasonic rangefinders (MaxSonar EZ1). The ultrasonic sensors are positioned in an arc at the front of the robot to provide information about objects in front of and to the sides of the robot. They inform the robot about the range of the object and help it find a open route around that object and back onto its original path.
IR Rangefinders:
The eyeRobot also carries two IR sensors (GP2Y0A02YK). The IR rangefinders are positioned to face out 90 degrees to the right and left to aid the robot in wall following. They can also alert the robot of objects too close to its sides that the user may walk into.

Cane Position Sensors
Linear Sensor:
In order for the eyeRobot to match it's speed to that of the user, the eyeRobot senses whether the user is pushing or retarding its forward motion. This is achieved by sliding the base of the cane along a track, as a potentiometer senses the cane's position. The eyeRobot uses this input to regulate the speed of the robot. The idea of the eyeRobot adapting to the speed of the user through a linear sensor was actually inspired by the family lawnmower.
The base of the cane is connected to a guide block moving along a rail. Attached to the guide block is a slide potentiometer that reads the position of the guide block and reports it to the processor. In order to allow the stick to rotate relative to the robot there is a rod running up through a block of wood, forming a rotating bearing. This bearing is then attached to a hinge to allow the stick to adjust to the height of the user.
Twist Sensor:
The twist sensor allows the user to twist on the handle to turn the robot. A potentiometer is attached to the end of one wooden shaft and the knob is inserted and glued into the upper part of the handle. The wires run down the dowel and feed the twist information into the processor.


The robot is controlled by a Zbasic ZX-24a sitting on a Robodyssey Advanced Motherboard II. The processor was chosen for its speed, ease of use, affordable cost, and 8 Analog inputs. It is connected to a large prototyping breadboard to allow for quick and easy changes. All power for the robot comes from the power supply on the motherboard. The Zbasic communicates with the roomba through the cargo bay port, and has full control over the Roomba's sensors and motors.


Code Overview
Obstacle avoidance:
For obstacle avoidance the eyeRobot uses a method where objects near the robot exert a virtual force on the robot moving it away from the object. In other words, objects push the robot away from themselves. In my implementation, the virtual force exerted by an object is inversely proportional to distance squared, so the strength of the push increases as the object gets closer and creates a nonlinear response curve:
PushForce = ResponseMagnitudeConstant/Distance2
The pushes coming from each sensor are added together; sensors on the left side push right, and vice versa, to get a vector for the robot's travel. Wheel speeds are then changed so the robot turns toward this vector. To ensure that objects dead in front of the robot do not exhibit a "no response" (because the forces on both sides balance), objects to the dead front push the robot to the more open side. When the robot has passed the object it then uses the Roomba's encoders to correct for the change and get back onto the original vector.
Wall Following:
The principle of wall following is to maintain a desired distance and parallel angle to a wall. Issues arise when the robot is turned relative to the wall because the single sensor yields useless range readings. Range readings are effected as much by the robots angle to the wall as by the actual distance to the wall. In order to determine angle and thus eliminate this variable, the robot must have two points of reference that can be compared to get the robots angle. Because the eyeRobot only has one side facing IR rangefinder, in order to achieve these two points it must compare the distance from the rangefinder over time as the robot moves. It then determines its angle from the difference between the two readings as the robot moves along the wall. It then uses this information to correct for improper positioning. The robot goes into wall following mode whenever it has a wall alongside it for a certain amount of time and exits it whenever there is an obstacle in its path, which pushes it off its course, or if the user uses the twist handle to bring the robot away from the wall.


Motivation and Improvement
This robot was designed to fill the obvious gap between the capable but expensive guide dog and the inexpensive but limited white cane. In the development of a marketable and more capable Robotic White Cane, the Roomba Create was the perfect vehicle for designing a quick prototype to see if the concept worked. In addition, the prizes would provide economic backing for the considerable expense of building a more capable robot.
The amount I learned building this robot was substantial and here I will attempt to lay out what I have learned as I move on to attempt to build a second generation robot:
1) Obstacle Avoidance - I have learned a lot about real time obstacle avoidance. In the process of building this robot I have gone through two completely different obstacle avoidance codes, starting with the original object force idea, then moving to the principle of finding and seeking the most open vector, and then moving back to the object force idea with the key realization that the object response should be non-linear. In the future I will correct my mistake of not doing any online research of previously used methods before embarking on my project, as I'm now learning a quick Google search would have yielded numerous great papers on the subject.
2) Design of the stick sensors - Beginning this project I thought my only option for a linear sensor was to use a slide pot and some sort of linear bearing. I now realize that a much simpler option would have been to simply attach the top of the rod to a joystick, such that pushing the stick forward would also push the joystick forwards. In addition a simple universal joint would allow the twist of the stick to be translated into the twist axis of many modern joysticks. This implementation would have been much simpler than the one I currently use.
3) Free turning wheels - Although this would have been impossible with the Roomba, it now seems obvious that a robot with free turning wheels would be ideal for this task. A robot that rolls passively would require no motors and a smaller battery and thus be lighter. In addition, this system requires no linear sensor to detect the users push, the robot would simply roll at the users speed. The robot could be turned by steering the wheels like a car, and if the user needed to be stopped brakes could be added. For the next generation eyeRobot I will certainly use this very different approach.
4) Two spaced sensors for wall following - As discussed earlier problems arose when trying to wall follow with only one side facing sensor, thus it was necessary to move the robot between readings to achieve different points of reference. Two sensors with a distance between them would simplify wall following greatly.
5) More sensors - Although this would have cost more money it was difficult trying to code this robot with so few windows on the world outside the processor. It would have made the navigation code much more powerful with a more complete sonar array (but of course sensors cost money, which I didn't have at the time).

The iRobot proved an ideal prototyping platform for experimenting with the concept of a Robotic White Cane. From the results of this prototype it is apparent that a robot of this type is indeed viable. I hope to develop a second generation robot from the lessons I have learned from using the Roomba Create. In future versions of eyeRobot I envision a device capable of doing more than just guiding a person down a hallway, rather a robot that can be put in the hands of the blind for use in everyday life. With this robot, the user would simply speak their destination and the robot would guide them there without conscious effort from the user. This robot would be light and compact enough to be easily carried up stairs, and tucked away in a closet. This robot would be able to do global navigation in addition to local, being able to guide the user from start to destination without the users prior knowledge or experience. This capability would go well beyond even the guide dog, with GPS and more advanced sensors allowing the blind to freely navigate the world,
Nathaniel Barshay,
(Entered by Stephen Barshay)
(Special thanks to Jack Hitt for the Roomba Create)


02-11-2008, 12:25 PM
Cool. I guess the next step would be to make something that could navigate down a sidewalk, across streets, etc.

02-11-2008, 12:31 PM
Really nice project, it shows how a fairly simple concept can be applied to a real life application that helps people. Applying it to outdoor use is going to be a challenge though.

Very well done however, I love how smooth the navigation is, doesn't even stutter.

02-11-2008, 01:32 PM
Thanks for posting this awesome project. I too love how this is something with serious real world application. I wanted to ask what sparked this idea in the first place? Was this an aha! moment or do you know someone that you were trying to solve a problem for? It's always interesting to hear the genesis of ideas.

02-13-2008, 10:24 AM

Great project! Have you let any actual blind person try it and what was their feedback/opinions about the eyeRobot? I have seen many blind persons walking with a blind-stick and it's quite impressive how well they do it with only a stick!

02-13-2008, 01:38 PM
Now here's a project that isn't just 'technological masturbation'. (Nuthin wrong with that, tho...)
I don't personally know any blind people, so I can't judge how real-world useful it might be, but it certainly is a productive avenue for research and development. With truly innovative ideas like this roaming about (pun intended) the possibility for practical robot use is becoming more real. Vacuuming a room is pretty good, but this is taking it to the next level.
This also makes me think that, in the near future, roboticists (especially hobbyists) will make some truly amazing things possible. You should think about copyrighting some of these great ideas before some company rips off your creation(s).
Damn, I do love the practical side of things!

02-13-2008, 02:00 PM
Thanks for all your praise...
This is definitely one of the better ideas I've had, but I still haven't gotten my head around the complexity to take it to the next level. Hopefully I'll get back to this project after the trinity fire fighting contest in April, and maybe I'll even have some money to put into it. I'm sure everyone here knows the difficulty of building complicated robots, particularly as a broke high school student.

My first step when I find some time is to go a passive drive system, hopefully with independently turning wheels to allow freedom infinitely higher then that of a differential drive. I'm still trying to decide how to proceed in terms of sensor arrays. The obvious avenues are vision based, through sonar arrays, or a laser range finding system. Vision is the most appealing, it would provide all the information I could possibly need, but extracting it is complex. Laser Range finders can provide a huge amount of information, but start at about 1000 dollars.

Anyways the idea came about as I browsed through robotics projects and noted that there are a multitude of projects for people who are deaf, paralyzed, etc. But effective systems for people who are blind are practically nonexistent, it's role was largely filled by the guide dog, who's use is restricted because... it's a dog (however highly trained). The white cane is the other prevalent solution, and a system based on this cane by sticking a robot on the end seemed a decent compromise.

In response to the kdwyer's comments on patenting, I was under the (probably optimistic) impression that if a company tried to that I could win the patent through evidence like this thread. If this is a optimistic outlook I would consider taking some steps towards protecting the idea,

02-20-2008, 02:49 PM
Very nice idea, and cost efficient as well. Perhaps another step would be to make the robot able to move over various surfaces like grass and dirt, with ease. You would have to make some serious modifications to to roomba however, by creating some sort of wheel system that cordinates with the cane.

02-20-2008, 02:53 PM
hey great project. i think the concept of your robot is great. Getting the robot to be able to be outside though might be hard. but all in all great job.

although have you seen the IBOT? its a wheel chair that is pretty tight if you combined that design and yours then you would have a great invention on your hands!

06-06-2008, 09:50 PM
add aome sort of gps to be able to esily be programmed to o wherever the blind need to

06-06-2008, 09:52 PM
Are you familiar with the error involved in GPS navigation? 60 feet with every step wouldn't be uncommon.

06-06-2008, 10:15 PM

robot maker
08-31-2008, 11:40 AM
i am building a little close to yours with same sensors ,no eye cam yet,but with roborealm create interface,i will
i was going to use parallax propellor chip that someone was able to interface to irobot create with codes,but maybe try your processor,where did you get the board and can you share the codes

yes its very common in gps to be off near 60 on some,some are a little better,i have 3 gps to try it at home to add to my robots latter on,but seems all sensors has some problems or drawback
i fix for this would be using all sensors together ,gps,compass,sonar,pir,pid and webcam and alot of good software coding
read somewhere in servo mag that someone using gps in a toy car robot
gps only works outdoor,but heard there is a design for indoor too

Are you familiar with the error involved in GPS navigation? 60 feet with every step wouldn't be uncommon.

08-31-2008, 12:10 PM
GPS works fine indoors if you have enough receiver sensitivity. The satellites don't require you fill-out a survey of where you plan to be receiving their signals...

robot maker
08-31-2008, 12:28 PM
i am thinking you are very close to being right,since i have one that works indoors and one that doesnt ,

GPS works fine indoors if you have enough receiver sensitivity. The satellites don't require you fill-out a survey of where you plan to be receiving their signals...

08-31-2008, 12:54 PM
"very close" as in "exactly" "dead on" "know what I'm talking about"

02-07-2011, 09:20 AM
Hi dude... Nice idea and great work..
If u dont have any objection can u please fwd me the circuit diag and also the image processing standards that u have used. Please dude..
mail me @ [email protected]
hoping for a positive response...