View Full Version : [Contest Entry] Gepetto

02-18-2008, 11:15 AM
Specification Dimention
Style Phidget Servo or RC based robot
4, DC Gearhead HG37F260-031
Vantec RDFR21 ESC
Phidget 4 servo controller
Software MAX/MSP Jitter/Cyclops
Colson 4” * 2” wheels
Height in inches 29
34 armed**
Length in inches 22
33 armed**
Width in inches 29
Frame materials Oak, Poplar, Steel, Paint


Gepetto (named after the old man who makes pinocchio the puppet who wants to be a real boy) was conceived of in response to two things. Most importantly looking around I have noticed that a lot of things people make these days seam to be made to look modern, but not to last. For example look at cars today. So much of what goes onto and into them is plastic crap that is going to break off the first time you touch it. (I know this isn’t actually true...and plastics have come a long way and are much more durable than they were in the past) I wanted to make something that looked and felt as though it wasn’t going anywhere. I also wanted a sort of timeless sense. Here you have this thing that seams like it could have been made forty years ago, but inside of it you have some technologies that are pretty modern.

The second reason is that I make robots as an art form. In going around to different galleries near where I live, I was always being told that my work was too radical for people to accept as art. People would ask. “what would someone do with that thing?”.
So I decided to see if I could make something sellable.

Mounting a gun on its back was an afterthought.


This really hurts.


The idea behind this robot was to create a stable mobile platform upon which to base an artificially intelligent robot. This has always been a problem for me. I know that we aren’t exactly there yet, but it seams to me that a lot of the robots that I have come across focus solely on mobility. That is they have programs that allow them to climb stares, or go over big rocks and boxes, or avoid them all together. At the same time while some of these robots can do things like find their way from point A to point B, and avoid obstacles along the way, I have not seen a lot in the way of a robot that would “decide” to go from point A to point B, Nor am I sure that I know what that might look like...but none the less, that is my over all goal.

The Platform:

In the past I worked primarily with two robotic platforms, the lego mindstorms system, and the ER-1 by Evolution robotics (www.evolution.com). The ER-1 was discontinued several years ago, and I decided then that I would not subject my work to the limitations of any one company. I then began looking for other ways to have successful platforms, and decided on using the Phidget Servo controller in conjunction with MAX/MSP.

Continuing the concept of the ER-1 a robot that uses an umbilicus concept only carries its mother (the computer) on its back, I have built a three phidget servo controlled robots which carry computers around with them.
In addition to being able to carry around a computer, I wanted this robot to be fairly stable. The frame design allows individual wheels to move independently with a wheel travel of almost 3 inches. Given that the wheel diameter is only five inches, that is quite a bit of movement.
Gepetto’s motors, while much stronger than I thought they would be, are still undersized given the weight they have to carry. Gepetto won’t get stuck on carpet edges or the like, but I don’t see him climbing any mountains...or even doing well in grass.

The video screen on Gepettos head does actually work, and using MAX/MSP I can display different pictures or video on the screen. I plan on using it for video conferencing while doing telepresence stuff in the future. I also wanted to make the robot capable of displaying its “mood”.

I plan on making a much more capable model this summer using many the same concepts.

The Software:

To be honest, I haven’t done nearly as much work on the software end of things as I should have at this point. I guess that is because in part the programming part is not as much fun for me as the building part.

In any case I have worked up some Max patches for doing things like changing the probability of an event based on virtual rewards or punishments. Really just a random number generator spitting numbers through a gate. Say from 0-9. The gate begins open up to 8 (9 gets rejected). A number making it through the gate triggers the behavior. So to start with the gate almost fully open, and most number gets through. If I then “punish” the robot. The gate is now half way closed and will cut off any number over 5. Further punishments will work the gate down to 2. There will still be some chance that the robot will perform the behavior, but it becomes less and less likely based on punishments. Rewards work the opposite way. Rewards keep the gate mostly open. These gates are the way in which behaviors are triggered thus a robot can simply “choose” not to do something you ask it to do.

Similarly the next iteration of this robot will have “see far” devices mounted on all 4 sides. In this way it will be able to respond to new things in its environment, or choose not to respond. Again, I will be able to encourage it either way with rewards and punishments.

“Mood” displayed on the screen on its head will be based on a culmination of all of its gates. A sort of average, coupled with how recently it has been punished. The more recently the more negative the mood. Mood will have the effect of making behaviors more or less likely to be performed, and also more or less erratic.

I have also worked a bit with the cyclops patch for MAX/MSP. This is a smaller program that triggers events based on visual data. Cyclops is not quite as advanced as I would like for it to be, but either with work, or with other people getting involved I’m hoping to be able to create behaviors based on visual information. As it currently stands I can trigger events by having either a specific color, or a movement in different zones on the screen.


Given that this is really just a platform, I have run into very few major problems.

One problem was making room around the motors so the shocks would fit. While this was a problem I should have foreseen during the CAD drawings, I did the drawings without actually having purchased the motors yet, and I didn’t realize that the motors would stick out as far as they did.

With the first Gepetto I also made the mistake of reconsidering the gap between the two frame rails. This made it so that the battery didn’t fit the way it was supposed to, and I had to add the dropped shelf.

Finally the original wheels that I chose were not designed for a robot that is so heavy. I switched them out for the colson wheels. The problem with using colson wheels is that I had to find a creative way to mount them on the 6mm shaft output from the motors. I used 1 3/8 oak wood dowels, and drilled out the ends using a 7/8th inch bit. I then used epoxy to glue aluminum hubs into the wooden shafts. I drilled two holes on opposite sides of the wheels, and dropped in wood screws that go into the shaft.

The software has also been a problem. The phidgetservo object for MAX/MSP does not work in MAX version 4.5. I can upgrade to 4.6 but for that I would have to buy an upgrade for Jitter (the video software). Given that I do all of this on a school teachers salary, Gepetto will have to wait.

The Future:

Included with this post should be drawings of the next version of this robot. It will be similar in concept, and capable of running the same software, but the computer used will be my sony vaio vgn ux180p. Its a micro PC. This unfortunately means no cyclops (the video software that Gepetto is capable of running). Instead I am going to use the I-cube which is a box that translates sensor data into MIDI events. Through these sensors Gepetto II will be able to create a model of her environment and respond to it.

Front View

Side View

Top View










02-18-2008, 11:25 AM
The suspension and movement are very impressive on that gravel drive.

Keep it up!

02-25-2008, 03:51 PM
Thanks for the post. This is an impressive project. I love the simplicity of the trigger servo. I have to give you kudos too for sacrificing your body for the entertainment of the kids. I can't believe you were taking shots that close. I've gone paintballing and that is close range!

02-28-2008, 01:15 AM
This patch makes use of Aka.wiiremote by Masayuki Akamatsu, and the phidgetservo object.

For the sake of this discussion I will use words which anthropomorphize Gepetto. I may at times refer to Gepetto “wanting to” or “choosing to” do something. “Wanting to” for example refers to an increase in the potential that an event will occur. Similarly “choosing to” refers to a random number generator resulting in a number which triggers the event to occur. While I understand using these words may cause some confusion, I personally believe they best express the manor in which the simulation works.


While this patch is far from representing true artificial intelligence, the lessons learned from its creation can be applied in a myriad of situations. This patch simulates an animal learning to respond to commands. The robot has the option of responding to stimuli in a myriad of ways. Through the use of punishments and rewards the robot can be trained to respond properly to a command that it is given. The same robot can be retrained later to respond differently to the very same set of commands.

How it Works

The patch can be broken up into three parts. First data is captured from the Wii controller. This data is incredibly dirty, and needs to be cleaned up so that commands can be given in a predictable manner.
The commands can be summarized as follows. Holding the remote level on both the x and y axis results in a command of 0 (or no command). Pointing the remote downward results in a command of 1. Tilting the remote up results in a command of 2. Twisting the remote clockwise while it is pointing away from you results in a command of 3, and twisting the remote counter clockwise results in a command of 4. Returning the remote to the level position returns it to a 0 command. The 0 command position is not connected to any output, and simply registers the time in between commands.
The second part, is the most difficult to express, In the second part, the robot is given the commands and a series of processes are done. These processes result in the robot choosing to respond.

For the purposes of this project the robot is given 4 response banks. Each bank is associated with 5 possible responses. Move forward for 1 second, move backward for 1 second, turn left for one second, turn right for 1 second, or do nothing.

When a command is received the software routes the command to one of the 4 response banks. There the command triggers a random number generator to pass a number on to a bank of 4 ranges. 0-24, 26-49,51-74, 76-98. If the number falls into one of these ranges then the robot will respond to the command by moving forward, moving backward, turing right or turning left respectively. If the number is not in that range, then it will simply be dumped.

The third part of the software formulates commands and outputs them in a meaningful way to the phidgetservo object.

The robot will then perform the chosen response.
Within 3 seconds of the robot performing its response, the operator can do one of three things. Nothing, results in no change in behavior. Pressing the trigger on the Wii remote (punishment) results in a decrease in the range of numbers available for triggering the chosen response. Pressing the A button (reward) does two things. First it increases the range of numbers available to trigger the chosen response. Simultaneously it decreases the range of available numbers for triggering any other response. The software is limited to 4% for reducing the range for any possible response. Similarly the robot is limited in the % of producing the desired response. This means that as a robot becomes more and more trained, the same robot becomes less and less responsive.

All 5 responses remain possible for several reasons. I could have written the patch so that as one potential increased the others decreased with no limits. I chose not to do this because once the robot learned a set of responses it would become immutable. The inherent flexibility built into this system means that you can retrain the robot to respond differently to a given input.

I also took my dog into consideration while assembling this patch. My dog doesn’t respond to given commands very often. While I know there are people in the world who have dogs that respond instantly to commands, if mine sat without repeated commands I would be incredibly surprised. Most of the time he simply looks at me and smiles.

Future Implementation

The underlying principles of this program can be applied in a myriad of places. In another discussion I will show how Gepetto’s camera and infrared sensors work, and the data from the camera is analyzed. The learning aspect of this software will be applied to the camera and infrared output analysis. In this way Gepetto can respond to visual stimulus with a greater degree of realism. Gepetto can also be set with “moods”. These moods will change starting potentials of response.

Software Problems Encountered


This patch is relatively stable once it is running, but was incredibly hard to get to that state. The computer and remote would either, not connect, partially connect, or properly connect. When the remote and computer failed to connect, I had to restart the software before it would connect. At times I had to restart the computer. I also often had to pull the batteries out of the remote before it would work. I’m not sure if this problem stems from bluetooth interference or not. Changing environment and electronics in the area seamed to have little effect on successful pairing.

Partial connections didn’t occur as often as no connection, but were a lot more annoying. The data stream from the Wii would either be random or drop to the lowest possible value. This would trigger the robot to begin dancing about as each jump in values triggered a new and often different response. This made it impossible to even begin training the robot because of its erratic movements.

Stack Errors:

Because this program requires feed back loops, I kept getting stack errors. This is one of the reasons that disciplining the robot does not increase the potentials of the other options. Increasing one option simultaneously decreases all others. This created really bad feedback loops which crashed the program.


It was really difficult to ensure that incoming discipline went to the right place. MAX?MSP reads the page from right to left and top to bottom. This means that an event that happens at the bottom of a page happens after an event at the top. This causes events that are passed from the bottom of the page to the top of the page to never occur, because the event at the bottom of the page does not send the message until after it is required. I had to add pauses in several different places.

This also caused the program to crash.


The nature of this concept is adding unpredictability to a robots software. This causes the robot to behave erratically, and I did some damage to both the robot and my house in the process.

I originally programmed the patch to reverse behaviors that were performed that were incorrect, this led to a whole host of problems. Gepetto would make a mistake, I would correct him. And he would do the opposite of the command that I had intended for him to perform, and not the one that he did perform. Similarly when that problem was fixed because of erroneous inputs (holding the remote improperly and sending commands by accident) Gepetto would have received a new command before being issued the correction. This also led to Gepetto reversing the wrong command.

Gepetto II will include infrared sensors. I’m hoping to use these to make gepetto choose not to follow interpreted commands that would lead to contact with a wall.

Here is the video that shows the training of Gepetto using the Wii remote.


I look forward any help.

02-28-2008, 03:32 PM
Thats really cool. Im a big fan of paintball. You might want to look into getting a remote cord for the co2 tank. They run pretty cheap on ebay (about $20,) and would give more stability to your arm and gun. But overall it looks really good.

02-29-2008, 02:43 AM
Gepetto Vision System.


The main purpose of this vision system is to add to the overall functionality of Gepetto, and to test out various systems for Gepetto II (which I have gotten permission from the wife to begin working on tomorrow provided I scavenge all of the electronics, and a lot of the other parts from the current Gepetto.). I am also working to create a viable vision system that does not rely on specific colors, or close matches. Gepetto identifies potential obstacles by comparing different parts of what he sees.

I also wanted to make it so that Gepetto can be more responsive to his environment, as part of the over all artificial intelligence concept.

That is not to say that there is any “learning” going on during this demonstration.

How It Works:
This patch uses the Cyclops object to interpret video signals coming from one of the two cameras. For the purpose of this robot I am using an external isight camera which is mounted in a downward position on Gepetto’s head. I have made several assumptions to help make this plan work.

1) Gepetto is starting far enough away from a wall, that his entire field of vision will be taken up by a drivable surface.
2) The bottom center of Gepetto’s field of vision is not an obstacle.
3) The floor is a relatively uniform color.
4) The walls and other obstacles are not the same color as the floor. (an invalid assumption in my house.)

The video feed is divided up into 144 squares. Each square outputs an average color for all of its pixels. This is in RGB format. This information is then passed on to the processing section of the software.
Video is processed in 2 ways. First predetermined squares look for changes in brightness, and output values when changes are detected. This is good for alarm systems and the like where the camera can remain in a stable place. I also found this to work well for defining obstacles if the threshold for difference was set high enough to avoid the robots own movement from triggering the sensors.

Second, chosen squares are compared to each other in order to find differences between them. A threshold is set for color difference in any of the three colors. So for example, if Gepetto is looking at a blue floor, and comes across a slightly different shade of blue floor, Gepetto would recognize that different shade as a separete object provided the difference exceeded the threshold. Gepetto would also recognize a red section of floor only the difference would be in the red section.. Exceeding the threshold indicates that an obstacle is present.

The software then passes that information on to the movement section of the software which turns on and off the motors in response to stimulation. I believe that this coupled with an IR system would make the robot incredibly capable in a myriad of situations.


First off all, my wife wants to kill me because Gepetto is relatively heavy, and made mostly of steel. He did a number on our dishwasher while I was trying to get this software to work properly. (put pillows in front of the cabinets)

I had planned on adding a fail safe using the wii remote, but left my remote at school. So I had to run around the room chasing him. (add the fail safe)

The current version of Gepetto is simply too wide for a video camera with this narrow of a field of view to be very useful. Obstacles will often fall outside of his field of view. (the new version (Gepetto II will use a much narrower frame, and have a much higher camera mount.)

The Cyclops software uses a lot of memory. This resulted in changes made to the software not saving and slow response times from the sofware allowing various commands to sneak through to the phidgetservo object.

If the robot is too close to the wall, it cannot see the wall as an obstacle because its entire field of view is occupied by the wall. This allows the robot to repeatedly slam itself into the wall...or refrigerator...or dishwasher. (Gepetto ii will use a conjunction of video, and infrared.)

The Future:

Gepetto ii will definately combine all of the above ideas into one robot. I’m planning on setting up modes In certain modes he will be able to patrol an area, and in other modes he will be remote controlled, and capable of basic telepresence. I have access to a house under construction. I’m planning on taking gepetto ii over there once completed to further test all of these different systems in conjunction with each other, and along side IR.

I have a set of USB IR sensors, and also a claw with IR sensors in its gripper. I plan on mounting this and the gun interchangeably.

I would also like to implement the concept of moods. Moods will work similarly to modes, different moods will set various parameters such as speed of learning, aggressive behavior, attention span (when training with the remote, if you give the same command too many times in a row, or not enough commands in a given amount of time, the robot will switch to “wander mode”), and responsiveness.



02-29-2008, 11:47 AM
This is all very impressive work. Sorry to hear about the dishwasher though! The first autonomous bot I built was a converted RC car chassis with an ITX mounted on it. Here it is (http://forums.trossenrobotics.com/showthread.php?t=1070) when it was a tethered bot. So I finally get this thing built after a month and turn it on for the first real trial and it instantly takes off across the room at full speed and I go sprinting after it, but there's no chance since it's too fast. I got to watch my $1,500 bot slam into the wall at full speed. LOL.

>>>>I would also like to implement the concept of moods. Moods will work similarly to modes, different moods will set various parameters such as speed of learning, aggressive behavior, attention span (when training with the remote, if you give the same command too many times in a row, or not enough commands in a given amount of time, the robot will switch to “wander mode”), and responsiveness.

Speaking of behavior code, Kdwyer and I were chatting about that recently. Here's a quick repost of my comments:

"An experiment I would LOVE to see done is a behavior program based on Freud's Id, Ego, and Super Ego or something similar. Where the main idea is that different "needs" and "desires" compete for the attention of the "conscious" decision making side of the brain. There are two main objectives of the robot 1) to stay alive 2) to be as happy as possible. The bot dies when it runs out of juice or when it's "depressed for too long."

The Id influences by being happy when there is plenty of "food" (battery charge) or knocking down "happy points" when he's hungry too long. The Super Ego could be mimicked by creating something psychological the robot prefers and wants to keep in order. Making the robot obsessive compulsive about his/her environment for instance is an easy way to implement some competing behavior to instinctive urges like needing to eat and sleep. An experimenter could make the robot prefer red walls and the walls would occasionally change color (backlit by lights) and to change it back the robot has to bump the wall 20 times or something. (burning energy)

This whole concept can be expanded out by adding social behavior with many bots and the need for social interaction (trading friendly IR pulses) there could be 5 bots and only two food supplies (outlets) creating competition and the need for bots to learn when to be aggressive etc... here would be many fun ways to add complexity to the competing decisions a robot has to make.

I think this would be a fascinating way to learn about building decision making algorithms and learning about actual animal behavior as well. I have a half written white paper on it somewhere. I should look for it..."

I wonder if the MAX/MSP software would be a good platform for something like this? Not my idea exactly per say, but creating those layered structures. I don't know much about that platform.

02-29-2008, 11:32 PM
MAX is basically C++ with pictures. I've been using it for about 15 years now...actually...it will be 15 years next september (MAX/MSP was originally an art program. There is a free version call PD (Pure Data) which isn't quite as developed, but is free.). I'm dyslexic so the idea of looking at line after line of code is about as inviting as oh...I don't know...something really bad. One thing that I love about MAX is that the learning curve is very shallow, but the program is incredibly capable.

I'll start working on your suggestion for Gepetto II. I think I'll start out with something simple though such as competing tasks. A guy I took a seminar with at Stanford assembled a patch that did something similar. It had different entities that voted on randomly created music. Some of the entities liked random sounds, while others liked some sort of structure. Another created the music using random number generators and markovian analysis of Shoenberg, and some of Stravinsky's later work.

I'm thinking along the lines of a patch that uses the camera to fine and collect colored paper scraps off of the floor, competing with a patch that seeks the darkest part of the room. I will be able to implement IR though, so the bumping into walls and appliances should be a lot less likely.

I'm thinking the amount of discipline as well as time intervals between paper finds will increase the intensity of the dark seeking patch, while the paper seeking patch will respond to periods of inactivity and the reward of finding paper (a scrap of paper found results in the robot continuing to seek paper.

Oh...BTW, I know I said I wouldn't start working on Gepetto II until this summer, but here is a picture of the computer case. I made it while the wife was getting her hair done.:p

I'll keep everyone posted as to my progress.



03-05-2008, 12:39 AM
Hey guys, thanks for the recognition...Awesome.

03-05-2008, 10:54 AM
Thanks for the awesome project and extensive posting. I think a lot of people are going to find this thread and learn a lot from it.

>>>Oh...BTW, I know I said I wouldn't start working on Gepetto II until this summer, but here is a picture of the computer case. I made it while the wife was getting her hair done.:p

This reminds me of a great quote from the show "The West Wing"
"Don't marry a genius, they never want to sleep."

03-06-2008, 01:45 PM
Grats on getting a story on Gizmodo http://gizmodo.com/364541/gepetto-robot-good-at-paintball-bad-at-wii

03-13-2008, 08:34 AM
Ok...so here are some pics of Gepetto II's frame before paint. I haven't decided on a color scheme, but I know I need three colors. I was thinking of doing a camo sort of thing, but then figured that would be cheesy. I also considered black and yellow...but then what should the third color be? The wood Is going to be a gold hue. Gepetto I was a bit too dark for my liking. It didn't show off the grain of the wood.
You can see that Gepetto II will actually have greater wheel travel than Gepetto I. These motors are also a lot stronger.
I'll post more about this later.


03-13-2008, 01:16 PM
Ok...whats this I hear this robot was on G4 TV? I want to be on G4 TV! Thats freeking awesome!

03-13-2008, 01:26 PM
Ok...whats this I hear this robot was on G4 TV?What?!? That's awesome man!

EDIT: I just found some links:



congrats DB!

03-13-2008, 03:43 PM
Congrats on a really awesome bot dark, I'm glad to see it got the recognition it deserved!

03-18-2008, 12:39 AM
Ok...minor setback.

I somehow managed to forget how thick paint is. and now post paint, the computer no longer fit in the fiberglass tray. I spent the afternoon cutting out the original tray, grinding off the paint, and welding in a new tray I made at work today. Yeah I now...I'm supposed to be teaching the kids something.

In anycase Gepetto II should be done and ready for testing by friday.

03-19-2008, 11:44 PM
Gepetto II

Today I had a really pleasant experience. Two really...One was putting the final screw in Gepetto II’s base, and the other was my wife looking at it and saying, “Your going to have to give it a girls name,” and “You’re getting pretty good at that.”

Really, this robot has been several years in the making. My dad got a sony vaio a few years ago. I took one look at it and thought, that would be a cool computer to base a robot around. A year and a half ago, I got myself a sony vaio, even smaller than my dads mind you, and finally today put the last screw into a robot built around that computer.

Gepetto II (Vivian),
Specification Dimention
Style Phidget Servo or RC based robot
4, DC Gearhead Denso 730556-7030
Vantec RDFR21 ESC
Phidget 4 servo controller
Software MAX/MSP Jitter/Cyclops
Colson 5” * 2” wheels
Height in inches 21
Length in inches 18
Width in inches 22.75
Frame materials Oak, Poplar, Steel, Aluminum, Paint


With Vivian my intention was to create a really capable and mobile platform on which my students and I could test various sensors. Ok, can’t go down stairs, but if you check out the video, you’ll see her cross a pretty big gap, and run just fine in relatively tall grass.
For a long time I’ve been building robots with artistic purposes in mind. Showing them in galleries, or a series of short films that I made about them, This is one of the first robots I’ve made in about 3 or so years now that hasn’t ultimately been for that purpose.
To be honest, it feels really good. I have a lot of my other robots sitting in my shed, or on display in my office, and while yes Vivian is going to spend a lot of time at work, she is mine, and I’m really looking forward to growing her and fleshing out her software. (hence the girls name.)

I should have some pics and video of a live fire test this weekend, and I’ll have more info about how big she is when she’s got her gun mounted.


I have an ER-1 arm that fits on her, but I think I’m going to switch that out for a new servo arm.

I got a wii remote to control a servo today using the Mouse State object in MAX/MSP and a small program that controls the mouse through a wii remote. I think I can get Vivian to respond much the same way Gepetto did to the wii remote. Similarly Vivian can have her computer and gun mounted at the same time. So I can control the gun through the wii remote also.

I’m going to work really hard to get the camera to work. There is a problem with another program that I can’t find taking over the hardware so that Vivian can’t see.

I also want to add my I-cube X. Its a midi controller that uses real world input from sensors to output midi. Through that I can connect 32 sensors.

Ok...enough for now.


03-20-2008, 01:45 AM

Great work with her suspension. At first I thought that your new wheels seemed to be a bit slick for off-road driving, but your movie proved me wrong ;)

03-20-2008, 07:56 AM
Ok...so I sort of cheated with that last part. Vivian went down the hill just fine. When I turned her around, she got stuck in some dog mess and the wheels just sat there spinning. Note to self, clean up after the dog before running the robot into woods where you can't see.

Then again, I just hosed off the wheels and underside of the frame. All is well.


03-20-2008, 09:32 AM
Haha cheating or not, Vivian is a great improvement over Gepetto!

What kind of sensors were you going to provide her?

03-20-2008, 11:26 AM
Ok...for now (budget totally blown, wasn't supposed to start this one until june.) I'm just going to add three IR sensors that I already have, and try to get the camera running. This summer I plan on getting a motorcycle so I have to behave, but I want to get a 3D orient sensor (uses a 3D compass), a couple of tilt sensors, 3 far reach Ultrasonic sensors, and perhaps in the very far future some Biofeedback sensors so that Vivian can tell what I'm doing. Ok...so thats a lot...

Oh...did I mention the light sensors?

03-20-2008, 11:39 AM
I'm also impressed by how much that flexible suspension matters. Very rough ground and she was having no problems. How come your not using some thick tread air filled tires? Is it a half and half bot for indoor outdoor?

It is nice to win other people over with robots isn't it? I brought a Pleo to the superbowl party and everyone loved him. They have NO idea what I do during the day or what TR is about so it's nice to show them a little.

Congrats on the new family member! :)

03-20-2008, 11:41 AM

03-20-2008, 04:35 PM
I'm also impressed by how much that flexible suspension matters. Very rough ground and she was having no problems. How come your not using some thick tread air filled tires? Is it a half and half bot for indoor outdoor?

The easy answer is I used these wheels because I now how to mount them on these motors without too much of a fuss... The hard answer is that I tried using air filled wheels with Gepetto.


He basically just turned the hubs out of the wheels, and then got stuck on them.


03-24-2008, 04:39 PM
It's good to know those units have that issue. I'm guessing the weight of the bot has something to do with it. There are probably more robust air filled tires out there, but your bot is already set of course in the wheel department.

03-24-2008, 10:32 PM
I have used wheel barrow wheels in a pinch, but then you have to deal with the huge hassle of connecting them to a motor. I really prefer direct drive given that I make everything by hand.

The other issue is that when you are using direct drive that amount of torque available directly effects the size of wheels that you can use. I think the first Gepetto got away with using smaller motors because the wheels were smaller.


03-25-2008, 12:41 AM
Ok...so I finally got Vivian working reliably with the Wii remote. I'll try to post the software and links tomorrow. Basically I used WiiRemote, downloaded from from Http://onakasuita.org/wii/.

This program allows the wii to control the mouse. It also maps the wii buttons to various keys. This allows me to map the B button or trigger to the trigger of the gun.

The MAX?MSP objects that I used are MouseState and PhidgetServo. MouseState takes the input from the mouse and outputs its location as pixel values in X and Y coordinates. I had to remap the values so that they would match the values required by my Vantec ESP.

I have found that a joystick works a lot better for controlling a robot than a mouse, because a joystick returns to a 0 value automatically. Using a mouse you don't have this advantage which means that you often end up over steering the mouse.

There is a MAX/MSP object that I may try to incorporate that can take control of the mouse. I may use this to automatically move the mouse back to a 0 position after some period of time.

A joystick also has some form of Physical feed back. You can feel the location of the mouse.

I added a few things to make the mouse easier to control. First I added threshholds, which greatly increased the center position of the mouse. I also added a pitch to the horizontal location of the mouse. The pitch can only be heard when the mouse is in a centered position, and increases in frequency as the mouse moves across the screan. This makes controlling the robot a lot easier because you can hear where the mouse is, and correct mistakes before you make them.

Here is a link to the video.


Again, I'll upload the MAX/MSP patch tomorrow.


03-26-2008, 04:32 PM
That's wild db, great work!

I downloaded that application (I think it was the right one (http://onakasuita.org/wii/WiinRemote_v2007.1.13.zip)), but I can't seem to figure out how to map the map wii buttons to keys on my keyboard. I'm just checking out the app and don't have a bluetooth module to connect to my wiimote though. Do I have to have one connected just to check out the mapping feature?

03-26-2008, 10:05 PM
I'm sorry,

I haven't uploaded the new software yet. My parents came to town, and everything has been all in a mush since then. As for mapping the wii keys to your keyboard, The wii / mouse software that I downloaded, just check under the options menu. I'll try to post a video of this later...as in next week...


03-26-2008, 11:55 PM
Ok...so if you download win remote.

Once you run the application, select Options/preferences.

In the upper right hand of the preferences box you can assign the various button functions.

Select the button you want to map from the drop down menu. Then under assign (the menu just below that select keys.

Below that there is a press keys here window. Below that is a dialogue box. type the key you want to map the button to, and press the set button just to the right of the dialogue box.

That should do it. I set the max patch to look for the a lower case h. I figured I was probably least likely to select that key by accident.

Hope this helps.


03-27-2008, 02:23 PM
Ah, I get it now:)

My monitor at home must not have been all that great, because I didn't see the "Options" menu till just now on my work monitor.

Either way though, it's not enabled, so I can't really check it out till I get a bluetooth module to interface with my wiimote:(

Back to your regularly scheduled program, I'll stop hijacking this thread now;)

03-27-2008, 10:59 PM
To be honest...its cool to see that someone is using my stuff. I'll do this...I'll set the mouse button to control phidget servo channel 4, and the letter h for channel 3...just for you.


03-28-2008, 10:36 AM
not as cool as using a Wii, but thanks;) I'd like to check it out!

04-20-2008, 05:49 PM
The easy answer is I used these wheels because I now how to mount them on these motors without too much of a fuss...

I am working on a project with the same motor, however I'm not sure how to mount the tire too it. What and How did you get yours to work?


04-20-2008, 10:20 PM
Ok...so first off...these are great motors...but they aren't really made for this. I'm using colson 2*5 wheels. They have a 1 3/16ths bore. I went down to the local Home Depot and got wooden dowels that are about that in diameter and cut them off at about 4 inch lengths. I then drilled an 11/32nds inch hole in about the center of the dowel. Next, take an angle grinder and grind the shaft on the motor to roughen it up a bit, and coat the tip of it with epoxy. Use a hammer and bang the wooden dowel onto the motor shaft. Drill small holes down throught the wheels, and put wood screws into them tying the wheels to the dowel. The epoxy should take care of the rest.

I also tried taking the motors apart, heat treating the shaft, drilling a hole in it, and then putting the motors back together. The gears are nylon so don't try to heat treat without taking them apart. Also there is this black greasy glue that they use that just gets all over the place.

Better solution would be to have someone turn a shaft coupler in a lathe out of aluminum. But that would be very expensive, and the wooden ones haven't failed me yet.

Hope this helps.


04-25-2008, 04:28 PM
Any new developments with Vivian you would like to show off? :D

04-25-2008, 07:31 PM
actually sort of. I was just working on the arm I'm adding to her. I can't really seam to get things working correctly though. For one, I don't have powerful enough servo motors. For 2 when the arm moves up and down it becomes longer or shorter. I'm trying to find a way to keep everything level, but given the first problem, the servo behaviors are not all that predictable, and the batteries go dead very quickly because they are just about stalled all the time. Thus making it harder to program the arm motions because nothing does what it should. Have I wined enough?

The arm isn't and won't be mounted for a little while...middle of next week or so...

I'm going to try to get what I do have running well enough to show how it can be controlled using a wii controller by the end of the weekend.

Thanks for asking though.


04-26-2008, 01:02 AM
Have you tried counter balancing with springs or weights yet? That will take a lot of strain and power draw off the servos. See the lynxmotion arms for an example.

03-04-2010, 11:20 PM

I hate posting "not started" projects, and this one will probably have to wait till at least this summer. I've had to cut way back on my robotics budget, and my wife and I have taken a few financial hits...

So...Thinking doesn't cost me anything, and this one should be startable for under $200...which as of late is relatively cheap.

I'm using:

An aluminum frame Most of which I already have or can find at the scrap yard.
4.75 inch off road wheels I have laying around
4 DC gearmotor hg37F269-031 of which I have 2
An SSC-32
A phidget 8/8/8
2 See Near IR sensors
2 See Far IR sensors
A Quickcam
and my wibrain computer...

I've been really inspired by RC rock crawlers...This is also kind of something I'm considering using as a rover platform for the book. Haven't gotten to work on that in a while.

I'm thinking of making a roborockcrawler version of Vivian...need a new name of course.

Here are the rather lame drawings so far...I only have 2D turbo cad. I still have to add the arm, and the shocks. I'm looking at about 4-6 inches of wheel travel if I can get everything working right....


03-05-2010, 07:39 AM
If you're planning on having those wheel support arms pivot, you need to pay attention to what the motors will hit as that happens...

- Jon

03-05-2010, 11:09 AM
If you're planning on having those wheel support arms pivot, you need to pay attention to what the motors will hit as that happens...

- Jon

Yeah...the idea is that each arm will swing as a whole unit the same as in vivian 1 and gepetto...so the body will move a bit with it. I may have to move things out to the sides a bit more to make room for the motors, but there is still a bit of room before anything makes contact. I currently have 2 of the motors, so I may mock up one side to see how much room I actually have, and to figure out the geometry for mounting the spring shocks. 4 inches of travel would be 2 inches up and 2 inches down...So it isn't as big a movement as it may first sound. The flip side of that is that moving up the front wheel by 2 inches will move the back wheel down by the same distance.