PDA

View Full Version : on encouraging autonomous mechs



societyofrobots
07-11-2009, 06:51 PM
I think we should put some effort into encouraging autonomous mechs, or at least mechs with auto-targeting:

Perhaps make the arena more camera friendly, like better lighting (brighter, more balanced).

Require targets to be bright red, and ban the color red for everything else.

Bonus HP for fully autonomous bots.


I think given the above being implemented, it gives autonomous bots a chance, and doesn't penalize non-autonomous bots. Otherwise I don't see anyone ever doing autonomous . . .

WGhost9
07-11-2009, 07:23 PM
I think this is a very neat idea. Autonomous robots are always very impressive to me. Perhaps if open source mech software was available using the ever popular trendnet camera, more people would make their robots autonomous for this event.

lnxfergy
07-11-2009, 09:46 PM
I think we should put some effort into encouraging autonomous mechs, or at least mechs with auto-targeting:

Perhaps make the arena more camera friendly, like better lighting (brighter, more balanced).

Require targets to be bright red, and ban the color red for everything else.

Bonus HP for fully autonomous bots.


I think given the above being implemented, it gives autonomous bots a chance, and doesn't penalize non-autonomous bots. Otherwise I don't see anyone ever doing autonomous . . .

Frankly, I think we need to see more actual competitors before we worry about any crazy ideas like autonomous mechs.

Going fully autonomous would probably lighten a bot up (since something like the CMUCam + several ranging sensors would be lighter than the trendnet). Adding autonomous targeting means you have to add a second camera (as the visual tracking feedback loop through an IP cam would be impossible to tune -- especially when drop outs happen), a second camera means more weight, and as we've already discussed a dozen times over, weight is an issue.

Personally, my experience is that the crowd doesn't understand the difference between autonomous and remote controlled, nor do they care. In fact, since the autonomous bots are almost certainly going to be more slower, the audience will only thing "wow, that sucks", take a look at the response to fire fighting bots... Along the same lines, the level of experience is significantly higher for autonomous, out of 15 entries for the fire fighting contest (which has a much lower technical requirement) only 4 worked reliably enough to complete 3 runs. Half or more crashed quite quickly -- that ends a match for us. And regardless of how anyone feels, we have to care about the crowd -- they foot the bill for RG.

On the subject of "Make the arena more camera friendly". Better lighting entails us setting up special lighting, blocking out outside lights, etc. As we have nobody "local" who was an event coordinator, I hope you plan to lug said lighting equipment out (and purchase it) -- and find some way to plug it in, as there was a serious lack of power distribution in the spot we were set up.

As for coloring the targets, it's been discussed, and will someday be implemented I'm sure, but not sure when. As for banning a color, I think you just have to deal with false positives (as will certainly happen when small children wearing red shirts are looking through the front wall openings).

As for bonus for autonomous, well, if you actually build something that doesn't suck, it probably already has a huge advantage in that it doesn't rely on a cruddy wireless camera connection that gets real jumpy. If you build something that sucks, no HP bonus in the world is gonna save you.

All in all, I'm not sure we really have to worry about autonomous mechs -- if people want to go that route, more power to them, but I don't think we need to tailor the game to them since we really aren't in dire need of autonomous bots. The challenge in getting a reliable remote control bot is quite enough for most people....

-Fergs
(and yes, I'm a PHd student in CS, who works on dialogue systems and autonomous robots, and I still feel this way about this particular venue)

societyofrobots
07-11-2009, 10:40 PM
Fergs, the point of my thread was to encourage autonomous bots, *not* to require them or make major rule changes to support them. The rules allow them, but you and I both know how difficult it can be in such an uncontrolled arena. I suggested simple/easy rule additions that in no way hurt the game-play as is.



Personally, my experience is that the crowd doesn't understand the difference between autonomous and remote controlled, nor do they care. In fact, since the autonomous bots are almost certainly going to be more slower, the audience will only thing "wow, that sucks", take a look at the response to fire fighting bots... Along the same lines, the level of experience is significantly higher for autonomous, out of 15 entries for the fire fighting contest (which has a much lower technical requirement) only 4 worked reliably enough to complete 3 runs. Half or more crashed quite quickly -- that ends a match for us. And regardless of how anyone feels, we have to care about the crowd -- they foot the bill for RG.

Good point. But under that reasoning, we should all just do battlebots with lots of flame throwers and loud noises =P



On the subject of "Make the arena more camera friendly". Better lighting entails us setting up special lighting, blocking out outside lights, etc. As we have nobody "local" who was an event coordinator, I hope you plan to lug said lighting equipment out (and purchase it) -- and find some way to plug it in, as there was a serious lack of power distribution in the spot we were set up.

If someone requires controlled lighting, I don't see any reason to stop them from setting up their own special lighting. Cheap studio lighting can be bought for ~$100. My point being, don't use a single spotlight to light the arena, as that'll kill any autonomous bot.



As for coloring the targets, it's been discussed, and will someday be implemented I'm sure, but not sure when. As for banning a color, I think you just have to deal with false positives (as will certainly happen when small children wearing red shirts are looking through the front wall openings).

Such a simple easy rule to add. =P
Heck, only one robot at the last competition even had red on it, and it was just decorative paint. Reducing false positives to encourage autonomous bots!

(and yes, I've made a half dozen autonomous robots that use camera vision) =P

lnxfergy
07-12-2009, 05:50 AM
I know you weren't gonna require autonomous mechs, I'm just saying, is it really even neccessary to encourage them (read the original purpose of the game, doesn't even allude to autonomy). As for colors, would you buy me a new bioloid frame kit then, cause I already dyed mine red!



If someone requires controlled lighting, I don't see any reason to stop them from setting up their own special lighting. Cheap studio lighting can be bought for ~$100. My point being, don't use a single spotlight to light the arena, as that'll kill any autonomous bot.Um, what happens when 2 autonomous mechs go up against each other, each has their own lighitng, and the two opposing bots want different lighting?

-Fergs

societyofrobots
07-12-2009, 08:16 AM
Are you just against autonomous robots, or tryin to yank my strings? :tongue:


As for colors, would you buy me a new bioloid frame kit then, cause I already dyed mine red!Consider it primer and then do another color over it :wink:

Ok in that case . . . I just read that targets will be standardized. Another option would be targets can be either blue or red, and that the rest of the robot can't be that target color? Point being, autonomous bots can pick out solid bright colors easily, and in no way affects the non-autonomous players.


Um, what happens when 2 autonomous mechs go up against each other, each has their own lighitng, and the two opposing bots want different lighting?You mean like one mech wants balanced lighting, and the other inbalanced lighting? Not sure your point here . . . :tongue:


But seriously, I think it'd be cool to have autonomous bots, and I don't see why you are against simple mods that don't take away game play that can encourage people to add autonomy to their bots!? :confused:

darkback2
07-12-2009, 08:41 AM
I'm not sure what the real argument is here...

I think autonomy is to some degree encouraged...as in it means the less the pilot has to worry about. (we are calling them pilots right?) I would think of it more as an exercise for you as a builder.

What can you do to make it so your bot can be autonomous given the constraints of the game as it currently stands? I would not necessarily think vision, but maybe highly constrained vision...might mean that your robot is always shooting at a kid standing by the window in a blue shirt, but hey right...

How about you make markers for you bot, or way points so that it can know where it is on a map of the arena? I don't think anyone would mind if you hung something off of the top of the arena to help direct your bot. Actually...What if you had your bot detect those "I Build Robot Stickers!" So long as they weren't tripping up the other bots, you could make a grid out of those on the floor...have your bot detect them and your good as gold!

Then it can just shoot at anything that moves inside of the tank. If you used the defender guns with a limited burst setting, seams that would be pretty effective.

heck an autonomous mech that can walk around and get a hit or two in would be something to be proud of!

DB

societyofrobots
07-12-2009, 08:52 AM
Im mostly convinced, with given rules, autonomous bots are impossible. Except for maybe yours that can handle a lot of payload like a mini-pc thingy =P

Basically, no portable realtime camera system can detect those moving targets if the color isn't predictable, standardized, and stands out from the background. Knowledge that the targets are square shaped and 3x3 helps tho!

(and doesn't those 'I make robot stickers' change sensor sensitivity?)

darkback2
07-12-2009, 09:26 AM
Funny,

I wasn't suggesting you use video to track the targets...just the movement. If you had the robot start at a certain hight, and spray everything in that area when it detected something moving? Not really aiming for the targets...not really aiming at all.

As for the I make robot stickers, what if you stuck them on the ground in each of the corners, and in the middle? You could use an IR sensor to detect the shiny stickers, an maybe get navigation working that way?

societyofrobots
07-12-2009, 09:47 AM
I wasn't suggesting you use video to track the targets...just the movement. If you had the robot start at a certain hight, and spray everything in that area when it detected something moving? Not really aiming for the targets...not really aiming at all.Well, won't work if you want an auto-targeting system while your mech is moving. But I guess who needs to aim when you do the shotgun approach! =P


As for the I make robot stickers, what if you stuck them on the ground in each of the corners, and in the middle? You could use an IR sensor to detect the shiny stickers, an maybe get navigation working that wayIR sensors, to detect color/reflectivity, is incredibly unreliable . . .

What about a neon green color? Anyone already planning to make their bot neon green?

(fyi, I don't plan to have auto-targeting, but I still want to encourage it)

DresnerRobotics
07-12-2009, 11:38 AM
I'm more than willing to make changes to accommodate autonomous bots in this competition. Things like brightly colored tape on target plates is totally feasible... enhanced lighting setups adds to the cost considerably however. Gotta realize, I'm having to ship everything for the competition across the country, and already have a considerable investment each year to make it happen.

John, if you want to take lead of organizing the autonomous side of things, I'll certainly be an open ear to your recommendations, but I can't really spend much in resources towards purchasing/shipping lighting, etc.

societyofrobots
07-12-2009, 12:54 PM
I think there was a misunderstanding about my lighting suggestion.


Perhaps make the arena more camera friendly, like better lighting (brighter, more balanced).These suggestions don't require a single penny spent:
- ban flash photography if an autonomous robot is competing (make a simple sign)
- schedule autonomous rounds at ~12pm, when sun isn't shining through windows
- avoid setting up arena near windows (as best as possible)
- don't use overhead lights (to avoid spotlight effect)
- describe on the website the expected lighting situation (so people can test at home)
- allow competitors to bring and setup their own lighting



John, if you want to take lead of organizing the autonomous side of things, I'll certainly be an open ear to your recommendations
Count me in!

Adrenalynn
07-12-2009, 01:59 PM
Bring and setup their own lighting: The official video crew says "no". We can more evenly light it - we have *faaaar* better lighting gear than you do, but the 'bot needs to be designed for the event, not the event designed for the 'bot.

- avoid setting up arena near windows (as best as possible)
- don't use overhead lights (to avoid spotlight effect)

We're where we're put. Design around it.

societyofrobots
07-12-2009, 03:18 PM
Bring and setup their own lighting: The official video crew says "no". We can more evenly light it - we have *faaaar* better lighting gear than you doProfessional lighting without a penny spent, and no one has to bring their own? ummmm well then lighting problem solved! =P

(my memory kinda sucks, I don't remember lighting at the last event)



but the 'bot needs to be designed for the event, not the event designed for the 'bot.

- avoid setting up arena near windows (as best as possible)
- don't use overhead lights (to avoid spotlight effect)

We're where we're put. Design around it.
I'll see what I can do during the event to help out anyone who wants to do autonomous stuff. I don't like telling autonomous builders 'we'll shine the lights at your camera when we don't have to' or 'we could request to be away from windows before area assignment, but would rather not request it' =P

(and just a reminder, as I said before, I am *NOT* using autonomous camera functions on my bot nor do I need any special lighting or target colors - I am doing this to support the event and encourage autonomy for anyone else)

Adrenalynn
07-12-2009, 06:19 PM
No lighting at the event because there's no power. Steve and I were going crazy just to try to get enough power to power the small amount of video gear we had there. We had video gear turning itself off because power-supplies for vacuum cleaner chargers were spilling over and lapping at the switch. We designed the videography around the limitations of the event, rather than ordering the building knocked to the ground and rebuilt because it had inadequate power, or ordering the San Francisco Bay drained because the humidity was too high... ;)

lnxfergy
07-12-2009, 07:30 PM
Are you just against autonomous robots, or tryin to yank my strings? :tongue:

You mean like one mech wants balanced lighting, and the other inbalanced lighting? Not sure your point here . . . :tongue:

These suggestions don't require a single penny spent:
- ban flash photography if an autonomous robot is competing (make a simple sign)
- schedule autonomous rounds at ~12pm, when sun isn't shining through windows
- avoid setting up arena near windows (as best as possible)
- don't use overhead lights (to avoid spotlight effect)
- describe on the website the expected lighting situation (so people can test at home)
- allow competitors to bring and setup their own lighting


Am I against autonomous robots? No. Am I against adding more rules just to encourage the "hypothetical development" of an autonomous mech? Yes. You've already state you don't intend to build one. Out of everyone on this board, I am unaware of ANY COMPETITOR WHO HAS BOTH THE SKILLS AND DESIRE TO BUILD AN AUTONOMOUS MECH. If that's untrue, speak up now please -- but frankly, if you can build an autonomous mech that works really well, you probably have a good shot at a high-paying job somewhere building robots to hunt terrorists.

A mech that wants imbalanced lighting? No, I'm talking about the fact that if we have 3 people give us their idea of "perfect lighting", we'll get 7 answers. Little changes really effect color-based vision. I don't think you'll be able to get a light enough camera with enough processing power to do truly dynamic region segmentation, especially over longer distances. You think 3"x3" target plates are "too big" for your mech, how about recognizing said target plate at 5' out... I say near impossible given the available camera/processing platforms that would actually fit on most mechs.

Ban flash.. HA.. we made multiple comments near fire fighting.. people don't care. You have to deal with it. It's a fact of life. Flash will go off. If you mech can't handle it, you should (by driving it).

Avoid windows.. yeah, the whole place has windows around the top edge, and we end up where we end up (as Adrenalynn already pointed out).

Don't use overhead lights.. .um you want me to go shut off the whole lighting system for the whole building then? We don't control the ceiling lights.

Describe the situation.. they can just read this, and the dozens of other threads -- the lighting sucks, it will change throughout the day. Best approximation, take your mech to an Owens Corning Sun Room Demo on a day with passing clouds, and see if it works.

Allow competitors their own lights, means finding power for them.. and dealing with the problem of two mechs who want different lights. And dealing with setup/teardown, frames to hold the lights, eesh. If there is not a regulating body governing lights it will be a mess. If you really feel this is so neccessary, step up and run the lighting department, but bring at least 400' of extension cords so you can find an outlet in the next building over.

-Fergs

Adrenalynn
07-12-2009, 08:08 PM
We brought over a thousand feet of extension cords. The issue there is traffic safety. We also had several 500' rolls of Gaffer's Tape to tape 'em down with, but even so - you have to consider the crowd safety (and potential damage to equipment) when you start trying to run cordage. The environment in there just can't be controlled well enough. Another issue is running cords along fire doors. Little things like that can get the event yellow or even red tagged. With the number of people in there, fire marshals watch those things like a hawk, especially with all the noxious chemicals (various battery chemistries, welding gasses, and fuels)

As it was, I was cringing at the clear violations of power-strips-into-power-strips. Any time I found an opportunity, I pulled those and ran another cord. But eventually the two two-gang boxes we could safely reach with cords were over-flowing. And I was holding my breath for load on a couple of those power strips popping their integrated breakers in the middle of a match. I had the clamp ammeter on 'em several times, moving stuff around trying to balance the power.

My small softboxes are 500-750watts x 2. My large soft boxes are 1500 watts x 2. My large reflectors are 1000 watts x 4. THAT doesn't get run off of a power strip...

Robot Dude
10-14-2009, 11:05 AM
I am unaware of ANY COMPETITOR WHO HAS BOTH THE SKILLS AND DESIRE TO BUILD AN AUTONOMOUS MECH.

Hi there! :veryhappy:

If you could throw us out of the box dreamers a bone here... Give me lime green targets. That's all I'm asking for. We will have an autonomous biped mech at the next event. Let it run with human pilot competitors, we don't care. I know you guys are experts on what can't be done! So please let us worry about the technical issues. Not only do I think we can do it. I think we can do it for under $1000.00. No embeded PC microcontrollers or expensive digital servos required.

lnxfergy
10-14-2009, 11:18 AM
Hi there! :veryhappy:

If you could throw us out of the box dreamers a bone here... Give me lime green targets. That's all I'm asking for. We will have an autonomous biped mech at the next event. Let it run with human pilot competitors, we don't care. I know you guys are experts on what can't be done! So please let us worry about the technical issues. Not only do I think we can do it. I think we can do it for under $1000.00. No embeded PC microcontrollers or expensive digital servos required.

That's fine, you can even bring the post-it-notes to put on my scoring plates (since your idea of lime green may not be the same as mine). But can I please ask, how are you going to overcome the technical issues? Lime green does not always equal lime green, a pure color blob tracking algorithm will probably fail miserably given the horrendous lighting at RG (heck, I found it hard to drive at certain times of day cause the sun washed out the camera).

I'm not being sarcastic or mean here, I'm very serious when I ask: how? Any video algorithm that's more than RLE + blob tracking typically won't fit on a PIC/AVR. And I can't think of many reliable non-visual algorithms that can detect the difference between a mech, a wall, or a building (especially on a biped that can't localize itself)...

-Fergs

P.S. As a side note, when I advocate for "expensive digital servos" (AX-12?), it's more an ease of use thing. Pose and capture really does speed up prototyping when you aren't just building a kit robot that already has all it's walking/gaits developed.

Robot Dude
10-14-2009, 11:48 AM
That's fine, you can even bring the post-it-notes to put on my scoring plates (since your idea of lime green may not be the same as mine). But can I please ask, how are you going to overcome the technical issues? Lime green does not always equal lime green, a pure color blob tracking algorithm will probably fail miserably given the horrendous lighting at RG (heck, I found it hard to drive at certain times of day cause the sun washed out the camera).

I'm not being sarcastic or mean here, I'm very serious when I ask: how? Any video algorithm that's more than RLE + blob tracking typically won't fit on a PIC/AVR. And I can't think of many reliable non-visual algorithms that can detect the difference between a mech, a wall, or a building (especially on a biped that can't localize itself)...

-Fergs

P.S. As a side note, when I advocate for "expensive digital servos" (AX-12?), it's more an ease of use thing. Pose and capture really does speed up prototyping when you aren't just building a kit robot that already has all it's walking/gaits developed.

My comments on expensive digital servos are directed towards all the nay-sayers from all the countless posts concerning biped mechs. I even remember you saying...


It all is gonna come down to engineering lower-power leg designs and gaits in order to make a biped work well at lower budgets.

That's what the BRAT is man! A walking platform that doesn't put the hip and knee servos into more than +/- 15° out of lined up. low power It's been there from the beginning but for some reason the misinformation is still widespread here. lol Glad you guys aren't heading up medical research or we would all be attaching leaches for blood letting. :)

As for the lime green color tracking question. I'm not going to give away all my tricks. Perhaps a hint. The perceived color is solely dependent on the light shined on the object. And it doesn't involve a single 4000w lighting fixture... Think about it. ;)

Adrenalynn
10-14-2009, 12:41 PM
>> for some reason the misinformation is still widespread here. lol Glad you guys aren't heading up medical research or we would all be attaching leaches for blood letting.

Please don't be so insistent with the fight-picking, Jim. There's a handful of really smart people in here and they're largely biting their collective tongues to maintain the tone traditionally maintained on the TRC. I suspect you don't want to take excessive advantage of their good will and hospitality...

xdream
10-14-2009, 01:10 PM
Hi all,

I am a new to this community but not to robotics. I have made several autonomous robots and think that it would be possible to enter a fully autonomous mech.

I agree that we need to design around the areana conditions but hope at least we could have uniquly colored targets to shoot at.

I'm surprised there are so many nay-sayers for autonomy....should be an interesting competition if autonomous robots are involved and even mixed with non-autonomous robots.....

Based on the videos I have seen and what I have heard about the last competition I would be doing well to just have a functioning autonomous robot even if it does not do too well.

xdream

xdream
10-14-2009, 01:15 PM
I just want to say that I like the color red idea from the original post below.....I don't think we need to change the arena lighting too much..I agree that we can't engineer the arena...but this post hit the spirit of a rule that would make it possible to do autonomous robots...

xdream


on encouraging autonomous mechs
I think we should put some effort into encouraging autonomous mechs, or at least mechs with auto-targeting:

Perhaps make the arena more camera friendly, like better lighting (brighter, more balanced).

Require targets to be bright red, and ban the color red for everything else.

Bonus HP for fully autonomous bots.


I think given the above being implemented, it gives autonomous bots a chance, and doesn't penalize non-autonomous bots. Otherwise I don't see anyone ever doing autonomous . . .

Robot Dude
10-14-2009, 01:23 PM
After further research on this, yes it's happenining this fast. Red has proven to be much easier to track. Not sure why...

lnxfergy
10-14-2009, 01:31 PM
The problem with just saying "red" target plates is: how do you define "red"? Depending on color space, camera, lighting, "red" is many different things. Simple color blob tracking has serious limitations.

I'm not against autonomy. I've participated (and won) numerous autonomous fire fighting contests. I also work in a university research lab, that works with autonomous bots. However, I think that building a working autonomous mech warrior is significantly harder than even say, the Trinity Fire Fighting contest. In a typical year at Trinity, there might be 100 entries, of which less than 15 can routinely navigate a known maze and extinguish a candle. Now, in addition to autonomous navigaion, you're tracking a moving object, and you have to do it visually -- and all your processing power has to stand on two little legs.

-Fergs

Robot Dude
10-14-2009, 01:32 PM
Please don't be so insistent with the fight-picking, Jim. There's a handful of really smart people in here and they're largely biting their collective tongues to maintain the tone traditionally maintained on the TRC. I suspect you don't want to take excessive advantage of their good will and hospitality...


To basically say we are living in the dark ages is ludicrous... we actually built bots and showed up. And we're continuing to build better bots. I'd say that's quite a bit of development over the rest of the general population.....

I honestly though it was ok to throw a little sarcasm in here. I've seen enough of it from some other members. I didn't intend to insult or overstep some boundary. I have seen less than encouraging comments on bipeds even lately. But this is just going way off topic... My bad... :(

Adrenalynn
10-14-2009, 01:35 PM
Sorry - no can ban the color red. We already have red mechs.

Pure green (255/255/255) is the most sensitive color for traditional CCDs. It's also the least saturated color in human skin tone - which is why it's used for chromakey. It will also get washed out in the natural lighting.

Be aware that blinding the opposing mechs' cameras with any lighting added gives you walking papers instead of awards - just as a heads-up. I've already had my IR illuminators denied.

xdream
10-14-2009, 01:45 PM
I'm confused...isn't 255/255/255 white? I thought green was 0/255/0 unless we have our termonology confused. As far as red goes...ignoring the banning topic...it doesn't have to be pure red...if we just agree to a particular color red tape or paint or maybe even a post-it there shouldn't be a problem.

Green is difficult to track since it is the most sensative colors. Blue may be an alternative but is harder to detect then red.... just some thoughts... If we are going to have autonomous bots we just would need to agree on a fair standard color and then us dreamer/designers can work with that.

xdream

Adrenalynn
10-14-2009, 01:58 PM
Sorry - you're correct. I was just working on a product shot as I wrote that. 0/255/0. Pure saturated green.

The fair standard is: "plan for anything. Don't get in the way of the people that have already invested tens of thousands of dollars in their already working, walking, shooting, video-transmitting mechs. Don't try to redesign the event to fit your desires, make your desires fit into the realm of reality - ie. the competition as-written"

I have six plus gallons of this (http://www.markertek.com/Lighting-Background-Effects/Backgrounds-Stands/Chroma-Key-Products/Rosco-Laboratories/150-05711-0640.xhtml) if anyone needs to paint anything on-site. ;)

darkback2
10-14-2009, 02:04 PM
Hey,

I have sort of stopped posting because of the negative atmosphere that seams to have grown in the forums over the past while. There are a lot of smart people in here, and out there...and really...

I guess the point is that it seams like it used to be about celebrating the cool things people were doing, and learning how they did them, and now its about arguing with eachother about how wrong we all are.

I'm not trying to pick any fights, and I've tried to contribute where I can, but until we all start to value eachothers ideas again, and think twice about all of the things we post then this community really isn't something that I'm going to be too much a part of. Which makes me sad because I really look forward to seeing what is up at the trossen forums every day.

Maybe I shouldn't have posted this, and maybe I'm just as much to blame...but oh well.

DB

DresnerRobotics
10-14-2009, 02:35 PM
I have seen less than encouraging comments on bipeds even lately. But this is just going way off topic... My bad... :(

If you're referring to the recent topic on the pan/tilt camera, my comments were more directed that the idea of using a 500g+ pan/tilt camera was ridiculous when less expensive, lighter options were widely available. Payload is a consideration with any biped, and even with the 6 grand in servos that are sitting in my bot, I wouldn't use a 500g camera.

I'm all for bipeds, and I'm all for there being a low cost entry such as the BRAT (Thank you for making that a reality). People just seem to think sometimes that payload is a non-issue, when it's one of the biggest issues in this competition.

Robot Dude
10-14-2009, 02:37 PM
The problem with just saying "red" target plates is: how do you define "red"?

A good place to start is with pantone colors. They can accurately be reproduced by any print shop. Maybe to even find the most reliably tracked color with say a CMUcam. Then cross it to the pantone equivelent.

I'm not really about hiding ideas. So I will reveal the lighting idea...

Of course there may be a rule against it... Why rely on big expensive lights to flood the arena. Why not allow the bots to carry the equivelent of a precision white light source (LED flashlight.) It's not likely to dissable a camera. It would allow color tracking. It may not even be necessary, but it's hardly intended as a way to dissable other bots. Well there it is... :rolleyes:

You do allow laser pointers, yes? They could disable a camera if pointed directly into it. So if laser pointers are ok, then why not LED flasjhlights?

lnxfergy
10-14-2009, 02:41 PM
XDream/Jim - As for colors - both Tyberius and I have said you can bring any color post it note you want to slap onto the opponent's scoring plate. We won't be dictating a particular bot's colors (so it is possible that the color of your post-it note may be the same as a particular bot, it's just something you'll have to live with) or the lighting of the arena (it's just too hard to actually control that at RG).

(A note for 2010 competitors, the new scoring plates will not be transparent like the 2009 ones! Transparent FSRs just don't exist. Something to take into account as you design your bots, since I know a number of people were thinking of using the target plate as a camera cover, and I'm not sure this has really been mentioned)

All - as for autonomy, if you can do it with above restrictions, go for it! My previous comments basically boil down to: I'd love to see an autonomous mech, it's just not gonna be easy, even RC mechs were quite hard to get working reliably.

-Fergs

Adrenalynn
10-14-2009, 02:44 PM
As I said before - I'm happy to light the arena. More/better power options than we had last year would be a requirement. I'll need 20A dedicated 110v or 9A dedicated 240v.

xdream
10-14-2009, 03:15 PM
All of this sounds great...I like the idea of a colored post it note or just s simple sqare of red (or some other color like blue) packing type tape on the scoring transponders. I agree we should not significantly restrict the rules of others.

I plan to enter an autonomous mech but we will see if I am successful. I plan to use Jim's Brat Platform with custom autonomous code...quite a challenge. If all else fails I can always drop back to human piloting.

Yes...autonomy has many huge challenges to overcome but I belive it is achievable with current technology on a reasonable budget.

Thanks for the flexability....lets hope I don't end up making a free roaming 'target' for you guys!

xdream

Adrenalynn
10-14-2009, 03:49 PM
As I said before - I'm happy to light the arena. More/better power options than we had last year would be a requirement. I'll need 20A dedicated 110v or 9A dedicated 240v.

Per my offer: http://forums.trossenrobotics.com/showthread.php?p=35138

And I need to edit that above requirement. I'd need [email protected] or [email protected] as a minimum. Just giving some thought to the environment. If we enclosed it, it'd be easier, but since we can't, it's going to take something similar to my home chromakey stage, as in the link. My home version has 60A @ 240v then split off into legs, and that's dedicated for lighting...

sam
10-14-2009, 06:05 PM
whew, this developed quite fast... Had to read the whole thread.

So, all the processing power of the robot needs to be on the bot itself? If I may ask, what kind of processor are you going to use Jim and xdream? Something like the XMOS or an ARM processor? But could that cut it for blob tracking?

Sam

mannyr7
10-14-2009, 06:16 PM
I don't think the processing has to be completely housed on the mech. You could still have control be offloaded to a wireless PC link. Just replace the human pulling the trigger with some AI of your own design. Save on weight this way.

lnxfergy
10-14-2009, 06:21 PM
I don't think the processing has to be completely housed on the mech. You could still have control be offloaded to a wireless PC link. Just replace the human pulling the trigger with some AI of your own design. Save on weight this way.

If we think of our gun honing in on an opponent, it's really a visual servo-ing task. The one issue to be careful of when offloading video for processing is the introduction of non-deterministic network lag (that's the sort of stuff that makes servo-loop-tuning nearly impossible, and typically unreliable). Effectively, you'll have to stop and wait to aim, and hope your opponent stops for you to aim at...

-Fergs

Adrenalynn
10-14-2009, 06:21 PM
So what happens when your packet transit times go from 80ms to 1500ms to 150ms to 1200ms to 75ms? How are you going to address realtime processing from non-deterministic systems? How are you going to get any TTF from your range sensors? How are you going to keep it from flopping on its back? Where is your algorithm tuning going to get the necessary data? Infinitely recursive Kalman filtering?

I'm not saying you can't do it. I just would like for you to spell out how specifically you're planning on overcoming non-determinism?

[edit] Sorry - Fergs and I were posting at the same time - he just beat me in there by a little bit.

sam
10-14-2009, 06:23 PM
However, I think that building a working autonomous mech warrior is significantly harder than even say, the Trinity Fire Fighting contest. In a typical year at Trinity, there might be 100 entries, of which less than 15 can routinely navigate a known maze and extinguish a candle. Now, in addition to autonomous navigaion, you're tracking a moving object, and you have to do it visually -- and all your processing power has to stand on two little legs.

-Fergs

Fergs is probably right,

but I don't see any precision on the matter in the rules.

EDIT: Ok I get it. So it's not necessary but useful?

xdream
10-14-2009, 06:28 PM
I will be doing all of my processing in the robot itself. As for the processor I will be using the bot will use and atom pro...vision system is tbd.
xdream

Adrenalynn
10-14-2009, 06:36 PM
>> will use and atom pro..

Awesome!

What will you be doing for field memory/frame buffering?

lnxfergy
10-14-2009, 06:40 PM
Fergs is probably right,

but I don't see any precision on the matter in the rules.

EDIT: Ok I get it. So it's not necessary but useful?

Ok, yeah, I see what you were going at, there is no Mech Warfare rule against off-board processing. Heck, if you wanta haul a giant cluster of PCs down there, its ok (although finding enough power might be a problem). However, visual servo-ing is typically one of those things that's best done on the local processor. A better task to offload would be something like mapping and localization, there's very sparse data to offload each frame, the timing isn't super critical, but the processing power and data storage can be huge, so it's a great candidate to offload (as would be high-level reasoning, long term planning, etc. Generally, I'd say offload long-term processes, keep the fast/servo-like stuff onboard).

-Fergs

xdream
10-14-2009, 07:27 PM
Adrenalynn- most likely an ARM processor for the on board image processing but I haven't decided yet.

Back to the color discussion...would anyone object to using a hot pink post it note for a standard color? It is close to red but us autonomous guys need to standardize on something and I find it works well...we will have to live with the red bot.

xdream

Adrenalynn
10-14-2009, 07:41 PM
You can make it any color you want. Just bring post-it notes sized appropriately for the target sensors. I suggest being ready to deal with it being black with lots of speckles or white with lots of bloom depending on the time of day, and also be ready to deal with someone's hot-pink mech that they built just 'cause.

So you're building an ARM board? Going the SDRAM route or dual ported/triple ported memory or ND or ... ? I'm in the middle of the design for a couple XMOS-based cameras which is why I ask... Going for your low-end CMOS/YUV camera type? They're not terribly light responsive - 1lux is typically the best you'd get, which would be completely black in the arena, or NTSC? ATSC? From my experience last year, I'd suggest that 0.5lux be your absolute minimum, but for autonomy you're probably wanting something at least 0.01 or even ideally a 0.001 lux camera. So maybe mate the ARM to one of the Philips VIPs to make life easier?

xdream
10-14-2009, 07:47 PM
I think the best idea is that red reflective tape...just cover the target with it...maybe we have a common source to buy samples to experiment with.....

I would really love to do an autonomous bot and any help from the community would be greatly appreciated. All efforts will be made to not impose uncessary constraints on the contest rules.

If people agree I can find a source for reflective tape and send it to whoever will be distributing the targeting sensors.

Sorry I'm still new here and don't even know you to address the question to yet.

xdream

xdream
10-14-2009, 07:51 PM
Like another poster said...I would be very happy if an autonomous bot could survive the compitition and get a few hits in....but it would be inspiration for next gen robotics...what would be great is a fully autonomous match but I doubt we will have enough participants...

lnxfergy
10-14-2009, 07:55 PM
If people agree I can find a source for reflective tape and send it to whoever will be distributing the targeting sensors.

Xdream,

The idea is "there is no standard" color. post-it notes are 3"x3" (same size as our scoring boards). The suggestion has been to get something like a post-it note, that you like. Whatever you show up with is fine, we'll attach it to your opponent before the match. That way, different people can have the different color they want. The one thing I'd request is: whatever it is you bring, make sure it's not SO STICKY that we can't get it off easily at the end of a match.

-Fergs

societyofrobots
10-14-2009, 07:56 PM
Its time for me to chime in . . . hey, I started this thread!

First, I 100% agree with Robot Dude. What he plans to do I know can be done. In fact, the biped I'm designing works by the same principles.

For anyone who thinks you can't track the color white on a grey background outside in bright daylight using small microcontrollers, I say you haven't seen my ERP robot:
YouTube - ERP at Mobot 2008
btw, I tracked both blue and green, as red was too flooded by sunlight. As long as the colors are predictable, a robot can track anything. That was the purpose of this thread, to encourage some form of predictability that can be programmed into a bot.

As for darkbacks comments, I also 100% agree. I always feel spoken down to, insulted, told that I'm wrong, and generally feel a constant air of negativity from multiple core TR forum members. Its always important, as an adult, to be humble and accept the possibility of sometimes being wrong despite your beliefs. As the TR forum isn't conducive to my happiness, I rarely participate. =P

ps - If someone supplies me with a color to put on my target board, I'd be more than happy to do it. I'd even help you get it working, despite it reducing my chance of winning. For me its about having fun, making friends, and learning - not winning! =)

Adrenalynn
10-14-2009, 07:57 PM
>> All efforts will be made to not impose uncessary constraints on the contest rules.

You keep mentioning "imposing unnecessary constraints on the contest rules" - I'm afraid I don't understand that.

When we design Sci Oly bots, for example, we don't get to walk in and tell them we're changing their rules. They'll just laugh and show us the door. Actually, most of the coordinators/proctors would frown and show us the door, but for sake of argument...

As a proctor for Science Olympiad myself, I bring copious copies of the rules, a highlighter, and a pretty purple pen. I get into a nice swing of highlighting the rule in question, and then writing "Next year, please read the rules. Disqualified." Here I always kinda thought that's what rules were about, right? Everyone has a set-in-stone target. I try to design my bots to fit the rules instead of trying to redesign a competition's rules to fit my bot.

I guess I never really considered that I could just impose whatever rule changes I felt were necessary on competition organizers.

Adrenalynn
10-14-2009, 08:03 PM
>> to be humble and accept the possibility of sometimes being wrong despite your beliefs.

Show me. That's all I ask. The rampant fantasy on this forum has gotten to the point that I'm having trouble accepting it. This place is becoming worse than ... naw, no reason to drag another forum into it...

Whatever. The community has set its course and decided what it wants to become - I wish it all the best in that decision.

Robot Dude
10-15-2009, 08:58 AM
I guess I never really considered that I could just impose whatever rule changes I felt were necessary on competition organizers.

I think this situation differs in one major way. We aren't necessarily asking for rules to be changed for the event. We are asking for clarification of rules, or standardization of concepts so autonomy can be tried. It would be fun. Really asking for Andrew to chime in here. Lets face it. If Andy want's autonomous bots at his event then he needs to regulate it, or delegate the regulation.

lnxfergy
10-15-2009, 10:55 AM
Jim (and others),

I'm going to specifically add to the rules the 'you can bring any 3"x3" colored dot/post-it-note' rule to the next rules draft. I'd think you'd be happier with getting the option of any color, rather than us dictating a particular one. Beyond that, I'm not sure we really can regulate it that much more. We just have to deal with whatever lighting comes in (even if Jodie sets up awesome lights, assuming she can find power for them, they may still get overpowered by the ambient lighting coming in, we just can't seal it up and run in a dark box).

As for promoting/discouraging autonomous mechs, we've done neither. Frankly, autonomous robots are typically gonna be slower than those that have a human controller, if you look at the videos from 2009, the human-piloted mechs were already slow/clunky enough for most people. We have to remember: Robogames needs to make money off admissions to pay for the event, your entry fee is pretty small (miniscule compared to many robotics competitions). Therefore, we also want to be somewhat crowd-pleasing. The general population just doesn't understand autonomy generally, and it's limitations. I can't tell you how many times I've watched fire-fighting robots run into a corner and get stuck. Imagine watching a mech do the same, people tend to walk away pretty fast....

Building an autonomous mech is a choice, just like building a biped rather than a quad, it's a harder route to go, you do it because you have a desire to do so, not because the rules are designed to make it advantageous (and I know Andrew/Tyberius agrees with me here). Also, look at the spirit/intention of the game: we wanted to recreate mini-scale Mech Warrior, a game where humans piloted monstrosities, autonomy is an afterthought, and is really only going to happen if a particular builder wants to do it. I think we will see more "fly-by-wire" applications though, not so much full-autonomy. And that would also be in line with the spirit/intention of the game.

-Fergs

DresnerRobotics
10-15-2009, 11:33 AM
I think this situation differs in one major way. We aren't necessarily asking for rules to be changed for the event. We are asking for clarification of rules, or standardization of concepts so autonomy can be tried. It would be fun. Really asking for Andrew to chime in here. Lets face it. If Andy want's autonomous bots at his event then he needs to regulate it, or delegate the regulation.

I agree with what Fergs said here. The original intention of Mech Warfare was to have the robots be remotely piloted, however I do not see an issue with autonomous robots being involved whatsoever. That said, running the competition in its current form is a massive drain of my own resources and time, so I can't make any promises to go out of my way and bend over backwards to accommodate autonomous robots.

I will however, within the best of my abilities, try to foster a competition where autonomous robots are possible. The new venue is going to have much better lighting, and won't have giant 80+ year old windows shining strange spectrums of varying light throughout the day on us. Everyone has also agreed to place colored markers on their robots to assist in machine vision. Past that, there isn't a ton we can do... the environment is going to be the biggest challenge to overcome, and that is about as controlled as it can be.

If we end up with a significant population of autonomous robots, than we can always look at making a specific league or ruleset for the, but we'll cross that bridge when we get to it.

Robot Dude
10-15-2009, 11:56 AM
Hi guys,

First off Fergs,

If an autonomous mech were to appear at the next event, and it's builder says it can track these post-it's, great, we're good to go. But if two autonomous mechs were to appear (it could happen) and one has one color and the other has a different one... Which ones are used. All I'm asking for is a standard. I will even offer a suggestion. Hot pink post it notes. Make it standard and be done with it.

Autonomy does not have to equal slow. Look at any Japanese 3kg sumo. There is no way an RC sumo can compete with autonomous ones. I submit to you an intellegently designed autonomous mech could far outperform a human pilot. I know the speal on creating interest for the spectators. Just don't get why you immediately assume they will suck. :)

It appears to me that many have discouraged. Simply due to the attitude of don't bother us with it. I'm really just trying to be clear. I'm not trying to pick a fight.

Andy,

I appreciate your input on this. Thanks! Jim

xdream
10-15-2009, 11:59 AM
Sounds great....the colored targets are all we (the people interested in autonomy) are really asking for and it seems there is no problem with this as most people agree to that concession. Without that even auto targeting will be close to impossible.

I would think that it would even be a benefit to human pilots to have the targets they are shooting out stand out in the video field...

xdream

xdream
10-15-2009, 12:01 PM
I agree with Jim...and am happy with the hot pink as well...
xdream

lnxfergy
10-15-2009, 12:13 PM
If an autonomous mech were to appear at the next event, and it's builder says it can track these post-it's, great, we're good to go. But if two autonomous mechs were to appear (it could happen) and one has one color and the other has a different one... Which ones are used. All I'm asking for is a standard. I will even offer a suggestion. Hot pink post it notes. Make it standard and be done with it.

Autonomy does not have to equal slow. Look at any Japanese 3kg sumo. There is no way an RC sumo can compete with autonomous ones. I submit to you an intellegently designed autonomous mech could far outperform a human pilot. I know the speal on creating interest for the spectators. Just don't get why you immediately assume they will suck. :)

Jim, the idea is that we put the post-it notes on for each match depending on what competitors need (see my earlier posts). Thus, your opponent always has the correct post-it notes on their targets (it really isn't much work, and it ensures that you have been able to test/train on the exact color you choose). One reason we don't want to pick a standard: everybody isn't going to agree on a standard, there's always gonna be someone who wants something different.

As for speed, sumo is a very simple task, find the opponent, blow it out of the ring and not go out yourself. A few ranging sensors, some reflective ones, etc. Once you are forced towards a video solution, I think it's a lot harder. I've done some work in machine vision, not a lot, but enough that I know I wouldn't want to deal with the very variable light in our arena. I'm not saying that the really bright light is the issue, I'm more worried about shadows. Shadows change color a lot, SOR posted a video of an outdoor line follower as an example that bright lighting can be overcome, but even there, not a lot of shadows, or even a lot of moving shadows.

A top autonomous mech probably could navigate faster than a purely human piloted one (if we throw out any desire to localize or keep track of our travel), thus my reason for suggesting "fly-by-wire" will become common-place (Issy2 will be fly-by-wire, where the onboard processor is handling obstacle avoidance, and the pilot is just steering it in the right direction).

But targeting... targeting is the hard part I think. We also saw humans having a hard time (the non-deterministic lag in the network, made it really hard to visually target). On one hand, autonomous mechs might have an easier time targeting since it's a deterministic lag, on the other hand, color tracking is gonna be tough given those shadows and variable lighting.

Either way, if you think "an intellegently designed autonomous mech could far outperform a human pilot" why are we having this discussion? Maybe we should be discussing giving human piloted bots a bonus cause they still have the puny human?

-Fergs

Connor
10-15-2009, 12:17 PM
Hi guys,

First off Fergs,

If an autonomous mech were to appear at the next event, and it's builder says it can track these post-it's, great, we're good to go. But if two autonomous mechs were to appear (it could happen) and one has one color and the other has a different one... Which ones are used. All I'm asking for is a standard. I will even offer a suggestion. Hot pink post it notes. Make it standard and be done with it.

Since we're only doing 1-on-1 right now, I don't see that as a issue. Just change out the notes and your good to go. In the future, I could see it being a slight issue with 2-on-2, but, IF I was a building a bot like this.. I would make sure I could change the target color before the match, or even on the fly. Also, What happens if we set a color for a target, and someone shows up with a mech that is painted that color?

Thanks, Connor

lnxfergy
10-15-2009, 12:20 PM
Since we're only doing 1-on-1 right now, I don't see that as a issue. Just change out the notes and your good to go. In the future, I could see it being a slight issue with 2-on-2, but, IF I was a building a bot like this.. I would make sure I could change the target color before the match, or even on the fly. Also, What happens if we set a color for a target, and someone shows up with a mech that is painted that color?

Thanks, Connor

We'd have to have multiple colors for 2 on 2 anyways! Otherwise, how do you tell if you are shooting your own team or a competitor?

As for a mech being a particular color, that's just a fact of a life (as I said before). It's also a fact of life that if you are targeting hot pink, a child with a t-shirt with a hot-pink 3"x3" square on it may appear. That's the trouble with autonomy, we aren't in a totally artificial environment, and the real world is really dirty for vision

[Edit: for that reason, you might train on multiple colors, if you can choose to flip between green or pink targets, you can switch depending on what your competitor looks like, etc]

-Fergs

xdream
10-15-2009, 12:33 PM
Then the autonomos bot needs to select a different color...or live with that color.

sam
10-15-2009, 12:35 PM
[Edit: for that reason, you might train on multiple colors, if you can choose to flip between green or pink targets, you can switch depending on what your competitor looks like, etc]

-Fergs

Yeah, that's what I was going to suggest. Just "train" the 'bot to see with green and blue for example. A simple jumper could let the bot determine if you decide to go with one color or the other (low= color1, high = color 2). So no need to change the actual program.

That let's you have a backup.

Sam

DresnerRobotics
10-15-2009, 12:36 PM
I think the point here is that if you're coming with an autonomous mech, try to make it versatile as we can't always control the environment. You probably do want to shoot for at least two sets of colors for it to track, that way you have a backup if for some reason the first doesn't work out.

Hell, I'm bringing a backup mech in case my first one doesn't work out ;)

sam
10-15-2009, 12:42 PM
Hell, I'm bringing a backup mech in case my first one doesn't work out ;)

And some of us have trouble affording our first Mech... Sheesh ;)

Robot Dude
10-15-2009, 12:50 PM
Doh! Slaps own forehead! Whap! :eek: I didn't know it was one on one! That explains a lot. Sorry... Man I gotta get some work done. lol

DresnerRobotics
10-15-2009, 12:53 PM
Doh! Slaps own forehead! Whap! :eek: I didn't know it was one on one! That explains a lot. Sorry... Man I gotta get some work done. lol

Well, the goal is to move towards 2v2 matches as well... but for this upcoming year they will likely be just exhibition matches.

xdream
10-15-2009, 01:45 PM
Based on some feedback I got via PM's with various people I figured I had better introduce myself.

My name is Mark and I am a Principal Engineer at Intel Corporation. I have 25 years experience in the semiconductor industry. Currently I'm incharge of advanced process technology development for Intel's Atom based microprocessors.

I have been building robots for over 15 years and would like this forum to know that I do have the experience to back up what I post here. I will always admit when I am wrong and enjoy learning from other people and would also enjoy contributing to help others.

My current plan is to build a Lynxmotion BRAT as the base platform. There is a lot of work to go here but nothing I consider a show stopper.

I hope this helps others understand my experience level. I'm trying hard to keep my ego in check and keep a low profile but I have been told people arn't taking me seriously do to lack of credentials.

I hope the above credentials put peoples concerns to rest.

Mark

sam
10-15-2009, 02:50 PM
Ok, so I guess I'll break the ice. Welcome to the forum.

Hum... Hope you can bring a lot to us... I'm an Intel fan, woohoo!

Just kiding. ;)

But more seriously, I can't wait to see what you'll manage to do!

Sam

xdream
10-15-2009, 02:53 PM
Thanks for the encouragement! :D

xdream
10-15-2009, 04:05 PM
Here is a link to Rover...my robot color tracking dog...the video is an older proto-type which tracking was not finished but you can see it tracking the RC car...ok...I have since improved it and continue to work on the problem. You can read about it and see some of my previous revs...

I'm only posting this in hopes it takes some pressure of what can be done with autonomous robots...

Before I get slammed...I do fully understand all of the challenges I face for this project and agree it will be extremely difficult to pull off..but worthy of trying...I figure if I fail I could always enter it using me as a human pilot..

Mark

YouTube - Lynxmotion Rover Robot autonomous tracking RC car

lnxfergy
10-15-2009, 04:20 PM
One of the things is, even if you don't have a fully autonomous bot, your bot can probably help drive itself -- that's huge! Controlling a walking robot through a cruddy video link is really tough. Just the ability to push the stick forward and have the bot walk forward without hitting a wall is HUGE.

-Fergs

xdream
10-15-2009, 04:24 PM
Yes that is my backup plan...but I would really like it to be fully autonomous...with all of the technical and tactical challenges

Adrenalynn
10-16-2009, 02:02 PM
Hold the presses. I believe I just solved the problem. I'll be back shortly with brief documentation.

xdream
10-16-2009, 02:15 PM
Can't wait!

Adrenalynn
10-16-2009, 02:53 PM
Ok - we can turn this into a simple highspeed machine vision recognition process. This is machine-vision 101.

The issues that we run into will be low resolution and worse - DCT block compression artifacts.

So rather than the traditional Fiducial which are coded in square blocks which will give the DCT nightmares, I chose a more organic form. These are quickly hand-drawn, but I think it will become apparent that they require exceptionally low resolution, low light, are IR friendly, and DCT's won't have much if any effect. We could literally do n#, but we only need two for two-team-autonomous. Here's two for discussion [Again - quickly hand drawn to illustrate the concept] You can double click on them to get basically full-size. I sized them to the regulation 3x3" @ 600DPI


[Note: I believe the forum resized the uploads. Full-res here: http://www.jlrdesigns.com/fudicials-v1

Also - I suspect you may not even need a camera to recognize them - but I need to give that a little more thought. In fact, I'm smelling patent on the "little more thought"... ;)


I open the floor to SERIOUS discussion on this idea, please - serious discussion.

xdream
10-16-2009, 03:25 PM
Adrenalynn says...."Ok - we can turn this into a simple highspeed machine vision recognition process. This is machine-vision 101"

That is a great idea for anyone who wants to go that route!

My preference is to stick with color blob tracking for now since I would rather pull a color 'blob' out of a noisy environment rather then doing full blown pattern recognition.

The great thing about Adrenalynn's idea is that since we are 'loosly' agreeing that autonmous bots designer get's to pick the target 'color' of target for the opponent...I doubt anyone would object using patterned Fudicial marker for the targets.

The nice thing about blob tracking is it is immune to noise and compression artifacts by definition but I totally agree that the Fudicial pattern recognition would be better but comes with a penelty of more complexity in the machine vision system.

On the other hand the nice thing about Adrenalynn's idea is that using a B/W pattened marker would be far more robust under variable lighting conditions, as well as have much more certainty that the target IS the target....wow that is a really good idea when spelled out.

For me it's not about machine vision or I would be concentrating on pick and place problems on a robotic arm (one of my other projects).

I'm not sure I want to solve that problem for this competition...but it is a great problem and if anyone thinks they have a relativly simple way to do it I'd love to hear it...heck I was proud of myself just to get the behavior you see in the RC car chase video.

Mark

Adrenalynn
10-16-2009, 03:55 PM
I think being able to use a 1 bit color space is more efficient, but how about a counter proposal given what you wrote? [NOTE: This was developed in a discussion offline with Lnxfergy. He's not here to stick up for himself, so all blame and flame to me. ;) Some credit extends to him, but none of the blame]

[Green was chosen as the central marker because it will give us a light shade of gray in a 2 bit/4 bit space which can then be canceled to white]

I disagree with you that color blobs are immune to noise, however. I'll go set up a camera here in a few minutes and demonstrate.

Adrenalynn
10-16-2009, 03:58 PM
Actually, that's a stupid design - let me fix it

xdream
10-16-2009, 04:04 PM
On a slightly different autonomus topic...anyone care to discuss ideas for the autonomous bot's to do distance sensing to the target?

This is important to an autonomous bot since the camera will not be inline with the gun and laser pointers don't do much good here. Before anyone jumps in on that statment *I* can't think of a good use for using a laser pointer for autonomous targeting without a human 'painting' the target which is not allowed.

The problem with aiming an autonomous mech is that you can't know the horizontal and vertical offset needed to hit the target at various distances. The offset drops to zero at infinity and is huge at close range.

For example if you have a 2 gun mech and your target is 6" away you will need to point at a 45 degree angle if the camera is offset 6" from the gun..this is just an example to illustrate the math.

If the target is 6 feet away then you point nearly strait. The trig is easy....but without knowing the distance to target you will only be able to dial in for a small target range...bigger range as you approach infinity but most fighting seems to be done up close.

So...anyone have any ideas/suggestions on the best way to get distance information?

One thought is that since we know the size of the targets (3x3") you can calculate the target distance assuming you know the angel of the lense and camera sensor and the apparent size of the target.

While this is the simplest way to do it the problem becomes that we may not get good data on the size of the target since the colors are difficult to track that acuratly.

Adrenalynns previous idea would also solve this problem since if you do pattern recognition you would know with good certanty the size of the target.

There are other interesting solutions including using a sharp IR sensor, or ultra sound for the ranging..but then you need to know that the range is to the target and not so something else.

The good news is that once again most of the combat happens at "close distance" and when you are tracking colored objects at close distance you get better and better object (target) data the closer the target is to you..so the close the target, the more acurate you need to be do to the offsets..but you can get more acurate color blob data when the target is close enough where it counts....

make sense?

What do you all think?

Mark

xdream
10-16-2009, 04:08 PM
When I said immune to noise I should have said "more" immune to noise...technically you only need one good pixel to track on the target and you should be able to hit the target...Generally RGB dark noise isn't close to saturated so if you are only tracking highly saturaturated colors they tend to not show up on the noise floor...I have confirmed this with many experiments....the tracking problem becomes getting enough 'good' pixles to track. I meant that you need less good pixles to track with blob tracking then pattern recognition...

Does that make more sense Adrenalynn?

Adrenalynn
10-16-2009, 04:08 PM
Revised : Now with 97% Less Stupid!

xdream
10-16-2009, 04:15 PM
So here is the way I look at it....imagine if the target above was all one color like red....now heep on tons of RGB dark noise....I'll probably get a few good pixels to track out of it...now imagine resolving that image...you need more good pixels..

My thoughts are that the pattern will be more difficult to resolve against heavy noise but better if you can...I don't have time to test this theory but would love if someone else would.

I'm going to stick with my plan for now but Adrenalynn is heavier in machine vision then I am.

Just my 2 cents!

Adrenalynn
10-16-2009, 04:16 PM
When I said immune to noise I should have said "more" immune to noise...technically you only need one good pixel to track on the target and you should be able to hit the target...Generally RGB dark noise isn't close to saturated so if you are only tracking highly saturaturated colors they tend to not show up on the noise floor...I have confirmed this with many experiments....the tracking problem becomes getting enough 'good' pixles to track. I meant that you need less good pixles to track with blob tracking then pattern recognition...

Does that make more sense Adrenalynn?


It does make more sense - however: I still think getting the saturation in random lighting is going to be a brutal challenge. The Fudicial is going to be tolerant to a tremendous number of lighting conditions, including total darkness (illuminated under infrared - "0 lux"), and will require less bandwidth, processing power, and RAM (Y-only, single bit)

In the interest of Rigor and Credit:

Fergs suggested "organics on a bright color.. once we recognize the organic, we can track in color at a higher frame rate and we occasionally update our color detector based on the organic recognition". I countered that the frame rate on a 1 or 2 bit space is going to be at least as high but I'm willing to let that slide for the sake of argument. He proposed putting the fudicial on a colored blob. I countered that we lose contrast. That led me to the proposal you see above, which Fergs hasn't peer-reviewed.

Adrenalynn
10-16-2009, 04:22 PM
Not to be disrespectful and no sarcasm intended:

What happens when the kid wearing the red shirt starts dancing up and down trying to see over the girl in front of him wearing the blue dress and dad, in his hunter green shirt, reaches over to shush him?

Kids are sitting 2" from the bots at some points in time, adults are standing two feet away. If you check some of the video you'll see what I mean... There is absolutely NO WAY, in all seriousness, that you're going to be able to tighten the recognition down to the point that 30/201/25 doesn't have to mean the same thing as 0/255/0.

xdream
10-16-2009, 04:33 PM
Agree with your comments on tracking...

My bot will not be able to shoot over or through the mesh is my safety solution....the technical challenges still remain..

If your idea is the only one that works then I may go that route...but for now still working with what I got...

Adrenalynn
10-16-2009, 04:39 PM
I wasn't really considering the safety aspect. That's for the parents to be concerned about. Hello? Robots firing projectiles here. Sheesh...

Anywho - yeah, it was more the projectile concern. I'm trying to decide which machine to throw this capture card in for demonstration purposes.

Sorry, I do too much business finance these days - for those researching, the correct spelling is: "fiducials" - mea culpa.

Adrenalynn
10-16-2009, 05:06 PM
How are you planning on detecting your target's existence? Wouldn't it be silly for them to come charging straight at your camera showing off their target plate? You can't use motion detection. If you stop and scan and find them, then you go running at them again and all you know is where they were - which is where they aren't now. ;) Sonar and IR rangers won't be of any help unless you have some advanced mapping and accurate odometry.

Adrenalynn
10-16-2009, 05:56 PM
Yeah, now that I spend some time thinking about it - that's the greatest hurdle I can see. The bots won't have beacons - this isn't an autonomous event. You can't sit still, the pillbox rule prevents it. You can't require that the other bot be painted a nice bright saturated color.

So we can't use light, we can't use beacons, we can't use range-finders (practically), and we can't sit still. I'm beyond curious how you're addressing that.

xdream
10-16-2009, 06:04 PM
I plan to hunt targets rather then evade...I doubt a human competetor is going to try and hide from an autonomous mech so I figure they will be trying to sneek up behind me. There is a huge tactical component that you are asking about..the autonomous bot needs to try and hit its target and try NOT to get hit...nearly impossible.

So...in close combat...where your chances of being hit are greatest you also have the greatest ability to hit your opponent...thus if the autonomous mech can find its target it may be able to hit its opponent..

Now the real question becomes...who is better at hitting who at a reasonably close range for a human?

I do not know the answer to that question which will be a lot of the fun for me and hopefully the spectators.

I'm not far enough along to discuss all of the autonomous tactical problems since I need to face them first...I will have to do a lot more testing and my parts for the basic mech are coming next week...hope to keep posting updates.

I could share all of the exploits of my mech bot (when I find them) but it would be like a football team telling the other team all of its plays before the game...this will be hard enough as it is.

Some things that make finding targets easier is that they will likely be less then 24" off the floor...I can reject tracking impossible targets...then there is the what happens if it likes that kids red shirt..well I will need a way to not 'lock' on to false targets...there are ways to boost target confidence with various techniques.

Another big problem is the mech bot will never ever know if he is hitting a valid target but it WILL know when it is hit..thus if it is aiming at the kid's shirt and getting hit from the back..and it knows there is only one robot..it must have the wrong target..

Now if it is targeting the red shirt and getting hit from the front...meaning the real target is in front of the robot and the kid is behind the target....this is not the most likely scenario...so here are some of the exploits I'm talking about..if the competetor robot parked in front of the kid with the red shirt...then it is essentially camoflage and auto mech will never see the real target....

Yup...have to solve all the problems...been doing target accept and rejection experiments.

I don't agree that motion detection via frame differencing will be usless for targeting...I may use some form of motion tracking to aid in target recognition.

I also have one 'secret' weapon that I don't want to disclose yet. By weapon I don't mean a gun I mean an ability that I have not discussed yet.

This should be enough to get this thread going again!

Mark

xdream
10-16-2009, 06:10 PM
I just saw your last post before I made the one above...I do have a plan to solve what you ask and to me it is the most interesting problem.....

I do know all of the problems are solvable....just maybe not at the same time ;)

Seriously..I have one sensor type that I have not seen mentioned before that will be a help...I don't want to discuss it until I test it or I will get slammed on here!

Mark

xdream
10-16-2009, 06:13 PM
Oh...one idea (not sure if is allowed) is for the autonomous bot to carry it's own LED light source. I think it should be legal unless it accidentally blinds the other cameras...

That is a good topic of discussion...can the bot have its own target illumination system?

You know that will make the saturated colors pop off the background since light levels drop exponentially with distance. Kind of like an overly saturated poor flash photo against black background....

that will help target certainty...

I need to use every cue I can to pull it off.

Adrenalynn
10-16-2009, 06:15 PM
How does that help you find a mech? It helps you find a color blob (maybe) if the target is facing you. If it's sideways you're still blind.

And my IR was already shot down anyway since it will blind cameras.

Oh - I see you posted above, sorry, I only read the last one. If you're thinking of heat - tried thermopiles last year. Tybs and I went down this road, it was a washout.

xdream
10-16-2009, 06:16 PM
What about non IR but just regular white LED's?

xdream
10-16-2009, 06:18 PM
It helps because the more light the easier many tracking tasks become.

Adrenalynn
10-16-2009, 06:20 PM
White LEDs are potentially even worse. If you soften 'em enough to not blind a camera you diffuse the light to the point where it just doesn't help. Try shining an LED flashlight into a camera and you'll see what I mean.

xdream
10-16-2009, 06:26 PM
One thing this discussion is highlighting is that the autonomous bots need to solve the entire system...not just one piece...that is a challenge of autonomy...and the real world is a complex system.

BTW...are your questions trying to encourage or discourage autonomous bots? BIG :)

Mark

xdream
10-16-2009, 06:28 PM
Wondering how localized the blooming will be on the opponent camera..also I'm thinking lasers may blind a camera too but do not want to bend any rules!

xdream
10-16-2009, 06:38 PM
I just took at a look at the videos again and think I could pull off tracking in that venue.

Adrenalynn
10-16-2009, 06:38 PM
By it's nature, coherent light (a laser) tends to be a point-source from the camera's perspective. In order to be effective at illuminating an area, the illuminator (IR or visible, they're really exactly the same thing, right?) needs to be non-coherent - ie. a non-point-source. If you could continuously blind a moving camera with a point-sourced laser, you could also shoot an arbitrarily small target with an airsoft BB from any angle and at any range inside the gun's effective range. Which means that you've already won, since no one else could or can.

Adrenalynn
10-16-2009, 06:40 PM
I just took at a look at the videos again and think I could pull off tracking in that venue.

With a 3 lux camera? Do remember that the cameras that shot that video weren't 3 lux. The cameras on the bots were 0.5lux, the street-level cameras were 0.01 lux, and the main camera was 18dB over 0.1 lux.

xdream
10-16-2009, 06:56 PM
I agree that the coherency of a laser means that the rays need to be orthogonal to the sensor to blind it..I just tried it and it worked. Completely blinds my camera.

I tried it with an LED flashlight and it is much less blinding but over a much greater range of angles...also needs to be quite orthogonal to be fully blinding.

Like I said I'm not going to bend the rules and just wanted to know...I didn't see any mention of no light sorces allowed to assist in targeting so I thought I would ask..there appears to be an exception for lasers.

Mark

Adrenalynn
10-16-2009, 07:03 PM
I don't know that any exception necessarily exists for Laser. Tybs or Fergs will have to clarify. As you mention, and I alluded - the laser has to be dead on or it's going to bounce around in the barrel of the lens and not do much. Dead on it's murder on a CCD. The non-point-source light is predictably less extreme but much broader. Remember you'll be moving and the countering mech will be moving. So there's a pretty high chance that you're going to spend a fair bit of time blinding the other guy with the non-point-source. But that's just my take on it. Tybs and Fergs control the rulebook, not I!

xdream
10-16-2009, 07:06 PM
I can't think of anything else to say other then if it can't be done why bother trying

Adrenalynn
10-16-2009, 07:12 PM
Like I say - that's Tybs and Fergs' call, not mine, I'm just stating a hypothesis...

I've got some little 180 lumen, 5wt Cree tactical LEDs here. One of them will handily blind a camera from a hundred meters, I've done it a few times. :) I think it's a rathole to fall down into if they start trying to specify what can and can't be used. So if it were me, I'd just simply say "if the other guy complains, you're DQ'd, there's the rule". Then leave it to the competitor to decide if they want to risk a DQ that way. But that's just what I'd do, again, their call not mine.

xdream
10-16-2009, 07:17 PM
I guess the best solution is to get an approx EV range for the arena and then we can see what can be done.

Adrenalynn
10-16-2009, 07:18 PM
Sorry - I edited above for clarity and crossed your post.

I'm heading out to dinner, time to actually try to make myself presentable, I'll be back later. :)

xdream
10-16-2009, 07:19 PM
Thanks for all the detailed lighting info btw..it is helping me focus on the biggest problems..I was just hoping available light wouldn't be one of them!

Autonomous topics seem to keep leading back to ambient light. :)

Mark

xdream
10-19-2009, 03:43 PM
I was just wondering how many people are planning on or entertaining building a fully...OR...semi autonomous Mech for April?

Mark

lnxfergy
10-19-2009, 04:18 PM
I was just wondering how many people are planning on or entertaining building a fully...OR...semi autonomous Mech for April?

Mark

Assuming I get my act together more than 3 weeks before RG, both my quad and biped should be fly-by-wire (so semi-auto). I'm mainly going to focus on having the bot help with the driving part.

-Fergs

societyofrobots
10-19-2009, 04:25 PM
I was just wondering how many people are planning on or entertaining building a fully...OR...semi autonomous Mech for April?
Got some good news and some bad news . . . good news is that I'll have an autonomous mech finished by February.

Bad news is . . . the RoboGames date was changed . . . I wasn't planning to be back from Bangkok until June . . . oh well! Maybe 2011 . . .

I'll probably just do a biped competition in Singapore or somewhere close . . .

xdream
10-19-2009, 04:50 PM
Do you have any pic/vids of your bot you would like to share? I will post mine as soon as I have it built...parts are coming this week.

Mark

societyofrobots
10-19-2009, 04:59 PM
Do you have any pic/vids of your bot you would like to share? I will post mine as soon as I have it built...parts are coming this week.
yea . . . but not posting any until its done . . .

I'm trying multiple 'crazy' ideas that no one has done before, and not 100% sure it'll work. Better to not open myself up to outside opinions until I've proven my ideas are better than yours =P

xdream
10-19-2009, 05:14 PM
ROTFL!!! I know exactly what you mean....I made a new thread just designed to talk about our crazy ideas...Its called Autonomous Mech Thread
Thread: http://forums.trossenrobotics.com/showthread.php?t=3640

Mark

zchris13
11-01-2009, 06:33 PM
You're adjusting for ambient light, amirite? Holy crap. Genius.
What are you using? Just averaging the pixels across the camera, or using a diffuser type thing with a single pixel "camera" light sensor thingy?
Or am I just wrong.

Adrenalynn
11-02-2009, 12:15 AM
You're adjusting for ambient light, amirite? Holy crap. Genius.


http://en.wikipedia.org/wiki/Gamma_correction
http://www.poynton.com/notes/colour_and_gamma/GammaFAQ.html

xdream
11-23-2009, 11:25 AM
FYI-

There has been some discussion on runtime and current draw for the BRAT platform. Without accounting for the Xbee and scoring plates, I'm getting about 0.8 Amps while standing and tracking with the pan/tilt head...while shooting it goes up to about 1.6 amps to run one of the guns. For both guns running at the same time current draw is about 2.4 amps so each gun draws about 0.8 amps.

I'm using Thunder Power Extreme 20C 2S lipo's that I stole from my RC heli. Modified it from a 3-cell pack to a 2 cell pack. Considering the packs can source 50 amps I should get near the full calculated run time of almost 1 hour of continuous shooting with both guns.

I have not tested current draw while walking but I expect it to be around 2.5A as well.

All of these current numbers look encouraging and I expect the final robot to have at least 30min of runtime and possibly closer to 60min depending on how often he is shooting/walking.

I'm very confident I will not have any problems for a 15min match with my current setup.

Mark