PDA

View Full Version : Camera / Vision Processing (within the HR-OS1 Framework)



r3n33
09-13-2015, 10:01 PM
This afternoon I decided I wanted to test and experiment with the USB camera provided with the HR-OS1. Looking at the current projects in the repository there wasn't anything utilizing the camera so I took at peek at what was available in the original Darwin-OP framework and found a perfect example called head_tracking in the Linux/project/tutorial directory. Recently I had already worked on mjpg_streamer and the httpd within the OS1 framework to enable and improve the walk_tuner's webpage interface so working with the camera was just one more small step to take.

At first things were pretty rough going but I spent a few hours tweaking and now I'm satisfied enough to move on to something else. About mid way through the process I took a video..


http://www.youtube.com/watch?v=lo9bP_xogj4

At the time of the video the tracking was pretty good but a little slow. You can see the web interface to head_tracking and the streaming view of the video and the detection overlay. Not bad for a postage stamp processor ;) Though the Edison is under a fairly heavy load while this is running. Some optimization is likely possible but I don't think it's getting more than 15fps as is with one core tapped out.

6164

Given time I'll be pulling all my bug fixes and new features into the main Interbotix repository but if you can't wait I'm currently keeping the code in an experimentation branch https://github.com/r3n33/HROS1-Framework/tree/experimentation

darkback2
09-14-2015, 07:34 AM
Sweet! I have been waiting for something like this. Seems like a waste having that camera and not using it for anything... Great work so far.

Would this code work on the Raspberry Pi?

DB

KurtEck
09-14-2015, 08:17 AM
Pretty cool, Will have to try it out. My assumption on the Edison that you needed to add some small USB hub? As the Edison only has one USB port. You probably also printed yourself a holder for it?

Great job!

DarKback - It probably should work, best way to find out would be to give it a try.

Kurt

P.S - I think I will try putting the Edison in mine to see how that changes things talking to Arbotix Pro. Or I wonder if I should just go ahead and try with Odroid C1...

LloydF
09-14-2015, 11:09 AM
I wonder why the Trossen guys have put out zero code for there camera? I have been looking for a different sensor all together because of that very reason. Thanks for your hard work . :o I'm embarrassed for not having looking into the Darwin-OP framework code to see what is there myself. On the raspberry pi I have been have the same frame rate issue. That is why I have been experimenting on the xu3 and xu4 for some more processing power. The XU3 was looking good before it went out of production and the xu4 is bloody power hungry:genmad: making the edison and raspberry pi the all around better choices.

tician
09-14-2015, 11:48 AM
I wonder why the Trossen guys have put out zero code for there camera? I have been looking for a different sensor all together because of that very reason alone. Thanks for your hard work . :o
Because a single, cheap USB webcam is not very useful for most robotics applications. Soccer is just about the only thing because the ball is of a uniform size and color, so tracking the color is easy enough. Its distance from the camera can be estimated from the size of the blob in the image, and its location/bearing relative to the camera can be estimated from the blob's position in the image. Pretty much everything else requires some sort of depth sensor (structured light like original kinect; time-of-flight; stereo cameras; etc.).

Since things like RoboCup prohibit active sensors and many eye-safe active sensors do not work very well in direct sunlight, there are not many options for reliably acquiring depth information except by producing disparity maps from multiple sensors viewing the same scene from different locations (e.g. stereo vision). Producing a decent disparity map requires accurately finding/matching a very large number of common elements in each image, which is not easy to do. Long story short, computer vision is still an area of intense research requiring lots of processing power.


OT: The summer-fall transition this year was scary fast. Friday was 80s~90s and sunny. Saturday was 60s~70s and cloudy. Sunday was 60s~70s with sun creepy low in the sky. I do not like.

LloydF
09-14-2015, 12:14 PM
If this little bot would play soccer even badly most of us would be very happy :cool:. Then I would need Two or more of them. (Surely someone can see the dollar signs here)

r3n33
09-14-2015, 02:10 PM
Thanks all

Most likely the camera code will work on the Raspberry Pi just the same as the Edison. I know Kurt has dabbed into testing the web interface I added to walk_tuner on his RPi and it worked as expected. This isn't too different.

I did have to add a USB hub so I could plug two things into the Edison (camera, arbotix). It was nice to find external power on the hub wasn't required :) At the moment all the cables are hanging off the robot.. I haven't even cut the camera cable yet :confused: So no fancy 3D printed holder. Actually, looking into mounting the hub. I don't think I'll be satisfied with anything external but what I had on hand was pretty small. If I can't fit the hub inside I'll have some considerations to make before going too much further.

I still have hopes the OS1 can play soccer at a level well enough not to be embarrassed to show it off ;) Since starting on Chap I've improved my walk tune, detect fall overs in ps3_demo, created pages for standing up from stomach or back, and now the ability to detect a color blob. When my replacement gears arrive today I'll be back up and running and the next thought is to make him get up, find a ball, and then walk to the ball's location. So we shall see how that goes or if I get distracted with something else along the way.

darkback2
09-15-2015, 06:44 AM
One thing that I noticed about the current neck design is that it doesn't let the camera look down very far. For soccer the ball may get lost as it gets close to the robot.

DB

r3n33
09-15-2015, 12:23 PM
6169

You are right, the neck design doesn't allow the robot to see it's feet :(

I had to do a test (my camera isn't mounted to spec but close enough). This is as close as a small object can sit in view:

61706171

Before getting too discouraged I realized a bigger ball may work just as well (at least temporarily)...

6172

tician
09-15-2015, 01:01 PM
The "Ay can't see mah feet" issue can be easily solved by changing the bracket that mounts the camera to the tilt servo to include a fixed angle offset depending on needs (angled down for soccer; unmodified for general use; angled up for interacting with humans). IIRC, the linkage-based tilt mechanism is supposed to be much more stable than a tilt mechanism made from AX servos and stock dynamixel brackets.

Whenever I get my mitts on one, I might try sticking the pan servo in the torso between the shoulders and attaching the tilt servo to the top plate with a thrust bearing like the Interbotix arms since I've already got the pin roller thrust bearings and washers from spares I bought for Darsha's kinect turret. Then make a really long C-bracket to have the tilt axis at the back of the neck while keeping the camera mounts above and/or in front of the servo's case while not losing too much of the servos range of motion. Might also try timing belt and pulleys to have the full range of motion with a solidly mounted pivot on top of the servo for better rigidity than a long C-bracket.

LloydF
09-16-2015, 08:35 PM
hum, and you have done this or are you just saying it should be easy?

tician
09-16-2015, 08:58 PM
If you have a 3D printer, it should be trivial to print up a new bracket to mount the shipped USB camera at some angle other than perpendicular to the end of the servo. It will introduce some rotation and translation offsets to the position of the camera relative to the rotational axes of the pan-tilt assembly, but they should be easily handled constants.

As for the alternate pan-tilt design with a timing belt and pivot, I'm working on a ~2:1 capstan pair for the tilt to do initial testing (300 degrees of servo horn movement into ~150 degrees of camera tilt). If it works anything like I hope, then buying 11T and 21T MXL pulleys and a correctly sized belt would be the next step.

LloydF
09-17-2015, 02:24 PM
tician, are you a beta tester? Oh yea I see you were having issues with the Arbotix-Pros and Pi's. Yes it seems to be a latency issue with the 3.18 kernel, an real time data. Now I could be wrong but I am pretty sure.

tician
09-17-2015, 03:11 PM
tician, are you a beta tester?
Not yet, but I have a few years experience building, using, and fixing dynamixel-based robots in addition to the BS EE/ME (and all courses for an MS Engr, but dropped out before finishing my thesis). Lots of ideas pop into my head and reach varying levels of development, but not many ever fully materialize because of limited resources and scary ability to become very quickly bored/disinterested/distracted.

LloydF
09-17-2015, 03:12 PM
I was Playing with you're rme files, they run on my pi-HROS1. I need to get mine adjusted to carpets, you're get up routines work :rolleyes:

r3n33
09-17-2015, 03:26 PM
If you have a 3D printer, it should be trivial to print up a new bracket to mount the shipped USB camera at some angle other than perpendicular to the end of the servo.

This is true and a note on the subject of the limited downward viewing angle; A little bird has mentioned there will be a (possible?) revision to the HR-OS1 design addressing the issue. I may get impatient and come up with another temporary solution (other than a bigger ball) while we wait.


I was Playing with you're rme files, they run on my pi-HROS1. I need to get mine adjusted to carpets, you're get up routines work :rolleyes:

Woooo! That is super cool! What style hands did you have attached to your robot when you tested? BTW, those poses were made with WinRME ;) </shameless plug>

I'm sure they could be improved upon but at the moment I'm still enjoying them. It's kind of fun to load the ps3 demo and (repeatedly) lay the robot on it's back / belly and watch it get up all on it's own ;)

LloydF
09-17-2015, 04:10 PM
I have a shapeways set as there close to indestructible and feather light. 6175
Now that I am on the REAL TIME kernel patch I think I'll name him RT-HROS1.

r3n33
09-19-2015, 01:22 PM
Cool picture! I like your sign on the wall. When the stand up pages were created I was using the shapeways hands as well. I wonder, especially for standing from a front fall, if the different hands will have a big effect on it's ability to upright. Good to know it works with all the armor attached. I have most of mine off as I'm re-working my bot to keep the weight down.

Last night I printed a new camera mount that has a 25 degree offset which allows him to see his feet. I hacked up the ps3_demo to add some basic soccer handling and this morning had a little fun...


http://www.youtube.com/watch?v=qMz-c2nUeMo

There is still much to do! And definitely lots of fun ahead.

KurtEck
09-19-2015, 01:49 PM
Great Film! It gave me a great laugh!

Kurt

LloydF
09-19-2015, 03:35 PM
ROFL.:veryhappy: So glad yours is as stable as mine, snicker. I think I need a tile soccer sheet, board, poster, to play on.

r3n33
09-19-2015, 06:26 PM
It isn't all falls in my lil world of soccer ;) In some more raw footage you can see I have a pretty stable walk gait. Since last night I threw a lot of new things at the robot and mostly need to improve my kicks and refine some parameters. Funny thing I've noticed.. 90% of the time when Chap walks to a ball and kicks it is on the left side which makes me think he is a lefty :p.

If you like the soccer rug you can find it on amazon as a kids area rug (http://www.amazon.com/Rugs-Time-FT-134-Soccerfield-Rug/dp/B000W59AI6).


http://www.youtube.com/watch?v=k4PxZdB35d8

LloydF
09-21-2015, 09:19 AM
Really nice work my man.:cool: Your using the edison setup right. The more I play with you're WinRME the more I like it, well done.

r3n33
09-29-2015, 11:54 AM
Really nice work my man.:cool: Your using the edison setup right. The more I play with you're WinRME the more I like it, well done.

I'm not a man :robotsurprised: but you are welcome. That's great you are getting use from WinRME. Development on that program has been on pause while I wait for feedback from the community (or until I *need* a new feature).

...

Getting back to vision processing. I've created another demo in my experimentation fork under Linux/project/tutorial/rgb_color_sensor (https://github.com/r3n33/HROS1-Framework/tree/experimentation/Linux/project/tutorial/rgb_color_sensor)

Granted it works best with one of my Dynamixel driven RGB hands ( I suppose that might deserve a new thread ) it will still output RGB values to terminal as well as give you a webpage to view the detection area and camera stream.

So this is just another way the camera can be used within the framework. Enjoy!


http://www.youtube.com/watch?v=a3Mw3Pt2h5g

Zenta
09-29-2015, 04:27 PM
Nice work on the rather slow walk, looks good! Also great work on the RGB hands, cool to see the hands matching the ball. Does the hand have fixed or active gripper?

LloydF
09-29-2015, 06:04 PM
Sometimes I forget whose thread I am on LOL. When you are over 65 you will understand.:o
But I simply love your work and you're sharing ways.

r3n33
09-29-2015, 07:20 PM
Nice work on the rather slow walk, looks good! Also great work on the RGB hands, cool to see the hands matching the ball. Does the hand have fixed or active gripper?

I'll take a slow and steady walk over fast and troublesome. With the AX18's in the knees with cooling fans and ventilation holes he can walk around for about 20 minutes on a battery.

This little guy has fixed hands. Some true grippers would be super cool but I think I'd have to sacrifice too much of the current functionality to adapt the robot to a heavy arm or two. The LEDs controlled by a trinket only added a few grams, the exact weight I don't have yet but plan to present with the "RGB hand project".


Sometimes I forget whose thread I am on LOL. When you are over 65 you will understand.:o
But I simply love your work and you're sharing ways.

Hahah you had a 99% chance of being correct on the robot forums :tongue: I'm not going to say how close I am to that age but I certainly understand sometimes. Thanks again though. It's really nice when others get good use from your trails and tribulations!

tician
09-29-2015, 10:11 PM
Robotis XL-320 weigh only ~17g and pololu/sanyo micro gearmotors weigh only ~9g, so quite a bit smaller and lighter than an AX servo. There are also -5g (and sub-5g) micro hobby servos that could work pretty well for controlling a gripper given an appropriately designed gripper actuation mechanism.


A bit late, but that is a cute little rug. Of course, for a regulation RoboCup or HuroCup field, you would need a few rolls of cheap astroturf* to cover an area measuring ~7.4x5.4 meters. Flooring is kinda on my mind since I'm still cleaning the dark blue carpet in my workspace before adding new shelves to make it look less like a sprawling robot graveyard/junkyard/landfill. Besides being a PITA to clean and making it easy to lose small parts, the very thick pile carpet with thick, squishy carpet pad is not very conducive to current bipedal robots actually staying upright so probably going to cover some sections of the nice and clean carpet with foam exercise matting or foam puzzle flooring.

*Oddly, there is a (mostly storage) room in my basement already equipped with artificial turf that might actually be large enough for a regulation field, but it still has football markings because it was supposedly salvaged from one of the UGA fields (Woodruff?) 20+ years ago. IIRC, the nasty privet along the property line was also supposed to have been salvaged from some of the hedges at Sanford Stadium that were ripped up in 1996.

KurtEck
09-30-2015, 09:02 AM
I'll take a slow and steady walk over fast and troublesome. With the AX18's in the knees with cooling fans and ventilation holes he can walk around for about 20 minutes on a battery.
I meant to also mention I like how yours is walking a lot better that the default walk. Any suggestions on what changes one needs to do to make ours walk in a similar way?

So far I am still running with all default AX-12s. I don't think I have killed any of them yet, but I know some of the time just having it stand up from the kneel that mine is normally at when not in use, one (maybe two) servos in the knees will start to blink...

Sure wish they would release something like a MX-12/18 that is not a W and fits in the same brackets!


:tongue: I'm not going to say how close I am to that age but I certainly understand sometimes.
I am pretty sure I am safe to say that I am a lot closer to that age :o and I totally understand.

r3n33
09-30-2015, 12:15 PM
Robotis XL-320 weigh only ~17g and pololu/sanyo micro gearmotors weigh only ~9g, so quite a bit smaller and lighter than an AX servo. There are also -5g (and sub-5g) micro hobby servos that could work pretty well for controlling a gripper given an appropriately designed gripper actuation mechanism.


A bit late, but that is a cute little rug. Of course, for a regulation RoboCup or HuroCup field, you would need a few rolls of cheap astroturf* to cover an area measuring ~7.4x5.4 meters. Flooring is kinda on my mind since I'm still cleaning the dark blue carpet in my workspace before adding new shelves to make it look less like a sprawling robot graveyard/junkyard/landfill. Besides being a PITA to clean and making it easy to lose small parts, the very thick pile carpet with thick, squishy carpet pad is not very conducive to current bipedal robots actually staying upright so probably going to cover some sections of the nice and clean carpet with foam exercise matting or foam puzzle flooring.


I totally agree about using some small servos and thanks for reminding me about the sub micros. I actually have 3x DSP33s that were never put to use. Goodness, I am already tempted to use them. Look at what you have done! :tongue:

How cool would it be to see a group of OS1s playing a match? At my current maximum walk rate ( 8 to 10mm/s ) it looks like it might take them about 9-11 minutes to cover the 5.4 meter dimension. Awww. Maybe we can cut the size down by 4, relative to how much precision is lost going from MX servos to AX. Ha ha ha.


I meant to also mention I like how yours is walking a lot better that the default walk. Any suggestions on what changes one needs to do to make ours walk in a similar way?

So far I am still running with all default AX-12s. I don't think I have killed any of them yet, but I know some of the time just having it stand up from the kneel that mine is normally at when not in use, one (maybe two) servos in the knees will start to blink...


Thanks Kurt :) I've actually put a good bit of time into making adjustments which means a number of things come into play with my current walk gait:

Obviously for one I put AX18s in the knees to help give a tiny bit more strength in that area while I was going to put it through a lot of tuning.

The other thing that was quite important to performing the walk tune was addressing the IMU readings. I'm not sure yet what step Trossen would like to take on getting these fixed ( ie update the Arbotix-Pro firmware so the output is correct or patch the Framework ) but it is essential. My fork of the repo is of course patched to accommodate in the meantime.

Another thing many beta testers have probably noticed if you pick up the robot while the walk gait is active the legs will likely shake like crazy. That is because the majority of the resistance has been removed and the compliance on the leg servos is pretty much set to maximum. While tuning ( and walking ) this is a huge problem because the robot's foot needs to leave the ground and when it does it introduces instability. So what I'm getting at is another change I've made that contributes to the success of my gait was setting the compliance slope to the default value. Now when the legs are off the ground they are considerably more stable.

Lastly I'd like point out a few things that became obvious to me after many hours of trials..

It takes patience ( and replacement gear sets ) to retune.
*All* of the tuning parameters are important and you will learn them.
The surface you tune on matters. Keep it consistent if you are going to retune.
Slower is easier. Tuning parameter changes are easier to see when the cycle period is higher. Also AX servos seem smoother at lower speeds.
Take videos along the way. If you have or have access to a camera that can take slow motion that is super helpful for review and comparison.


http://www.youtube.com/watch?v=8M3IZ6fmiWc

LloydF
10-01-2015, 08:06 AM
Oh, wow I did not think of that, you are 100% on. One of the first bioloid upgrades was the AX18's in the legs making the GP line for combat and speed. Now how many do I have laying around here, hum, not tearing into my GP. So must get more :tongue:.

tician
10-01-2015, 08:12 AM
If you do upgrade to AX-18A in the knees, you will need to add the cooling fans and printed mounts to slow overheating. AX-18 use a coreless brushed DC motor and run at higher current so they overheat much faster than the AX-12, especially when trying to hold a position instead of moving.

KurtEck
10-01-2015, 10:04 AM
:) I've actually put a good bit of time into making adjustments which means a number of things come into play with my current walk gait:

Obviously for one I put AX18s in the knees to help give a tiny bit more strength in that area while I was going to put it through a lot of tuning.
It shows that you have put in a lot of time. I was going to ask to see what your current settings are, but I see you already have done that, up in your experimentation branch on github (Data/config.ini). So I may try out those settings as a starting point. I also should probably insert the battery as it will probably change the balance.

Just wondering how many of the servos have you upgraded to 18s?


The other thing that was quite important to performing the walk tune was addressing the IMU readings. I'm not sure yet what step Trossen would like to take on getting these fixed ( ie update the Arbotix-Pro firmware so the output is correct or patch the Framework ) but it is essential. My fork of the repo is of course patched to accommodate in the meantime.

I totally agree. I now have the programmer and converter, such that I can probably update my firmware. However there is not many instructions on how how they actually do the makes, other than the readme.md appears to show they are using someone elses compiler tool set and making it under linux (14.10)... I will probably soon setup that toolset on the NUC and see how it builds... Note: NUC is on 14.04... But will probably do some other diversions before then ;)

It would be good to know if the Arbotix-pro's that ship in beta2 are the same as beta 1s? If so did they fix it? There have not been any new updates in the ArbotixPro project in about 5 months (except pdf added and readme file updated). So if they fixed it, not sure how and if so, did they change something like revision number the software could detect and if earlier version use your fix...

Again great work!

vehemens
10-01-2015, 12:06 PM
I'll take a slow and steady walk over fast and troublesome. With the AX18's in the knees with cooling fans and ventilation holes he can walk around for about 20 minutes on a battery.

Could you post pictures showing the details of the cooling fans and ventilation holes?

r3n33
10-01-2015, 03:10 PM
Ok, more information about the AX18 knees... They were mainly for experimentation and I found to get the most out of them (in this location) they need to be cooled and ventilated. With only the fans they would last longer but still overheat when under load and would certainly cool down quicker. This allowed me to test intermittently while taking ~15 minute breaks for cooling. When I added what I joking like to call "speed holes" I was able to run for extended periods of time and in some cases ran into the thermal limit on other joints before the knees.

The fans are 30mm 12V and attach with the brackets I made on the printer. You can get those here: http://www.thingiverse.com/thing:781366

6179 6180

Holes were drilled around 3 sides of the motor. I'd have probably done the 4th but my servos were already installed and I was in a rush/too lazy/insert excuse here. These drill locations were very specifically selected by inspecting the inside of a servo case to make sure the motor and case structure weren't going to be compromised.

6181 6182

Power comes from the empty connector on the ankle pitch servo.

LloydF
10-01-2015, 03:16 PM
Thank you keeps me from scratching my head too hard. Yea!!:veryhappy:

Zenta
10-02-2015, 05:26 AM
Sometimes I wonder if Robotis ever considered to make a servo based on the Ax-18 but a lot higher gear ratio to achieve more torque.

vehemens
10-02-2015, 08:47 PM
My solution for the knee heating problem was to eliminate the knee joint offset relative to the hip and ankle joints (i.e. DARwIn-OP like geometry) so that the knee torque while standing and walking was lower.

Considered adding cooling slots/holes as well, but didn't do it as I worried about getting it wrong and ROBOTIS doesn't sell replacement cases.

Nice to see someone try it and get good results.

DewStorer
10-05-2015, 04:59 AM
Thanks for sharing your project with us. It is pretty cool idea. I am also looking to work on a image processing project. I want to know which MCU is best for these kind of projects? Also what is the list of necessary sensors and how i can find the best ones to ensure the reliability? What OS is required here to perform the image processing?

pcb assembly quotes (http://www.7pcb.com/PCB-Assembly-Quote.php)

LloydF
10-26-2015, 04:32 PM
Ah, there final got streaming running on my RT (real time) kernel. (:confused: Is the frame rate really this low, like 10 to 20 FPS?)

r3n33
10-26-2015, 04:44 PM
Thanks for sharing your project with us. It is pretty cool idea. I am also looking to work on a image processing project. I want to know which MCU is best for these kind of projects? Also what is the list of necessary sensors and how i can find the best ones to ensure the reliability? What OS is required here to perform the image processing?


Typically you wouldn't use a micro controller for image processing, you'd want a computer or in the case of my robot a single board computer like an Edison, Raspberry Pi, ODROID, etc. Most likely you'll want as much processing power as you can afford. I would suggest doing some research on the tasks you'd like to accomplish and look for clues as to what others have used before you.


Ah, there final got streaming running on my RT (real time) kernel. (:confused: Is the frame rate really this low, like 10 to 20 FPS?)

Nice work. I would say yes the frame rate is about 15-17 fps on my Edison. Plenty enough to track a color blob or detect colors but like tician said at some point, not much else.

KurtEck
11-15-2015, 06:13 PM
I have been playing around with the r3n33s ideas for the ability to place HROS1 controlled color LEDS in different places on the robot and have started playing with a simple "hat" like circuit board for Adafruit Trinket Pros, where I can hook them up in the Servo chain.

The boards arrived yesterday, so I built two of them.
6283
They are actually setup to use Adafruit's Neo pixels and should be able to use either the chip type (top) or the Through hole (bottom). So far the chip one has some issues. Probably my soldering. Also have option to solder in wiring to use different Neopixels, but without some other +5v, you are limited on how many. With the bread board version I have hooked up their Neopixel Jewel (7 Neopixels).

Code is still a WIP.

Kurt

P.S. - Boards for Teensy 3.1/2 for this should arrive tomorrow. That one is setup with VR, plus 2 AX servo connectors, so can put anywhere in chain...

r3n33
11-16-2015, 10:53 AM
I have been playing around with the r3n33s ideas for the ability to place HROS1 controlled color LEDS in different places on the robot and have started playing with a simple "hat" like circuit board for Adafruit Trinket Pros, where I can hook them up in the Servo chain.

The boards arrived yesterday, so I built two of them.
6283
They are actually setup to use Adafruit's Neo pixels and should be able to use either the chip type (top) or the Through hole (bottom). So far the chip one has some issues. Probably my soldering. Also have option to solder in wiring to use different Neopixels, but without some other +5v, you are limited on how many. With the bread board version I have hooked up their Neopixel Jewel (7 Neopixels).

Code is still a WIP.

Kurt

P.S. - Boards for Teensy 3.1/2 for this should arrive tomorrow. That one is setup with VR, plus 2 AX servo connectors, so can put anywhere in chain...

They look so small and certainly easier to assemble without all the wiring. I can't wait to see the Teensy version in action!!

LloydF
11-17-2015, 06:13 AM
Everything on your Framework branch compiled on my rpi2, Yea!:wink: Now to give my hero some new feet.

KurtEck
11-17-2015, 08:15 PM
They look so small and certainly easier to assemble without all the wiring. I can't wait to see the Teensy version in action!! Yes they are small!

Here is an assembled Teensy version, with chip...
6285

I have not done much testing with it yet. So far I have simply made sure I could get the strand test to run on the neopixel.
I also used the Dout and +5v/Gnd connections and chained it to an external 12 neopixel ring, which worked as well.

Next up test it out... If I do 2nd build, I do see a few things I would change...

KurtEck
11-19-2015, 09:13 AM
Looks like I need to do a 2nd build :( The Neopixels are not working reliably. I posted up on PJRC and Paul (owner) responded quickly

Are you sending a 3.3V signal to the NeoPixel input?

3.3V signals almost always work if the NeoPixels run from 3.3 to 3.7V. But at 5V power, usually a 5V buffer is needed to increase the signal to 5V. That's why we sell the Octo28 shield and why Teensy-LC has a built-in buffer.
So will design in same chip he uses on LC (SNLV1T125), but use the larger version SOT23-5. Also suggestion came up that others have needed PD resistor on data line, although I think that was for ones that had long lines of pixels...

Back to the drawing board: Will hack up one of current ones to see what happens if I connect to +3.3v...

KurtEck
11-19-2015, 05:17 PM
In case anyone is curious, I have been playing around with a couple of different options here.

One that added the buffer chip and optional PD resistor. Looks something like:
6287
Which after I verify a few things probably will order up a batch. for about $5 for 3...

But if I use several neopixels with this type of VR (like I tried 13), the VR can get pretty warm. So wondering about using ones like on the Arbotix-m or Arbotix-Pro.

Earlier I did the start of one using the vertical VR, but decided it was too high:
6288

So today tried playing with the horizontal version. Decided for this case would not support the pattern for chip, but could allow either one TH or could go external. Right now hole pattern is size for TH, but would overhang. Might instead space out at .1" which would spread pins out, but also would make easier to use breakoff pin connector for external... If I do that would probably then drop the Neo Out as no need....
6289

Still needs work as it has silk screen over pads and the like, but might be fun to try as again only $5 (plus parts)

FYI Update: with one of my current boards, I removed the Neopixel and soldered in new Through hole version but connected it to 3.3v instead of 5v and now this one is happy with the 3.3v signal going through. I still wish to update the board as I don't think the T3.2's 3.3v VR should be used for more than one. New 3.2"s card shows max output of 250ma.

LloydF
12-18-2015, 07:55 AM
You're Head_tracking is in the main github branch now, and it works very well on the RPI2. :cool: Just upside down and backwards, as in run from the ball not follow and the picture is inverted?

r3n33
12-19-2015, 02:59 PM
You're Head_tracking is in the main github branch now, and it works very well on the RPI2. :cool: Just upside down and backwards, as in run from the ball not follow and the picture is inverted?

I noticed this (https://github.com/Interbotix/HROS1-Framework/blob/master/Linux/build/LinuxCamera.cpp#L242) too and flipped my camera over.

LloydF
12-19-2015, 03:34 PM
Does it work correctly with a flipped camera? Or are you saying un-coment them there lines and re-compile?

r3n33
12-20-2015, 04:15 PM
Oh I meant by flipping my physical camera over it works without code updates.

Edit: If you look down at line 408 (https://github.com/Interbotix/HROS1-Framework/blob/master/Linux/build/LinuxCamera.cpp#L408) it's flipping the video. Comment that line, compile, and it will work in it's original orientation I bet.

LloydF
12-21-2015, 04:24 AM
Yea! You are spot on, that did it. Thank you.:wink: This is fun! Finally some camera stuff to work with.:p P.S We follow the ball thank you so, so much, God bless you and yours and Merry Christmas.
63756376

r3n33
02-03-2016, 03:49 PM
Hey @LloydF! Sorry I missed the message. I'm so glad it worked out and you could start playing with blob tracking.

I hope you had a wonderful holiday and new years. Very nice robot team you have there! :D

LloydF
03-18-2016, 08:58 PM
I re-compiled everything with the rpi3 and everything still works. I did note a odd behavior with the Headtracking demo. The first time you run make all, it has a error, but if you then run sudo make clean and and make all over, it compiles correctly and runs OK fine. The built in wifi works fine, YEA! I'll play with the Bluetooth next. Would it not be cool to be able to free up all them there USB ports. :wink:

jwatte
03-19-2016, 08:32 PM
it has a error

If you want anyone to be able to do anything about it (or give you ideas about what might be silently going wrong,) you'll need to copy and paste that error into the post. We can't guess what's going on otherwise.

LloydF
03-20-2016, 08:00 AM
My bad.