Well now it certainly has been a while since I've updated Timmy's post here. We've been learning a few new tricks, experimenting with different ROS packages and writing a few new nodes just for fun.
I continued to play with ORK and trained new objects which lead to writing a node that subscribes to the detected objects, queries the object database and speaks the items that fit within my sanity parameters.
Object detection with item announcements:
Sorry no youtube videos exist yet.
Another package I wanted to get back to since building ROSie was COB (Care-O-bot) people detection. It is used for detecting heads and faces and recognizing people. I started with a node that subscribes to the detection data and Timmy would announce the names of the faces he recognized. Then I took it a step further and used the detection location data to position Timmy's head before he offered his greeting. So somewhere between my instagram post and the youtube video I just posted for you all I added the ability to look around for a face when no detections were present. This led to some funny but almost creepy moments where I see movement in the corner of my eye and when I look over the robot is staring right at me.
At some point or another I had been wanting to play with a package called find_object_2d which kinda humorously also contains a find_object_3d launch file for use with depth sensors. Anyway. This one was extremely easy to use and quite fun with a primesense. Beyond testing I haven't done anything with this package but I'm sure at some point I'll have a use. I did notice this morning TeamKAIST has a demo video showing the use of this package.
Feature match objects and estimate pose in 3D space:
After a while I started to realize Timmy is never walking around. A big part of why he doesn't is the environment he lives in isn't really suited for his walk gait, size, etc. He lives in a hackerspace of a home after all, it's quite cluttered around here. Despite this I do take Timmy out every now and again and when he's walking around I'm teleoperating. So I wrote another node that subscribes to the depth data ( depth to laserscan specifically and for simplicity ) and determines feasibility to travel forward when the depth points within my bounds are less than my threshold. When an obstacle is detected he backs up and picks a new route. If he falls over he has the ability to get back up and the program will continue once his body is stabilized.
Simple Rover Node:
A small challenge presented itself and I'm not sure how much I should be mentioning but I can share the result. Basically what I ended up with is my HR-OS1 running Ubuntu 14.04 on a RPi2 running a ROS package containing nodes I created for Arbotix-Pro communication and servo feedback. Servo angles were then transmitted over the network from the HR-OS1 to the HR-OS5 running the ROS stack. Chap became Timmy's puppet master
HR-OS1 ROS networking:
I suppose that is about all.
Timmy had an amazing time at RoboGames 2016! (and hello to those of you who I ran into).