1 / 20

Visual Tracking on an Autonomous Self-contained Humanoid Robot

Visual Tracking on an Autonomous Self-contained Humanoid Robot. Mauro Rodrigues , Filipe Silva, Vítor Santos University of Aveiro. CLAWAR 2008 Eleventh International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines

Download Presentation

Visual Tracking on an Autonomous Self-contained Humanoid Robot

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines 08 – 10 September 2008, Coimbra, Portugal

  2. Outline • Overview • Objectives • Self-Contained Platform • Vision System • Experimental Results • Conclusions

  3. OverviewHumanoid Platform • Humanoid Robot developed at University of Aveiro • Ambition is participation at Robocup • Platform composed of 22 DOF’s • Head on a PTU arrangement • Up to 70 cm height and a mass of 6,5 kg

  4. OverviewDistributed Control Architecture • Master/Multi-Slave configuration on CAN Bus • Central Processing Unit: • Image processing and visual tracking • External computer interaction for monitorization, debug or tele-operation • Master • CPU/Slaves communication interface • Slaves • Interface with actuators and sensors

  5. Objectives • Central Processing Unit Integration • Computational autonomy • Development environment • Vision System Development • Visual Tracking Approach • Detection and tracking of a moving target (ball)

  6. Self-Contained Platform • CPU standard PCI-104 • AMD Geode LX-800 @ 500MHz • 512Mb RAM • SSD 1Gb • Video Signal Capture • PCMCIA FireWire board • Dual PCMCIA PC104 module • UniBrain Fire-i @ 30fps (640x480) Camera • Development Environment • Linux based • OpenCV

  7. Vision System

  8. Vision System Acquisition Segmentation - H, S and V Components Object Location Pre-processing Mask

  9. Vision System No ROI With ROI • Dynamic Region of Interest (ROI) • Reduced noise impact • Faster calculus

  10. Vision SystemDynamic Region of Interest (ROI)

  11. Vision SystemVisual Tracking Approach • Keep target close to image centre • Image based algorithm • Fixed gains proportional law, • , joint increment vector • , constant gain matrix • , error vector defined by the ball’s offset • Variable gains nonlinear law,

  12. Experimental ResultsSelf-Contained Platform • Acquisition • libdc1394 based library • Acquisition @ 320x240 with down-sampling: ~24ms • Processing • Without dynamic ROI: 15ms • With dynamic ROI: 11ms Total = ~40ms 25Hz

  13. Experimental ResultsVisual Tracking • Ball alignment • ~1s • Stationary error (~7 pixels)

  14. Experimental ResultsVisual Tracking • Pan trackingwith fixed gains • Error increases in frontal area of the robot

  15. Experimental ResultsVisual Tracking Fixed Gains Variable Gains • Pan tracking with variable gains • Frontal area error reduced

  16. Experimental ResultsVisual Tracking • Tilt trackingwith variable gains • Error similar to the pan tracking • Trunk increases the error

  17. Experimental ResultsVisual Tracking

  18. Conclusions • Implemented architecture separates the high-level vision processing from the low-level actuators control • Dynamic Region of Interest guarantees a greater noise immunity and faster calculus • Low error location and alignment with stationary target, fast convergence • Tracking error reveals the need of a more sophisticated control • Autonomous Self-Contained Humanoid Platform • 25Hz average processing rate, sufficient to deal with fast-stimuli and other quick changing visual entries

  19. Future Work • Validate ball detection through shape detection • Recognition of other elements, such as the ones present at the Robocup competition • Explore alternative techniques of Visual Servoing • Study the influence of the robot’s movement on the visual information and on the tracking system’s performance

  20. Thank you for your atention

More Related