1 / 72

Occlusion-Aware Multi-View Reconstruction of Articulated Objects for Manipulation

Occlusion-Aware Multi-View Reconstruction of Articulated Objects for Manipulation. Xiaoxia Huang Committee members Dr. Stanley Birchfield (Advisor) Dr. Ian Walker Dr. John Gowdy Dr. Damon Woodard. Motivation. Service robots. Care-O-bot 3 ( Fraunhofer IPA ). Robot Rose ( TSR ).

Download Presentation

Occlusion-Aware Multi-View Reconstruction of Articulated Objects for Manipulation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Occlusion-Aware Multi-View Reconstruction ofArticulated Objects for Manipulation Xiaoxia Huang Committee members Dr. Stanley Birchfield (Advisor) Dr. Ian Walker Dr. John Gowdy Dr. Damon Woodard

  2. Motivation Service robots Care-O-bot 3 ( Fraunhofer IPA ) Robot Rose ( TSR )

  3. Motivation Domestic robots in many applications require manipulation of articulated objects • Tools: scissors, shears, pliers, stapler • Furniture: cabinets, drawers, doors, windows, fridge • Devices: laptop, cell phone • Toys: truck, puppet, train, tricycle Important problem: Learning kinematic models

  4. Approach • Part 1: Reconstruct 3D articulated model using multiple perspective views • Part 2: Manipulate articulated objects – even occluded parts • Part 3: Apply RGBD sensor to improve performance

  5. Related Work [ Ross et al., IJCV 2010 ] [ Katz et al., ISER 2010 ] [ Sturm et al., IJCAI 2009, IROS 2010 ] [ Sturm et al., ICRA 2010 ] [ Yan et al., PAMI 2008 ]

  6. Our Approach Recovers kinematic structure from images Features: • Uses single camera • Produces dense 3D models • Recovers both prismatic and revolute joints • Handles multiple joints • Provides occlusion awareness

  7. Approach • Part 1: Reconstruct 3D articulated model using multiple perspective views • Part 2: Manipulate articulated objects – even occluded parts • Part 3: Apply RGBD sensor to improve performance

  8. Articulated Model Reconstruction

  9. Procrustes-Lo-RANSAC (PLR) {I1} P 3D reconstruction Alignment / Segmentation {I2} Q t R Axis direction estimation 3D joint estimation 2D joint estimation θ u w Axis point estimation w … {I1} {I2} …

  10. Procrustes-Lo-RANSAC (PLR) Axis direction estimation u w Axis point estimation w … {I1} {I1} P 3D reconstruction Alignment / Segmentation {I2} {I2} Q … t R 3D joint estimation 2D joint estimation θ

  11. Camera Calibration Output K, R, t Camera model intrinsic parameters K object point image point extrinsic parameters Features x’ SIFT {I} Input Bundler: http://phototour.cs.washington.edu/bundler/

  12. SIFT Features • Input images • SIFT features[ Lowe, IJCV 2004 ] • Matched SIFT features 658 keypoints 651 keypoints 24 matches

  13. Camera Calibration Minimize error Structure from motion Output EXIF tag or default value K, R, t Camera model intrinsic parameters K object point image point extrinsic parameters Features x’ SIFT {I} Input Bundler: http://phototour.cs.washington.edu/bundler/

  14. Camera Calibration Camera positions 3D model with 147 images Toy truck

  15. 3D Model Reconstruction patch Image projection of a patch Object • Expand to neighboring empty image cells • Not expanded if there is a depth discontinuity I1 I2 PMVS: http://grail.cs.washington.edu/software/pmvs/

  16. 3D Model Reconstruction 3D model from Bundler 3D model from PMVS

  17. Procrustes-Lo-RANSAC (PLR) {I1} {I1} P {I2} {I2} Q t R Axis direction estimation u w Axis point estimation w … 3D reconstruction Alignment / Segmentation … 3D joint estimation 2D joint estimation θ

  18. Alignment / Segmentation ASIFT Find closest 3D correspondence S1 Project into image C Match feature S2 Find closest ASIFT update End N Good? R, t,σ Y Segment … F1 {I1} P … Q {I2} Procrustes + Lo-RANSAC F2

  19. Procrustes Analysis {A} {B} Procrustes analysisis the process of performing a shape-preserving similarity transformation. Greek myth http://www.mythweb.com/today/today07.html

  20. Procrustes Algorithm μ μ X Y X X Y Step 1:Translation Y

  21. Procrustes Algorithm Y X X Y Step 2:Scale

  22. Procrustes Algorithm X Y Step 3:Rotation R

  23. Procrustes-Lo-RANSAC (PLR) {I1} {I1} P {I2} {I2} Q t R Axis direction estimation u w Axis point estimation w … 3D reconstruction Alignment / Segmentation … 3D joint estimation 2D joint estimation θ

  24. 2D Joint Estimation Joint point w {A} 3D model Link 1 Link 0 Configuration 1

  25. Two Links: Change Configuration {B} 3D model Link 1 Joint point w Link 0 Configuration 2

  26. Object Model in 2D Transformation of Link 0 between two configurations: Configuration 2 Configuration 1 {B} {A} Link 1 Link 1 Link 0 Link 0

  27. Align Link 0 Configuration 2 Configuration 1 {A} {A} Link 1 Link 1 Link 0 Link 0 Transformation of Link 1 between two configurations:

  28. Align Link 1 Configuration 2 Configuration 1 {A} {A} Link 1 Link 1 Link 0 Link 0

  29. 2D Joint Estimation R1 R1 -w t1 A A A A A A +w 2D joint :

  30. Procrustes-Lo-RANSAC (PLR) {I1} {I1} P {I2} {I2} Q t R Axis direction estimation u w Axis point estimation w … 3D reconstruction Alignment / Segmentation … 3D joint estimation 2D joint estimation θ

  31. 3D Joint Axis directionu Axis directionu Axis pointw Axis pointw • Revolute joint • Prismatic joint • u = t/|t| • w= mean({pi}) Joint is classified using R

  32. Revolute Joint Direction u θ Axis angle representation Direct computation Eigenvalues / eigenvectors Two methods (Singularity: q = 0°or q = 180°)

  33. Revolute Joint Point u π R A A θ πu Rotation axis u Rotation plane πu θ

  34. Revolute Axis Point u πu θ • Rotation • Translation • 2D joint • 3D axis point

  35. Experimental Results(1) 1 out of 22 1 out of 19 Red line is the estimated axis

  36. Experimental Results(2) 1 out of 17 1 out of 20 Red line is the estimated axis

  37. Experimental Results (3) 1 out of 99 1 out of 94 Red line is the estimated axis

  38. Experimental Results (4) 1 out of 24 1 out of 25 Red line is the estimated axis

  39. Experimental Results (5) 1 out of 13 1 out of 18 Red line is the estimated axis

  40. Experimental Results Average and standard deviation of angle error High performance

  41. Experimental Results(6) 7.6°angle difference between two axes

  42. Experimental Results(6)

  43. Experimental Results(7) 2.5°angle difference between two axes

  44. Approach • Part 1: Reconstruct 3D articulated model using multiple perspective views • Part 2: Manipulate articulated objects – even occluded parts • Part 3: Apply RGBD sensor to improve performance

  45. Part 2 : Object Manipulation • Robotic arm + Eye in hand (camera on the end effector) • 3D articulated model + Scale estimation(σ) • Object registration + Manipulation 3D articulated model Manipulating object Robotic arm

  46. Object Registration • Camera calibration • Hand-eye calibration • Robot calibration

  47. Hand-eye Calibration • Calibration object: chessboard • Robotic arm with a camera moves from P to Q • A is the motion of the camera, B is the corresponding motion of the robot hand Let: Then: Since: Only unknown:

  48. Object Pose Estimation • Place the object in the camera field of view • Take an image of the object at some viewpoint • Detect 2D-3D correspondences • Estimate the object pose by POSIT algorithm • POSIT : • does not require the correspondences are planar • Iteratively approximate an object pose using POS (Pose from Orthography and Scaling) • POS simplifies a perspective projection by a scaled orthographic projection

  49. Experimental Results (1) • Camera calibration 20 different views of a chessboard (7×9 squares of 10×10mm)

  50. Experimental Results (1) • Corners extraction + : extracted corner (up to 0.1pixel) : corner finder window (5×5mm)

More Related