1 / 44

Multiple View Geometry

Multiple View Geometry. Marc Pollefeys University of North Carolina at Chapel Hill. Modified by Philippos Mordohai. Outline. Computation of camera matrix P Introduction to epipolar geometry estimation Chapters 7 and 9 of “Multiple View Geometry in Computer Vision” by Hartley and Zisserman.

nanji
Download Presentation

Multiple View Geometry

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai

  2. Outline • Computation of camera matrix P • Introduction to epipolar geometry estimation • Chapters 7 and 9 of “Multiple View Geometry in Computer Vision” by Hartley and Zisserman Note (2011): the slides refer to chapter numbers fromthe previous edition of the book. I try to update thembut they may be off by one.

  3. Pinhole camera

  4. Camera anatomy Camera center Principal point Principal ray

  5. Camera calibration

  6. Resectioning Resectioning: correspondence between 3D and image entities

  7. Basic equations Similar to last week’s H estimation

  8. minimize subject to constraint Basic equations minimal solution P has 11 dof, 2 independent eq./points • 5½ correspondences needed (say 6) Over-determined solution n  6 points

  9. Degenerate configurations • Camera and points on a twisted cubic • Points lie on plane or single line passing • through projection center

  10. Data normalization Centroid at origin Scaling

  11. Geometric error Assume 3D points are more accurately known

  12. Gold Standard algorithm • Objective • Given n≥6 3D to 2D point correspondences {Xi↔xi’}, determine the Maximum Likelihood Estimation of P • Algorithm • Linear solution: • Normalization: • DLT: Form 2nx12 matrix A and solve for Ap=0, subject to ||p||=1. p is the singular vector of A corresponding to the smallest singular value. • Minimization of geometric error: using the linear estimate as a starting point minimize the geometric error: • Denormalization: ~ ~ ~

  13. Calibration example • Canny edge detection • Straight line fitting to the detected edges • Intersecting the lines to obtain the images corners • typically precision <1/10 • (HZ rule of thumb: 5n constraints for n unknowns)

  14. Errors in the world Errors in the image and in the world points

  15. Restricted camera estimation • Find best fit that satisfies • skew s is zero • pixels are square • principal point is known • complete camera matrix K is known • Minimize geometric error • impose constraint through parameterization • Image only 9  2n, otherwise 3n+9 5n • Minimize algebraic error • assume map from param q  P=K[R|-RC], i.e. p=g(q) • minimize ||Ag(q)|| (9 instead of 12 parameters)

  16. Restricted camera estimation • Initialization • Use general DLT • Clamp values to desired values, e.g. s=0, x= y • Note: can sometimes cause big jump in error • Alternative initialization • Use general DLT • Impose soft constraints • gradually increase weights

  17. Exterior orientation (hand-eye calibration) Calibrated camera, position and orientation unknown  Pose estimation 6 dof  3 points minimal (4 solutions in general)

  18. Experimental evaluation Algebraic method minimizes 12 errors instead of 2n=396

  19. Example: n=197, =0.365, =0.37 Covariance estimation ML residual error d: number of parameters

  20. Covariance for estimated camera Compute Jacobian of measured points in terms of camera parameters at ML solution, then (variance per parameter can be found on diagonal) Confidence ellipsoid for camera center: cumulative-1 (chi-square distribution =distribution of sum of squares)

  21. Radial distortion short and long focal length

  22. 2 Correction of distortion Choice of the distortion function and center • Computing the parameters of the distortion function • Minimize with additional unknowns • Straighten lines

  23. Placing virtual models in video unmodelled radial distortion Bundle adjustment needed to avoid drift of virtual object throughout sequence modelled radial distortion (Sony TRV900; miniDV)

  24. Some typical calibration algorithms Tsai calibration Zhang calibration http://research.microsoft.com/~zhang/calib/ Z. Zhang. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11):1330-1334, 2000. Z. Zhang. Flexible Camera Calibration By Viewing a Plane From Unknown Orientations. International Conference on Computer Vision (ICCV'99), Corfu, Greece, pages 666-673, September 1999.

  25. Related Topics • Calibration from vanishing points • Calibration from the absolute conic

  26. Outline • Computation of camera matrix P • Introduction to epipolar geometry estimation

  27. Three questions: • Correspondence geometry: Given an image point x in the first image, how does this constrain the position of the corresponding point x’ in the second image? • (ii) Camera geometry (motion): Given a set of corresponding image points {xi ↔x’i}, i=1,…,n, what are the cameras P and P’ for the two views? • (iii) Scene geometry (structure): Given corresponding image points xi ↔x’i and cameras P, P’, what is the position of (their pre-image) X in space?

  28. The epipolar geometry C,C’,x,x’ and X are coplanar

  29. The epipolar geometry What if only C,C’,x are known?

  30. The epipolar geometry All points on p project on l and l’

  31. The epipolar geometry Family of planes p and lines l and l’ Intersection in e and e’

  32. The epipolar geometry epipoles e,e’ = intersection of baseline with image plane = projection of projection center in other image = vanishing point of camera motion direction an epipolar plane = plane containing baseline (1-D family) an epipolar line = intersection of epipolar plane with image (always come in corresponding pairs)

  33. Example: converging cameras

  34. Example: motion parallel with image plane (simple for stereo  rectification)

  35. Example: forward motion e’ e

  36. The fundamental matrix F algebraic representation of epipolar geometry we will see that mapping is a (singular) correlation (i.e. projective mapping from points to lines) represented by the fundamental matrix F

  37. The fundamental matrix F correspondence condition The fundamental matrix satisfies the condition that for any pair of corresponding points x↔x’ in the two images

  38. The fundamental matrix F F is the unique 3x3 rank 2 matrix that satisfies x’TFx=0 for all x↔x’ • Transpose: if F is fundamental matrix for (P,P’), then FT is fundamental matrix for (P’,P) • Epipolar lines: l’=Fx & l=FTx’ • Epipoles: on all epipolar lines, thus e’TFx=0, x e’TF=0, similarly Fe=0 • F has 7 d.o.f. , i.e. 3x3-1(homogeneous)-1(rank2) • F is a correlation, projective mapping from a point x to a line l’=Fx (not a proper correlation, i.e. not invertible)

  39. Projective transformation and invariance Derivation based purely on projective concepts F invariant to transformations of projective 3-space unique not unique canonical form

  40. The essential matrix ~fundamental matrix for calibrated cameras (remove K) 5 d.o.f. (3 for R; 2 for t up to scale) E is essential matrix if and only if two singular values are equal (and third=0)

  41. Four possible reconstructions from E (only one solution where points is in front of both cameras)

More Related