1 / 28

Artificial Vision-Based Tele-Operation for Lunar Exploration

Artificial Vision-Based Tele-Operation for Lunar Exploration. Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan, Stephanie Herd. Project Mentor Dr. Giovanni Giardini. Project Advisor Prof. Tamás Kalmár-Nagy. NASA JSC Mentors Dr. Bob Savely

ashtyn
Download Presentation

Artificial Vision-Based Tele-Operation for Lunar Exploration

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Artificial Vision-Based Tele-Operation for Lunar Exploration Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan, Stephanie Herd Project Mentor Dr. Giovanni Giardini Project Advisor Prof. Tamás Kalmár-Nagy NASA JSC Mentors Dr. Bob Savely Dr. Mike Goza

  2. Freshman • Freshman • Sophomore • Sophomore • Junior • Senior • Senior • Nicholas Logan • Stephanie Herd • Aaron Roney • Albert Soto • Bonnie Stern • Brian Kuehner • David Taylor • Electrical Engineering • Computer Engineering • Nuclear Engineering • Mechanical Engineering • Mechanical Engineering • Aerospace Engineering • Aerospace Engineering Project Members

  3. Outline • Motivation and Objectives • Ego-Motion Theory • Code Flow • Calibration and Rectification • Hardware • Testing Results • Future Work

  4. Lunar surface exploration • Humanperspective • In safety • With low risk • 3D environment reconstruction • Self location with artificial vision system Motivation

  5. Visual Feedback System for Tele-Operations Objectives • Vision System • Ego-Motion estimation • Environment reconstruction • Tele-Operation System • Remote control mobile unit • Hardware and Mechanical • Implementation

  6. Vehicle Hardware Wireless 802.11 Network Visual System (onboard the Vehicle) Ground Station

  7. Ego-Motion Theory Vehicle Hardware Wireless 802.11 Network Visual System (onboard the Vehicle) Ground Station

  8. Left image Right image 3D Reconstruction Theory urightp uleftp uleftp • It is impossible to compute the 3D coordinates of an object with a single image • Solution: Stereo Cameras • Disparity computation • 3D reconstruction Image vleftp vrightp

  9. Environment Reconstruction • Disparity mapcomputation: • Given 2 images, it is a collection of pixel disparities • Point distances can be calculated from disparities Environment can be reconstructed from disparity map Left Image Right Image Disparity Map

  10. Ego-Motion Estimation • Main goal: Evaluate the motion (translation and rotation) of the vehicle from sequences of images Optical Flow Example • Optical Flow is related to vehicle movement through the Perspective Projection Equation Perspective Projection Equation • Solving will give change in position of the vehicle • Least Square solution

  11. Code Flow Vehicle Hardware Wireless 802.11 Network Visual System (onboard the Vehicle) Ground Station

  12. Image Processing Code Calibration Parameters Logitech QuickCam Deluxe Acquire Images Rectify Images Ego-Motion Estimation Sony VAIO - Pentium 4 Wireless 802.11 Network Ground Station

  13. Mobile Unit Detailed Code Calibration Parameters T = 0.15 sec Acquire Image T = 0.5 sec Rectify Images Apply Distortion Coefficient to Image Matrix Image Parameters: Gray Scale (640x480) … Snapshot Image Matrix Save Image Rectified Image Matrix Wireless 802.11 Network Ground Station Ego-Motion Estimation

  14. Ego-Motion Estimation Overview Discard All non-Identical Points in All images Find Features in Right Image Right Image Track Right Image Features in Left Image Find Features in Left Image Left Image Displacement Vector (X, Y, Z, X-Rot, Y-Rot, Z-Rot) Find Features in New Right Image Track Right Image Features in New Right Image New Right Image New Left Image Image Feature Matrix Find Features in New Left Image Track Right Image Features in New Left Image Calibration Parameters Wireless 802.11 Network T = 3 sec

  15. Calibration and Rectification Vehicle Hardware Wireless 802.11 Network Visual System (onboard the Vehicle) Ground Station

  16. Calibration and Rectification • Calibration: Utilizes Matlab tools to determine image distortion associated with the camera • Rectification: Removes the distortion in the images

  17. Hardware Vehicle Hardware Wireless 802.11 Network Visual System (onboard the Vehicle) Ground Station

  18. Hardware Mobile Unit Base Station Web Cameras TROPOS Router Operator Computer Laptop Wireless 802.11 Command Computer Mobile Unit Linksys Router Wireless 802.11

  19. Improvements Implemented in the System • Improved robustness of the software • Implemented a menu driven system for the operator using Matlab’s network handling protocol • Allowed pictures to be taken • Run Ego-motion • Sending all the results to the operator • Graphic displaying of optical flow • Reduced crashing • Achieved greater mobile unit control

  20. Mobile Unit Vehicle Courtesy of Prof. Dezhen Song FOV1 FOV2 D α Baseline L • Camera support system • 3-DOF mechanical neck: • Panoramic rotation • Tilt rotation • Telescopic capability Horizontal View • Controlled height and baseline length

  21. Testing Result Vehicle Hardware Wireless 802.11 Network Visual System (onboard the Vehicle) Ground Station

  22. Test Environment Light to simulate solar exposure Black background to eliminate background features Walls to eliminate stray light and side shadows Lunar Environment Measured displacements

  23. Test Setup • 25 pictures taken from each location (0, 5, 10 and 15 cm) in the Z direction (perpendicular to camera focal plane), unidirectional movement • Set 1 25 images located at Z=0 • Set 2 25 images located at Z=5 • Set 3 25 images located at Z=10 • Set 4 25 images located at Z=15 • The distances are measured using a tape measure • The cameras are mounted using a semi ridged fixture

  24. Determining the Number of Features Results for 5 cm displacement Used all 100 images Compared each set to the previous • But the accuracy of the results decrease • The standard deviation decreases with the more features 100 Features were selected

  25. Ego-Motion: Example Optical Flow Left Image Optical Flow Right Image

  26. Problems • Images were not rectified • Possible motion of cameras between images • No image filtering • Camera mounting is misaligned • Images acquired from the right camera appear blurry

  27. Conclusions andFuture Work • Demonstrated: • Ego-motion estimation • Environment Reconstruction • Vehicle control and movement • System integration • Future Developments: • Filtering and improving results • Increase the robustness of the vision system • Create a visual 3D environment map

  28. Acknowledgements • Thanks to: • Prof. Tamás Kalmár-Nagy • Dr. Giovanni Giardini • Prof. Dezhen Song • Change Young Kim • Magda Lagoudas • Tarek Elgohary • Pedro Davalos

More Related