1 / 26

QuickTime VR An Image-based Approach to Virtual Environment Navigation

QuickTime VR An Image-based Approach to Virtual Environment Navigation. Shenchang Eric Chen Apple Computer, Inc. Computer Graphics Lab. 정규만. Introduction(1). Motivation

Download Presentation

QuickTime VR An Image-based Approach to Virtual Environment Navigation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. QuickTime VRAn Image-based Approach to Virtual Environment Navigation Shenchang Eric Chen Apple Computer, Inc. Computer Graphics Lab. 정규만

  2. Introduction(1) • Motivation • All that view interpolation stuff was fine and good, but now I want to create an IBR system that lets people navigate virtual environment…and I want it run on your PC! • Forget Lance Williams…I’m gonna get into SIGGRAPH by myself this time!

  3. Introduction(2) • 3D Modeling and Rendering • creating the geometric entities is a laborious manual process • for real-time, limit scene complexity and rendering quality • need for special-purpose rendering engine • Branching Movies • video game • limited navigability and interaction

  4. Objective and Main Approach • First, the system should playback at interactive speed on most personal computers available today without hardware acceleration. • Second, the system should accommodate both real and synthetic scenes. • Third, the system should be able to display high quality images independent of scene complexity. • The approach presented is similar to the movie-based approach and shares the same advantages.

  5. The idea(1) • Use 360-degree cylindrical panoramic images to compose a virtual environment. • Allow panoramas to link through hot spots • Panoramic images can be created via rendering, panoramic cameras, or by “stitching” together overlapping photos • Allow the user to view an object from different direction

  6. The Idea(2) • Why Bother? • Traditional virtual walkthroughs use specialized hardware rendering engines that are not widely available. • Branching movies require lots of space and have limited navigability. • The Objectives: • The system should interactively playback on most PCs without hardware acceleration. • The system must handle both real and synthetic images. • The display should be independent of scene complexity.

  7. Image-Based Rendering(1) • Simulate a virtual camera's motion in space- requires 6 degrees of freedom. • Camera rotation • 3 rotational degrees of freedom • rotating the camera's view direction while keeping the viewpoint stationary. • can be accomplished with an image rotation or reprojection of an environment map. • Camera orbiting(Object rotation) • can be achieved by orbiting a camera while keeping the view direction centered at the object.

  8. Image-Based Rendering(2) • Camera movement • free motion of the camera in space requires all 6 degrees of freedom. • Camera zooming • changing the camera’s field-of-view • can be done with multi-resolution images.

  9. Image-Based Rendering(3) • Camera Rotation • Must handle pitch, yaw, and roll. • Roll can be done by rotating the image. • Environment map- a canonical projection that can be regarded as an orientation-independent view of the scene • Environment maps project the scene onto a simple shape. • Reprojecting the environment map can handle pitch and yaw. Figure. Reprojecting a cubic and a spherical environment map.

  10. Image-Based Rendering(4) • Object Rotation: • Orbiting the camera around an object is equivalent to rotating the object about its center. • Can store all allowable orientations as frames. • Can use View Interpolation in-between these frames.

  11. Image-Based Rendering(5) • Camera Movement • Environment maps can be created along the path at intervals to allow movement down a hallway. • Arrange environment maps in a 2D or 3D lattice for 2D or 3D motion- this requires a lot of space. • Could use View Interpolation to generate nearby environment maps

  12. Image-Based Rendering(6) • Camera Zooming: • Can use multi-resolution images stored in a tree structure to provide more detail when zooming. • Optimization: segment images so the whole high-resolution image isn't loaded into memory when we zoom.

  13. QuickTime VR(1) • Implements continuous camera panning and zooming. • Allows jumping to selected points and object rotation via frame indexing. • Uses cylindrical environment maps to accomplish camera rotation. • It's relatively easy to capture a panorama in this format. • Two movie formats: panoramic & object

  14. QuickTime VR(2) • Panoramic Movie • Each node in a movie is stored in 3 tracks: • The panoramic track holds the graph info and pointers to other tracks. • The video track holds the panoramic images. • The hot spot video track is an optional track that's used to identify regions of the image that activate links.

  15. QuickTime VR(3) • Panoramic Movie • Hot spot images go through the same warping process as panoramic images. So the hot spots will stay attached to the appropriate objects regardless of pans and zooms. • Object Movie • 2D array of frames with each frame as a viewing direction.

  16. Interactive Environment(1) • The Panoramic Player • Allows user to perform continuous panning in vertical & horizontal directions (looking all the way up or down isn't allowed). • Zooming- simple image magnification. • Panoramic images are usually compressed and diced into tiles.

  17. Interactive Environment(2) • The Panoramic Player • The tiles overlapping the current view orientation are decompressed to an offscreen buffer. • The visible region is then warped to show the correct perspective (we can cache tiles to improve performance). • Image warp reprojects sections of the cylindrical image onto a planar view. • Moving in space is accomplished by jumping to points where panoramic images are attached.

  18. Interactive Environment(3) • The Object Player • Each frame in the 2D array of object images is created with a constant color background to facilitate compositing. • Multiple frames for the same direction can be looped continuously (can simulate time varying behavior, like a flickering candle).

  19. Authoring Environment(1)

  20. Authoring Environment(2) • Stitching • Create a seamless panoramic images from a set of overlapping pictures.

  21. Authoring Environment(3) • Stitching • The stitcher blends adjacent pictures by smoothing out intensity levels over an overlap region between the two images. • Object Movie-Making • Use a computer-controlled camera rig that surrounds the object and capture frames.

  22. Results(1)

  23. Results(2)

  24. Results(3)

  25. Applications and Questions(1) • Applications • Virtual travel, real estate property inspection, architecture visualization, games, etc. • Discussion Questions • 1. What are the advantages of using an image-based rendering system, as opposed to a system that is geometry-based? • 2. Can you think of some limitations to the IBR systems presented in these papers? (i.e. What assumptions are made, for instance, in the View Interpolation paper or any of the other papers? What effects might be difficult to obtain in an IBR system?)

  26. Applications and Questions(2) • Discussion Questions • 3. When might one of the approaches presented here (plenoptic modeling, view interpolation, or QuickTime VR) be more useful than another? • 4. Can you think of some more applications for the systems in these papers, or for IBR systems in general?

More Related