Translated view navigation for visualizations
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06T-011/00
G06T-015/04
G06T-019/00
출원번호
US-0804543
(2013-03-14)
등록번호
US-9305371
(2016-04-05)
발명자
/ 주소
Arcas, Blaise Aguera y
Unger, Markus
Barnett, Donald A.
Sinha, Sudipta Narayan
Stollnitz, Eric Joel
Kopf, Johannes Peter
Pylvaenaeinen, Timo Pekka
Messer, Christopher Stephen
출원인 / 주소
Uber Technologies, Inc.
대리인 / 주소
Fenwick & West LLP
인용정보
피인용 횟수 :
3인용 특허 :
24
초록▼
Among other things, one or more techniques and/or systems are provided for defining transition zones for navigating a visualization. The visualization may be constructed from geometry of a scene and one or more texture images depicted the scene from various viewpoints. A transition zone may correspo
Among other things, one or more techniques and/or systems are provided for defining transition zones for navigating a visualization. The visualization may be constructed from geometry of a scene and one or more texture images depicted the scene from various viewpoints. A transition zone may correspond to portions of the visualization that do not have a one-to-one correspondence with a single texture image, but are generated from textured geometry (e.g., a projection of texture imagery onto the geometry). Because a translated view may have visual error (e.g., a portion of the translated view is not correctly represented by the textured geometry), one or more transition zones, specifying translated view experiences (e.g., unrestricted view navigation, restricted view navigation, etc.), may be defined. For example, a snapback force may be applied when a current view corresponds to a transition zone having a relatively higher error.
대표청구항▼
1. A method for defining transition zones for navigating a visualization of a scene, comprising: utilizing a computing device to project a first texture image and a second texture image onto a geometry of a scene to create a textured geometry of the scene;determining, via the computing device, an er
1. A method for defining transition zones for navigating a visualization of a scene, comprising: utilizing a computing device to project a first texture image and a second texture image onto a geometry of a scene to create a textured geometry of the scene;determining, via the computing device, an error basin for a transition space between the first texture image and the second texture image based upon a visual error measurement, the transition space based upon the textured geometry;defining, by the computing device, a first transition zone within the transition space based upon one or more first translated views being below an error threshold within the error basin, the first transition zone specifying a first translated view experience for the one or more first translated views;defining, by the computing device, a second transition zone within the transition space based upon one or more second translated views being above the error threshold within the error basin, the second transition zone defining a second translated view experience for the one or more second translated views;receiving a navigation request from a user, the request including a navigation pause within the second transition zone; andresponsive to the navigation pause, transitioning a current view position within the second transition zone to an alternative current view. 2. The method of claim 1, wherein the visual error measurement comprises at least one of: an inaccurate geometry measurement;a resolution fallout measurement;a pixel occlusion measurement; ora color difference measurement. 3. The method of claim 1, wherein the first translated view experience specifies unrestricted navigation movement for the one or more first translated views within the first transition zone. 4. The method of claim 1, wherein the second translated view experience specifies the restricted navigation movement, the restricted navigation movement corresponding to a snapback force from a current view position to at least one of the first texture image, the second texture image, or the first transition zone. 5. The method of claim 1, wherein at least one of the first texture image or the second texture image comprises at least one of: a panorama image;a photo image;a generated image; oran orthographic image from an aerial viewpoint. 6. The method of claim 1, further comprising: generating a graph representing one or more texture images available to texture the geometry, the graph comprising a first node representing the first texture image, a second node representing the second texture image, and a transitional edge between the first node and the second node, the transitional edge corresponding to the transition space. 7. The method of claim 6, wherein the determining an error basin comprises: obtaining a first image feature of a first rendered view of the scene at a first point along the transitional edge, the first rendered view based upon the textured geometry;obtaining a second image feature of a second rendered view of the scene at a second point along the transitional edge, the second rendered view based upon the textured geometry; anddetermining the visual error measurement based upon a comparison of the first image feature of the first rendered view to the second image feature of the second rendered view. 8. The method of claim 1, further comprising: generating a confidence mask comprising one or more pixel confidences, a first pixel confidence of a first geometry pixel specifying a confidence that an object, associated with the first geometry pixel, is represented in both the first texture image and the second texture image. 9. The method of claim 8, further comprising: responsive to the first pixel confidence being below a confidence threshold: determining that the first geometry pixel corresponds to a transient occluder; andmodifying the first geometry pixel based upon at least one of a blending technique, an inpaint technique, a shading technique, or a fadeout technique. 10. The method of claim 1, further comprising: generating a first color model for the first texture image and a second color model for the second texture image; andestablishing a color relationship between the first texture image and the second texture image based upon the first color model and the second color model. 11. The method of claim 10, further comprising: blending color from the first texture image and the second texture image based upon the color relationship to create a current translated view of the scene. 12. The method of claim 1, further comprising: providing an interactive navigation experience of the scene through one or more current views, the one or more current views comprising a current translated view provided based upon the first translated view experience or the second translated view experience. 13. The method of claim 1, further comprising: defining one or more additional transition zones based upon the error basin. 14. The method of claim 1, wherein the alternative current view is associated with the first transition zone. 15. A system for defining transition zones for navigating a visualization of a scene, comprising: a computer-readable storage device that stores a first set of instructions that when executed by a computing device provides error estimation by implementing the following steps: projecting a first texture image and a second texture image onto a geometry of a scene to create a textured geometry of the scene; anddetermining an error basin for a transition space between the first texture image and the second texture image based upon a visual error measurement, the transition space based upon the textured geometry; andthe computer-readable storage device that stores a second set of instructions that when executed by the computing device provides zone definition by implementing the following steps: defining a first transition zone within the transition space based upon one or more first translated views being below an error threshold within the error basin, the first transition zone specifying a first translated view experience for the one or more first translated views;defining a second transition zone within the transition space based upon one or more second translated views being above the error threshold within the error basin, the second transition zone defining a second translated view experience for the one or more second translated views;receiving a navigation request from a user, the request including a navigation pause within the second transition zone; andresponsive to the navigation pause, transitioning a current view position within the second transition zone to an alternative current view. 16. The system of claim 15, further comprising: a color model component configured to:generate a first color model for the first texture image and a second color model for the second texture image; andestablish a color relationship between the first texture image and the second texture image based upon the first color model and the second color model. 17. The system of claim 16, wherein the color model component is configured to: blend color from the first texture image and the second texture image based upon the color relationship to create a current translated view of the scene. 18. The system of claim 15, further comprising: a confidence mask component configured to: generate a confidence mask comprising one or more pixel confidences, a first pixel confidence of a first geometry pixel specifying a confidence that an object, associated with the first geometry pixel, is represented in both the first texture image and the second texture image. 19. The system of claim 18, wherein the confidence mask component is configured to: responsive to the first pixel confidence being below a confidence threshold: determine that the first geometry pixel corresponds to a transient occluder; andmodify the first geometry pixel based upon at least one of a blending technique, an inpaint technique, a shading technique, or a fadeout technique. 20. The system of claim 15, further comprising: a graph component configured to: generate a graph representing one or more texture images available to texture the geometry, the graph comprising a first node representing the first texture image, a second node representing the second texture image, and a transitional edge between the first node and the second node, the transitional edge corresponding to the transition space.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (24)
Hamilton, II, Rick A.; O'Connell, Brian M.; Pickover, Clifford A.; Walker, Keith R., Asynchronous immersive communications in a virtual universe.
Logan, Ronald K.; Szeliski, Richard S.; Uyttendaele, Matthew T., Automatic digital image grouping using criteria based on image metadata and spatial information.
Shum Heung-Yeung ; Han Mei ; Szeliski Richard S., Interactive construction of 3D models from panoramic images employing hard and soft constraint characterization and decomposing techniques.
Christofferson Carl L. ; Christofferson Frank C., Method for automatically smoothing object level of detail transitions for regular objects in a computer graphics display system.
Park, Kyoung-Ju; Cho, Sung-Dae; Kim, Soo-Kyun; Moon, Jae-Won; Cho, Nam-Ik; Lee, Sang-Hwa; Koo, Hyung-Il; Ha, Seong-Jong, Method for taking panorama mosaic photograph with a portable terminal.
Sinha, Sudipta N.; Roberts, Richard; Steedly, Drew; Szeliski, Richard, Performing structure from motion for unordered images of a scene with multiple object instances.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.