Blending 3D model textures by image projection
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G09G-005/14
G06T-017/20
G06T-019/20
G06T-015/04
G06T-011/00
출원번호
US-0479952
(2012-05-24)
등록번호
US-9224233
(2015-12-29)
발명자
/ 주소
Hsu, Stephen Charles
출원인 / 주소
Google Inc.
대리인 / 주소
Dority & Manning, P.A.
인용정보
피인용 횟수 :
0인용 특허 :
9
초록▼
An example method and system for blending textures of a composite image formed by a plurality of source images mapped onto a three dimensional model are presented. The composite image is projected to obtain an unblended projected image having textures. The textures are blended to obtain a blended pr
An example method and system for blending textures of a composite image formed by a plurality of source images mapped onto a three dimensional model are presented. The composite image is projected to obtain an unblended projected image having textures. The textures are blended to obtain a blended projected image. Both the unblended and the blended projected images are backprojected onto the three dimensional model. A difference is determined between a pixel of the backprojected, blended image and a corresponding pixel of the backprojected, unblended image. The determined difference is then applied to a further corresponding pixel of the composite image to obtain a modified composite image representing a blending of the plurality of source images in the composite image.
대표청구항▼
1. A method, performed on one or more processors, for blending textures of a composite image formed by a plurality of source images mapped onto a three dimensional model, comprising: projecting the composite image to obtain a two dimensional, unblended, projected image having textures;blending the t
1. A method, performed on one or more processors, for blending textures of a composite image formed by a plurality of source images mapped onto a three dimensional model, comprising: projecting the composite image to obtain a two dimensional, unblended, projected image having textures;blending the textures of the unblended projected image to obtain a blended projected image;backprojecting the unblended projected image and the blended projected image onto the three dimensional model;determining a difference between a pixel of the backprojected, blended image and a corresponding pixel of the backprojected, unblended image; andapplying the determined difference to a further corresponding pixel of the composite image to obtain a modified composite image representing a blending of the plurality of source images in the composite image. 2. The method of claim 1, wherein the projecting comprises projecting an oblique composite image. 3. The method of claim 1, wherein the projecting comprises orthorectifying the composite image to obtain an unblended orthorectified image. 4. The method of claim 1, wherein the blending comprises blending the textures using a multiband blending technique. 5. The method of claim 1, further comprising tagging each pixel of the composite image with an ID corresponding to the source image associated with the pixel. 6. The method of claim 1, further comprising dividing the unblended projected image into a plurality of tiles. 7. The method of claim 6, further comprising overlapping each of the plurality of tiles in at least one of X and Y dimensions. 8. The method of claim 6, wherein the textures within each of the plurality of tiles are blended individually. 9. The method of claim 8, wherein the blending the textures comprises blending each of the plurality of tiles together to obtain the blended projected image. 10. The method of claim 1, wherein the determining a difference comprises determining an additive change between a pixel of the backprojected, blended image and a corresponding pixel of the backprojected, unblended image. 11. The method of claim 1, wherein the determining a difference comprises determining a multiplicative change between a pixel of the backprojected, blended image and a corresponding pixel of the backprojected, unblended image. 12. A system for blending textures of a composite image formed by a plurality of source images mapped onto a three dimensional model, comprising: an image projector configured to project the composite image to obtain a two dimensional, unblended projected image having textures;a blending engine configured to blend the textures of the unblended projected image to obtain a blended projected image;a backprojecting engine configured to backproject the unblended projected image and the blended projected image onto the three dimensional model; anda pixel processing unit configured to: determine a difference between a pixel of the backprojected, blended image and a corresponding pixel of the backprojected, unblended image, andapply the difference to a further corresponding pixel of the composite image to obtain a modified composite image representing a blending of the plurality of source images in the composite image. 13. The system of claim 12, wherein the image projector is configured to orthorectify the composite image to obtain an unblended orthorectified image. 14. The system of claim 12, wherein the plurality of source images illustrate content of the model at different perspectives. 15. The system of claim 12, wherein the plurality of source images illustrate content of the model at various brightness levels. 16. The system of claim 12, wherein the plurality of source images illustrate content of the model at various exposure settings. 17. The system of claim 12, wherein the blending engine is configured to blend the textures using a multiband blending technique. 18. The system of claim 12, wherein the composite image is derived from a digital surface model (DSM). 19. The system of claim 12, wherein the blending engine is further configured to separately blend tiled regions of the unblended projected image. 20. An apparatus comprising at least one non-transitory computer readable storage medium encoding instructions thereon that, in response to execution by a computing device, cause the computing device to perform operations comprising: projecting the composite image formed by a plurality of the source images mapped onto a three dimensional model to obtain a two dimensional, unblended projected image having textures;blending the textures of the unblended projected image to obtain a blended projected image;backprojecting the unblended projected image and the blended projected image onto the three dimensional model;determining a difference between a pixel of the backprojected, blended image and a corresponding pixel of the backprojected, unblended image; andapplying the determined difference to a further corresponding pixel of the composite image to obtain a modified composite image representing a blending of the plurality of source images in the composite image. 21. The apparatus of claim 20, wherein the plurality of source images illustrate content of the model at different perspectives. 22. The apparatus of claim 20, wherein the plurality of source images illustrate content of the model at various brightness levels. 23. The apparatus of claim 20, wherein the plurality of source images illustrate content of the model at various exposure settings. 24. The apparatus of claim 20, wherein the composite image is derived from a digital surface model (DSM).
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (9)
Yalin Xiong, Blending arbitrary overlaying images into panoramas.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.