최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | US-0327651 (2006-01-09) |
등록번호 | US-7430312 (2008-09-30) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 391 인용 특허 : 12 |
According to a general aspect, processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and
According to a general aspect, processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.
What is claimed is: 1. A computer-implemented method comprising: projecting an infra-red pattern onto a three-dimensional object; producing a first image of the three-dimensional object while the pattern is projected on the three-dimensional object, the first image (i) including the three-dimension
What is claimed is: 1. A computer-implemented method comprising: projecting an infra-red pattern onto a three-dimensional object; producing a first image of the three-dimensional object while the pattern is projected on the three-dimensional object, the first image (i) including the three-dimensional object and the pattern, (ii) being a two-dimensional digital image including pixels, and (iii) being produced by capturing at a first camera light filtered through an infra-red filter; producing a second image of the three-dimensional object while the pattern is projected on the three-dimensional object, the second image (i) including the three-dimensional object and the pattern, (ii) being a two-dimensional digital image including pixels, and (iii) being produced by capturing at a second camera light filtered through an infra-red filter, and the first and second cameras arranged as a first stereo pair having a known physical relationship; identifying, based on a deformation of the pattern, a depth discontinuity occurring at a junction of surfaces of the three-dimensional object having different depths; establishing a first-pair correspondence between a portion of the pixels of the first image and a portion of the pixels of the second image; controlling, based on the identified depth discontinuity, a disparity propagation of the portion of the pixels of the first image and the portion of the pixels of the second image established in the first-pair correspondence; producing a third image of the three-dimensional object while the pattern is projected on the three-dimensional object, the third image (i) including the three-dimensional object but not the pattern and (ii) being a two-dimensional digital image including pixels; and constructing, based on the disparity propagation of the portion of the pixels of the first image and the portion of the pixels of the second image established in the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object. 2. The method of claim 1 wherein projecting the infra-red pattern comprises projecting a non-random infra-red pattern. 3. The method of claim 1 wherein establishing the first-pair correspondence comprises: determining a correspondence between an initial pixel in the first image and a corresponding pixel in the second image; and determining a correspondence between additional pixels in the first image and corresponding pixels in the second image, based on the correspondence between the initial pixel in the first image and its corresponding pixel in the second image. 4. The method of claim 1 wherein establishing the first-pair correspondence comprises: determining a correspondence between a first initial pixel located on a first particular horizontal line in the first image and a first corresponding pixel that corresponds to the first initial pixel, wherein the first corresponding pixel is located on the first particular horizontal line in the second image; determining a correspondence between additional pixels located on the first particular horizontal line in the first image and corresponding pixels that correspond to the additional pixels, wherein the corresponding pixels are located on the first particular horizontal line in the second image; determining a correspondence between a second initial pixel located on a second particular horizontal line in the first image and a second corresponding pixel that corresponds to the second initial pixel, wherein the second corresponding pixel is located on the second particular horizontal line in the second image; and determining a correspondence between additional pixels located on the second particular horizontal line in the first image and corresponding pixels that correspond to the additional pixels, wherein the corresponding pixels are located on the second particular horizontal line in the second image. 5. The method of claim 1, wherein the first and second images includes horizontal lines, and wherein establishing the first-pair correspondence comprises: determining a correspondence between an initial pixel in each horizontal line in the first image and a corresponding pixel in each horizontal line in the second image; and determining a correspondence between additional pixels in the first image and corresponding pixels in the second image, based on the correspondence between the initial pixel in each horizontal line in the first image and its corresponding pixel in each horizontal line in the second image. 6. The method of claim 1 wherein the pattern comprises vertical stripes. 7. The method of claim 5 wherein a first initial pixel is a centroid pattern pixel calculated from pixels associated with the pattern in a first of the horizontal lines. 8. The method of claim 4, wherein determining the correspondence for at least one of the additional pixels located on the second particular horizontal line in the first image is based on the correspondence determined for at least one other pixel located in the second particular horizontal line. 9. The method of claim 4, wherein determining the correspondence for at least one of the additional pixels located on the second particular horizontal line in the first image is based on the correspondence determined for at least one pixel located in the first particular horizontal line. 10. The method of claim 9 wherein the at least one pixel located in the first particular horizontal line is in a common stripe edge with the at least one of the additional pixels located in the second particular horizontal line. 11. The method of claim 1 wherein constructing the two-dimensional image that depicts the three-dimensional construction comprises: forming a first set of three-dimensional points based on the first-pair correspondence; and producing a first three-dimensional surface model based on the first set of three-dimensional points. 12. The method of claim 1 further comprising: producing a fourth image of the three-dimensional object while the pattern is projected on the three-dimensional object, the fourth image being a two-dimensional digital image including pixels and being produced by capturing at a fourth camera light filtered through an infra-red filter; producing a fifth image of the three-dimensional object while the pattern is projected on the three-dimensional object, the fifth image being a two-dimensional digital image including pixels and being produced by capturing at a fifth camera light filtered through an infra-red filter, the fourth and fifth cameras arranged as a second stereo pair having a known physical relationship; and establishing a second-pair correspondence between a portion of the pixels of the fourth image and a portion of the pixels of the fifth image, wherein constructing the two-dimensional image that depicts the three-dimensional construction of the three-dimensional object is further based on the second-pair correspondence. 13. The method of claim 12 wherein constructing the two-dimensional image that depicts the three-dimensional construction comprises: producing a first three-dimensional surface model based on the first-pair correspondence; producing a second three-dimensional surface model based on the second-pair correspondence; and registering the first and second three-dimensional surface models. 14. The method of claim 13 wherein registering comprises: determining a common surface in the first and the second three-dimensional surface models; producing an initial estimate for a registration matrix based on the common surface; and determining the closest points between the first and the second three-dimensional surface models based on the initial estimate for a registration matrix. 15. The method of claim 13 wherein: producing the first three-dimensional surface model comprises: forming a first set of three-dimensional points based on the first-pair correspondence; and producing the first three-dimensional surface model based on the first set of three-dimensional points, and producing the first three-dimensional surface model comprises: forming a second set of three-dimensional points based on the second-pair correspondence; and producing the second three-dimensional surface model based on the second set of three-dimensional points. 16. The method of claim 13 further comprising: integrating, after the registering, the first and second three-dimensional surface models to produce an integrated three-dimensional surface model; and providing texture to the integrated three-dimensional surface model. 17. The method of claim 1 wherein producing the third image comprises producing the third image by capturing non-filtered light at a third camera. 18. The method of claim 17 wherein the third camera is a texture camera. 19. A system comprising: a first stereo camera pair including a first camera coupled to a second camera; a second stereo camera pair including a third camera coupled to a fourth camera; a set of four infra-red filters, with a separate one of the four infra-red filters operatively coupled to each of four cameras; a projector; and a computer readable medium coupled to each of the four cameras and to the projector, and including instructions for performing at least the following: projecting an infra-red pattern from the projector onto a three-dimensional object; producing a first image of the three-dimensional object while the pattern is projected on the three-dimensional object, the first image (i) including the three-dimensional object and the pattern, (ii) being a two-dimensional digital image including pixels, and (iii) being produced by capturing at a first camera light filtered through an infra-red filter; producing a second image of the three-dimensional object while the pattern is projected on the three-dimensional object, the second image (i) including the three-dimensional object and the pattern, (ii) being a two-dimensional digital image including pixels, and (iii) being produced by capturing at a second camera light filtered through an infra-red filter, and the first and second cameras arranged as a first stereo pair having a known physical relationship; identifying, based on a deformation of the pattern, a depth discontinuity occurring at a junction of surfaces of the three-dimensional object having different depths; establishing a first-pair correspondence between a portion of the pixels of the first image and a portion of the pixels of the second image; controlling, based on the identified depth discontinuity, a disparity propagation of the portion of the pixels of the first image and the portion of the pixels of the second image established in the first-pair correspondence; producing a third image of the three-dimensional object while the pattern is projected on the three-dimensional object, the third image (i) including the three-dimensional object but not the pattern, (ii) being a two-dimensional digital image including pixels, and (iii) being produced by capturing light at a texture camera; and constructing, based on the disparity propagation of the portion of the pixels of the first image and the portion of the pixels of the second image established in the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object. 20. The system of claim 19 wherein the projector includes a lighting source capable of producing light in the visible spectrum and in the infrared spectrum. 21. The system of claim 20 wherein the projector includes a fifth infra-red filter. 22. The system of claim 19 wherein the computer readable medium comprises one or more of a processing device and a storage device. 23. A computer readable medium including instructions for performing at least the following: accessing a first image captured of a three-dimensional object, the first image having been captured while a pattern is projected on the three-dimensional object, the first image (i) including the three-dimensional object and the pattern, (ii) being a two-dimensional digital image including pixels, and (iii) being produced by capturing at a first camera light filtered through an infra-red filter; accessing a second image captured of the three-dimensional object, the second image having been captured while the pattern is projected on the three-dimensional object, the second image (i) including the three-dimensional object and the pattern, (ii) being a two-dimensional digital image including pixels, and (iii) being produced by capturing at a second camera light filtered through an infra-red filter; identifying, based on a deformation of the pattern, a depth discontinuity occurring at a junction of surfaces of the three-dimensional object having different depths; establishing a first-pair correspondence between a portion of the pixels of the first image and a portion of the pixels of the second image, the first-pair correspondence being established based on the first and second cameras having been arranged as a first stereo pair having a known physical relationship while the first and second images were captured; controlling, based on the identified depth discontinuity, a disparity propagation of the portion of the pixels of the first image and the portion of the pixels of the second image established in the first-pair correspondence; accessing a third image captured of the three-dimensional object, the third image having been captured while the pattern is projected on the three-dimensional object, the third image (i) including the three-dimensional object but not the pattern, (ii) being a two-dimensional digital image including pixels; and constructing, based on the disparity propagation of the portion of the pixels of the first image and the portion of the pixels of the second image established in the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object. 24. The method of claim 1, wherein controlling the disparity propagation further comprises halting the disparity propagation at the identified depth discontinuity. 25. The method of claim 1, wherein the pattern comprises a stripe, and wherein the depth discontinuity is identified where the stripe breaks in to several broken segments. 26. The method of claim 1, wherein the pattern comprises first and second stripes, and wherein the depth discontinuity is identified where the first and second stripes join each other. 27. A computer program product, tangibly embodied in a machine-readable medium, the computer program product comprising instructions that, when read by a machine, operate to cause data processing apparatus to: generate first and second images of an object illuminated with a pattern; establish a first-pair correspondence between initial matched pixels of the first and second image based on detecting the illuminated pattern in the first and second images; identify a depth discontinuity associated with the object based on a deformation of the illuminated pattern; control a disparity propagation of the initial matched pixels in two directions based on the identified depth discontinuity; and generate a three-dimensional reconstruction of the object based on the controlled disparity propagation. 28. A computer program product, tangibly embodied in a machine readable medium, the computer program product comprising instructions that, when read by a machine, operate to cause data processing apparatus to: receive first and second pairs of images in a sequence of paired images of an object illuminated with a pattern; and for each of the first and second pairs of images: establish a first-pair correspondence between initial matched pixels of the paired images based on detecting the illuminated pattern in the paired images, identify a depth discontinuity associated with the object based on a deformation of the illuminated pattern, control a disparity propagation of the initial matched pixels in two directions based on the identified depth discontinuity, and generate a three-dimensional reconstruction of the object based on the controlled disparity propagation. 29. A computer program product, tangibly embodied in a machine readable medium, the computer program product comprising instructions that, when read by a machine, operate to cause data processing apparatus to: receive first and second images of an object illuminated with a pattern of uncoded stripes; establish a first-pair correspondence between initial matched pixels of the paired images based on detecting the illuminated pattern in the paired images; identify a depth discontinuity associated with the object based on a deformation of the illuminated pattern; control a disparity propagation of the initial matched pixels in two directions based on the identified depth discontinuity; and generate a three-dimensional reconstruction of the object based on the controlled disparity propagation.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.