Multi-aperture depth map using blur kernels and edges
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
H04N-007/18
G06T-007/60
H04N-013/00
H04N-005/222
H04N-009/07
G06T-005/00
G06T-007/571
H04N-005/33
H04N-013/02
출원번호
US-0162147
(2016-05-23)
등록번호
US-9721357
(2017-08-01)
발명자
/ 주소
Wajs, Andrew
출원인 / 주소
Dual Aperture International Co. Ltd.
대리인 / 주소
Fenwick & West LLP
인용정보
피인용 횟수 :
0인용 특허 :
46
초록▼
The present disclosure overcomes the limitations of the prior art by using blurring of edges. For example, a first image may contain an edge and a second image may contain the same edge as the first image. The two images may be captured by imaging systems with blur characteristics that vary differen
The present disclosure overcomes the limitations of the prior art by using blurring of edges. For example, a first image may contain an edge and a second image may contain the same edge as the first image. The two images may be captured by imaging systems with blur characteristics that vary differently as a function of object depth. For example, a dual-aperture system may simultaneously capture a faster f-number visible image and a slower f-number infrared image. Depth information may be generated by comparing blurring of the same edge in the two images.
대표청구항▼
1. A computer-implemented method for processing blurred image data, comprising: selecting a plurality of first windows from first image data associated with a first image of an object, the first image captured using a first imaging system;selecting a corresponding plurality of second windows from se
1. A computer-implemented method for processing blurred image data, comprising: selecting a plurality of first windows from first image data associated with a first image of an object, the first image captured using a first imaging system;selecting a corresponding plurality of second windows from second image data associated with a second image of the object wherein corresponding first and second windows contain a same edge in the object, the second image captured using a second imaging system, wherein a comparison of blurring by the first imaging system and blurring by the second imaging system varies as a function of object depth;for pairs of corresponding first and second windows, estimating the comparison of blurring by the first and second imaging systems based on blurring of the same edge in corresponding first and second windows, wherein estimating the comparison of blurring by the first and second imaging systems comprises: binarizing the same edge in corresponding first and second windows, anddetermining a blur kernel that approximates blurring of the same binarized edge in corresponding first and second windows, wherein different blur kernels correspond to different object depths; andgenerating depth information for the object based on said estimated comparisons comprises selecting the object depth that corresponds to the determined blur kernel. 2. The computer-implemented method of claim 1, wherein the comparison of blurring by the first imaging system and blurring by the second imaging system is a comparison of blur spot size of the first imaging system and blur spot size of the second imaging system. 3. The computer-implemented method of claim 1, wherein determining the blur kernel that approximates blurring of the same edge in corresponding first and second windows comprises blurring the edges using different blur kernels. 4. The computer-implemented method of claim 1, wherein determining the blur kernel that approximates blurring of the same edge in corresponding first and second windows comprises convolving the edges using different blur kernels. 5. The computer-implemented method of claim 1, wherein determining the blur kernel that approximates blurring of the same binarized edge in corresponding first and second windows comprises summing different blur kernels along the binarized edge. 6. The computer-implemented method of claim 5, wherein determining the blur kernel that approximates blurring of the same binarized edge in corresponding first and second windows comprises summing different blur kernels along the binarized edge, the blur kernel depending on an orientation of the edge. 7. The computer-implemented method of claim 1, wherein determining the blur kernel that approximates blurring of the same binarized edge in corresponding first and second windows comprises only summing mathematical operations. 8. The computer-implemented method of claim 1, wherein the windows include edges but do not include parallel edges that are spaced close enough to interfere with each other when blurred by the blur kernels. 9. The computer-implemented method of claim 1, wherein: estimating the comparison of blurring by the first and second imaging systems comprises: for each blur kernel from a bank of blur kernels, wherein each blur kernel corresponds to a different object depth, and the bank of blur kernels spans a range of object depths: blurring the edge of the second window with the blur kernel; andcomparing the blurred edge and the same edge of the corresponding first window; andgenerating depth information for the object comprises generating depth information for the object based on said comparisons. 10. The computer-implemented method of claim 9, wherein a size of the windows is different than a size of the blur kernels. 11. The computer-implemented method of claim 10, wherein a size of the windows is larger than a size of the blur kernels. 12. The computer-implemented method of claim 1, wherein estimating the comparison of blurring by the first and second imaging systems comprises phase matching the same edges in corresponding first and second windows. 13. The computer-implemented method of claim 12, wherein phase matching the same edges in corresponding first and second windows comprises: applying derivative operators to the same edges in corresponding first and second windows; andtaking absolute values of the derivatives of the same edges in corresponding first and second windows. 14. The computer-implemented method of claim 1, wherein estimating the comparison of blurring by the first and second imaging systems comprises equating energy in the same edges in corresponding first and second windows. 15. The computer-implemented method of claim 14, wherein equating energy in the same edges in corresponding first and second windows comprises scaling amplitude of the same edges in corresponding first and second windows so that the amplitude-scaled edges have equal energy. 16. The computer-implemented method of claim 1, wherein the first imaging system is characterized by a first f-number and the second imaging system is characterized by a second f-number that is different than the first f-number. 17. The computer-implemented method of claim 1, wherein the first and second imaging systems are different parts of a dual-aperture imaging system, the first imaging system using a visible spectral band and characterized by a first f-number, and the second imaging system using an infrared spectral band and characterized by a second f-number that is slower than the first f-number. 18. A non-transitory computer-readable storage medium storing executable computer program instructions for processing blurred image data, the instructions executable by a processor and causing the processor to perform a method comprising: selecting a plurality of first windows from first image data associated with a first image of an object, the first image captured using a first imaging system;selecting a corresponding plurality of second windows from second image data associated with a second image of the object wherein corresponding first and second windows contain a same edge in the object, the second image captured using a second imaging system wherein a comparison of blurring by the first and second imaging systems varies as a function of object depth;for pairs of corresponding first and second windows, estimating the comparison of blurring by the first and second imaging systems based on blurring of the same edge in corresponding first and second windows, wherein estimating the comparison of blurring by the first and second imaging systems comprises: binarizing the same edge in corresponding first and second windows, anddetermining a blur kernel that approximates blurring of the same binarized edge in corresponding first and second windows, wherein different blur kernels correspond to different object depths; andgenerating depth information for the object based on said estimated comparisons comprises selecting the object depth that corresponds to the determined blur kernel.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (46)
Olsen,Richard Ian; Sato,Darryl L.; Moller,Borden; Vitomirov,Olivera; Brady,Jeffrey A.; Gunawan,Ferry; Oten,Remzi; Sun,Feng Qing; Gates,James, Apparatus for multiple camera devices and method of operating same.
Sabnis Ram W. ; Brewer Terry L. ; Nichols Robert E. ; Hays Edith G. ; Stroder Michael D. ; Yanagimoto Akira,JPX ; Sone Yasuhisa,JPX ; Watanabe Yoshitane,JPX ; Ema Kiyomi,JPX, High optical density ultra thin organic black matrix system.
Ono, Shuji, Image capturing apparatus having a filter section disposed on periphery of a light passing section of a partial wavelength spectrum diaphragm section.
Subbarao Muralidhara (Port Jefferson Station NY), Method and apparatus for determining the distances between surface-patches of a three-dimensional spatial scene and a ca.
Kinnard Kenneth P. (Moorestown NJ) Strong ; Jr. Richard T. (Medford NJ) Goldfarb Samuel (Princeton NJ) Tower John R. (Medford NJ), Multichip imager with improved optical performance near the butt region.
Hamilton ; Jr. John F. (Rochester NY) Adams ; Jr. James E. (Rochester NY), Particular pattern of pixels for a color filter array which is used to derive luminance and chrominance values.
Lyon,Richard F.; Merrill,Richard B., Vertical color filter sensor group array that emulates a pattern of single-layer sensors with efficient use of each sensor group's sensors.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.