최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | US-0987768 (2018-05-23) |
등록번호 | US-10225543 (2019-03-05) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 0 인용 특허 : 316 |
Systems and methods for calibrating an array camera are disclosed. Systems and methods for calibrating an array camera in accordance with embodiments of this invention include the capturing of an image of a test pattern with the array camera such that each imaging component in the array camera captu
Systems and methods for calibrating an array camera are disclosed. Systems and methods for calibrating an array camera in accordance with embodiments of this invention include the capturing of an image of a test pattern with the array camera such that each imaging component in the array camera captures an image of the test pattern. The image of the test pattern captured by a reference imaging component is then used to derive calibration information for the reference component. A corrected image of the test pattern for the reference component is then generated from the calibration information and the image of the test pattern captured by the reference imaging component. The corrected image is then used with the images captured by each of the associate imaging components associated with the reference component to generate calibration information for the associate imaging components.
1. A method for manufacturing an array camera device, the method comprising: assembling an array of cameras comprising a plurality of imaging components that capture images of a scene from different viewpoints, where the plurality of imaging components comprises: a set of one or more reference imagi
1. A method for manufacturing an array camera device, the method comprising: assembling an array of cameras comprising a plurality of imaging components that capture images of a scene from different viewpoints, where the plurality of imaging components comprises: a set of one or more reference imaging components, each having a reference viewpoint; anda set of one or more associate imaging components;configuring the array of cameras to communicate with at least one processor;configuring the processor to communicate with at least one display;configuring the processor to communicate with at least one type of memory; andperforming a calibration process for the array of cameras, where the calibration process comprises: capturing images of a test pattern using the array of cameras, where each of the plurality of imaging components captures an image from a particular viewpoint;generating scene independent geometric corrections for reference image data captured by a reference imaging component using test pattern image data captured by the reference imaging component and data describing the test pattern using the processor;generating a corrected test pattern image for the reference imaging component based on the scene independent geometric corrections for the reference image data and the image of the test pattern captured by the reference imaging component using the processor; andgenerating scene independent geometric corrections for associate image data captured by an associate imaging component using test pattern image data captured by the associate imaging component and data for the corrected test pattern image using the processor; andloading calibration information into the memory. 2. The method of claim 1, wherein the calibration information comprises: reference calibration information for the reference imaging component comprising the scene independent geometric corrections for the reference image data to account for distortions related to the mechanical construction of the reference imaging component and produce a corrected reference image; andassociate calibration information for the associate imaging component comprising the scene independent geometric corrections for the associate image data that map locations of pixels in an image captured by the associate imaging component to corresponding pixel locations in the corrected reference image, where corresponding pixel locations represent the same point in a scene in the absence of disparity due to parallax. 3. The method of claim 2, wherein the calibration information further comprises colorimetric corrections or photometric corrections for image data captured by one or more imaging components of the plurality of imaging components. 4. The method of claim 2, further comprising loading a software application comprising machine readable instructions into the memory, where execution of the software application by the processor directs the processor to: capture images of a scene using the plurality of imaging components in the array of cameras, wherein the captured images comprise: an associate image captured by the associate imaging component; anda reference image captured by the reference imaging component;apply corrections to locations of pixels of the associate image using the associate calibration information;generate a depth map by measuring disparity due to parallax between pixels in the reference image and corrected pixels in the associate image; andsynthesize an image using the generated depth map and at least some of the pixels from the captured image data. 5. The method of claim 4, wherein the execution of the software application by the processor further directs the processor to apply corrections to locations of pixels of the reference image using the reference calibration information; and wherein the depth map is generated by measuring disparity due to parallax between the corrected pixels in the reference image and the corrected pixels in the associate image. 6. The method of claim 1, wherein the test pattern is placed at a defined distance away from the array of cameras when the image of the test pattern is captured, and the distance is at least 70 percent of a hyperfocal distance of the array of cameras. 7. The method of claim 1, wherein the test pattern is placed at a defined distance away from the array of cameras when the image of the test pattern is captured, and the distance is at least 50 percent of a hyperfocal distance of the array of cameras. 8. The method of claim 1, wherein the test pattern includes a low-contrast slanted edge pattern. 9. The method of claim 8, wherein the test pattern includes a plurality of Macbeth Color Chart type patterns inset at different positions in the low-contrast slanted pattern. 10. The method of claim 1, further comprising performing at least one pass/fail test of the array of cameras based on captured images of the test pattern to verify proper image capture by the plurality of imaging components. 11. The method of claim 1, wherein generating scene independent geometric corrections for reference image data comprises: identifying reference intersection points in the image of the test pattern captured by the reference imaging component;determining uniformity characteristics of the reference imaging component from reference intersection points and the test pattern; andderiving parameters for the reference imaging component to compensate for low frequency aberrations in the image of the test pattern captured by the reference imaging component. 12. The method of claim 1, wherein generating scene independent geometric corrections for associate image data comprises: identifying associate intersection points in images of the test pattern captured by the associate imaging component;translating associate intersection points in accordance with an expected parallax shift for the associate imaging component relative to the reference imaging component; andderiving parameters for the associate imaging component to compensate for low frequency aberrations in the image of the test pattern captured by the associate imaging component by comparing translated associate intersection points to corresponding intersection points in the corrected test pattern image for the reference imaging component. 13. The method of claim 12, wherein the expected parallax shift for the associate imaging component is based upon at least one of the physical offset of the associate imaging component to the reference imaging component, the behavior of sensor optics in the associate imaging component, and a distance of the test pattern from the array of cameras. 14. The method of claim 1, further comprising generating colorimetric corrections or photometric corrections for image data captured by each imaging component in the array of cameras using test pattern image data captured by each imaging component using the processor; and storing the generated colorimetric corrections or photometric corrections in the memory. 15. The method of claim 1, wherein the array of cameras includes more than one reference imaging component. 16. The method of claim 1, wherein the processor includes a graphics processing unit. 17. The method of claim 1, wherein the scene independent geometric corrections for associate image data are represented by a grid that provides a geometric correction prescription for pixels of the associate imaging component. 18. The method of claim 1, wherein the plurality of imaging components are configured in a 5×5 array. 19. The method of claim 1, wherein at least one imaging component of the plurality of imaging components contains a lens stack array and at least one sensor element, where the sensor element is selected from the group of a traditional CIS (CMOS Image Sensor) pixel, a CCD (charge-coupled device) pixel, a high dynamic range sensor element, a multispectral sensor element, and a structure configured to generate an electrical signal indicative of light incident on the structure.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.