[미국특허]
Vehicle 360° surround view system having corner placed cameras, and system and method for calibration thereof
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
H04N-007/18
G06K-009/00
H04N-005/232
H04N-005/247
H04N-017/00
G06T-007/80
G06T-007/73
출원번호
US-0683212
(2015-04-10)
등록번호
US-10089538
(2018-10-02)
발명자
/ 주소
Molin, Hans M.
Gyori, Marton
Kuehnle, Andreas U.
Li, Zheng
출원인 / 주소
Bendix Commercial Vehicle Systems LLC
대리인 / 주소
Tucker Ellis LLP
인용정보
피인용 횟수 :
0인용 특허 :
13
초록▼
Cameras having wide fields of view are placed at each of the front left, front right, rear left, and rear right corners of a nominally rectangular-shaped vehicle, thereby providing a continuous region of overlapping fields of view completely surrounding the vehicle, and enabling complete 360° stereo
Cameras having wide fields of view are placed at each of the front left, front right, rear left, and rear right corners of a nominally rectangular-shaped vehicle, thereby providing a continuous region of overlapping fields of view completely surrounding the vehicle, and enabling complete 360° stereoscopic vision detection around the vehicle. In an embodiment the regions of overlapping fields of view completely surround the vehicle. The cameras are first individually calibrated, then collectively calibrated considering errors in overlapping viewing areas to develop one or more calibration corrections according to an optimization method that adjusts imaging parameters including homography values for each of the cameras, to reduce the overall error in the 360° surround view image of the continuous region surrounding the vehicle.
대표청구항▼
1. A method of calibrating an associated imaging system providing a bird's eye view of an area surrounding an associated vehicle, the method comprising: receiving first image data, the first image data being related by a first homography matrix to a first image of a first target area adjacent the as
1. A method of calibrating an associated imaging system providing a bird's eye view of an area surrounding an associated vehicle, the method comprising: receiving first image data, the first image data being related by a first homography matrix to a first image of a first target area adjacent the associated vehicle;receiving second image data, the second image data being related by a second homography matrix to a second image of a second target area adjacent the associated vehicle, wherein a portion of the second target area overlaps a portion of the first target area at an overlap area;receiving first collective image data, the first collective image data being related by the first and second homography matrices to a composite image of a first plurality of target objects disposed in the overlap area;determining a composite registration error between the collective image data and the area surrounding the associated vehicle in accordance with a comparison between locations of images of the first plurality of target objects in the collective image data and known physical locations of the first plurality of target objects in the overlap area; andsimultaneously modifying the first and second homography matrices as globally modified first and second homography matrices in accordance with the determined composite registration error to register the collective image data with the known physical locations of the plurality of target objects. 2. The method according to claim 1, wherein: the receiving the first image data comprises receiving the first image data from a first camera disposed on the associated vehicle, the first camera imaging at least one two-dimensional target object disposed in the overlap area;the receiving the second image data comprises receiving the second image data from a second camera disposed on the associated vehicle, the second camera imaging the at least one two-dimensional target object disposed in the overlap area; andthe determining the composite registration error between the collective image data and the area surrounding the associated vehicle comprises determining a first paired registration error between the first image data and the second image data in accordance with a comparison between portions of a first and second image data representative of an image of the at least one two-dimensional target object disposed in the overlap area. 3. The method according to claim 2, wherein: the receiving the first image data from the first camera disposed on the associated vehicle by the first camera imaging the at least one two-dimensional target object disposed in the overlap area comprises the first camera imaging a non-periodic pattern disposed in the overlap area;the receiving the second image data from the second camera disposed on the associated vehicle by the second camera imaging the at least one two-dimensional target object disposed in the overlap area comprises the second camera imaging the non-periodic pattern disposed in the overlap area; andthe determining the composite registration error between the collective image data and the area surrounding the associated vehicle comprises determining the first paired registration error between the first image data and the second image data in accordance with a comparison between portions of a first and second image data representative of an image of the non-periodic pattern disposed in the overlap area. 4. The method according to claim 2, wherein: the receiving the first image data from the first camera disposed on the associated vehicle by the first camera imaging the at least one two-dimensional target object disposed in the overlap area comprises the first camera imaging an object presenting a non-periodic two-dimensional arrayed color pattern disposed in the overlap area;the receiving the second image data from the second camera disposed on the associated vehicle by the second camera imaging the object presenting the non-periodic two-dimensional arrayed color pattern disposed in the overlap area; andthe determining the composite registration error between the collective image data and the area surrounding the associated vehicle comprises determining the first paired registration error between the first image data and the second image data in accordance with a comparison between portions of a first and second image data representative of an image of the object presenting a non-periodic two-dimensional arrayed color pattern disposed in the overlap area. 5. The method according to claim 2, further comprising: calibrating a first distortion characteristic parameter of the first camera to determine a first distortion characteristic scaling parameter for optimizing a linearity of a lens of the first camera; andcalibrating a second distortion characteristic parameter of the second camera to determine a second distortion characteristic scaling parameter for optimizing a linearity of a lens of the second camera. 6. The method according to claim 2, further comprising: determining the first homography matrix by: imaging an associated rectangular first calibration object in the first target area by the first camera to obtain a first calibration object image; andmodifying selected parameters of the first homography matrix to minimize an amount of skew in the first calibration object image representative of the associated rectangular first calibration object disposed in the first target area; anddetermining the second homography matrix by: imaging an associated rectangular second calibration object in the second target area by the second camera to obtain a second calibration object image; andmodifying selected parameters of the second homography matrix to minimize an amount of skew in the second calibration object image representative of the associated rectangular second calibration object disposed in the second target area. 7. An apparatus for calibrating an associated imaging system operatively coupled with an associated vehicle and providing a bird's eye view of an area surrounding the associated vehicle, the apparatus comprising: a communication interface operatively coupled with cameras of the associated imaging system and configured to communicate with the cameras; anda processor coupled with the communication interface, and configured to: receive first image data, the first image data being related by a first homography matrix to a first image of a first target area adjacent the associated vehicle;receive second image data, the second image data being related by a second homography matrix to a second image of a second target area adjacent the associated vehicle, wherein a portion of the second target area overlaps a portion of the first target area at an overlap area;receive first collective image data, the first collective image data being related by the first and second homography matrices to a composite image of a first plurality of target objects disposed in the overlap area;determine a composite registration error between the collective image data and the area surrounding the associated vehicle in accordance with a comparison between locations of images the first plurality of target objects in the collective image data and known physical locations of the first plurality of target objects in the overlap area; andsimultaneously modify the first and second homography matrices as globally modified first and second homography matrices in accordance with the determined composite registration error to register the collective image data with the known physical locations of the plurality of target objects. 8. The apparatus according to claim 7, wherein the processor is further configured to: receive the first image data from a first associated camera disposed on the associated vehicle, the first camera imaging at least one two-dimensional target object disposed in the overlap area;receive the second image data from a second associated camera disposed on the associated vehicle, the second camera imaging the at least one two-dimensional target object disposed in the overlap area; anddetermine a first paired registration error between the first image data and the second image data in accordance with a comparison between portions of a first and second image data representative of an image of the at least one two-dimensional target object disposed in the overlap area. 9. The apparatus according to claim 8, wherein the processor is further configured to: receive, from the associated first camera, an image of a non-periodic pattern disposed in the overlap area;receive, from the associated second camera, an image of a non-periodic pattern disposed in the overlap area; anddetermine the first paired registration error between the first image data and the second image data in accordance with a comparison between portions of a first and second image data representative of an image of the non-periodic pattern disposed in the overlap area. 10. The apparatus according to claim 8, wherein the processor is further configured to: receive, from the associated first camera, an image of an object presenting a non-periodic two-dimensional arrayed color pattern disposed in the overlap area;receive, from the associated second camera, an image of an object presenting the non-periodic two-dimensional arrayed color pattern disposed in the overlap area; anddetermine the first paired registration error between the first image data and the second image data in accordance with a comparison between portions of a first and second image data representative of an image of the object presenting a non-periodic two-dimensional arrayed color pattern disposed in the overlap area. 11. The apparatus according to claim 8, wherein the processor is further configured to: calibrate a first distortion characteristic parameter of the first camera to determine a first distortion characteristic scaling parameter for optimizing a linearity of a lens of the first camera; andcalibrate a second distortion characteristic parameter of the second camera to determine a second distortion characteristic scaling parameter for optimizing a linearity of a lens of the second camera. 12. The apparatus according to claim 8, wherein: the processor is further configured to determine the first homography matrix by: imaging an associated rectangular first calibration object in the first target area by the first camera to obtain a first calibration object image; andmodifying selected parameters of the first homography matrix to minimize an amount of skew in the first calibration object image representative of the associated rectangular first calibration object disposed in the first target area; andthe processor is further configured to determine the second homography matrix by: imaging an associated rectangular second calibration object in the second target area by the second camera to obtain a second calibration object image; andmodifying selected parameters of the second homography matrix to minimize an amount of skew in the second calibration object image representative of the associated rectangular second calibration object disposed in the second target area. 13. A method of calibrating an imaging system providing a bird's eye view of an area surrounding an associated vehicle, the method comprising: receiving first image data, the first image data being related by a first homography matrix to a first image of a first target area adjacent the associated vehicle;receiving second image data, the second image data being related by a second homography matrix to a second image of a second target area adjacent the associated vehicle, wherein a portion of the second target area overlaps a portion of the first target area at a first overlap area;receiving third image data, the third image data being related by a third homography matrix to a third image of a third target area adjacent the associated vehicle, wherein a portion of the third target area overlaps a portion of the second target area at a second overlap area;receiving fourth image data, the fourth image data being related by a fourth homography matrix to a fourth image of a fourth target area adjacent the associated vehicle, wherein a portion of the fourth target area overlaps a portion of the third target area at a third overlap area, and wherein a portion of the fourth target area overlaps a portion of the first target area at a fourth overlap area;receiving first collective image data, the first collective image data being related by the first, second, third, and fourth homography matrices to a composite image of a first plurality of target objects disposed in the first, second, third, and fourth overlap areas;determining a composite registration error between the collective image data and the area surrounding the associated vehicle in accordance with a comparison between locations of images the first plurality of target objects in the collective image data and known physical locations of the first plurality of target objects in the first, second, third, and fourth overlap areas; andsimultaneously modifying the first, second, third, and fourth homography matrices as globally modified first, second, third, and fourth homography matrices in accordance with the determined composite registration error to register the collective image data with the known physical locations of the plurality of target objects. 14. A surround view system generating a bird's eye view of an area adjacent an associated apparatus disposed in an environment, the surround view system comprising: a computer system comprising a processor;a non-transient memory operably coupled with the processor; anda plurality of cameras operatively coupled with the processor and memory, the plurality of cameras being disposed at selected positions on the associated apparatus, each of the plurality of cameras respectively having a field of view projected onto the area adjacent the associated apparatus wherein each field of view overlaps at least one other field of view defining a continuous field of view overlap area completely surrounding the associated apparatus,wherein the processor is configured to calibrate the plurality of cameras by: determining a set of registration errors in respective images of each pair of the plurality of cameras having overlapping fields of views;determining an average value of the set of registration errors; andsimultaneously adjusting homography matrix values of each camera of the set of cameras stored in the memory by the determined average value. 15. The surround view system according to claim 14, wherein: the plurality of cameras comprises a set of cameras disposed at a corresponding set of corners of an associated vehicle. 16. The surround view system according to claim 14, wherein: the plurality of cameras comprises at least one camera disposed at each corner of the associated vehicle. 17. A method of calibrating a plurality of cameras disposed in an environment wherein each of the plurality of cameras respectively has a field of view projected onto an area adjacent the camera wherein each field of view overlaps at least one other field of view defining a continuous field of view overlap area completely surrounding a selected target area, the method comprising: determining a set of image registration errors in respective images of each pair of a plurality of cameras having overlapping fields of view;determining an average value of the set of registration errors; andsimultaneously adjusting homography matrix values of each camera of the set of cameras by the determined average value. 18. The method of calibrating a plurality of cameras according to claim 17, wherein the determining the adjustment value based on the set of registration errors comprises determining the adjustment value as an average of the image registration errors in the respective images of each pair of a plurality of cameras.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (13)
Li, Youfu; Zhang, Beiwei, Auto-calibration method for a projector-camera system.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.