최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | US-0679275 (2015-04-06) |
등록번호 | US-9292969 (2016-03-22) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 0 인용 특허 : 288 |
Systems and methods of determining the volume and dimensions of a three-dimensional object using a dimensioning system are provided. The dimensioning system can include an image sensor, a non-transitory, machine-readable, storage, and a processor. The dimensioning system can select and fit a three-d
Systems and methods of determining the volume and dimensions of a three-dimensional object using a dimensioning system are provided. The dimensioning system can include an image sensor, a non-transitory, machine-readable, storage, and a processor. The dimensioning system can select and fit a three-dimensional packaging wireframe model about each three-dimensional object located within a first point of view of the image sensor. Calibration is performed to calibrate between image sensors of the dimensioning system and those of the imaging system. Calibration may occur pre-run time, in a calibration mode or period. Calibration may occur during a routine. Calibration may be automatically triggered on detection of a coupling between the dimensioning and the imaging systems.
1. A dimensioning system device selectively couplable to an imaging system device that has at least one imaging system image sensor and at least one imaging system display coupled to present display images acquired by the at least one imaging system image sensor, the dimensioning system device compr
1. A dimensioning system device selectively couplable to an imaging system device that has at least one imaging system image sensor and at least one imaging system display coupled to present display images acquired by the at least one imaging system image sensor, the dimensioning system device comprising: at least one dimensioning system sensor, where:in a calibration mode the dimensioning system device: captures a depth map;captures an intensity image;receives imaging system image data display image from the imaging system, the imaging system image data representative of an image captured via the at least one imaging sensor; anddetermines a number of extrinsic parameters that provide a correspondence between a three-dimensional point in each of the depth map and the intensity image as viewed by the dimensioning system sensor with a same three-dimensional point viewed by the at least one imaging system image sensor of the imaging system device, including: determines a number of points positioned on at least one feature in the depth map and the intensity image including: determines a minimum of at least eight points positioned on at least one feature in the depth map and the intensity image, the at least one feature including an item other than a calibration target; anddetermines an equal number of respective points similarly positioned on the at least one feature in the imaging system image data. 2. The dimensioning system device of claim 1, wherein in a run-time mode the dimensioning system device: generates a packaging wireframe image to be presented by the at least one imaging system display of the imaging system positioned to encompass an object in a display of the image represented by the imaging system image data concurrently presented with the package wireframe image by the at least one imaging system display as correlated spatially therewith based at least in part on the determined extrinsic parameters. 3. The dimensioning system device of claim 1, wherein the dimensioning system device selectively couples to the imaging system device wirelessly. 4. The dimensioning system device of claim 1, wherein responsive to the receipt of at least one input when in run-time mode, the dimensioning system device: captures a second depth map;captures a second intensity image;receives a second display image from the imaging system; anddetermines a number of extrinsic parameters that provide a correspondence between a three-dimensional point in each of the second depth map and the second intensity image as viewed by the dimensioning system sensor with a same three-dimensional point viewed by the at least one imaging system image sensor of the imaging system device. 5. The dimensioning system device of claim 1, comprising a timer configured to provide an output at one or more intervals, and wherein responsive to the at least one output by the timer, the dimensioning system device: captures a second depth map;captures a second intensity image;receives a second display image from the imaging system; anddetermines a number of extrinsic parameters that provide a correspondence between a three-dimensional point in each of the second depth map and the second intensity image as viewed by the dimensioning system sensor with a same three-dimensional point viewed by the at least one imaging system image sensor of the imaging system device. 6. The dimensioning system device of claim 1, comprising a depth lighting system to selectively illuminate an area within a field of view of the dimensioning system sensor, and wherein the dimensioning system device selectively illuminates the area contemporaneous with the capture of the depth map and the capture of the intensity image by the dimensioning system sensor. 7. The dimensioning system device of claim 6, wherein the depth lighting system provides a structured light pattern and/or a modulated light pattern. 8. The dimensioning system device of claim 6, comprising: an automatic exposure control system, and wherein the automatic exposure control system adjusts an exposure of at least one of the depth map and the intensity image contemporaneous with the selective illumination of the area within the field of view of the dimensioning system sensor; and/oran automatic gain control system, and wherein the automatic gain control system adjusts a gain of at least one of the depth map and the intensity image contemporaneous with the selective illumination of the area within the field of view of the dimensioning system sensor. 9. The dimensioning system device of claim 1, wherein in response to an input when in calibration mode, the dimensioning system device establishes a correspondence between a three-dimensional point in each of the depth map and the intensity image as viewed by the dimensioning system sensor with a same three-dimensional point viewed by the at least one imaging system image sensor of the imaging system device. 10. A method of operation of a dimensioning system device, the method comprising: detecting a selective coupling of an imaging system device to the dimensioning system device, the imaging system including at least one imaging system image sensor and at least one imaging system display;capturing by the dimensioning system device a depth map via the dimensioning system sensor;capturing by the dimensioning system device an intensity image via the dimensioning system sensor;receiving by the dimensioning system device imaging system image data, the imaging system image data representing an image captured by the at least one imaging system image sensor; anddetermining by the dimensioning system device a number of extrinsic parameters that provide a correspondence between a point in the depth map and intensity image as viewed by the dimensioning system sensor with a same point as viewed by the at least one imaging system image sensor of the imaging system device including: determining by the dimensioning system device a minimum of at least eight points positioned on at least one feature in the depth map and the intensity image, the at least one feature including an item other than a calibration target; anddetermining by the dimensioning system device an equal number of respective points similarly positioned on the at least one feature in the imaging system image data. 11. The method of claim 10, wherein the dimensioning system device selectively couples to the imaging system device wirelessly. 12. The method of claim 10, comprising generating by the dimensioning system processor a packaging wireframe model for presentation by the at least one imaging system display of the imaging system, the packaging wireframe model positioned to encompass an object in the imaging system image data concurrently presented by the at least one imaging system display and spatially correlated therewith based at least in part on the determined extrinsic parameters. 13. The method of claim 10, wherein: the imaging system image sensor includes a known back focus value (fc); anddetermining by the dimensioning system processor a number of extrinsic parameters that provide a correspondence between a point in the depth map and intensity image as viewed by the dimensioning system sensor with a same point as viewed by the at least one imaging system image sensor of the imaging system device includes for each of the minimum of at least eight points: determining by the dimensioning system device a point coordinate location (xi, yi) within the intensity image;determining by the dimensioning system device a point depth (zi) within the depth map for the determined point coordinate location (xi, yi); anddetermining by the dimensioning system device a respective point coordinate location (xc, yc) within the imaging system image data. 14. The method of claim 13, wherein determining by the dimensioning system processor a number of extrinsic parameters that provide a correspondence between a point in the depth map and intensity image as viewed by the dimensioning system sensor with a same point as viewed by the at least one imaging system image sensor of the imaging system device further includes: determining via the dimensioning system device all of the components of a three-by-three, nine component, Rotation Matrix (R11-33) and a three element Translation Vector (T0-2) that relate the depth map and intensity image to the imaging system image data using the following equations: xc=fcR11xi+R12yi+R13zi+T0R31xi+R32yi+R33zi+T2yc=fcR21xi+R22yi+R23zi+T1R31xi+R32yi+R33zi+T2. 15. The method of claim 10, wherein capturing by the dimensioning system device a depth map via the dimensioning system sensor includes selectively illuminating an area within a first field of view of the dimensioning system sensor with at least one of a structured light pattern or a modulated light pattern. 16. The method of claim 10, wherein determining by the dimensioning system processor a number of extrinsic parameters that provide a correspondence between a point in the depth map and intensity image as viewed by the dimensioning system sensor with a same point as viewed by the at least one imaging system image sensor of the imaging system device includes: receiving by the dimensioning system device, an input identifying a point located on at least on feature in the depth map and the intensity image; andreceiving by the dimensioning system device, a second input identifying a point similarly located on the at least one feature in the imaging system image data. 17. A method of calibrating a dimensioning system device, the method comprising: determining that the dimensioning system device is communicably couple to an imaging system including an imaging system display and an imaging system image sensor;receiving by the dimensioning system device a depth map and an intensity image of a scene provided by a dimensioning system sensor in the dimensioning system device, the depth map and the intensity image including a common plurality of three-dimensional points;receiving by the dimensioning system device an imaging system image data from the imaging system image sensor, the imaging system image data comprising a plurality of visible points, at least a portion of the plurality of visible points corresponding to at least a portion of the plurality of three-dimensional points;determining via the dimensioning system device a number of points positioned on at least on feature appearing in the depth map and intensity image with a same number of points similarly positioned on the at least one feature appearing in the imaging system image data, the at least one feature not including a calibration target; anddetermining by the dimensioning system device a number of extrinsic parameters spatially relating each of the number of points in the depth map and the intensity image with each of the respective, same number of points in the imaging system image data. 18. The method of claim 17, comprising: identifying, by the dimensioning system device, at least one three-dimensional object in the depth map and the intensity image;generating by the dimensioning system device and based at least in part on the determined number of extrinsic parameters, a packaging wireframe for presentation on the imaging system display, the packaging wireframe substantially encompassing the at least one three-dimensional object;generating by the dimensioning system device, a composite image signal including the three-dimensional wireframe substantially encompassing an image of the three-dimensional object appearing in the imaging system image data; anddisplaying the composite image signal on the imaging system display. 19. The method of claim 17, wherein: the imaging system image sensor includes a back focus value (fc); anddetermining a number of extrinsic parameters spatially relating each of the number of points in the depth map and the intensity image with each of the respective, same number of points in the imaging system image data includes: determining by the dimensioning system device at least eight points located on at least one feature in the depth map and the intensity image; anddetermining by the dimensioning system device an equal number of respective points similarly located on the at least one feature in the imaging system image data;determining by the dimensioning system device for each of the at least eight points a point coordinate location (xi, yi) within the intensity image;determining by the dimensioning system device for each of the at least eight points a point depth (zi) within the depth map for the determined point coordinate location (xi, yi); anddetermining by the dimensioning system device a respective point coordinate location for each of the at least eight points (xc, yc) within the imaging system image data. 20. The method of claim 19, wherein determining a number of extrinsic parameters spatially relating each of the number of points in the depth map and the intensity image with each of the respective, same number of points in the imaging system image data includes: determining via the dimensioning system device all of the components of a three-by-three, nine component, Rotation Matrix (R11-33) and a three element Translation Vector (T0-2) that relate the depth map and intensity image to the imaging system image data using the following equations: xc=fcR11xi+R12yi+R13zi+T0R31xi+R32yi+R33zi+T2yc=fcR21xi+R22yi+R23zi+T1R31xi+R32yi+R33zi+T2.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.