최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | US-0800757 (2015-07-16) |
등록번호 | US-10060721 (2018-08-28) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 0 인용 특허 : 377 |
Methods for dimensioning a 3D item are described. A FOV is mapped over three spatial dimensions, each of the three spatial dimensions oriented orthogonally in relation to each of the others and graduated according to a linear scale. The 3D item is scanned relative to the mapped FOV. Each of the 2D s
Methods for dimensioning a 3D item are described. A FOV is mapped over three spatial dimensions, each of the three spatial dimensions oriented orthogonally in relation to each of the others and graduated according to a linear scale. The 3D item is scanned relative to the mapped FOV. Each of the 2D surfaces of the scanned 3D item is identified. A dimension is measured for each of the identified 2D surfaces of the scanned 3D item. A perspective-correct representation of the measured dimension is rendered, in real time or near real time, with respect to the measuring the dimension step, onto each of the identified 2D surfaces of the scanned 3D item.
1. A computer implemented method for dimensioning a three dimensional (3D) item, the method comprising the steps of: mapping a field of view (FOV) over three spatial dimensions, each of the three spatial dimensions oriented orthogonally in relation to each of the others and graduated according to a
1. A computer implemented method for dimensioning a three dimensional (3D) item, the method comprising the steps of: mapping a field of view (FOV) over three spatial dimensions, each of the three spatial dimensions oriented orthogonally in relation to each of the others and graduated according to a linear scale;scanning the 3D item relative to the mapped FOV;identifying each of a plurality of two dimensional (2D) surfaces of the scanned 3D item;measuring a dimension of each of the identified 2D surfaces of the scanned 3D item; andrendering a perspective-corrected representation of the measured dimension, in real time, with respect to the measuring the dimension step, onto each of the identified 2D surfaces of the scanned 3D item;wherein the identifying each of the 2D surfaces of the scanned 3D item comprises: detecting each of the 2D surfaces of the scanned 3D item; andorienting each of the detected 2D surface of the scanned 3D item in relation to the three spatial dimensions of the mapped FOV; andwherein the rendering the perspective-corrected representation of the measured dimension comprises: computing an incident angle and a normal angle, based on the orienting the detected 2D surface of the scanned 3D item in relation to the at least two spatial dimensions of the mapped FOV, for a projection of the rendering of the measured dimension onto the identified 2D surface of the scanned 3D item;computing a translation matrix for translating the projection of the rendering in an alignment with the computed normal angle;texture mapping a model of the scanned 3D item, the texture mapped model disposed within a virtual 3D space corresponding, and scaled in relation, to the mapped FOV, wherein the detected 2D surface of the scanned item is modeled within the 3D space based on the orienting step; andprojecting the rendered representation of the measured dimension onto the identified 2D surface of the scanned 3D item based on the texture mapped model. 2. The method as described in claim 1, wherein the projecting the rendering of the measured dimension onto the identified 2D surface of the scanned 3D item based on the texture mapped model comprises: computing a projection of the texture mapped model from a perspective corresponding to the scanning and the rendering; andperforming the projecting of the rendered representation based on the computed projection, wherein an illusion is created of a plurality of individual projections, each of which is rendered onto a corresponding 2D surface of the 3D item with a perspective that appears projected in alignment with the normal angle computed in relation thereto. 3. The method as described in claim 1, further comprising the steps of: delineating a location for a positioning of the 3D item for a performance of the scanning step;positioning the 3D item in the delineated position; andinitiating the scanning step upon the positioning of the 3D item in the delineated position. 4. The method as described in claim 3, further comprising the step of detecting the positioning the 3D item in the delineated position, wherein the initiating of the scanning step is automatically performed upon the detecting of the positioning. 5. The method as described in claim 3 wherein the delineated position corresponds with a scale, which is operable for detecting a weight of the 3D item, the method further comprising detecting the weight of the 3D item wherein the rendering the perspective-corrected representation comprises rendering the detected weight, in real time with respect to the rendering of the measured dimension, onto at least one of the identified 2D surfaces of the 3D item. 6. The method as described in claim 5, further comprising the step of computing a volume of the 3D object based on the measured dimension of each of the identified 2D surfaces thereof, wherein the rendering the perspective-corrected representation comprises rendering the computed volume, in real time with respect to the rendering of the measured dimension, onto at least one of the identified 2D surfaces of the 3D item. 7. The method as described in claim 6, further comprising computing a fee relating to shipping the 3D item or storing the 3D item based on one or more of the measured dimension, the computed volume, or the detected weight of the 3D item. 8. The method as described in claim 1, further comprising the step of capturing an image of the scanned 3D item in real time with respect to the rendering the representation step, the captured image comprising the representation of the measured dimension rendered with the corrected perspective on each of the identified 2D surfaces of the scanned 3D item. 9. A system operable for dimensioning a three dimensional (3D) item, the system comprising: a dimensioner component operable for: mapping a field of view (FOV) over three spatial dimensions, each of the three spatial dimensions oriented orthogonally in relation to each of the others and graduated according to a linear scale;scanning the 3D item relative to the mapped FOV;identifying each of a plurality of two dimensional (2D) surfaces of the scanned 3D item; andmeasuring a dimension of each of the identified 2D surfaces of the scanned 3D item; anda projector component communicatively coupled to the dimensioner component and operable therewith for rendering a perspective-corrected representation of the measured dimension, in real time, with respect to the measuring the dimension step, onto each of the identified 2D surfaces of the scanned 3D item;wherein the dimensioner component is operable for the identifying each of the 2D surfaces of the scanned 3D item with a process comprising the steps of: detecting each of the 2D surfaces of the scanned 3D item; andorienting each of the detected 2D surface of the scanned 3D item in relation to the three spatial dimensions of the mapped FOV; andwherein the dimensioner component and the projector component are operable together for the rendering of the perspective-corrected representation of the measured dimension with a process comprising the steps of: computing an incident angle and a normal angle, based on the orienting the detected 2D surface of the scanned 3D item in relation to the at least two spatial dimensions of the mapped FOV, for a projection of the rendering of the measured dimension onto the identified 2D surface of the scanned 3D item;computing a translation matrix for translating the projection of the rendering in an alignment with the computed normal angle;texture mapping a model of the scanned 3D item, the texture mapped model disposed within a virtual 3D space corresponding, and scaled in relation, to the mapped FOV, wherein the detected 2D surface of the scanned item is modeled within the 3D space based on the orienting step; andprojecting the rendered representation of the measured dimension onto the identified 2D surface of the scanned 3D item based on the texture mapped model. 10. The system as described in claim 9, wherein the projecting the rendering of the measured dimension onto the identified 2D surface of the scanned 3D item based on the texture mapped model step comprises: computing a projection of the texture mapped model from a perspective corresponding to the scanning and the rendering; andperforming the projecting of the rendered representation based on the computed projection, wherein an illusion is created of a plurality of individual projections, each of which is rendered onto a corresponding 2D surface of the 3D item with a perspective that appears projected in alignment with the normal angle computed in relation thereto. 11. The system as described in claim 9, wherein the projector component is further operable for: delineating a location for a positioning of the 3D item for a performance of the scanning step;positioning the 3D item in the delineated position; andinitiating the scanning step upon the positioning of the 3D item in the delineated position. 12. The system as described in claim 11, further comprising: a detector component operable for detecting the positioning the 3D item in the delineated position, wherein the initiating of the scanning is performed upon the detecting of the positioning; anda scale component operable for detecting a weight of the 3D item wherein a working surface of the scale is positioned at the delineated position, and wherein the rendering the perspective-corrected representation comprises rendering the detected weight, in real time with respect to the rendering of the measured dimension, onto at least one of the identified 2D surfaces of the 3D item. 13. The system as described in claim 12, wherein the dimensioner is further operable for computing a volume of the 3D object based on the measured dimension of each of the identified 2D surfaces thereof, wherein the projector component is further operable for rendering a representation of the computed volume, in real time with respect to the rendering of the measured dimension, onto at least one of the identified 2D surfaces of the 3D item. 14. The system as described in claim 13, further comprising a camera component operable for capturing an image of the scanned 3D item in real time with respect to the rendering the representation step, the captured image comprising the representation of the measured dimension rendered with the corrected perspective on each of the identified 2D surfaces of the scanned 3D item. 15. The system as described in claim 14, further comprising a processor component operable for computing a fee relating to shipping the 3D item or storing the 3D item based on one or more of the measured dimension, the computed volume, the detected weight, or the captured image of the 3D item. 16. A non-transitory computer readable storage medium comprising instructions operable for causing one or more processors to perform, execute, or control a process for imaging a three dimensional (3D) item, the process comprising the steps of: mapping a field of view (FOV) over three spatial dimensions, each of the three spatial dimensions oriented orthogonally in relation to each of the others and graduated according to a linear scale;scanning the 3D item relative to the mapped FOV;identifying each of a plurality of two dimensional (2D) surfaces of the scanned 3D item;measuring a dimension of each of the identified 2D surfaces of the scanned 3D item; andrendering a perspective-corrected representation of the measured dimension, in real time, with respect to the measuring the dimension step, onto each of the identified 2D surfaces of the scanned 3D item;wherein the identifying each of the 2D surfaces of the scanned 3D item comprises: detecting each of the 2D surfaces of the scanned 3D item; andorienting each of the detected 2D surface of the scanned 3D item in relation to the three spatial dimensions of the mapped FOV; andwherein the rendering the perspective-corrected representation of the measured dimension comprises: computing an incident angle and a normal angle, based on the orienting the detected 2D surface of the scanned 3D item in relation to the at least two spatial dimensions of the mapped FOV, for a projection of the rendering of the measured dimension onto the identified 2D surface of the scanned 3D item;computing a translation matrix for translating the projection of the rendering in an alignment with the computed normal angle;texture mapping a model of the scanned 3D item, the texture mapped model disposed within a virtual 3D space corresponding, and scaled in relation, to the mapped FOV, wherein the detected 2D surface of the scanned item is modeled within the 3D space based on the orienting step; andprojecting the rendered representation of the measured dimension onto the identified 2D surface of the scanned 3D item based on the texture mapped model.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.