IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0709803
(2010-02-22)
|
등록번호 |
US-8508508
(2013-08-13)
|
우선권정보 |
NZ-524211 (2003-02-14) |
발명자
/ 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
Kilpatrick Townsend & Stockton LLP
|
인용정보 |
피인용 횟수 :
6 인용 특허 :
323 |
초록
▼
A coordinate detection system can comprise a display screen, a touch surface corresponding the top of the display screen or a material positioned above the screen and defining a touch area, at least one camera outside the touch area and configured to capture an image of space above the touch surface
A coordinate detection system can comprise a display screen, a touch surface corresponding the top of the display screen or a material positioned above the screen and defining a touch area, at least one camera outside the touch area and configured to capture an image of space above the touch surface, and a processor executing program code to identify whether an object interferes with the light from the light source projected through the touch surface based on the image captured by the at least one camera. The processor can be configured to carry out a calibration routine utilizing a single touch point in order to determine a plane corresponding to the touch surface by using mirror images of the features adjacent the touch surface, images of the features, and/or based on the touch point and a normal to the reflective plane defined by an image of the object and its mirror image.
대표청구항
▼
1. A method for calibrating a coordinate detection system, the method comprising: capturing an image above a touch surface positioned on a body, the image captured by at least one camera positioned outside an area defined by the touch surface; andidentifying, by a processor, a plane corresponding to
1. A method for calibrating a coordinate detection system, the method comprising: capturing an image above a touch surface positioned on a body, the image captured by at least one camera positioned outside an area defined by the touch surface; andidentifying, by a processor, a plane corresponding to the touch surface based on at least three points defining the plane, wherein identifying the plane comprises identifying at least one point of the plane by determining a touch point based on tracking an image of an object and a mirror image of the object as reflected by the touch surface. 2. The method set forth in claim 1, wherein the plane is identified using the identified touch point and data in the image corresponding to a point of the object and a mirror image of the point of the object, the point of the object and mirror image of the point of the object defining the plane normal. 3. The method set forth in claim 1, wherein at least one of the at least three points is the midpoint of a line between a point of the object and a mirror image of the point of the object, wherein the line is normal to the plane. 4. The method set forth in claim 1, wherein the plane is identified using the identified touch point and data in the image corresponding to at least two features of the body. 5. The method set forth in claim 4, wherein the at least two features each comprise a tab or protrusion in a bezel surrounding the touch surface. 6. The method set forth in claim 4, wherein the plane is identified using the identified touch point and data in the image corresponding to mirror images of each of the at least two features of the body. 7. The method set forth in claim 1, wherein the at least one camera is positioned on the body. 8. The method set forth in claim 4, wherein the at least one camera is positioned on a second body connected to the body by a plurality of hinges, the plurality of hinges corresponding to the at least two features. 9. The method set forth in claim 1, wherein the processor is configured to identify the plane periodically during operation of a position detection algorithm or during a calibration sequence prior to carrying out a position detection algorithm. 10. A coordinate detection system, comprising: a touch surface mounted to or corresponding to a body;at least one camera outside an area defined by the touch surface and configured to capture an image above the touch surface;an illumination system comprising a light source, the illumination system configured to project light from the light source through the touch surface; anda processor configured to execute program code and further configured to identify a plane corresponding to the touch surface based on at least three points defining the plane, wherein identifying the plane comprises identifying at least one point of the plane as a touch point by tracking an image of an object and a mirror image of the object as reflected by the touch surface. 11. The system set forth in claim 10, wherein the plane is identified using the identified touch point and data in the image corresponding to a point of the object and a mirror image of the point of the object, the point of the object and mirror image of the point of the object defining the plane normal. 12. The system set forth in claim 10, wherein at least one of the at least three points is the midpoint of a line between a point of the object and a mirror image of the point of the object, wherein the line is normal to the plane. 13. The system set forth in claim 10, wherein the plane is identified using the identified touch point and data in the image corresponding to at least two features of the body. 14. The system set forth in claim 13, wherein the at least one camera is positioned on a second body connected to the body by a plurality of hinges, the plurality of hinges corresponding to the at least two features. 15. The system set forth in claim 13, wherein the at least two features each comprise a tab or protrusion in a bezel surrounding the touch surface. 16. The system set forth in claim 10, wherein the touch surface corresponds to a display screen or a material positioned above the display screen. 17. The system set forth in claim 10, wherein the at least one camera is positioned on the body. 18. A computer program product comprising a tangible, nontransitory computer-readable medium embodying program code for calibrating a coordinate detection system, the program code comprising: code that configures a processing device to capture an image above a touch surface positioned on a body; andcode that configures the processing device to identify a plane corresponding to the touch surface based on determining a touch point by detecting an image of an object and a mirror image of the object, the plane identified from the touch point and at least two additional points. 19. The computer program product set forth in claim 18, wherein the plane is identified using the touch point and data in the image corresponding to a point of the object and a mirror image of the point of the object, the point of the object and mirror image of the point of the object defining the plane normal. 20. The computer program product set forth in claim 18, wherein at least one of the at least three points is the midpoint of a line between a point of the object and a mirror image of the point of the object, wherein the line is normal to the plane. 21. The computer program product set forth in claim 18, wherein the plane is identified using the identified touch point, data indicating an expected location of at least two features of the body, and data in the image corresponding to images of the at least two features. 22. The computer program product set forth in claim 18, wherein the program code configures the processing device to identify the plane periodically during operation of a position detection algorithm or during a calibration sequence prior to carrying out a position detection algorithm. 23. The computer program product set forth in claim 18, wherein the program code configures the processing device to identify the plane by solving for a plane that includes the touch point and which also corresponds to a plane of reflection between the features of the body and mirror images thereof as detected. 24. The method of claim 1, wherein the plane comprises a surface including each of the three points that is parallel to the touch surface. 25. The method of claim 1, further comprising determining an orientation of the plane with respect to the at least one camera. 26. The method of claim 25, wherein the at least one camera is positioned in an additional plane different from the plane corresponding to the touch surface.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.