IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0838536
(2004-05-05)
|
등록번호 |
US-7492357
(2009-02-17)
|
발명자
/ 주소 |
- Morrison,Gerald D.
- Holmgren,David E.
|
출원인 / 주소 |
|
대리인 / 주소 |
Katten Muchin Rosenman LLP
|
인용정보 |
피인용 횟수 :
32 인용 특허 :
82 |
초록
▼
An apparatus for detecting a pointer relative to a touch surface includes at least two spaced imaging assemblies having overlapping fields of view encompassing the touch surface. The imaging assemblies see the touch surface in three-dimensions as a perspective view. The imaging assemblies acquire ov
An apparatus for detecting a pointer relative to a touch surface includes at least two spaced imaging assemblies having overlapping fields of view encompassing the touch surface. The imaging assemblies see the touch surface in three-dimensions as a perspective view. The imaging assemblies acquire overlapping images from different locations. A processor receives and processes image data generated by at least one of the imaging assemblies to determine the location of the pointer relative to the touch surface.
대표청구항
▼
What is claimed is: 1. An apparatus for detecting a pointer relative to a touch surface comprising: at least two spaced imaging devices having overlapping fields of view, each of said imaging devices having a field of view that looks back at the touch surface such that said imaging devices see said
What is claimed is: 1. An apparatus for detecting a pointer relative to a touch surface comprising: at least two spaced imaging devices having overlapping fields of view, each of said imaging devices having a field of view that looks back at the touch surface such that said imaging devices see said touch surface in three-dimensions as a perspective view including the boundaries of said touch surface, said imaging devices acquiring overlapping images from different viewpoints, each imaging device comparing acquired images with a model describing the boundaries of said touch surface thereby to determine a subset of pixels in each acquired image and processing each pixel subset to generate image data; and processing structure processing image data generated by at least one of said imaging devices to determine the location of the pointer. 2. An apparatus according to claim 1 wherein each imaging device is calibrated to establish the relationship between points (X,Y,Z) in its perspective view and points (x,y) in acquired images, each imaging device generating pointer co-ordinate data when a pointer exists in an acquired image. 3. An apparatus according to claim 1 wherein said processing structure triangulates the image data to determine the location of the pointer relative to said touch surface. 4. An apparatus according to claim 3 including a pair of imaging devices. 5. An apparatus according to claim 4 wherein each imaging device is positioned adjacent a different corner of said touch surface. 6. An apparatus according to claim 5 wherein each imaging device is laterally spaced from and spaced in front of said touch surface. 7. An apparatus according to claim 6 wherein each imaging device is positioned relative to said touch surface so that at a minimum the entire periphery of the touch surface is within its field of view. 8. An apparatus according to claim 7 wherein each imaging device is spaced in front of said touch surface by a distance equal to at least 2.5 cm. 9. An apparatus according to claim 8 wherein said touch surface is bordered by a bezel. 10. An apparatus according to claim 7 wherein each imaging device is aimed so that the optical axis thereof bisects a corner of said touch surface. 11. An apparatus according to claim 2 wherein during calibration external orientation parameters (X0,Y0,Z0) and (ω,φ,κ) of each imaging device are determined where: (X0,Y0,Z0) is the spatial location of the optical center of the imaging device; and (ω,φ,κ) are the orientation angles of the optical axis of the imaging device with respect to the three-dimensional co-ordinate system of the touch surface. 12. An apparatus according to claim 11 wherein during calibration internal orientation parameters f,x0,y0 and Δx,Δy of each imaging device are also determined where: (Δx,Δy)represent distortion terms introduced due to the imperfect nature of a lens of the imaging device; f is the focal length of the imaging device; and (x0,y0) are the co-ordinates of the principal point of the imaging device. 13. An apparatus according to claim 12 wherein said relationship is expressed using collinearity equations as: where: (x,y) are the co-ordinates of a point in a captured image corresponding to a point (X,Y,Z) in the three-dimensional perspective view; and (R1,R2,R3) are terms depending on point (X,Y,Z), the spatial location (X0,Y0,Z0) and the orientation angles (ω,φ,κ). 14. An apparatus according to claim 13 wherein during calibration, calibration points (X,Y,Z) on said touch surface and image points (x,y) corresponding to said calibration points are measured, said collinearity equations being solved using said measured calibration and image points thereby to determine said external and internal orientation parameters. 15. An apparatus according to claim 14 wherein said collinearity equations are solved using a least-squares method. 16. An apparatus according to claim 14 wherein said calibration points are at spaced locations along the periphery of said touch surface. 17. An apparatus according to claim 16 wherein said calibration points are located at the corners and edge mid-points of said touch surface. 18. An apparatus according to claim 11 wherein said external orientation parameters are determined using a vanishing point method. 19. An apparatus according to claim 18 wherein the determined external orientation parameters are refined using a least-squares method. 20. An apparatus according to claim 18 wherein said external and internal orientation parameters are determined using planar homography. 21. An apparatus according to claim 18 wherein said external orientation parameters are determined using a three point method. 22. An apparatus according to claim 2 wherein each imaging device also generates a certainty value representing the degree of certainty that the imaging device has positively identified the pointer in the acquired image. 23. An apparatus according to claim 22 wherein said certainty value is used by said processing structure to determine pointer co-ordinate data to be used to determine the position of said pointer. 24. An apparatus according to claim 23 wherein said processing structure ignores pointer co-ordinate data generated by said imaging device when the certainty value associated therewith is below a threshold level. 25. An apparatus according to claim 24 including a pair of imaging devices. 26. An apparatus according to claim 25 wherein each imaging device is positioned adjacent a different corner of said touch surface. 27. An apparatus according to claim 26 wherein each imaging device is laterally spaced from and spaced in front of said touch surface. 28. An apparatus according to claim 27 wherein each imaging device is positioned relative to said touch surface so that at a minimum the entire periphery of the touch surface is within its field of view. 29. An apparatus according to claim 28 wherein each imaging device is aimed so that the optical axis thereof bisects the corner of said touch surface. 30. An apparatus according to claim 2 wherein said imaging devices communicate to assist in determining a pointer in acquired images. 31. An apparatus according to claim 30 wherein the imaging device that detects a pointer in its acquired image first communicates data to the other imaging device to assist that imaging device to detect the pointer in its acquired image. 32. An apparatus according to claim 31 wherein each imaging device also generates a certainty value representing the degree of certainty that the imaging device has positively identified the pointer in the acquired image. 33. An apparatus according to claim 32 wherein said certainty value is used by said processing structure to determine pointer co-ordinate data to be used to determine the position of said pointer. 34. An apparatus according to claim 33 wherein said processing structure ignores pointer co-ordinate data generated by said imaging device when the certainty value associated therewith is below a threshold level. 35. An apparatus according to claim 34 including a pair of imaging devices. 36. An apparatus according to claim 35 wherein each imaging device is positioned adjacent a different corner of said touch surface. 37. An apparatus according to claim 36 wherein each imaging device is laterally spaced from and spaced in front of said touch surface. 38. An apparatus according to claim 37 wherein each imaging device is positioned relative to said touch surface so that at a minimum the entire periphery of the touch surface is within its field of view. 39. An apparatus according to claim 38 wherein each imaging device is aimed so that the optical axis thereof bisects the corner of said touch surface. 40. An apparatus according to claim 14 wherein each imaging device also generates a certainty value representing the degree of certainty that the imaging device has positively identified the pointer in the acquired image. 41. An apparatus according to claim 40 wherein said certainty value is used by said processor to determine pointer co-ordinate data to be used to determine the position of said pointer. 42. An apparatus according to claim 41 wherein said processing structure ignores pointer co-ordinate data generated by said imaging device when the certainty value associated therewith is below a threshold level. 43. An apparatus according to claim 42 including a pair of imaging devices. 44. An apparatus according to claim 43 wherein each imaging device is positioned adjacent a different corner of said touch surface. 45. An apparatus according to claim 44 wherein each imaging device is laterally spaced from and spaced in front of said touch surface. 46. An apparatus according to claim 45 wherein each imaging device is positioned relative to said touch surface so that at a minimum the entire periphery of the touch surface is within its field of view. 47. An apparatus according to claim 46 wherein each imaging device is aimed so that the optical axis thereof bisects the corner of said touch surface. 48. An apparatus according to claim 2 wherein said imaging devices are portable. 49. An apparatus according to claim 48 wherein each imaging device includes a digital camera and a digital signal processor mounted within a housing, said digital signal processor processing image frames acquired by said digital camera to generate said pointer co-ordinate data. 50. An apparatus according to claim 48 wherein each imaging device includes a digital camera and a digital signal processor, both imaging devices being mounted within a single housing, said digital signal processor processing image data acquired by said digital camera to generate said pointer co-ordinate data. 51. An apparatus according to claim 3 including three or more imaging devices at spaced locations along said touch surface, each imaging device having a field of view encompassing a different portion of said touch surface. 52. An apparatus according to claim 51 wherein said imaging devices are arranged in pairs, with each pair of imaging devices viewing a different portion of said touch surface. 53. A camera-based touch system comprising: a generally rectangular passive touch surface on which contacts are made using a pointer; camera devices removably mounted adjacent at least two corners of said touch surface, each of said camera devices being disposed in front of the plane of the touch surface and having a field of view looking across and back towards said touch surface, the fields of view of said camera devices overlapping over said touch surface such that said camera devices see said touch surface and the boundaries thereof in perspective views, said camera devices acquiring images of said touch surface from different viewpoints, each camera device comparing each acquired image with a mathematical model describing the boundaries of said touch surface as seen by said camera device to determine a subset of relevant pixels of the acquired image corresponding generally to said touch surface and processing the subset of relevant pixels of each acquired image to generate image data; and a processor receiving and processing said image data to determine the location of said pointer relative to said touch surface via triangulation. 54. A touch system according to claim 53 wherein each camera device is calibrated to establish the relationship between points (X,Y,Z) in its perspective view and points (x,y) in acquired images, each camera device generating pointer co-ordinate data when a pointer is captured in an acquired image. 55. A touch system according to claim 54 wherein each camera device is spaced in front of said touch surface a sufficient distance so that at a minimum each camera device sees the four corners and sides of said touch surface with its field of view. 56. A touch system according to claim 55 wherein each camera device also generates a certainty value representing the degree of certainty that the camera device has positively identified the pointer in the acquired image. 57. A touch system according to claim 56 wherein said certainty value is used by said processor to determine pointer co-ordinate data to be used to determine the position of said pointer relative to said touch surface. 58. A touch system according to claim 57 wherein said processor ignores pointer co-ordinate data generated by said camera device when the certainty value associated therewith is below a threshold level. 59. An apparatus for detecting a pointer relative to a generally rectangular touch surface comprising: at least two spaced imaging devices having overlapping fields of view encompassing said touch surface, said imaging devices being spaced in front of said touch surface and looking back to see said touch surface in three-dimensions as a perspective view with the perspective view including at least the four corners and sides of said touch surface, said imaging devices acquiring overlapping images from different viewpoints, each imaging device comparing each captured image with a mathematical model describing the boundaries of the touch surface to determine a subset of relevant pixels within the captured image and processing the subset of relevant pixels in each captured image to generate image data, said relevant pixel subset encompassing said touch surface; and a processor receiving and processing image data generated by at least one of said imaging devices to determine the location of the pointer relative to said touch surface using triangulation. 60. An apparatus according to claim 59 wherein each imaging device is calibrated to establish the relationship between points (X,Y,Z) in its perspective view and points (x,y) in acquired images, each imaging device outputting pointer co-ordinate data when a pointer is captured in an acquired image. 61. An apparatus according to claim 60 wherein each imaging device is spaced in front of said touch surface a sufficient distance to inhibit its view of the entire touch surface from being obstructed. 62. An apparatus according to claim 61 wherein each imaging device also generates a certainty value representing the degree of certainty that the imaging device has positively identified the pointer in the acquired image. 63. An apparatus according to claim 62 wherein said certainty value is used by said processor to determine pointer co-ordinate data to be used to determine the position of said pointer relative to said touch surface. 64. An apparatus according to claim 63 wherein said processor ignores pointer co-ordinate data generated by said imaging device when the certainty value associated therewith is below a threshold level. 65. An apparatus for detecting a pointer relative to a touch surface comprising: at least two spaced imaging devices having overlapping fields of view, each of said imaging devices being in front of the touch surface and looking back at the touch surface such that said imaging devices see said touch surface in three-dimensions as a perspective view including the boundaries of said touch surface, said imaging devices acquiring overlapping images from different viewpoints; and processing structure processing image data generated by at least one of said imaging devices to determine the location of the pointer, wherein each imaging device is calibrated to establish the relationship between points (X,Y,Z) in its perspective view and points (x,y) in acquired images, each imaging device generating pointer co-ordinate data when a pointer exists in an acquired image and wherein during calibration external orientation parameters (X0,Y0,Z0) and (ω,φ,κ) of each imaging device are determined where: (X0,Y0,Z0) is the spatial location of the optical center of the imaging device; and (ω,φ,κ) are the orientation angles of the optical axis of the imaging device with respect to the three-dimensional co-ordinate system of the touch surface. 66. An apparatus according to claim 65 wherein during calibration internal orientation parameters f,x0,y0 and Δx,Δy of each imaging device are also determined where: (Δx,Δy) represent distortion terms introduced due to the imperfect nature of a lens of the imaging device; f is the focal length of the imaging device; and (x0,y0) are the co-ordinates of the principal point of the imaging device. 67. An apparatus according to claim 66 wherein said relationship is expressed using collinearity equations as: where: (x,y) are the co-ordinates of a point in a captured image corresponding to a point (X,Y,Z) in the three-dimensional perspective view; and (R1,R3R3) are terms depending on point (X,Y,Z), the spatial location (X0,Y0,Z0)and the orientation angles (ω,φ,κ). 68. An apparatus according to claim 67 wherein during calibration, calibration points (X,Y,Z) on said touch surface and image points (x,y) corresponding to said calibration points are measured, said collinearity equations being solved using said measured calibration and image points thereby to determine said external and internal orientation parameters. 69. An apparatus according to claim 68 wherein said collinearity equations are solved using a least-squares method. 70. An apparatus according to claim 68 wherein said calibration points are at spaced locations along the periphery of said touch surface. 71. An apparatus according to claim 70 wherein said calibration points are located at the corners and edge mid-points of said touch surface. 72. An apparatus according to claim 65 wherein said external orientation parameters are determined using a vanishing point method. 73. An apparatus according to claim 72 wherein the determined external orientation parameters are refined using a least-squares method. 74. An apparatus according to claim 72 wherein said external and internal orientation parameters are determined using planar homography. 75. An apparatus according to claim 72 wherein said external orientation parameters are determined using a three point method.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.