IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
UP-0312983
(2001-07-05)
|
등록번호 |
US-7692625
(2010-05-20)
|
국제출원번호 |
PCT/CA2001/000980
(2001-07-05)
|
§371/§102 date |
20040907
(20040907)
|
국제공개번호 |
WO02/003316
(2002-01-10)
|
발명자
/ 주소 |
- Morrison, Gerald
- Holmgren, David
|
출원인 / 주소 |
|
대리인 / 주소 |
Katten Muchin Rosenman LLP
|
인용정보 |
피인용 횟수 :
116 인용 특허 :
89 |
초록
▼
A camera-based touch system (50) includes a passive touch surface (60) and at least two cameras (63) associated with the touch surface. The at least two cameras (63) have overlapping fields of view (FOV) encompassing the touch surface. The at least two cameras (63) acquire images of the touch surfac
A camera-based touch system (50) includes a passive touch surface (60) and at least two cameras (63) associated with the touch surface. The at least two cameras (63) have overlapping fields of view (FOV) encompassing the touch surface. The at least two cameras (63) acquire images of the touch surface from different locations and generate image data. A processor (54) receives and processes image data generated by the at least two cameras to determine the location of the pointer relative to the touch surface when the pointer is captured in images acquired by the at least two cameras. Actual pointer contact with the touch surface and pointer hover above the touch surface can be determined.
대표청구항
▼
What is claimed is: 1. A camera-based touch system comprising: at least two image sensors associated with a touch surface and having overlapping fields of view encompassing said touch surface, said at least two image sensors acquiring images of said touch surface from different vantages; and proces
What is claimed is: 1. A camera-based touch system comprising: at least two image sensors associated with a touch surface and having overlapping fields of view encompassing said touch surface, said at least two image sensors acquiring images of said touch surface from different vantages; and processing structure processing pixel subsets of the images acquired by said at least two image sensors, pixel data of each pixel subset being processed to determine the existence of a pointer by calculating a median location of the pointer, said processing structure using the calculated median locations to triangulate the location of the pointer relative to said touch surface. 2. A touch system according to claim 1 wherein said at least two image sensors have fields of view looking generally along the plane of said touch surface. 3. A touch system according to claim 2 wherein pixel intensities of said pixel subsets are processed to determine the existence of the pointer. 4. A touch system according to claim 1 wherein each of said image sensors communicates with an associated digital signal processor, said digital signal processor receiving image output from said image sensor and executing a find pointer routine to determine the median location of said pointer. 5. A touch system according to claim 4 wherein during said find pointer routine, said digital signal processor builds a vertical intensity histogram including columns of pixel intensities representing differences between an acquired image and a background image, the column of said vertical intensity histogram having the largest pixel intensity above a threshold level being used to define the center of said pixel subset, the width of said pixel subset being defined by columns of said vertical intensity histogram on opposite sides of the column defining the center of said pixel subset that have pixel intensities greater than said threshold level. 6. A touch system according to claim 5 wherein the digital signal processor associated with each image sensor analyses the pixels in said pixel subset to locate the pixel row where the pointer tip is located. 7. A touch system according to claim 6 wherein the digital signal processor associated with each image sensor creates a binary mask in said pixel subset with pixels of one value representing said pointer and pixels of a different value representing background to enable said median location and pointer tip location to be calculated. 8. A touch system according to claim 4 wherein the digital signal processor associated with each image sensor further executes an update background image routine to update the background image after each image is acquired. 9. A touch system according to claim 8 wherein during said update background image routine, the digital signal processor associated with each image sensor uses the equation: Bn+1(i,j)=(1−a)Bn(i,j)+aI(i,j) where: Bn+1 is the new background image; Bn is the current background image; I is the current acquired image; i,j are the row and column coordinates of the background image pixels being updated; and a is a number between 0 and 1 that indicates the degree of learning that should be taken from the current acquired image I. 10. A touch system according to claim 8 wherein the digital signal processor associated with each image sensor further determines the differences between each acquired image and the background image to detect changing light conditions. 11. A touch system according to claim 10 wherein said processing structure adjusts the exposure of each image sensor. 12. A touch system according to claim 1 wherein said processing structure further executes a touch surface determination routine to calculate the orientation of the touch surface as seen by each image sensor. 13. A touch system according to claim 1 wherein said processing structure further calculates the velocity of a pointer as said pointer moves toward said touch surface within the fields of view of said image sensors. 14. A touch system according to claim 12 wherein said processing structure tracks said pointer within the fields of view of said image sensors. 15. A touch system according to claim 13 wherein said processing structure tracks said pointer using at least one Kalman filter. 16. A camera-based touch system comprising: a generally rectangular passive touch surface on which contacts are made using a pointer; at least two digital image sensors, each image sensor being mounted adjacent a different corner of said touch surface, said digital image sensors having overlapping fields of view encompassing said touch surface, said digital image sensors acquiring images looking generally across said touch surface; and processing structure receiving image data output by said digital image sensors, said processing structure processing the image data to determine the existence of a pointer in acquired images by calculating median locations x of the pointer and using the calculated median locations to triangulate the location of said pointer relative to said touch surface and to determine whether said pointer is in contact with said touch surface. 17. A touch system according to claim 16 wherein each of said image sensors communicates with an associated digital signal processor, said digital signal processor receiving image output from said image sensor and executing a find pointer routine to determine the median location x of said pointer. 18. A touch system according to claim 17 wherein during said find pointer routine, said digital signal processor of each digital camera builds a vertical intensity histogram including columns of pixel intensities representing differences between an acquired image and a background image, the column of said vertical intensity histogram having the largest pixel intensity above a threshold level being used to define the center of said pixel subset, the width of said pixel subset being defined by columns of said vertical intensity histogram on opposite sides of the column defining the center of said pixel subset that have pixel intensities greater than a threshold level. 19. A touch system according to claim 18 wherein the digital signal processor associated with each image sensor analyses the pixels in said pixel subset to locate the pixel row where the pointer tip is located and determine said pointer tip location z. 20. A touch system according to claim 19 wherein the digital signal processor associated with each image sensor creates a binary mask in said pixel subset with pixels of one value representing said pointer and pixels of a different value representing background to enable said median location x and pointer tip location z to be calculated. 21. A touch system according to claim 17 wherein the digital signal processor associated with each image sensor further executes an update background image routine to update the background image after each image is acquired. 22. A touch system according to claim 21 wherein during said update background image routine, the digital signal processor associated with each image sensor uses the equation: Bn+1(i,j)=(1−a)Bn(i,j)+aI(i,j) where: Bn+1 is the new background image; Bn is the current background image; I is the current acquired image; i,j are the row and column coordinates of the background image pixels being updated; and a is a number between 0 and 1 that indicates the degree of learning that should be taken from the current acquired image I. 23. A touch system according to claim 21 wherein the digital signal processor associated with each image sensor further determines the differences between each acquired image and the background image to detect changing light conditions. 24. A touch system according to claim 23 wherein said processing structure adjusts the exposure of each image sensor. 25. A touch system according to claim 16 wherein said processing structure further executes a touch surface determination routine to calculate the orientation of the touch surface as seen by each image sensor. 26. A touch system according to claim 16 wherein said processing structure further calculates the velocity of a pointer as said pointer moves toward said touch surface within the fields of view of said image sensors. 27. A touch system according to claim 26 wherein said processing structure tracks said pointer within the fields of view of said digital cameras. 28. A touch system according to claim 27 wherein said processing structure tracks said pointer using at least one Kalman filter. 29. A touch system according to claim 16 wherein said touch surface is bordered by a frame and wherein each of said image sensors is mounted to said frame via a frame assembly, each image sensor being oriented so that the field of view thereof looks downward and generally along the plane of said touch surface. 30. A touch system according to claim 16 further including a computer coupled to said processing structure, said computer receiving pointer location information from said processing structure and using said pointer location information to update an applications program executed thereby. 31. A touch system according to claim 30 wherein computer display information is presented on said touch surface. 32. A method of detecting the position of a pointer relative to a touch surface comprising the steps of: acquiring multiple overlapping images of a pointer proximate to said touch surface using image sensors having fields of view at least looking at said touch surface; processing at processing structure pixel subsets of said acquired images to determine the existence of a pointer by calculating median locations of the pointer; and using the calculated median locations to triangulate the location of said pointer relative to said touch surface. 33. The method of claim 32 wherein during said processing, the pixel data is processed to determine when said pointer is in contact with said touch surface and when said pointer is hovering above said touch surface. 34. The method of claim 33 wherein said processing further includes the step of tracking the pointer as the pointer approaches the touch surface. 35. A method of detecting the position of a pointer relative to a touch surface comprising the steps of: acquiring multiple overlapping images of a pointer proximate to said touch surface using cameras having fields of view at least looking at said touch surface and outputting from each camera a pixel subset of each image, each said pixel subset comprising pixel data from selected pixel rows of the acquired image; and receiving at processing structure the outputted pixel subsets of said acquired images and processing the pixel data of said pixel subsets to determine the location of said pointer relative to said touch surface using triangulation, wherein during said processing the existence of said pointer is determined by calculating median lines of the pointer and wherein the location of said pointer is determined by calculating the intersection point of the median lines and using triangulation to determine the coordinates of said intersection point. 36. The method of claim 35 wherein during said processing, the pixel data is processed to determine when said pointer is in contact with said touch surface and when said pointer is hovering above said touch surface. 37. The method of claim 36 wherein said processing further includes the step of tracking the pointer as the pointer approaches the touch surface. 38. A camera-based touch system comprising: at least two digital cameras associated with a touch surface and having overlapping fields of view encompassing said touch surface, said at least two digital cameras acquiring images of said touch surface from different vantages and for each acquired image, outputting a pixel subset of the acquired image, said pixel subset comprising pixel data from selected pixel rows of the acquired image; and processing structure receiving said pixel subsets of the acquired images generated by said at least two digital cameras, the pixel data of the pixel subsets being processed to generate pointer data when a pointer exists in said acquired images, said processing structure triangulating the pointer data to determine the location of a pointer relative to said touch surface, wherein each of said digital cameras includes an image sensor and a digital signal processor, said digital signal processor receiving image output from said image sensor and executing a find pointer routine to determine if a pointer is in each image acquired by said digital camera, and if so the median line x of said pointer and wherein the digital signal processor of each digital camera further executes an update background image routine to update the background image after each image is acquired. 39. A touch system according to claim 38 wherein during said update background image routine, the digital signal processor of each digital camera uses the equation: Bn+1(i,j)=(1−a)Bn(i,j)+aI(i,j) where: Bn+1 is the new background image; Bn is the current background image; I is the current acquired image; i,j are the row and column coordinates of the background image pixels being updated; and a is a number between 0 and 1 that indicates the degree of learning that should be taken from the current acquired image I. 40. A touch system according to claim 38 wherein the digital signal processor of each digital camera further determines the differences between each acquired image and the background image to detect changing light conditions. 41. A touch system according to claim 40 wherein each digital camera transmits light condition information to said processing structure, said processing structure using said light condition information to adjust the exposure of each said digital camera. 42. A touch system according to claim 41 wherein the digital signal processor of each digital camera further executes a create packet routine to package said image data and light condition information into said packets. 43. A camera-based touch system comprising: at least two image sensors associated with a touch surface and having overlapping fields of view encompassing said touch surface, said at least two image sensors acquiring images of said touch surface from different vantages; and processing structure processing pixel subsets of the acquired images generated by said at least two image sensors, pixel data of the pixel subsets being processed to determine the existence of a pointer in said acquired images, when a pointer exists in said acquired images said processing structure determining the location of the pointer relative to said touch surface using triangulation, wherein during determination of the existence of the pointer, said processing structure builds a vertical intensity histogram including columns of pixel intensities representing differences between an acquired image and a background image, the column of said vertical intensity histogram having the largest pixel intensity above a threshold level being used to define the center of said pixel subset, the width of said pixel subset being defined by columns of said vertical intensity histogram on opposite sides of the column defining the center of said pixel subset that have pixel intensities greater than said threshold level. 44. A touch system according to claim 43 wherein said at least two image sensors have fields of view looking generally along the plane of said touch surface. 45. A touch system according to claim 43 wherein each image sensor has an associated digital signal processor, said digital signal processor updating the background image after each image is acquired. 46. A touch system according to claim 45 wherein during said update background image routine, the digital signal processor associated with each image sensor uses the equation: Bn+1(i,j)=(1−a)Bn(i,j)+aI(i,j) where: Bn+1 is the new background image; Bn is the current background image; I is the current acquired image; i,j are the row and column coordinates of the background image pixels being updated; and a is a number between 0 and 1 that indicates the degree of learning that should be taken from the current acquired image I. 47. A touch system according to claim 46 wherein the digital signal processor associated with each image sensor further determines the differences between each acquired image and the background image to detect changing light conditions. 48. A touch system according to claim 47 wherein said processing structure adjusts the exposure of each image sensor. 49. A touch system according to claim 43 wherein said processing structure further calculates the orientation of the touch surface as seen by each image sensor. 50. A touch system according to claim 49 wherein said processing structure further calculates the velocity of a pointer as said pointer moves toward said touch surface within the fields of view of said image sensors. 51. A touch system according to claim 50 wherein said processing structure tracks said pointer within the fields of view of said image sensors. 52. A touch system according to claim 51 wherein said processing structure tracks said pointer using at least one Kalman filter. 53. A touch system according to claim 43 wherein said touch surface is generally rectangular and wherein said at least two image sensors are positioned adjacent different corners of said touch surface. 54. A touch system according to claim 53 wherein said touch surface is bordered by a frame and wherein each image sensor is mounted to said frame, each image sensor being oriented so that the field of view thereof looks downward and generally along the plane of said touch surface. 55. A touch system according to claim 53 further including a computer coupled to said processing structure, said computer receiving pointer location information from said processing structure and using said pointer location information to update an applications program executed thereby. 56. A touch system according to claim 55 wherein computer display information is presented on said touch surface.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.