IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0291584
(2002-11-12)
|
우선권정보 |
AU-0000559 (1999-05-25); AU-0001313 (1999-06-30); AU-0003632 (1999-10-25); AU-0004392 (1999-12-01) |
발명자
/ 주소 |
- Lapstun, Paul
- Silverbrook, Kia
|
출원인 / 주소 |
- Silverbrook Research Pty LTD
|
인용정보 |
피인용 횟수 :
10 인용 특허 :
8 |
초록
▼
A method and system for enabling user interaction with computer software running in a computer system via an interface surface and a sensing device. The interface surface contains information relating to the computer software and coded data indicative of a drawing field. When placed in an operative
A method and system for enabling user interaction with computer software running in a computer system via an interface surface and a sensing device. The interface surface contains information relating to the computer software and coded data indicative of a drawing field. When placed in an operative position relative to the interface surface, the sensing device senses indicating data indicative of the drawing field and generates movement data indicative of the sensing device's movement relative to the interface surface. The indicating and movement data are received in the computer system from the sensing device. The drawing field is identified and then the computer software is operated at least partly in reliance on the movement data, and in accordance with instructions associated with the drawing field.
대표청구항
▼
1. A method of enabling user information with computer software running in a computer system via:an interface surface containing information relating to the computer software and included coded data indicative of the position of a plurality of reference points on the interface surface; and a sensing
1. A method of enabling user information with computer software running in a computer system via:an interface surface containing information relating to the computer software and included coded data indicative of the position of a plurality of reference points on the interface surface; and a sensing device comprising: (a) an image sensor adapted to capture images of at least some of the coded data when the sensing device is placed in an operative position relative to the surface; and (b) a processor adapted to: (i) identify at least some of the coded data from at least one of the captured images; (ii) determine an orientation and position, within the captured images, of at least some of the coded data; (iii) decode at least some of the coded data; and (iv) generate, using at least some of the decoded coded data; indicating data indicative of a position of the sensing device relative to the interface surface; and at least one of movement data and position data, the movement data being indicative of the sensing device's movement relative to the interface surface, the position data being indicative of the sensing device's position relative to the interface surface; the method including the steps of, in the computer system: (a) receiving the indicating data from the sensing device; (b) receiving at least one of the movement data and the position data from the sensing device; (c) identifying a drawing field from the indicating data; (d) operating the computer software at least partly in reliance on at least one of the movement data and the position data, and in accordance with instructions associated with the drawing field; and (e) determining an identity of the interface surface from the indicating data. 2. The method according to claim 1, the method further including the steps of, in the computer system, associating at least one of the movement data and the position data with the drawing field.3. The method according to claim 1, including one step of sending in the computer system, data to the computer software indicative of at least the drawing field.4. The method of claim 1, wherein the coded data is indicative of an identity of the interface surface.5. The method according to claim 2, wherein the drawing field is associated with a visible drawing zone defined on the interface surface.6. A method of enabling user interaction with computer software running in a computer system via:an interface surface containing information relating to the computer software and including coded data indicative of an identity of the interface surface; and a sensing device comprising: (a) an image sensor adapted to capture images of at least some of the coded data when the sensing device is placed in an operative position relative to the surface; and (b) a processor adapted to: (i) identify at least some of the coded data from one or more of the captured images; (ii) determine an orientation, within the captured images, of at least some of the coded data; (iii) decode at least some of the coded data; and (iv) generate, using at least some of the decoded coded data; indicating data indicative of the identity of the interface surface; and at least one of movement data and position data, the movement data being indicative of the sensing device's movement relative to the interface surface, the position data being indicative of the sensing device's position relative to the interface surface; the method including the step of, in the computer system: (a) receiving the indicating data from the sensing device; (b) receiving at least one of the movement data and the position data from the sensing device; (c) performing written gesture recognition in relation to at least one of: at least some of the movement data; and at least some of the position data; (d) in the event that a written gesture is recognised, operating the computer system in accordance with instructions associated with the written gesture and the interface surface; and (e) determining an identity of the interface surface from the indicating data. 7. The method according to claim 6, wherein the selection gesture includes at least one of circumscribing and underlining at least some of the information.8. the method according to claim 6, wherein position elements are disposed on the interface surface, the sensing device being configured to periodically sense position elements as it is used to draw the hand-drawn user input into the surface, the method including the step of generating the movement data in the form of a locus of the sensing device in relation to the surface by ascertaining relative displacement of the sensing device over time with respect to at least one of the position elements.9. The method of claim 1 or claim 6, wherein the processor is housed within the sensing device.10. The method of claim 1 or claim 6, wherein the processor is adapted to operate at processing speeds ranging from 1 instruction per second to 1×1012 instructions per second.11. The method of claim 10, wherein the processor is adapted to operate at processing speeds ranging from 10 instructions per second to 1×106 instructions per second.12. The method of claim 1 or claim 6, wherein the image sensor is housed within the sensing device.13. The method of claim 1 or claim 6, wherein the image sensor is adapted to capture images at rates ranging from 1 image per second to 1,000 images per second.14. The method of claim 13 wherein the image sensor is adapted to capture images at rates ranging from 10 images per second to 200 images per second.15. The method of claim 1 or claim 6, wherein the coded data comprises non-barcode coded data.16. A system for enabling user interaction with computer software running in a computer system via:an interface surface containing information relating to the computer software and including coded data indicative of the position of a plurality of reference points on the interface surface; and a sensing device comprising: (a) an image sensor adapted to capture images of at least some of the coded data when the sensing device is placed in an operative position relative to the surface; and (b) a processor adapted to: (i) identify at least some of the coded data from at least one of the captured images; (ii) determine an orientation and position, within the captured images, of at least some of the coded data; (iii) decode at least some of the coded data; and (iv) generate, using at least some of the decoded coded data: indicating data indicative of the identity of the interface surface; and at least one of movement data and position data, the movement data being indicative of the sensing device's movement relative to the interface surface, the position data being indicative of the sensing device's position relative to the interface surface; the computer system being configured to: (a) receive the indicating data from the sensing device; (b) receive at least one of the movement data and the position data from the sensing device; (c) identify a drawing field from the indicating data; and (d) operate the computer software at least partly in reliance on at least one of the movement data and the position data, and in accordance with instructions associated with the drawing field. 17. The system according to claim 16, the computer system further being configured to associate at least one of the movement data and the position data with the drawing field.18. The system according to claim 16, the computer system being configured to send data to the computer software indicative of at least the drawing field.19. The system according to claim 17, wherein the drawing field is associated with a visible drawing zone defined on the interface surface.20. A system of enabling user interaction with computer software running in a computer system via:an interface surface containing information relating to the computer software and including coded data indicative of an identity of the interface surface; and a sensing device comprising: (a) an image sensor adapted to capture images of at least some of the coded data when the sensing device is placed in an operative position relative to the surface; and (b) a processor adapted to: (i) identify at least some of the coded data from one or more of the captured images; (ii) determine in orientation, within the captured images, of at least some of the coded data; (iii) decode at least some of the coded data; and (iv) generate, using at least some of the decoded coded data; indicating data indicative of the identity of the interface surface; and at least one of movement data and position data, the movement data being indicative of the sensing device's movement relative to the interface surface, the position data being indicative of the sensing device's position relative to the interface surface: the computer system being configured to: (a) receive the indicating data from the sensing device; (b) receive at least one of the movement data and the position data from the sensing device; (c) perform written gesture recognition in relation to at least on of: at least some of the movement data; and at least some of the position data; and (d) in the event that a written gesture is recognized, operate the computer software in accordance with instructions associated with the written gesture and the interface surface. 21. The system according to claim 20, wherein the selection gesture includes circumscribing or underlining at least some of the information.22. The system according to claim 20, wherein position elements are disposed on the interface surface, the sensing device being configured to periodically sense position elements as it is used to draw the band-drawn user input onto the surface, the system the computer system being configured to generate the movement data in the form of a locus of the sensing device in relation to the surface by ascertaining relative displacement of the sensing device over time with respect to at least one of the position elements.23. The system of claim 16 or claim 20, wherein the processor is housed within the sensing device.24. The system of claim 16 or claim 20, wherein the processor is adapted to operate at processing speeds ranging from 1 instruction per second to 1×1012 instructions per second.25. The system of claim 24 wherein the processor is adapted to operate all processing speeds ranging from 10 instructions per second to 1×106 instructions per second.26. The system of claim 16 or claim 20, wherein the image sensor is housed within the sensing device.27. The system of claim 16 or claim 20, wherein the image sensor is adapted to capture images at rates ranging from 1 image per second to 1,000 images per second.28. The system of claim 27 wherein the image sensor is adapted to capture images at rates ranging from 10 images per second to 200 images per second.29. The system of claim 16 or claim 20, wherein the coded data comprises non-barcode coded data.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.