IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0807589
(2004-03-23)
|
등록번호 |
US-7301529
(2007-11-27)
|
발명자
/ 주소 |
- Marvit,David L.
- Reinhardt,Albert H. M.
- Adler,B. Thomas
- Wilcox,Bruce A.
- Matsumoto,Hitoshi
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
85 인용 특허 :
22 |
초록
▼
A motion controlled handheld device includes a display having a viewable surface and operable to generate an image and a gesture database maintaining a plurality of gestures. Each gesture is defined by a motion of the device with respect to a first position of the device. The device includes a plura
A motion controlled handheld device includes a display having a viewable surface and operable to generate an image and a gesture database maintaining a plurality of gestures. Each gesture is defined by a motion of the device with respect to a first position of the device. The device includes a plurality of applications each having a plurality of predefined commands and a gesture mapping database comprising a plurality of command maps. Each of the command maps corresponds to a particular one of the applications and maps each of the predefined commands to one of the gestures. The device includes a motion detection module operable to detect motion of the handheld device within three dimensions and to identify components of the motion in relation to the viewable surface. The device also includes a control module operable to load one of the applications, to select one of the command maps corresponding to the loaded application, to track movement of the handheld device using the motion detection module, to compare the tracked movement against the gestures to determine a matching gesture, to identify, using the selected command map, the predefined command mapped to the matching gesture, and to perform the identified command using the loaded application.
대표청구항
▼
What is claimed is: 1. A motion controlled handheld device comprising: a display having a viewable surface and operable to generate an image; a gesture database maintaining a plurality of gestures, each gesture defined by a motion of the device with respect to a first position of the device; a plur
What is claimed is: 1. A motion controlled handheld device comprising: a display having a viewable surface and operable to generate an image; a gesture database maintaining a plurality of gestures, each gesture defined by a motion of the device with respect to a first position of the device; a plurality of applications each having a plurality of predefined commands; a gesture mapping database comprising a plurality of command maps, each of the command maps corresponding to a particular one of the applications and mapping each of the predefined commands to one of the gestures; a motion detection module operable to detect motion of the handheld device within three dimensions and to identify components of the motion in relation to the viewable surface; a control module operable to load one of the applications, to select one of the command maps corresponding to the loaded application, to track movement of the handheld device using the motion detection module, to compare the tracked movement against the gestures to determine a matching gesture, to identify, using the selected command map, the predefined command mapped to the matching gesture, and to perform the identified command using the loaded application; wherein a selected one of the applications has a first application state and a second application state; wherein the command map associated with the selected application comprises a first mapping for the first application state and a second mapping for the second application state, the first mapping and the second mapping each mapping selected ones of the gestures to selected ones of the predefined commands of the selected application; and wherein the first application state comprises viewing of a portion of an image of application data, and the second application state comprises viewing a hierarchical menu for performing operations with respect to the application data. 2. The motion controlled handheld device of claim 1, wherein the control module is further operable to load a second one of the applications, to select a second one of the command maps corresponding to the second loaded application, and to replace the first selected command map with the second selected command map. 3. The motion controlled handheld device of claim 2, wherein the matching gesture maps to a first predefined command using the first selected command map and to a second predefined command using the second selected command map. 4. The motion controlled handheld device of claim 1, wherein the first application state is associated with a first image type and the second application state is associated with a second image type. 5. The motion controlled handheld device of claim 1, further comprising: a first accelerometer operable to detect acceleration along a first axis; a second accelerometer operable to detect acceleration along a second axis, the second axis perpendicular to the first axis; and a third accelerometer operable to detect acceleration along a third axis, the third axis perpendicular to the first axis and perpendicular to the second axis; and wherein: the gesture database further defines each of the gestures using a sequence of accelerations; the motion detection module is further operable to detect motion of the device using accelerations measured by the first accelerometer, the second accelerometer, and the third accelerometer; and the control module is further operable to match the accelerations measured by the motion detection module against gesture definitions in the gesture database to identify particular ones of the gestures. 6. A method for controlling a handheld device comprising: generating an image on a viewable surface of the handheld device; maintaining a gesture database comprising a plurality of gestures, each gesture defined by a motion of the device with respect to a first position of the device; maintaining a plurality of applications each having a plurality of predefined commands; maintaining a gesture mapping database comprising a plurality of command maps, each of the command maps corresponding to a particular one of the applications and mapping each of the predefined commands to one of the gestures; loading one of the applications; selecting one of the command maps corresponding to the loaded application; tracking movement of the handheld device using the motion detection module in relation to the viewable surface; comparing the tracked movement against the gestures to determine a matching gesture; identifying, using the selected command map, the predefined command mapped to the matching gesture; performing the identified command using the loaded application; wherein a selected one of the applications has a first application state and a second application state; wherein the command map associated with the selected application comprises a first mapping for the first application state and a second mapping for the second application state, the first mapping and the second mapping each mapping selected ones of the gestures to selected ones of the predefined commands of the selected application; and wherein the first application state comprises viewing of a portion of an image of application data, and the second application state comprises viewing a hierarchical menu for performing operations with respect to the application data. 7. The method of claim 6, further comprising, in response to a received command: loading a second one of the applications; selecting a second one of the command maps corresponding to the second loaded application; and replacing the first selected command map with the second selected command map. 8. The method of claim 7, wherein the matching gesture maps to a first predefined command using the first selected command map and to a second predefined command using the second selected command map. 9. The method of claim 6, wherein the first application state is associated with a first image type and the second application state is associated with a second image type. 10. The method of claim 6, wherein the gesture database further defines each of the gestures using a sequence of accelerations; the method further comprising: detecting acceleration along a first axis; detecting acceleration along a second axis, the second axis perpendicular to the first axis; and detecting acceleration along a third axis, the third axis perpendicular to the first axis and perpendicular to the second axis; detecting motion of the device using accelerations measured by the first accelerometer, the second accelerometer, and the third accelerometer; and matching the accelerations against gesture definitions in the gesture database to identify potential indicated ones of the gestures. 11. Logic for controlling a handheld device, the logic embodied as a computer program stored on a computer readable medium and operable when executed to perform the steps of: generating an image on a viewable surface of the handheld device; maintaining a gesture database comprising a plurality of gestures, each gesture defined by a motion of the device with respect to a first position of the device; maintaining a plurality of applications each having a plurality of predefined commands; maintaining a gesture mapping database comprising a plurality of command maps, each of the command maps corresponding to a particular one of the applications and mapping each of the predefined commands to one of the gestures; loading one of the applications; selecting one of the command maps corresponding to the loaded application; tracking movement of the handheld device using the motion detection module in relation to the viewable surface; comparing the tracked movement against the gestures to determine a matching gesture; identifying, using the selected command map, the predefined command mapped to the matching gesture; performing the identified command using the loaded application; wherein a selected one of the applications has a first application state and a second application state; wherein the command map associated with the selected application comprises a first mapping for the first application state and a second mapping for the second application state, the first mapping and the second mapping each mapping selected ones of the gestures to selected ones of the predefined commands of the selected application; and wherein the first application state comprises viewing of a portion of an image of application data, and the second application state comprises viewing a hierarchical menu for performing operations with respect to the application data. 12. The logic of claim 11, further operable, in response to a received command, to: load a second one of the applications; select a second one of the command maps corresponding to the second loaded application; and replace the first selected command map with the second selected command map. 13. The logic of claim 12, wherein the matching gesture maps to a first predefined command using the first selected command map and to a second predefined command using the second selected command map. 14. The logic of claim 11, wherein the first application state is associated with a first image type and the second application state is associated with a second image type. 15. The logic of claim 11, wherein the gesture database further defines each of the gestures using a sequence of accelerations; the logic further operable when executed to perform the steps of: detecting acceleration along a first axis; detecting acceleration along a second axis, the second axis perpendicular to the first axis; and detecting acceleration along a third axis, the third axis perpendicular to the first axis and perpendicular to the second axis; detecting motion of the device using accelerations measured by the first accelerometer, the second accelerometer, and the third accelerometer; and matching the accelerations against gesture definitions in the gesture database to identify potential indicated ones of the gestures. 16. A motion controlled handheld device comprising: means for generating an image on a viewable surface of the handheld device; means for maintaining a gesture database comprising a plurality of gestures, each gesture defined by a motion of the device with respect to a first position of the device; means for maintaining a plurality of applications each having a plurality of predefined commands; means for maintaining a gesture mapping database comprising a plurality of command maps, each of the command maps corresponding to a particular one of the applications and mapping each of the predefined commands to one of the gestures; means for loading one of the applications; means for selecting one of the command maps corresponding to the loaded application; means for tracking movement of the handheld device using the motion detection module in relation to the viewable surface; means for comparing the tracked movement against the gestures to determine a matching gesture; means for identifying, using the selected command map, the predefined command mapped to the matching gesture; means for performing the identified command using the loaded application; wherein a selected one of the applications has a first application state and a second application state; wherein the command map associated with the selected application comprises a first mapping for the first application state and a second mapping for the second application state, the first mapping and the second mapping each mapping selected ones of the gestures to selected ones of the predefined commands of the selected application; and wherein the first application state comprises viewing of a portion of an image of application data, and the second application state comprises viewing a hierarchical menu for performing operations with respect to the application data.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.