Asymmetric mapping for tactile and non-tactile user interfaces
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06F-003/033
G06F-003/01
G06F-003/0481
출원번호
US-0778172
(2013-02-27)
등록번호
US-9229534
(2016-01-05)
발명자
/ 주소
Galor, Micha
출원인 / 주소
APPLE INC.
대리인 / 주소
D. Kligler I.P Services Ltd
인용정보
피인용 횟수 :
1인용 특허 :
111
초록▼
A method, including receiving, by a computer, a sequence of signals indicating a motion of a hand of a user within a predefined area, and segmenting the area into multiple regions. Responsively to the signals, a region is identified in which the hand is located, and a mapping ration is assigned to t
A method, including receiving, by a computer, a sequence of signals indicating a motion of a hand of a user within a predefined area, and segmenting the area into multiple regions. Responsively to the signals, a region is identified in which the hand is located, and a mapping ration is assigned to the motion of the hand based on a direction of the motion and the region in which the hand is located. Using the assigned mapping ratio, a cursor on a display is presented responsively to the indicated motion of the hand.
대표청구항▼
1. A method, comprising: receiving, by a computer, a sequence of signals indicating a motion of a hand of a user within a predefined area;segmenting the area into multiple regions;identifying, responsively to the signals, a region in which the hand is located;assigning a mapping ratio to the motion
1. A method, comprising: receiving, by a computer, a sequence of signals indicating a motion of a hand of a user within a predefined area;segmenting the area into multiple regions;identifying, responsively to the signals, a region in which the hand is located;assigning a mapping ratio to the motion of the hand based on a direction of the motion and the region in which the hand is located; andpresenting, using the assigned mapping ratio, a cursor on a display responsively to the indicated motion of the hand. 2. The method according to claim 1, wherein the signals indicate the hand manipulating a mouse, and wherein the predefined area comprises a two dimensional area including the mouse, and wherein the region comprises a two dimensional region within the two dimensional area. 3. The method according to claim 2, wherein receiving the signals comprises receiving the signals from one or more sensing devices mounted on the mouse, each of the one or more sensing devices selected from a list comprising an optical sensor and an ultrasonic senor. 4. The method according to claim 2, wherein receiving the signals comprises receiving the signals from a sensing device configured to collect images of the area, the sensing device selected from a list comprising a two dimensional optical sensor and a three dimensional optical sensor. 5. The method according to claim 1, wherein the region comprises a two dimensional region within a touchpad, and wherein receiving the signals comprises collecting the signals from tactile sensors positioned within the touchpad. 6. The method according to claim 1, wherein the motion of the hand comprises a three dimensional gesture performed by the hand, and wherein the predefined area comprises a three dimensional area including the hand, and wherein the region comprises a three dimensional region within the three dimensional area. 7. The method according to claim 6, wherein receiving the signals comprises collecting, from a three dimensional sensing device, three dimensional information of the area. 8. The method according to claim 1, wherein the mapping ratio is inversely related to a difficulty of moving the hand in the direction of the motion within the region. 9. The method according to claim 8, wherein the difficulty comprises an impediment positioned in proximity to the hand and in the direction of the motion, the impediment selected from a list comprising an edge of the area and an obstacle positioned within the area. 10. An apparatus, comprising: a sensing device; anda computer executing a mixed modality user interface, and configured to receive, from the sensing device, a sequence of signals indicating a motion of a hand of a user within a predefined area, to segment the area into multiple regions, to identify, responsively to the signals, a region in which the hand is located, to assign a mapping ratio to the motion of the hand based on a direction of the motion and the region in which the hand is located, and to present, using the assigned mapping ratio, a cursor on a display responsively to the indicated motion of the hand. 11. The apparatus according to claim 10, and comprising a mouse, and wherein the signals indicate the hand manipulating a mouse, and wherein the predefined area comprises a two dimensional area including the mouse, and wherein the region comprises a two dimensional region within the two dimensional area. 12. The apparatus according to claim 11, and comprising one or more sensing devices mounted on the mouse, and wherein the computer is configured to receive the signals from the one or more sensing devices, each of the one or more sensing devices selected from a list comprising an optical sensor and an ultrasonic senor. 13. The apparatus according to claim 11, and comprising a sensing device configured to collect images of the area, and wherein the computer is configured to receive the signals from the sensing device, the sensing device selected from a list comprising a two dimensional optical sensor and a three dimensional optical sensor. 14. The apparatus according to claim 10, and comprising a touchpad having tactile sensors positioned within the touchpad, wherein the area is comprised in the touchpad, and wherein the region comprises a two dimensional region on the touchpad, and wherein the computer is configured to receive the signals from the tactile sensors. 15. The apparatus according to claim 10, wherein the motion of the hand comprises a three dimensional gesture performed by the hand, and wherein the predefined area comprises a three dimensional area including the hand, and wherein the region comprises a three dimensional region within the three dimensional area. 16. The apparatus according to claim 15, and comprising a three dimensional sensing device configured to collect three dimensional information of the area, and wherein the computer is configured to receive the signals from the three dimensional sensing device. 17. The apparatus according to claim 10, wherein the mapping ratio is inversely related to a difficulty of moving the hand in the direction of the motion within the region. 18. The apparatus according to claim 17, wherein the difficulty comprises an impediment positioned in proximity to the hand and in the direction of the motion, the impediment selected from a list comprising an edge of the area and an obstacle positioned within the area. 19. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer, cause the computer to receive a sequence of signals indicating a motion of a hand of a user within a predefined area, to segment the area into multiple regions, to identify responsively to the signals, a region in which the hand is located, to assign a mapping ratio to the motion of the hand based on a direction of the motion and the region in which the hand is located, and to present, using the assigned mapping ratio, a cursor on a display responsively to the indicated motion of the hand.
Kazama Hisashi,JPX ; Onoguchi Kazunori,JPX ; Yuasa Mayumi,JPX ; Fukui Kazuhiro,JPX, Apparatus and method for controlling an electronic device with user action.
Wee, Susie J.; Baker, Henry Harlyn; Bhatti, Nina T.; Covell, Michele; Harville, Michael, Communication and collaboration system using rich media environments.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Controlling resource access based on user gesturing in a 3D captured image stream of the user.
Cohen, Charles J.; Beach, Glenn; Cavell, Brook; Foulk, Gene; Jacobus, Charles J.; Obermark, Jay; Paul, George, Gesture-controlled interfaces for self-service machines and other applications.
Honda,Tadashi, Handwriting information processing apparatus, handwriting information processing method, and storage medium having program stored therein for handwriting information processing.
Rushmeier, Holly E.; Bernardini, Fausto, Method and apparatus for acquiring a set of consistent image maps to represent the color of the surface of an object.
Lanier Jaron Z. (Palo Alto CA) Grimaud Jean-Jacques G. (Portola Valley CA) Harvill Young L. (San Mateo CA) Lasko-Harvill Ann (San Mateo CA) Blanchard Chuck L. (Palo Alto CA) Oberman Mark L. (Mountain, Method and system for generating objects for a multi-person virtual world using data flow networks.
Latypov Nurakhmed Nurislamovich,SUX ; Latypov Nurulla Nurislamovich,SUX, Method for tracking and displaying user's spatial position and orientation, a method for representing virtual reality for a user, and systems of embodiment of such methods.
Rafii, Abbas; Bamji, Cyrus; Sze, Cheng-Feng; Torunoglu, Iihami, Methods for enhancing performance and data acquired from three-dimensional image systems.
Bang, Won chul; Kim, Dong yoon; Chang, Wook; Kang, Kyoung ho; Choi, Eun seok, Spatial motion recognition system and method using a virtual handwriting plane.
Murray, Paul; Troy, James J.; Erignac, Charles A.; Wojcik, Richard H.; Finton, David J.; Margineantu, Dragos D., System and method for controlling swarm of remote unmanned vehicles through human gestures.
Segawa,Hiroyuki; Hiraki,Norikazu; Shioya,Hiroyuki; Abe,Yuichi, Three-dimensional model processing device, three-dimensional model processing method, program providing medium.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Tracking a range of body movement based on 3D captured image streams of a user.
Backlund, Erik Johan Vendel; Bengtsson, Henrik; Heringslack, Henrik; Sassi, Jari; Thörn, Ola Karl; Åberg, Peter, User interface with three dimensional user input.
Ellenby, John; Ellenby, Thomas; Ellenby, Peter, Vision system computer modeling apparatus including interaction with real scenes with respect to perspective and spatial relationship as measured in real-time.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.