Three dimensional user interface cursor control
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G09G-005/00
G06F-003/01
출원번호
US-0314207
(2011-12-08)
등록번호
US-8872762
(2014-10-28)
발명자
/ 주소
Galor, Micha
Gelbourt, Idan
Or, Ofir
Pokrass, Jonathan
Hoffnung, Amir
출원인 / 주소
Primesense Ltd.
대리인 / 주소
D. Kligler I.P. Services Ltd.
인용정보
피인용 횟수 :
7인용 특허 :
102
초록▼
A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a first set of multiple 3D coordinates representing a gesture performed by a user positioned within a field of view of a sensing device coupled to the computer, the first set of 3D coordinates
A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a first set of multiple 3D coordinates representing a gesture performed by a user positioned within a field of view of a sensing device coupled to the computer, the first set of 3D coordinates comprising multiple points in a fixed 3D coordinate system local to the sensing device. The first set of multiple 3D coordinates are transformed to a second set of corresponding multiple 3D coordinates in a subjective 3D coordinate system local to the user.
대표청구항▼
1. A method, comprising: receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of 3D coordinates representing a gesture performed by a limb of a user, wherein the limb is positioned within a field of view of a sensing device coupled to the computer and include
1. A method, comprising: receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of 3D coordinates representing a gesture performed by a limb of a user, wherein the limb is positioned within a field of view of a sensing device coupled to the computer and includes a joint, the set of 3D coordinates comprising first multiple points measured in a fixed 3D coordinate system local to the sensing device, the fixed 3D coordinate system having a first depth axis;upon the set of the 3D coordinates indicating that a motion of the joint has exceeded a specified threshold, transforming the first multiple points that indicate the limb to be moving along the first depth axis to corresponding second multiple points along a different, second depth axis local to the user; andapplying the second multiple points in executing the non-tactile 3D user interface. 2. The method according to claim 1, wherein the limb comprises a hand and the joint comprises an elbow associated with the hand. 3. An apparatus, comprising: a display; anda computer executing a non-tactile three dimensional (3D) user interface, and configured to receive a set of 3D coordinates representing a gesture performed by a limb of a user, wherein the limb is positioned within a field of view of a sensing device coupled to the computer and includes a joint, the set of 3D coordinates comprising first multiple points measured in a fixed 3D coordinate system local to the sensing device, the fixed 3D coordinate system having a first depth axis, and upon the set of the 3D coordinates indicating that a motion of the joint has exceeded a specified threshold, to transform the first multiple points that indicate the limb to be moving along the first depth axis to corresponding second multiple points along a different, second depth axis local to the user, and to apply the second multiple points in executing the non-tactile 3D user interface. 4. The apparatus according to claim 3, wherein the limb comprises a hand and the joint comprises an elbow associated with the hand. 5. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer executing a non-tactile three dimensional user interface, cause the computer to receive a set of 3D coordinates representing a gesture performed by a limb of a user, wherein the limb is positioned within a field of view of a sensing device coupled to the computer and includes a joint, the set of 3D coordinates comprising first multiple points measured in a fixed 3D coordinate system local to the sensing device, the fixed 3D coordinate system having a first depth axis, and upon the set of the 3D coordinates indicating that a motion of the joint has exceeded a specified threshold, to transform the first multiple points that indicate the limb to be moving along the first depth axis to corresponding second multiple points along a different, second depth axis local to the user, and to apply the second multiple points in executing the non-tactile 3D user interface. 6. A method, comprising: presenting, by a computer executing a non-tactile three dimensional (3D) user interface, a cursor, having a given cursor size, in proximity to one or more items on a display;receiving in the computer, from a sensing device, a set of 3D coordinates representing a gesture performed by a body part of a user positioned within a field of view of the sensing device;calculating a ratio between the cursor size and a body part size of the body part that performed the gesture; andmoving the cursor responsively to the received set of the coordinates in proportion to the calculated ratio. 7. The method according to claim 6, wherein the body part comprises a hand. 8. An apparatus, comprising: a display; anda computer executing a non-tactile three dimensional (3D) user interface, and configured to present a cursor, having a given cursor size, in proximity to one or more items on the display, to receive in the computer, from a sensing device, a set of 3D coordinates representing a gesture performed by a body part of a user positioned within a field of view of the sensing device, to calculate a ratio between the cursor size and a body part size of the body part that performed the gesture, and to move the cursor responsively to the received set of the coordinates in proportion to the calculated ratio. 9. The apparatus according to claim 8, wherein the body part comprises a hand. 10. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer executing a non-tactile three dimensional user interface, cause the computer to present a cursor, having a given cursor size, in proximity to one or more items on the display, to receive in the computer, from a sensing device, a set of 3D coordinates representing a gesture performed by a body part of a user positioned within a field of view of the sensing device, to calculate a ratio between the cursor size and a body part size of the body part that performed the gesture, and to move the cursor responsively to the received set of the coordinates in proportion to the calculated ratio. 11. A method, comprising: presenting, by a computer executing a non-tactile three dimensional (3D) user interface, an interactive cursor shaped as a hand in proximity to one or more items on a display;receiving in the computer, from a sensing device, a set of 3D coordinates representing a gesture performed by a user positioned within a field of view of the sensing device;positioning the interactive cursor on the display responsively to the received set of the coordinates; andconveying feedback to the user, responsively to the received set of the coordinates, indicating a proximity of the cursor to the one or more items, wherein a disposition of one or more fingers of the hand changes responsively to a distance of the cursor from the one of the items. 12. The method according to claim 11, wherein the feedback further comprises a sound. 13. An apparatus, comprising: a display; anda computer executing a non-tactile three dimensional (3D) user interface, and configured to present an interactive cursor shaped as a hand in proximity to one or more items on the display, to receive in the computer, from a sensing device, a set of 3D coordinates representing a gesture performed by a user positioned within a field of view of the sensing device, to position the interactive cursor on the display responsively to the received set of the coordinates, and to convey feedback to the user, responsively to the received set of the coordinates, indicating a proximity of the cursor to one of the items, wherein a disposition of one or more fingers of the hand changes responsively to a distance of the cursor from the one of the items. 14. The apparatus according to claim 13, wherein the feedback further comprises a sound. 15. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer executing a non-tactile three dimensional user interface, cause the computer to present an interactive cursor shaped as a hand in proximity to one or more items on a display, to receive in the computer, from a sensing device, a set of 3D coordinates representing a gesture performed by a user positioned within a field of view of the sensing device, to position the interactive cursor on the display responsively to the received set of the coordinates, and to convey feedback to the user, responsively to the received set of the coordinates, indicating a proximity of the cursor to one of the items, wherein a disposition of one or more fingers of the hand changes responsively to a distance of the cursor from the one of the items.
Kazama Hisashi,JPX ; Onoguchi Kazunori,JPX ; Yuasa Mayumi,JPX ; Fukui Kazuhiro,JPX, Apparatus and method for controlling an electronic device with user action.
Wee, Susie J.; Baker, Henry Harlyn; Bhatti, Nina T.; Covell, Michele; Harville, Michael, Communication and collaboration system using rich media environments.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Controlling resource access based on user gesturing in a 3D captured image stream of the user.
Cohen, Charles J.; Beach, Glenn; Cavell, Brook; Foulk, Gene; Jacobus, Charles J.; Obermark, Jay; Paul, George, Gesture-controlled interfaces for self-service machines and other applications.
Honda,Tadashi, Handwriting information processing apparatus, handwriting information processing method, and storage medium having program stored therein for handwriting information processing.
Rushmeier, Holly E.; Bernardini, Fausto, Method and apparatus for acquiring a set of consistent image maps to represent the color of the surface of an object.
Lanier Jaron Z. (Palo Alto CA) Grimaud Jean-Jacques G. (Portola Valley CA) Harvill Young L. (San Mateo CA) Lasko-Harvill Ann (San Mateo CA) Blanchard Chuck L. (Palo Alto CA) Oberman Mark L. (Mountain, Method and system for generating objects for a multi-person virtual world using data flow networks.
Latypov Nurakhmed Nurislamovich,SUX ; Latypov Nurulla Nurislamovich,SUX, Method for tracking and displaying user's spatial position and orientation, a method for representing virtual reality for a user, and systems of embodiment of such methods.
Rafii, Abbas; Bamji, Cyrus; Sze, Cheng-Feng; Torunoglu, Iihami, Methods for enhancing performance and data acquired from three-dimensional image systems.
Bang, Won chul; Kim, Dong yoon; Chang, Wook; Kang, Kyoung ho; Choi, Eun seok, Spatial motion recognition system and method using a virtual handwriting plane.
Murray, Paul; Troy, James J.; Erignac, Charles A.; Wojcik, Richard H.; Finton, David J.; Margineantu, Dragos D., System and method for controlling swarm of remote unmanned vehicles through human gestures.
Segawa,Hiroyuki; Hiraki,Norikazu; Shioya,Hiroyuki; Abe,Yuichi, Three-dimensional model processing device, three-dimensional model processing method, program providing medium.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Tracking a range of body movement based on 3D captured image streams of a user.
Backlund, Erik Johan Vendel; Bengtsson, Henrik; Heringslack, Henrik; Sassi, Jari; Thörn, Ola Karl; Åberg, Peter, User interface with three dimensional user input.
Ellenby, John; Ellenby, Thomas; Ellenby, Peter, Vision system computer modeling apparatus including interaction with real scenes with respect to perspective and spatial relationship as measured in real-time.
Im, Soungmin; Yu, Sunjin; Kim, Sangki; Lim, Kyungyoung; Cho, Yongwon; Kim, Taehyeong, Electronic device for displaying three-dimensional image and method of using the same.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.