A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device i
A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device is actuated responsively to the gesture.
대표청구항▼
1. A method, comprising: identifying, by a computer coupled to a three-dimensional (3D) sensing device and a display, different, respective locations of multiple controllable devices other than the display;receiving, by the computer from the 3D sensing device, a sequence of three-dimensional maps co
1. A method, comprising: identifying, by a computer coupled to a three-dimensional (3D) sensing device and a display, different, respective locations of multiple controllable devices other than the display;receiving, by the computer from the 3D sensing device, a sequence of three-dimensional maps containing at least a head and a hand of a user of the computer;detecting in the maps a gaze direction of the user that is directed toward a given device among the multiple controllable devices;defining a region in space with an apex at the head of the user and a base encompassing the given device;defining an interaction zone within the region;defining an angle threshold;defining a minimum time period;analyzing the maps to detect a gesture performed by the hand within the defined interaction zone that is directed toward an identified location of the given device among the multiple controllable devices,wherein the gesture comprises extending an elbow associated with the hand at an angle greater than or equal to the angle threshold, extending the hand toward the device and pausing the hand for the minimum time period; andactuating the given device responsively to the gesture. 2. The method according to claim 1, wherein the gesture is selected from a list comprising a pointing gesture, a grab gesture and a release gesture. 3. The method according to claim 1, and comprising receiving a vocal command from the user, and actuating the device in response to the gesture and the vocal command. 4. The method according to claim 1, and comprising communicating between the computer and a further device upon detecting, in the maps, a subsequent gesture directed toward the further device. 5. The method according to claim 1, wherein identifying the respective locations comprises performing, by the computer, an initialization step comprising: identifying, by the computer, the controllable devices that are in proximity to the 3D sensing device; anddirecting the user to point to each of the identified controllable devices. 6. An apparatus, comprising: a three-dimensional sensing device;a display; anda computer configured to identify different, respective locations of multiple controllable devices other than the display, to receive from the three-dimensional sensing device a sequence of three-dimensional maps containing at least a head and a hand of a user of the computer, to detect in the maps a gaze direction of the user that is directed toward a given device among the multiple controllable devices, to define a region in space with an apex at the head of the user and a base encompassing the given device, to define an interaction zone within the region, to analyze the maps to detect a gesture performed by the hand within the defined interaction zone that is directed toward an identified location of the given device among the multiple controllable devices, and to actuate the given device responsively to the gesture,wherein the computer is configured to define an angle threshold and a minimum time period, and wherein the gesture detected by the computer comprises extending an elbow associated with the hand at an angle greater than or equal to the angle threshold, extending the hand toward the given device, and pausing the hand for the minimum time period. 7. The apparatus according to claim 6, wherein the computer is configured to select the gesture from a list comprising a pointing gesture, a grab gesture and a release gesture. 8. The apparatus according to claim 6, wherein the computer is configured to receive a vocal command from the user, and to actuate the given device in response to the gesture and the vocal command. 9. The apparatus according to claim 6, wherein the computer is configured to communicate with a further device among the multiple controllable devices upon detecting, in the maps, a subsequent gesture directed toward the further device. 10. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer executing a non-tactile user interface and coupled to a display, cause the computer to: identify different, respective locations of multiple controllable devices other than the display,receive from a three-dimensional sensing device that is coupled to the computer a sequence of three-dimensional maps containing at least a head and a hand of a user of the computer,detect in the maps a gaze direction of the user that is directed toward a given device among the multiple controllable devices,define a region in space with an apex at the head of the user and a base encompassing the given device,define an interaction zone within the region,analyze the maps to detect a gesture performed by the hand within the defined interaction zone that is directed toward an identified location of the given device among the multiple controllable devices, andactuate the given device responsively to the gesture,wherein the instructions cause the computer to define an angle threshold and a minimum time period, and wherein the gesture detected by the computer comprises extending an elbow associated with the hand at an angle greater than or equal to the angle threshold, extending the hand toward the given device, and pausing the hand for the minimum time period.
Kazama Hisashi,JPX ; Onoguchi Kazunori,JPX ; Yuasa Mayumi,JPX ; Fukui Kazuhiro,JPX, Apparatus and method for controlling an electronic device with user action.
Wee, Susie J.; Baker, Henry Harlyn; Bhatti, Nina T.; Covell, Michele; Harville, Michael, Communication and collaboration system using rich media environments.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Controlling resource access based on user gesturing in a 3D captured image stream of the user.
Cohen, Charles J.; Beach, Glenn; Cavell, Brook; Foulk, Gene; Jacobus, Charles J.; Obermark, Jay; Paul, George, Gesture-controlled interfaces for self-service machines and other applications.
Honda,Tadashi, Handwriting information processing apparatus, handwriting information processing method, and storage medium having program stored therein for handwriting information processing.
Rushmeier, Holly E.; Bernardini, Fausto, Method and apparatus for acquiring a set of consistent image maps to represent the color of the surface of an object.
Lanier Jaron Z. (Palo Alto CA) Grimaud Jean-Jacques G. (Portola Valley CA) Harvill Young L. (San Mateo CA) Lasko-Harvill Ann (San Mateo CA) Blanchard Chuck L. (Palo Alto CA) Oberman Mark L. (Mountain, Method and system for generating objects for a multi-person virtual world using data flow networks.
Latypov Nurakhmed Nurislamovich,SUX ; Latypov Nurulla Nurislamovich,SUX, Method for tracking and displaying user's spatial position and orientation, a method for representing virtual reality for a user, and systems of embodiment of such methods.
Rafii, Abbas; Bamji, Cyrus; Sze, Cheng-Feng; Torunoglu, Iihami, Methods for enhancing performance and data acquired from three-dimensional image systems.
Bang, Won chul; Kim, Dong yoon; Chang, Wook; Kang, Kyoung ho; Choi, Eun seok, Spatial motion recognition system and method using a virtual handwriting plane.
Murray, Paul; Troy, James J.; Erignac, Charles A.; Wojcik, Richard H.; Finton, David J.; Margineantu, Dragos D., System and method for controlling swarm of remote unmanned vehicles through human gestures.
Segawa,Hiroyuki; Hiraki,Norikazu; Shioya,Hiroyuki; Abe,Yuichi, Three-dimensional model processing device, three-dimensional model processing method, program providing medium.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Tracking a range of body movement based on 3D captured image streams of a user.
Backlund, Erik Johan Vendel; Bengtsson, Henrik; Heringslack, Henrik; Sassi, Jari; Thörn, Ola Karl; Åberg, Peter, User interface with three dimensional user input.
Ellenby, John; Ellenby, Thomas; Ellenby, Peter, Vision system computer modeling apparatus including interaction with real scenes with respect to perspective and spatial relationship as measured in real-time.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.