IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0984031
(2012-02-09)
|
등록번호 |
US-9285874
(2016-03-15)
|
국제출원번호 |
PCT/IB2012/050577
(2012-02-09)
|
§371/§102 date |
20131007
(20131007)
|
국제공개번호 |
WO2012/107892
(2012-08-16)
|
발명자
/ 주소 |
- Bychkov, Eyal
- Brezner, Oren
- Galor, Micha
- Or, Ofir
- Pokrass, Jonathan
- Hoffnung, Amir
- Berliner, Tamir
|
출원인 / 주소 |
|
대리인 / 주소 |
D. Kligler IP Services Ltd.
|
인용정보 |
피인용 횟수 :
2 인용 특허 :
90 |
초록
▼
A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user (22) of a computerized system, and receiving a two dimensional (2D) image of the user, the image including an eye (34) of the user. 3D coordinates of a head (32) of the user are extracted from the 3D ma
A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user (22) of a computerized system, and receiving a two dimensional (2D) image of the user, the image including an eye (34) of the user. 3D coordinates of a head (32) of the user are extracted from the 3D map and the 2D image, and a direction of a gaze performed by the user is identified based on the 3D coordinates of the head and the image of the eye.
대표청구항
▼
1. A method, comprising: receiving a three-dimensional (3D) map of at least a part of a body of a user of a computerized system;receiving a two dimensional (2D) image of the user, the image including an eye of the user;extracting, from the 3D map and the 2D image, 3D coordinates of a head of the use
1. A method, comprising: receiving a three-dimensional (3D) map of at least a part of a body of a user of a computerized system;receiving a two dimensional (2D) image of the user, the image including an eye of the user;extracting, from the 3D map and the 2D image, 3D coordinates of a head of the user;identifying, based on the 3D coordinates of the head and the image of the eye, a direction of a gaze performed by the user; andcontrolling a function of the computerized system responsively to the direction of the gaze by performing an action associated with an interactive item presented in the direction of the gaze on a display coupled to the computerized system upon receiving a first sequence of three-dimensional (3D) maps indicating a motion of a limb toward the display, receiving a second sequence of 3D maps indicating a deceleration of the motion of the limb toward the display, and receiving a third sequence of 3D maps indicating a motion of the limb away from the display. 2. The method according to claim 1, wherein identifying the direction of the gaze comprises analyzing light reflected off an element of the eye. 3. The method according to claim 2, wherein the element is selected from a list comprising a pupil, an iris and a cornea. 4. The method according to claim 1, wherein extracting the 3D coordinates of the head comprises segmenting the 3D map in order to extract a position of the head along a horizontal axis, a vertical axis, and a depth axis. 5. The method according to claim 1, wherein extracting the 3D coordinates of the head comprises identifying, from the 2D image a first position of the head along a horizontal axis and a vertical axis, and segmenting the 3D map in order to identify, from the 3D map, a second position of the head along a depth axis. 6. The method according to claim 1, wherein controlling the function of the computerized system comprises identifying an interactive item presented on a display coupled to the computerized system and in the direction of the gaze, and changing a state of the interactive item. 7. The method according to claim 6, wherein the state of the interactive item is changed responsively to the gaze. 8. The method according to claim 6, wherein the state of the interactive item is changed responsively to a vocal command received from the user. 9. The method according to claim 6, wherein changing the state comprises directing input received from the user to the interactive item. 10. The method according to claim 6, and comprising receiving a sequence of three-dimensional maps indicating that the user is moving a limb in a specific direction, and repositioning the interactive item responsively in the specific direction. 11. The method according to claim 6, and comprising identifying a target point on the display in the direction of the gaze, and calculating a calibration coefficient based on the proximity of the target point to the interactive item. 12. The method according to claim 1, wherein controlling the function of the computerized system comprises activating, by the computerized system, power saving techniques upon the user directing the gaze away from the display. 13. The method according to claim 12, wherein the activating power saving techniques comprises decreasing a brightness of a display coupled to the computerized system. 14. The method according to claim 1, wherein controlling the function of the computerized system comprises deactivating, by the computerized system, power saving techniques upon the user directing the gaze toward from the display. 15. The method according to claim 14, wherein deactivating the power saving techniques comprises increasing a brightness of a display coupled to the computerized system. 16. The method according to claim 1, wherein controlling the function of the computerized system comprises identifying a device coupled to the computerized system and in the direction of the gaze, and changing a state of the device. 17. The method according to claim 1, wherein controlling the function of the computerized system comprises presenting content on a display coupled to the computerized system, identifying a region on a display coupled to the computerized system and in the direction of the gaze, and blurring the content outside the region. 18. The method according to claim 1, wherein controlling the function of the computerized system comprises presenting, on a display coupled to the computerized system, content comprising multiple items, each the multiple interactive items having an associated depth value, identifying one of the interactive items in the direction of the gaze, identifying the depth value associated with the one of the interactive items, and blurring any of the interactive items whose associated depth value is not in accordance with the identified depth value. 19. The method according to claim 1, and comprising receiving and segmenting a first sequence of three-dimensional (3D) maps over time of at least a part of a body of a user of a computerized system in order to extract 3D coordinates of a first point and a second point of the user, the 3D maps indicating a motion of the second point with respect to a display coupled to the computer system, calculating a line segment that intersects the first point and the second point, identifying a target point where the line segment intersects the display and engaging a second interactive item presented on the display in proximity to the target point, upon detecting a specific event, the second interactive item differing from a first interactive item presented on a display coupled to the computerized system and in the direction of the gaze. 20. The method according to claim 19, wherein the specific event comprises the 3D map and the 2D image not indicating the direction of the gaze. 21. The method according to claim 19, wherein the specific event comprises the motion of the second point having at least a first specified distance and the target point being at least a second specified distance from the first interactive item. 22. The method according to claim 1, and comprising controlling, in response to the gaze direction, a function of a device coupled to the computerized system and positioned in the direction of the gaze. 23. An apparatus, comprising: a sensing device configured to receive a three dimensional (3D) map of at least a part of a body of a user and an image of an eye of the user, and to receive a two dimensional (2D) image of the user, the 2D image including an eye of the user; anda computer coupled to the sensing device and configured to extract, from the 3D map and the 2D image, 3D coordinates of a head of the user and to identify, based on the 3D coordinates of the head and the image of the eye, a direction of a gaze performed by the user,wherein the computer is configured to control a function of the apparatus responsively to the direction of the gaze by performing an action associated with an interactive item presented in the direction of the gaze on a display coupled to the apparatus, upon receiving a first sequence of three-dimensional (3D) maps indicating a motion of a limb toward the display, receiving a second sequence of 3D maps indicating a deceleration of the motion of the limb toward the display, and receiving a third sequence of 3D maps indicating a motion of the limb away from the display. 24. The apparatus according to claim 23, wherein the computer is configured to identify the direction of the gaze by analyzing light reflected off an element of the eye. 25. The apparatus according to claim 24, wherein the computer is configured to select the element from a list comprising a pupil, an iris and a cornea. 26. The apparatus according to claim 23, wherein the computer is configured to extract the 3D coordinates of the head by segmenting the 3D map in order to extract a position of the head along a horizontal axis, a vertical axis, and a depth axis. 27. The apparatus according to claim 23, wherein the computer is configured to extract the 3D coordinates of the head by identifying, from the 2D image a first position of the head along a horizontal axis and a vertical axis, and segmenting the 3D map in order to identify, from the 3D map, a second position of the head along a depth axis. 28. The apparatus according to claim 23, wherein the computer is configured to control the function of the computerized system by identifying an interactive item presented on a display coupled to the computerized system and in the direction of the gaze, and to change a state of the interactive item. 29. The apparatus according to claim 28, wherein the computer is configured to change the state of the interactive item is changed responsively to the gaze. 30. The apparatus according to claim 28, wherein the computer is configured to change the state of the interactive item responsively to a vocal command received from the user. 31. The apparatus according to claim 28, wherein the computer is configured to change the state by directing input received from the user to the interactive item. 32. The apparatus according to claim 28, wherein the computer is configured to receive a sequence of three-dimensional maps indicating that the user is moving a limb in a specific direction, and to reposition the interactive item responsively in the specific direction. 33. The apparatus according to claim 28, wherein the computer is configured to identify a target point on the display in the direction of the gaze, and to calculate a calibration coefficient based on the proximity of the target point to the interactive item. 34. The apparatus according to claim 23, wherein the computer is configured to control the function of the computerized system by activating, by the computerized system, power saving techniques upon the user directing the gaze away from the display. 35. The apparatus according to claim 34, wherein the computer is configured to activate power saving techniques by decreasing a brightness of a display coupled to the computerized system. 36. The apparatus according to claim 23, wherein the computer is configured to control the function of the computerized system by deactivating, by the computerized system, power saving techniques upon the user directing the gaze toward from the display. 37. The apparatus according to claim 36, wherein the computer is configured to deactivate the power saving techniques by increasing a brightness of a display coupled to the computerized system. 38. The apparatus according to claim 23, wherein the computer is configured to control the function of the computerized system by identifying a device coupled to the computerized system and in the direction of the gaze, and changing a state of the device. 39. The apparatus according to claim 23, wherein the computer is configured to control the function of the computerized system by presenting content on a display coupled to the computerized system, identifying a region on a display coupled to the computerized system and in the direction of the gaze, and blurring the content outside the region. 40. The apparatus according to claim 23, wherein the computer is configured to control the function of the computerized system by presenting, on a display coupled to the computerized system, content comprising multiple items, each the multiple interactive items having an associated depth value, identifying one of the interactive items in the direction of the gaze, identifying the depth value associated with the one of the interactive items, and blurring any of the interactive items whose associated depth value is not in accordance with the identified depth value. 41. The apparatus according to claim 23, wherein the computer is configured to receive and segment a first sequence of three-dimensional (3D) maps over time of at least a part of a body of a user of a computerized system in order to extract 3D coordinates of a first point and a second point of the user, the 3D maps indicating a motion of the second point with respect to a display coupled to the computer system, to calculate a line segment that intersects the first point and the second point, and to identify a target point where the line segment intersects the display and engage a second interactive item presented on the display in proximity to the target point, upon detecting a specific event, the second interactive item differing from a first interactive item presented on a display coupled to the computerized system and in the direction of the gaze. 42. The apparatus according to claim 41, wherein the specific event comprises the 3D map and the 2D image not indicating the direction of the gaze. 43. The apparatus according to claim 41, wherein the specific event comprises the motion of the second point having at least a first specified distance and the target point being at least a second specified distance from the first interactive item. 44. The apparatus according to claim 23, wherein the computer is configured to control, in response to the gaze direction, a function of a device coupled to the computerized system and positioned in the direction of the gaze. 45. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer, cause the computer to receive a three-dimensional (3D) map of at least a part of a body of a user of the computer, to receive a two dimensional (2D) image of the user, the image including an eye of the user, to extract, from the 3D map and the 2D image, 3D coordinates of a head of the user, and to identify, based on the 3D coordinates of the head and the image of the eye, a direction of a gaze performed by the user, wherein the instructions cause the computer to control a function of the computer responsively to the direction of the gaze by performing an action associated with an interactive item presented in the direction of the gaze on a display coupled to the computerized system, upon receiving a first sequence of three-dimensional (3D) maps indicating a motion of a limb toward the display, receiving a second sequence of 3D maps indicating a deceleration of the motion of the limb toward the display, and receiving a third sequence of 3D maps indicating a motion of the limb away from the display.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.