IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0960823
(2013-08-07)
|
등록번호 |
US-9342146
(2016-05-17)
|
발명자
/ 주소 |
- Bychkov, Eyal
- Brezner, Oren
- Galor, Micha
- Or, Ofir
- Pokrass, Jonathan
- Hoffnung, Amir
- Berliner, Tamir
|
출원인 / 주소 |
|
대리인 / 주소 |
D. Kliger IP Services Ltd.
|
인용정보 |
피인용 횟수 :
3 인용 특허 :
90 |
초록
▼
A method includes receiving and segmenting a first sequence of three-dimensional (3D) maps over time of at least a part of a body of a user of a computerized system in order to extract 3D coordinates of a first point and a second point of the user, the 3D maps indicating a motion of the second point
A method includes receiving and segmenting a first sequence of three-dimensional (3D) maps over time of at least a part of a body of a user of a computerized system in order to extract 3D coordinates of a first point and a second point of the user, the 3D maps indicating a motion of the second point with respect to a display coupled to the computerized system. A line segment that intersects the first point and the second point is calculated, and a target point is identified where the line segment intersects the display. An interactive item presented on the display in proximity to the target point is engaged.
대표청구항
▼
1. A method, comprising: receiving a first sequence of three-dimensional (3D) maps over time of at least a part of a hand of a user of a computerized system;segmenting the 3D maps in order to extract 3D coordinates of a first point located on a knuckle of the finger on the hand and a second point at
1. A method, comprising: receiving a first sequence of three-dimensional (3D) maps over time of at least a part of a hand of a user of a computerized system;segmenting the 3D maps in order to extract 3D coordinates of a first point located on a knuckle of the finger on the hand and a second point at a tip of the finger;calculating a line segment that extends through the first point and the second point to a display that is coupled to the computerized system;identifying a target point where the line segment intersects the display; andin response to a motion of the hand indicated by the 3D maps, engaging an interactive item presented on the display in proximity to the target point. 2. The method according to claim 1, wherein engaging the interactive item comprises performing an operation associated with the interactive item upon receiving a vocal command from the user. 3. The method according to claim 1, wherein the motion of the hand comprises a motion toward the display. 4. The method according to claim 3, wherein engaging the interactive item comprises performing an action associated with the interactive item upon receiving a second sequence of 3D maps indicating a deceleration of the motion of the second point toward the display, and receiving a third sequence of 3D maps indicating a motion of the second point away from the display. 5. The method according to claim 3, wherein engaging the interactive item comprises repositioning the interactive item on the display responsively upon receiving a second sequence of 3D maps indicating a deceleration of the motion of the second point toward the display, and receiving a third sequence of 3D maps indicating a motion of the second point along a plane comprising vertical and horizontal axes. 6. The method according to claim 3, wherein engaging the interactive item comprises performing a hold operation on the interactive item responsively upon receiving a second sequence of 3D maps indicating a deceleration of the motion of the second point toward the display, and receiving a third sequence of 3D maps indicating that the user is holding the second point relatively steady for a specific time period. 7. The method according to claim 6, wherein the hold operation comprises presenting context information for the interactive item. 8. The method according to claim 6, wherein the hold operation comprises deleting the interactive item from the display. 9. The method according to claim 3, wherein the finger comprises an index finger, and wherein engaging the interactive item comprises selecting the interactive item upon receiving a second sequence of 3D maps indicating a deceleration of the motion of the second point toward the display, and receiving a third sequence of 3D maps indicating the user folding a thumb toward the index finger while pointing the index finger at the interactive item. 10. The method according to claim 1, and comprising calculating a calibration coefficient based on the proximity of the interactive item to the target point. 11. The method according to claim 1, and comprising identifying the line segment intersecting a device coupled to the computer, and controlling a function of the device responsively to a motion of the second point. 12. An apparatus, comprising: a display; anda computer coupled to the display and configured to receive and segment a first sequence of three-dimensional (3D) maps over time of at least a part of a hand of a user of a computerized system in order to extract 3D coordinates of a first point located on a knuckle of the finger on the hand and a second point at a tip of the finger, to calculate a line segment that extends through the first point and the second point to the display, to identify a target point where the line segment intersects the display, and in response to a motion of the hand indicated by the 3D maps, to engage an interactive item presented on the display in proximity to the target point. 13. The apparatus according to claim 12, wherein the computer is configured to engage the interactive item by performing an operation associated with the interactive item upon receiving a vocal command from the user. 14. The apparatus according to claim 12, wherein the motion of the hand comprises a motion toward the display. 15. The apparatus according to claim 14, wherein the computer is configured to engage the interactive item by performing an action associated with the interactive item upon receiving a second sequence of 3D maps indicating a deceleration of the motion of the second point toward the display, and receiving a third sequence of 3D maps indicating a motion of the second point away from the display. 16. The apparatus according to claim 14, wherein the computer is configured to engage the interactive item by repositioning the interactive item on the display responsively upon receiving a second sequence of 3D maps indicating a deceleration of the motion of the second point toward the display, and receiving a third sequence of 3D maps indicating a motion of the second point along a plane comprising vertical and horizontal axes. 17. The apparatus according to claim 14, wherein the computer is configured to engage the interactive item by performing a hold operation on the interactive item responsively upon receiving a second sequence of 3D maps indicating a deceleration of the motion of the second point toward the display, and receiving a third sequence of 3D maps indicating that the user is holding the second point relatively steady for a specific time period. 18. The apparatus according to claim 17, wherein the computer is configured to perform the hold operation by presenting context information for the interactive item. 19. The apparatus according to claim 17, wherein the computer is configured to perform the hold operation by deleting the interactive item from the display. 20. The apparatus according to claim 14, wherein the finger comprises an index finger, and wherein the computer is configured to engage the interactive item by selecting the interactive item upon receiving a second sequence of 3D maps indicating a deceleration of the motion of the second point toward the display, and receiving a third sequence of 3D maps indicating the user folding a thumb toward the index finger while pointing the index finger at the interactive item. 21. The apparatus according to claim 12, wherein the computer is configured to calculate a calibration coefficient based on the proximity of the interactive item to the target point. 22. The apparatus according to claim 12, wherein the computer is configured to identify the line segment intersecting a device coupled to the computer, and to control a function of the device responsively to a motion of the second point. 23. A computer software product comprising a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer, which is coupled to a display, cause the computer to receive and segment a first sequence of three-dimensional (3D) maps over time of at least a part of a hand of a user of a computerized system in order to extract 3D coordinates of a first point located on a knuckle of the finger on the hand and a second point at a tip of the finger, to calculate a line segment that extends through the first point and the second point to the display, to identify a target point where the line segment intersects the display, and in response to a motion of the hand indicated by the 3D maps, to engage an interactive item presented on the display in proximity to the target point.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.