This disclosure relates to proximity sensing for wink detection. An illustrative method includes receiving data from a receiver portion of a proximity sensor. The receiver portion is disposed at a side section of a head-mountable device (HMD). When a wearer wears the HMD, the receiver portion is arr
This disclosure relates to proximity sensing for wink detection. An illustrative method includes receiving data from a receiver portion of a proximity sensor. The receiver portion is disposed at a side section of a head-mountable device (HMD). When a wearer wears the HMD, the receiver portion is arranged to receive light reflected from an eye area of the wearer, the proximity sensor detects a movement of the eye area, and the data represents the movement. The method includes determining that the data corresponds to a wink gesture. The method also includes selecting a computing action to perform, based on the wink gesture. The method further includes performing the computing action.
대표청구항▼
1. A method comprising: receiving data from a receiver portion of a proximity sensor, wherein the receiver portion is disposed at a side section of a head-mountable device (HMD), and wherein, when a wearer wears the HMD, (i) the receiver portion is arranged to receive light reflected from an eye are
1. A method comprising: receiving data from a receiver portion of a proximity sensor, wherein the receiver portion is disposed at a side section of a head-mountable device (HMD), and wherein, when a wearer wears the HMD, (i) the receiver portion is arranged to receive light reflected from an eye area of the wearer, wherein the eye area includes a skin area to a side of an eye that wrinkles during a wink, (ii) the proximity sensor detects a movement of the eye area, and (iii) the data represents the movement;determining that the data corresponds to a wink gesture, wherein the determining step comprises: determining a light intensity level at the receiver portion; anddetermining that the light intensity level corresponds, at least in part, to wrinkles in the skin area that are characteristic of a wink;selecting a computing action to perform, based on the wink gesture; andperforming the computing action. 2. The method of claim 1, further comprising determining a gaze direction, based on the data. 3. The method of claim 2, wherein the determining the gaze direction comprises: determining a first light intensity level and a second light intensity level, based on the data, wherein the first light intensity level corresponds to an eye-open state, and wherein the second light intensity level corresponds to an eye-closed state;determining whether a difference between the first light intensity level and the second light intensity level exceeds a threshold; andbased on a determination that the difference exceeds the threshold, determining that the gaze direction is an intermediate vertical direction. 4. The method of claim 2, wherein the determining the gaze direction comprises: determining a first light intensity level and a second light intensity level, based on the data, wherein the first light intensity level corresponds to an eye-open state, and wherein the second light intensity level corresponds to an eye-closed state;determining whether a difference between the first light intensity level and the second light intensity level does not exceed a threshold; andbased on a determination that the difference does not exceed the threshold, determining that the gaze direction is either an upward direction or a downward direction. 5. The method of claim 2, further comprising: storing a plurality of light intensity levels for an eye-open state, wherein each light intensity level in the plurality is associated with a corresponding gaze direction; andperforming a calibration of the plurality of light intensity levels for a context in which the method is used. 6. The method of claim 2, wherein the selecting the computing action to perform is further based on the gaze direction. 7. The method of claim 2, further comprising determining whether the gaze direction corresponds to at least one of a predetermined set of directions, wherein the selecting the computing action to perform is further based on a determination that the gaze direction corresponds to the at least one of the predetermined set of directions. 8. The method of claim 1, further comprising causing infrared light having a predetermined characteristic to be provided to the eye area, wherein the data indicates the predetermined characteristic. 9. The method of claim 1, wherein the computing action comprises causing an image capture device to capture image data. 10. A head-mountable device (HMD) comprising: a support structure comprising a front section and a side section, wherein the side section is adapted to receive a receiver portion of a proximity sensor such that, when a wearer wears the HMD, (i) the receiver portion is arranged to receive light reflected from an eye area of the wearer, wherein the eye area includes a skin area to a side of an eye that wrinkles during a wink, and (ii) the proximity sensor is configured to detect a movement of the eye area;a computer-readable medium; andprogram instructions stored to the computer-readable medium and executable by at least one processor to perform functions comprising: receiving data from the receiver portion, wherein, when the HMD is worn, the data represents the movement of the eye area;determining that the data corresponds to a wink gesture, wherein the determining step comprises: determining a light intensity level at the receiver portion; anddetermining that the light intensity level corresponds, at least in part, to wrinkles in the skin area that are characteristic of a wink;selecting a computing action to perform, based on the wink gesture; andperforming the computing action. 11. The HMD of claim 10, further comprising the proximity sensor. 12. The HMD of claim 10, wherein the side section is further adapted to receive the receiver portion of the proximity sensor such that, when the wearer wears the HMD, the receiver portion is positioned at an oblique angle with respect to the eye. 13. The HMD of claim 10, further comprising a light source, wherein when the wearer wears the HMD, the light source is adapted to provide light to the eye area. 14. The HMD of claim 13, wherein when the wearer wears the HMD, the light source is adapted to provide light to both of an upper-eyelid area of the eye area and a lower-eyelid area of the eye area, and wherein the data represents at least one characteristic of one or both of light reflected from the upper-eyelid area and light reflected from the lower-eyelid area. 15. The HMD of claim 13, wherein the light is infrared light. 16. The HMD of claim 10, wherein the functions further comprise determining a gaze direction, based on the data. 17. The HMD of claim 10, wherein the functions further comprise causing infrared light having a predetermined characteristic to be provided to the eye area, wherein the data indicates the predetermined characteristic. 18. The HMD of claim 10, wherein the skin area to the side of the eye wrinkles less or does not wrinkle during a blink, and wherein the HMD further comprises program instructions stored on the computer-readable medium and executable by at least one processor to perform functions comprising: determining that the light intensity level instead corresponds, at least in part, to light reflected from the skin area to the side of the eye during a blink. 19. A non-transitory computer-readable medium having stored thereon program instructions that, upon execution by at least one processor, cause the at least one processor to perform functions comprising: receiving data from a receiver portion of a proximity sensor, wherein the receiver portion is disposed at a side section of a head-mountable device (HMD), and wherein, when a wearer wears the HMD, (i) the receiver portion is arranged to receive light reflected from an eye area of the wearer, wherein the eye area includes a skin area to a side of an eye that wrinkles during a wink, (ii) the proximity sensor detects a movement of the eye area, and (iii) the data represents the movement;determining that the data corresponds to a wink gesture, wherein the determining step comprises: determining a light intensity level at the receiver portion; anddetermining that the light intensity level corresponds, at least in part, to wrinkles in the skin area that are characteristic of a wink;determining a gaze direction, based on the data;selecting a computing action to perform, based on one or both of the wink gesture and the gaze direction; andperforming the computing action. 20. The non-transitory computer-readable medium of claim 19, wherein the determining the gaze direction comprises: determining a first light intensity level and a second light intensity level, based on the data, wherein the first light intensity level corresponds to an eye-open state, and wherein the second light intensity level corresponds to an eye-closed state;determining whether a difference between the first light intensity level and the second light intensity level exceeds a threshold; andbased on a determination that the difference between the first light intensity level and the second light intensity level exceeds the threshold, determining that the gaze direction is an intermediate vertical direction. 21. The non-transitory computer-readable medium of claim 19, wherein the determining the gaze direction comprises: determining a first light intensity level and a second light intensity level, based on the data, wherein the first light intensity level corresponds to an eye-open state, and wherein the second light intensity level corresponds to an eye-closed state;determining whether a difference between the first light intensity level and the second light intensity level does not exceed a threshold; andbased on a determination that the difference between the first light intensity level and the second light intensity level does not exceed the threshold, determining that the gaze direction is either an upward direction or a downward direction. 22. The non-transitory computer-readable medium of claim 19, wherein the functions further comprise determining whether the gaze direction corresponds to at least one of a predetermined set of directions, wherein the selecting the computing action to perform is further based on a determination that the gaze direction corresponds to the at least one of the predetermined set of directions. 23. The non-transitory computer-readable medium of claim 19, wherein the functions further comprise causing infrared light having a predetermined characteristic to be provided to the eye area, wherein the data indicates the predetermined characteristic.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (36)
Kaufman Arie A. (Plainview NY) Bandopadhay Amit (Smithtown NY) Piligian George J. (Englewood Cliffs NJ), Apparatus and method for eye tracking interface.
Terunuma Hiroshi (Yachiyoshi JPX) Wakabayashi Hiroshi (Yokohama JPX) Tsukahara Daiki (Hiratsuka JPX), Camera which operates a shutter according to a photographer\s wink and the vibration level.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.