Biosensors, communicators, and controllers monitoring eye movement and methods for using them
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
A61B-013/00
A61B-005/103
A61B-005/117
A61B-003/00
A61B-001/00
출원번호
US-0687125
(2010-01-13)
등록번호
US-10039445
(2018-08-07)
발명자
/ 주소
Torch, William C.
출원인 / 주소
Google LLC
인용정보
피인용 횟수 :
0인용 특허 :
161
초록▼
Biosensor, communicator, and/or controller apparatus, systems, and methods are provided for monitoring movement of a person's eye. The apparatus includes a device configured to be worn on a user's head, a light source for directing light towards one or both eyes of the user, one or more image guides
Biosensor, communicator, and/or controller apparatus, systems, and methods are provided for monitoring movement of a person's eye. The apparatus includes a device configured to be worn on a user's head, a light source for directing light towards one or both eyes of the user, one or more image guides on the device for viewing one or both eyes of the user, and one or more cameras carried on the device and coupled to the image guides for acquiring images of the eyes and/or the user's surroundings. The apparatus may include a cable and/or a transmitter for transmitting image data from the camera to a remote location, e.g., to processor and/or display for analyzing and/or displaying the image data. A system including the apparatus may be used to monitor one or more oculometric parameters, e.g., pupillary response, and/or to control a computer using the user's eyes instead of a mouse.
대표청구항▼
1. A system for controlling a computer device, comprising: a device configured to be worn on a user's head;a plurality of light sources mounted on the device at locations and configured to direct light towards a first eye of the user when the device is worn, the light sources projecting a reference
1. A system for controlling a computer device, comprising: a device configured to be worn on a user's head;a plurality of light sources mounted on the device at locations and configured to direct light towards a first eye of the user when the device is worn, the light sources projecting a reference frame onto the first eye of the user when the device is worn, the projected reference frame comprising a graphic having at least one horizontal component that intersects at least one vertical component in an x-y coordinate system;a camera comprising an objective lens mounted to the device at a location to avoid interference with the user's vision, the camera mounted on the device such that an imaging axis of the camera is oriented towards the first eye of the user wearing the device for monitoring movement of the first eye with respect to the projected reference frame, the camera configured for generating signals representing video images of the first eye acquired by the camera;a display mounted to the device such that the display is in a field of view of the first eye of the user wearing the device but not in front of the first eye of the user wearing the device; anda processor coupled to the display and to the camera and configured to: assign the x-y coordinate system to the projected reference frame;analyze the signals generated by the camera to identify edges of a pupil of the first eye by superimposing the graphic of the reference frame on the video images of the first eye;process the signals to determine the pupil's location relative to the projected reference frame; andprocess the signals to monitor movement of the first eye relative to the projected reference frame to determine a gaze location of the user's pupil relative to a display reference frame associated with the display to create a pointer presented on the display tracking a gaze of the user relative to the display, the processor causing the pointer to move on the display to follow movement of the user's pupil based upon the movement of the user's pupil relative to the projected reference frame, wherein the processor is configured to analyze the signals from the camera to determine a direction in which the first eye is moving relative to the projected reference frame and to direct the pointer on the display relative to the display reference frame in the same direction in which the first eye is moving. 2. The system of claim 1, wherein the processor is configured to determine a base location of the first eye on an active area of the camera when the first eye looks at the pointer, and wherein the movement of the first eye is correlated by determining a change in location of the first eye on the active area relative to the base location to cause the pointer to move in response to the change in location. 3. The system of claim 1, wherein the processor is configured to correlate movement of the first eye relative to the pointer on the display by moving the pointer on the display until the user stops moving the first eye, whereupon the processor stops the pointer once the pointer arrives at the location where the user is currently looking on the display. 4. The system of claim 1, wherein the processor is configured to relate movement of the pointer to determine that the user wearing the device is looking at a specific location on the display, the processor further configured to monitor the signals to detect video images indicating that the user wearing the device has blinked in a predetermined sequence while looking at the specific location to request a command be executed, the processor coupled to a computing device and configured for executing the command on the computing device based upon detecting the video images. 5. The system of claim 1, wherein the processor is configured for displaying an image at a base location on the display, the processor further configured for monitoring the signals to detect video images indicating that the user wearing the device is looking at the base location to provide the display reference frame for subsequent movement of the first eye relative to the base location. 6. The system of claim 1, wherein the processor is further configured to monitor the first eye for a predetermined sequence of eye movement, providing a signal to activate a command identified by the pointer on the display. 7. The system of claim 1, wherein the processor is further configured to establish the display reference frame by having the user look sequentially at two or more identified locations on the display, thereby providing a scale for relative movement of the first eye. 8. The system of claim 7, wherein having the user look sequentially at two or more identified locations on the display comprises having the user look at opposite corners of the display to identify limits of appropriate eye movement relative to the display. 9. The system of claim 1, wherein the device configured to be worn on the user's head comprises a frame including a nose bridge and ear supports. 10. The system of claim 9, wherein the display is a heads-up display secured to the frame such that the display is disposed at a predetermined distance relative to the first eye. 11. The system of claim 1, wherein the camera is mounted on the device at a location such that the imaging axis of the camera is offset from an eye-gaze axis of the first eye, the eye-gaze axis corresponding to a direction in which the user wearing the device looks when looking straight ahead. 12. The system of claim 11, wherein the plurality of light sources are mounted on the device at locations offset from the eye-gaze axis. 13. A system for controlling a computer device, comprising: a device configured to be worn on a user's head;one or more light sources mounted on the device at locations and configured to direct light towards a first eye of the user when the device is worn, the light sources configured to project a reference frame onto the first eye of the user when the device is worn, the projected reference frame comprising a graphic having at least one horizontal component that intersects at least one vertical component in an x-y coordinate system;a camera mounted on the device at a location to avoid interference with the user's vision, the camera mounted such that an imaging axis of the camera is oriented directly towards the first eye of the user for monitoring movement of the first eye with respect to the projected reference frame, the camera configured for generating signals representing video images of the first eye acquired by the camera;a display mounted to the device such that a pointer on the display is viewable by the user wearing the device; anda processor coupled to the display and to the camera, the processor configured to: assign the x-y coordinate system to the projected reference frame;analyze the signals generated by the camera to identify a pupil of the first eye by superposing the graphic of the reference frame on the video images of the first eye;process the signals to monitor movement of the pupil of the first eye relative to the projected reference frame to determine a gaze location of the first eye relative to a display reference frame associated with the display; andcause the pointer to move on the display to the gaze location on the display to cause the pointer to move on the display to follow movement of the first eye. 14. The system of claim 13, wherein the processor is configured to analyze the signals from the camera to determine a direction in which the first eye is moving and to direct the pointer in the same direction in which the first eye is moving on the display. 15. The system of claim 13, wherein the camera is mounted on the device at a location such the imaging axis of the camera is offset from an eye-gaze axis of the first eye, the eye-gaze axis corresponding to a direction in which the user wearing the device looks when looking straight ahead. 16. The system of claim 15, wherein the one or more light sources comprise a plurality of light sources mounted on the device at locations offset from the eye-gaze axis. 17. The system of claim 13, wherein the camera comprises an image guide including a first end mounted on the device such that first end is oriented directly towards the first eye of the user wearing the device and defines the imaging axis, and a second end coupled to an imaging device. 18. The system of claim 17, wherein the imaging device comprises a CMOS or CCD detector. 19. The system of claim 17, further comprising an objective lens on the first end to focus the image guide towards the first eye. 20. The system of claim 17, wherein the second end is also coupled to a light source of the one or more light sources. 21. The system of claim 13, wherein the processor is further configured to monitor the first eye for a predetermined sequence of eye movement, providing a signal to activate a command identified by the pointer on the display. 22. The system of claim 13, further comprising: a filter mounted over the camera, the filter removing visible light or ultraviolet light from reaching a detector of the camera to facilitate capturing images within a desired bandwidth of light. 23. The system of claim 13, wherein: one of the one or more light sources is mounted on the device such that the light source contacts skin of the user and transcutaneously transmits light into the eye of the user and out of the pupil;the camera is further configured for capturing light exiting out of the pupil; andthe processor is further configured to identify the pupil by the exiting light captured by the camera in the video images of the first eye. 24. A system for controlling a computer device, comprising: a frame comprising ear supports carrying ear pieces configured to be worn on a user's head and a nose bridge extending between the ear supports;one or more light sources mounted on one or both of the ear supports and the nose bridge and oriented towards a first eye of the user wearing the frame and configured to direct light towards the first eye and project a reference frame onto the first eye of the user wearing the frame, the projected reference frame comprising a graphic having at least one horizontal component that intersects at least one vertical component in an x-y coordinate system;a camera comprising an objective lens mounted on one of the ear supports and the nose bridge, the camera and objective lens mounted by a lockable swivel mount allowing the objective lens to rotate about a pivot axis such that an imaging axis of the camera is oriented directly towards the first eye of the user wearing the frame for monitoring movement of the first eye with respect to the projected reference frame, the camera configured for generating signals representing video images of the first eye acquired by the camera;a display mounted to at least one of the ear supports and the nose bridge such that the display is viewable by the first eye of the user wearing the frame; anda processor coupled to the display and to the camera for processing the signals to monitor movement of the first eye relative to the projected reference frame to determine a gaze location of the first eye of the user relative to a display reference frame to create a pointer tracking a gaze of the user relative to the display, the processor causing the pointer to move and follow movement of the first eye based upon the movement of the first eye of the user and the gaze location relative to the display reference frame, and wherein the processor is configured to: assign the x-y coordinate system to the projected reference frame;analyze the camera signals generated by the camera to identify a pupil of the first eye by superimposing the graphic of the reference frame on the video images of the first eye; andprocess the signals to determine the gaze location of the pupil of the first eye relative to the projected reference frame and the gaze location relative to the display reference frame based on the superimposed graphic. 25. The system of claim 24, wherein the processor is configured to analyze the signals from the camera to determine a direction in which the first eye is moving and the direction the gaze location is moving relative to the display and to direct the pointer on the display in the same direction in which the first eye is moving. 26. The system of claim 24, wherein the processor is configured to determine a base location of the first eye on an active area of the camera when the first eye looks at the pointer, and wherein the movement of the first eye is correlated by determining a change in location of the first eye on the active area relative to the base location to cause the pointer to move in response to the change in location. 27. The system of claim 24, wherein the processor is configured to correlate movement of the first eye relative to the pointer on the display by moving the pointer on the display until the user stops moving the first eye, whereupon the processor stops the pointer once the pointer arrives at the location on the display where the user is currently looking on the display. 28. The system of claim 24, wherein the processor is configured to relate movement of the pointer to determine that the user wearing the frame is looking at a specific location on the display, the processor further to monitor the signals to detect video images indicating that the user wearing the frame has blinked in a predetermined sequence while looking at the specific location to request a command be executed, the processor coupled to a computing device and configured for executing the command on the computing device based upon detecting the video images. 29. The system of claim 24, wherein the processor is configured for displaying an image at a base location on the display, the processor further configured for monitoring the signals to detect video images indicating that the user wearing the frame is looking at the base location to provide the display reference frame for subsequent movement of the first eye relative to the base location. 30. The system of claim 24, wherein the processor is further configured to monitor the first eye for a predetermined sequence of eye movement for providing a signal to activate a command identified by the pointer on the display. 31. The system of claim 24, wherein the processor is further configured to establish the display reference frame by having the user look sequentially at two or more identified locations on the display, thereby providing a scale for relative movement of the first eye. 32. The system of claim 31, wherein having the user look sequentially at two or more identified locations on the display comprises having the user look at opposite corners of the display to identify limits of appropriate eye movement relative to the display. 33. The system of claim 24, wherein the display is a heads-up display secured to the frame such that the display is disposed at a predetermined distance in front of the first eye. 34. The system of claim 24, wherein the camera is mounted on one of the ear supports and the nose bridge at a location such the imaging axis of the camera is offset from an eye-gaze axis of the first eye, the eye-gaze axis corresponding to a direction in which the user wearing the frame looks when looking straight ahead. 35. The system of claim 34, wherein the one or more light sources comprise a plurality of light sources mounted on one or both of the ear supports and the nose bridge at locations offset from the eye-gaze axis.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (161)
Thorn, Ola Karl; Lessing, Simon, Adjusting display brightness and/or refresh rates based on eye tracking.
Kaufman Arie A. (Plainview NY) Bandopadhay Amit (Smithtown NY) Piligian George J. (Englewood Cliffs NJ), Apparatus and method for eye tracking interface.
Bjorklund, Christoffer; Eskilsson, Henrik; Jacobson, Magnus; Skogo, Marten, Arrangement, method and computer program for controlling a computer apparatus based on eye-tracking.
Mahaffey, Robert B.; Weiss, Lawrence F.; Schwerdtfeger, Richard S.; Kjeldsen, Frederik C., Computer system providing hands free user input via optical means for navigation or zooming.
Nobuo Fukushima JP; Tomotaka Muramoto JP; Masayoshi Sekine JP, Display apparatus which detects an observer body part motion in correspondence to a displayed element used to input operation instructions to start a process.
Spitzer Mark B. (Sharon MA) Jacobsen Jeffrey (Hollister CA), Eye tracking system having an array of photodetectors aligned respectively with an array of pixels.
Beard Terry D. (1407 N. View Dr. Westlake Village CA 91362), Low differential 3-D viewer glasses and method with spectral transmission properties to control relative intensities.
Salganicoff Marcus ; Hanna Keith James, Method and apparatus for illuminating and imaging eyes through eyeglasses using multiple sources of illumination.
Camus Theodore A. ; Salganicoff Marcus ; Chmielewski ; Jr. Thomas A. ; Hanna Keith James, Method and apparatus for removal of bright or dark spots by the fusion of multiple images.
Dryer,D. Christopher; Flickner,Myron Dale; Mao,Jianchang, Method and system for real-time determination of a subject's interest level to media content.
Flickner, Myron Dale; Koons, David Bruce; Lu, Qi; Maglio, Paul Philip; Morimoto, Carlos Hitoshi; Selker, Edwin Joseph, Method and system for relevance feedback through gaze tracking and ticker interfaces.
Moore-Ede, Martin C.; Trutschel, Udo E.; Guttkuhn, Rainer; Heitmann, Anneke M., Method of and apparatus for evaluation and mitigation of microsleep events.
Gevins Alan S. (532 Waller St. San Francisco CA 94117), Neurocognitive adaptive computer interface method and system based on on-line measurement of the user\s mental effort.
Velez Jose (Newton MA) Borah Joshua D. (Mansfield MA), Visor and camera providing a parallax-free field-of-view image for a head-mounted eye movement measurement system.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.