Arrangement, method and computer program for controlling a computer apparatus based on eye-tracking
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06F-003/033
A61B-003/14
G09G-005/08
출원번호
US-0570840
(2005-05-24)
등록번호
US-8185845
(2012-05-22)
우선권정보
EP-04445071 (2004-06-18)
국제출원번호
PCT/SE2005/000775
(2005-05-24)
§371/§102 date
20061218
(20061218)
국제공개번호
WO2005/124521
(2005-12-29)
발명자
/ 주소
Bjorklund, Christoffer
Eskilsson, Henrik
Jacobson, Magnus
Skogo, Marten
출원인 / 주소
Tobii Technology AB
대리인 / 주소
Birch, Stewart, Kolasch & Birch, LLP
인용정보
피인용 횟수 :
51인용 특허 :
8
초록▼
A computer based eye-tracking solution is disclosed. A computer apparatus is associated with one or more graphical displays (GUI components) that may be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of re
A computer based eye-tracking solution is disclosed. A computer apparatus is associated with one or more graphical displays (GUI components) that may be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine is adapted to produce a set of non-cursor controlling event output signals, which influence the GUI-components. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the proposed event engine receives a control signal request from each of the GUI-components. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the GUI-components in accordance with each respective control signal request.
대표청구항▼
1. An arrangement for controlling a computer apparatus associated with a graphical display, the display presenting a representation of a plurality of GUI-components that are configured to be manipulated based on user-generated commands, the arrangement comprising; a display; an event engine configur
1. An arrangement for controlling a computer apparatus associated with a graphical display, the display presenting a representation of a plurality of GUI-components that are configured to be manipulated based on user-generated commands, the arrangement comprising; a display; an event engine configured to receive an eye-tracking data signal describing a user's point of regard on the display, and at least based on the eye-tracking data signal, configured to produce a set of noncursor controlling event output signals influencing the GUI-components, each of the noncursor controlling event output signals describing a different aspect of the user's ocular activity in respect of the display, wherein the event engine is configured to: receive a respective control signal request from each of the GUI-components, each respective control signal request defining a sub-set of the set of non-cursor controlling event output signals which is required by a corresponding one of the GUI-components to operate as intended,produce the set of non-cursor controlling event output signals which is actually requested by the respective control signal request from each of the GUI-components, anddeliver the set of non-cursor controlling event output signals to the GUI-components in accordance with each respective control signal request. 2. The arrangement according to claim 1, wherein the computer apparatus is configured to receive a cursor control signal, and control a graphical pointer on the display in response to the cursor control signal. 3. The arrangement according to claim 1, wherein at least one of the GUI-components is configured to generate at least one respective output control signal upon a user manipulation of the component. 4. The arrangement according to claim 1, wherein the event engine is configured to produce at least a first signal of the non-cursor controlling event output signals based on a dynamic development of the eye-tracking data signal. 5. The arrangement according to claim 4, wherein the first signal represents a particular gaze pattern over the display. 6. The arrangement according to claim 5, wherein at least one of the GUI-components is configured to interpret the first signal as an estimated intention of the user, and trigger a user manipulation of the component in response to the estimated intention. 7. The arrangement according to claim 5, wherein at least one of the GUI-components is configured to interpret the first signal as an estimated attention level of the user, and trigger a user manipulation of the component in response to the estimated attention level. 8. The arrangement according to claim 5, wherein at least one of the GUI-components is configured to interpret the first signal as a state-of-mind parameter of the user, and trigger a user manipulation of the component in response to the state-of-mind parameter. 9. The arrangement according to claim 1, wherein the event engine is configured to receive at least one auxiliary input signal and produce the set of non-cursor controlling event output signals on the further basis of the at least one auxiliary input signal. 10. The arrangement according to claim 1, wherein the at least one auxiliary input signal is originated from at least one of a button, a switch, a speech signal, a movement pattern of an input member, a gesture pattern, a facial expression and an EEG-signal. 11. A method of controlling a computer apparatus associated with a graphical display, the display representing a plurality of GUI-components that are configured to be manipulated based on user-generated commands, the method comprising: receiving at an event engine an eye-tracking data signal which describes a user's point of regard on the display,receiving at the event engine a respective control signal request from each of the GUI-components, the respective control signal request defining a sub-set of a set of noncursor controlling event output signals, the sub-set of the set of noncursor controlling event output signals being required by a corresponding one of the GUI-components to operate as intended,producing the set of non-cursor controlling event output signals at least based on the eye-tracking data signal, the set of noncursor controlling event output signals influencing the GUI-components, each of the non-cursor controlling event output signals describing a different aspect of the user's ocular activity in respect of the display, the set of non-cursor controlling event output signals being actually requested by the respective control signal request from each of the GUI-components, anddelivering from the event engine the set of non-cursor controlling event output signals to the GUI-components in accordance with each respective control signal request. 12. The method according to claim 11, further comprising receiving a cursor control signal, and controlling a graphical pointer on the display in response to the cursor control signal. 13. The method according to claim 11, further comprising generating at least one output control signal by at least one of the GUI-components upon a user manipulation of the at least one of the GUI-components. 14. The method according to claim 11, further comprising producing at least a first signal of the non-cursor controlling event output signals based on a dynamic development of the eye-tracking data signal. 15. The method according to claim 14, wherein the first signal represents a particular gaze pattern over the display. 16. The method according to claim 15, further comprising interpreting the first signal by at least one of the GUI-components as an estimated intention of the user, and triggering a user manipulation of the at least one of the GUI-components in response to the estimated intention. 17. The method according to claim 15, further comprising interpreting the first signal by at least one of the GUI-components as an estimated attention level of the user, and triggering a user manipulation of the at least one of the GUI-components in response to the estimated attention level. 18. The method according to claim 15, further comprising interpreting the first signal by at least one of the GUI-components as a state-of-mind parameter of the users, and triggering a user manipulation of the at least one of the GUI-components in response to the state-of-mind parameter. 19. The method according to claim 11, further comprising: receiving at least one auxiliary input signal, and producing the set of non-cursor controlling event output signals on the further basis of the at least one auxiliary input signal. 20. The method according to claim 11, wherein the at least one auxiliary input signal being originated from at least one of a button, a switch, a speech signal, a movement pattern of an input member, a gesture pattern, a facial expression and an EEG-signal. 21. A computer program embodied into the internal memory of a computer, comprising software for executing the method of claim 11 when said program is run on the computer. 22. A non-transitory computer readable medium, having a program recorded thereon for controlling a computer apparatus associated with a graphical display, the display representing a plurality of GUI-components that are to be manipulated based on user-generated commands, where the program operates as an event engine where the program includes computer readable program code portions stored therein, the computer readable program code portions comprising: a first program code portion configured to receive an eye-tracking data signal which describes a user's point of regard on the display;a second program code portion configured to receive a respective control signal request from each of the GUI-components, the respective control signal request defining a sub-set of a set of noncursor controlling event output signals, the sub-set of the set of noncursor controlling event output signals being required by a corresponding one of the GUI-components to operate as intended;a third program code portion configured to produce the set of non-cursor controlling event output signals at least based on the eye-tracking data signal, the set of noncursor controlling event output signals influencing the GUI-components, each of the non-cursor controlling event output signals describing a different aspect of the user's ocular activity in respect of the display, the set of non-cursor controlling event output signals being actually requested by the respective control signal request from each of the GUI-components; anda fourth program code portion configured to deliver the set of non-cursor controlling event output signals to the GUI-components in accordance with each respective control signal request.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (8)
Kaufman Arie A. (Plainview NY) Bandopadhay Amit (Smithtown NY) Piligian George J. (Englewood Cliffs NJ), Apparatus and method for eye tracking interface.
Flickner, Myron Dale; Koons, David Bruce; Lu, Qi; Maglio, Paul Philip; Morimoto, Carlos Hitoshi; Selker, Edwin Joseph, Method and system for relevance feedback through gaze tracking and ticker interfaces.
Osterhout, Ralph F.; Haddick, John D.; Lohse, Robert Michael; Cella, Charles; Nortrup, Robert J.; Nortrup, Edward H., AR glasses with event and sensor triggered AR eyepiece interface to external devices.
Osterhout, Ralph F.; Haddick, John D.; Lohse, Robert Michael; Cella, Charles; Nortrup, Robert J.; Nortrup, Edward H., AR glasses with event and sensor triggered control of AR eyepiece applications.
Osterhout, Ralph F.; Haddick, John D.; Lohse, Robert Michael; Cella, Charles; Nortrup, Robert J.; Nortrup, Edward H., AR glasses with event and user action control of external applications.
Boss, Gregory J.; Hamilton, II, Rick A.; Cruz Huertas, Luis Carlos; Zamora Duran, Edgar Adolfo, Adaptive, automatically-reconfigurable, vehicle instrument display.
Boss, Gregory J.; Hamilton, II, Rick A.; Cruz Huertas, Luis Carlos; Zamora Duran, Edgar Adolfo, Adaptive, automatically-reconfigurable, vehicle instrument display.
Boss, Gregory J.; Hamilton, II, Rick A.; Cruz Huertas, Luis Carlos; Zamora Duran, Edgar Adolfo, Adaptive, automatically-reconfigurable, vehicle instrument display.
Boss, Gregory J.; Hamilton, II, Rick A.; Cruz Huertas, Luis Carlos; Zamora Duran, Edgar Adolfo, Adaptive, automatically-reconfigurable, vehicle instrument display.
Boss, Gregory J.; Hamilton, II, Rick A.; Cruz Huertas, Luis Carlos; Zamora Duran, Edgar Adolfo, Adaptive, automatically-reconfigurable, vehicle instrument display.
Boss, Gregory J.; Hamilton, II, Rick A.; Huertas, Luis Carlos Cruz; Duran, Edgar Adolfo Zamora, Adaptive, automatically-reconfigurable, vehicle instrument display.
Björklund, Christoffer; Eskilsson, Henrik; Jacobson, Magnus; Skogö, Mårten, Arrangement, method and computer program for controlling a computer apparatus based on eye-tracking.
Björklund, Christoffer; Eskilsson, Henrik; Jacobson, Magnus; Skogö, Mårten, Arrangement, method and computer program for controlling a computer apparatus based on eye-tracking.
Lerner, David M.; Smith, Brandon J.; Ducrou, Jon Robert; Miller, Erik J.; Barry, Marcus A.; Sanders, II, Kenneth O., Auditory enhancement using word analysis.
Osterhout, Ralph F.; Haddick, John D.; Lohse, Robert Michael; Border, John N.; Miller, Gregory D.; Stovall, Ross W., Eyepiece with uniformly illuminated reflective display.
Miller, Gregory D.; Border, John N.; Osterhout, Ralph F., Grating in a light transmissive illumination system for see-through near-eye display glasses.
Strombom, Johan; Skogo, Marten; Nystedt, Per; Gustafsson, Simon; Elvesjo, John Mikael Holtz; Blixt, Peter, Method for displaying gaze point data based on an eye-tracking unit.
Miller, Gregory D.; Border, John N.; Osterhout, Ralph F., Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses.
Border, John N.; Bietry, Joseph; Osterhout, Ralph F., See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film.
Border, John N.; Haddick, John D.; Osterhout, Ralph F., See-through near-eye display glasses including a partially reflective, partially transmitting optical element.
Border, John N.; Osterhout, Ralph F., See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment.
Border, John N.; Bietry, Joseph; Osterhout, Ralph F., See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film.
Border, John N.; Osterhout, Ralph F., See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear.
Border, John N.; Haddick, John D.; Osterhout, Ralph F., See-through near-eye display glasses with a light transmissive wedge shaped illumination system.
Border, John N.; Haddick, John D.; Lohse, Robert Michael; Osterhout, Ralph F., See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light.
Publicover, Nelson George; Marggraff, Lewis James; Drake, Eliot Francis; Connaughton, Spencer James, Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects.
Publicover, Nelson G.; Torch, William C.; Amayeh, Gholamreza; Leblanc, David, Systems and methods for identifying gaze tracking scene reference locations.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.