Gaze-based touchdown point selection system and method
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
B64F-001/18
G05D-001/12
G06F-003/033
G09G-005/08
출원번호
US-0468605
(2009-05-19)
등록번호
US-8350726
(2013-01-08)
발명자
/ 주소
Mathan, Santosh
Ververs, Patricia May
출원인 / 주소
Honeywell International Inc.
대리인 / 주소
Ingrassia Fisher & Lorenz, P.C.
인용정보
피인용 횟수 :
2인용 특허 :
8
초록▼
Methods and apparatus are provided for selecting a touchdown point for a vertical takeoff and landing aircraft. The eye movements of a user are tracked relative to an image being rendered on a display screen. An updated touchdown point location is determined from the tracked eye movements, and an up
Methods and apparatus are provided for selecting a touchdown point for a vertical takeoff and landing aircraft. The eye movements of a user are tracked relative to an image being rendered on a display screen. An updated touchdown point location is determined from the tracked eye movements, and an updated touchdown point is rendered at the updated touchdown point location on the display screen.
대표청구항▼
1. A method of selecting a touchdown point for a vertical takeoff and landing aircraft, comprising the steps of: measuring rotational movement of one or both eyes of a user with respect to the user's head and relative to an image being rendered on a display screen;determining an updated touchdown po
1. A method of selecting a touchdown point for a vertical takeoff and landing aircraft, comprising the steps of: measuring rotational movement of one or both eyes of a user with respect to the user's head and relative to an image being rendered on a display screen;determining an updated touchdown point location from the measured rotational movement;rendering an updated touchdown point at the updated touchdown point location on the display screen;receiving input from a user interface;supplying fine-tuning touchdown point location data in response to the input from the user interface;determining a further updated touchdown point location in response to the input from the user interface; andrendering, simultaneously with the updated touchdown point, a further updated touchdown point at the further updated touchdown point location on the display device. 2. The method of claim 1, further comprising the steps of: rendering an initial touchdown point on the display screen in accordance with a first paradigm; andrendering the updated touchdown point on the display screen in accordance with a second paradigm that is different than the first paradigm. 3. The method of claim 2, further comprising the steps of: receiving input from a user interface; andin response to the input from the user interface: (i) no longer rendering the initial touchdown point on the display screen, and(ii) rendering the updated touchdown point on the display screen in accordance with the first paradigm. 4. The method of claim 1, further comprising the step of: automatically navigating the VTOL aircraft to the updated touchdown point. 5. The method of claim 1, further comprising the steps of: tracking head movements of the user; anddetermining the updated touchdown point location from the tracked eye movements and the tracked head movements. 6. The method of claim 1, further comprising the steps of: receiving terrain data from a terrain data source; andrendering an initial touchdown point on the display device based on the terrain data received from the terrain data source. 7. The method of claim 6, further comprising the steps of: rendering the initial touchdown point on the display device in accordance with a first paradigm; andrendering the updated touchdown point on the display device in accordance with a second paradigm that is different than the first paradigm. 8. A vertical takeoff and landing (VTOL) aircraft touchdown point selection and display system, comprising: a display device coupled to receive image rendering display commands and configured, upon receipt of the image rendering display commands, to render an image;an eye tracker configured to (i) measure rotational movement of one or both eyes of a user with respect to the user's head and relative to the image being rendered by the display device and (ii) supply eye movement data;a user interface (116) in operable communication with the processor, the user interface adapted to receive user input and configured, upon receipt of the user input, to supply fine-tuning touchdown point location data anda processor in operable communication with the display device and the eye tracker, the processor configured, upon receipt of the eye movement data, to (i) determine an updated touchdown point location and (ii) supply image rendering display commands to the display device that cause the display device to render an updated touchdown point at the updated touchdown point location, and further configured, upon receipt of the fine-tuning touchdown point location data, to (i) determine a further updated touchdown point location and (ii) supply image rendering display commands to the display device that cause the display device to render, simultaneously with the updated touchdown point, a further updated touchdown point at the further updated touchdown point location on the display device. 9. The system of claim 8, wherein the processor is further configured to: supply image rendering display commands to the display device that cause the display device to render an initial touchdown point in accordance with a first paradigm; andsupply image rendering display commands to the display device that cause the display device to render the updated touchdown point in accordance with a second paradigm that is different than the first paradigm. 10. The system of claim 9, wherein: the user interface is further configured to selectively supply user interface data, andthe processor is further coupled to receive the user interface data and is further configured, in response to the user interface data to supply image rendering display commands to the display device that cause the display device to (i) no longer rendering the initial touchdown point and (ii) render the updated touchdown point in accordance with the first paradigm. 11. The system of claim 10, further comprising: a VTOL aircraft control device,wherein the user interface is coupled to the VTOL aircraft control device. 12. The system of claim 11, wherein the VTOL aircraft control device is selected from the group consisting of a cyclic and a collective. 13. The system of claim 8, further comprising: an aircraft autopilot in operable communication with the processor, the aircraft autopilot configured to automatically navigate the VTOL aircraft to a touchdown point. 14. The system of claim 13, wherein: the processor is further configured to supply updated touchdown point data to the aircraft autopilot; andthe aircraft autopilot is configured, upon receipt of the updated touchdown point data, to automatically navigate the VTOL aircraft to the updated touchdown point. 15. The system of claim 8, further comprising: a head tracker configured to (i) track head movements of the user and (ii) supply head movement data,wherein the processor is further in operable communication with the head tracker and is further configured, upon receipt of the head movement data, to determine the updated touchdown point location from the eye movement data and the head movement data. 16. The system of claim 8, further comprising: a terrain data source in operable communication with the processor, the terrain data source operable to supply terrain data,wherein the processor is further configured to (i) selectively receive terrain data from the terrain data source and (ii) supply image rendering display commands to the display device that cause the display device to render an initial touchdown point on the display device based on the terrain data received from the terrain data source. 17. The system of claim 16, wherein the processor is further configured to supply image rendering display commands to the display device that cause the display device to render: (i) the initial touchdown point in accordance with a first paradigm; and(ii) the updated touchdown point in accordance with a second paradigm that is different than the first paradigm. 18. A vertical takeoff and landing (VTOL) aircraft touchdown point selection and display system, comprising: a display device coupled to receive image rendering display commands and configured, upon receipt of the image rendering display commands, to render an image;an eye tracker configured to (i) measure rotational movement of one or both eyes of a user with respect to the user's head and relative to the image being rendered by the display device and (ii) supply eye movement data;a head tracker configured to (i) track head movements of the user and (ii) supply head movement data; a user interface (116) in operable communication with the processor, the user interface adapted to receive user input and configured, upon receipt of the user input, to supply fine-tuning touchdown point location data; and a processor in operable communication with the display device, the eye tracker, and the head tracker, the processor configured, upon receipt of the eye movement data and the head movement data, to (i) determine an updated touchdown point location and (ii) supply image rendering display commands to the display device that cause the display device to render an updated touchdown point at the updated touchdown point location, and further configured, upon receipt of the fine-tuning touchdown point location data, to (i) determine a further updated touchdown point location and (ii) supply image rendering display commands to the display device that cause the display device to render a further updated touchdown point at the further updated touchdown point location on the display device.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (8)
Bang Eric S. ; Groce ; deceased John L. ; Chaney Robert E., Cursor controlled navigation system for aircraft.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.