Systems and methods using enhanced vision to provide out-the-window displays for a device
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06T-015/20
G06T-015/10
출원번호
US-0615634
(2003-07-08)
등록번호
US-7486291
(2009-02-03)
발명자
/ 주소
Berson,Barry L.
Bialecki,Larry J.
Buck,Peter A.
출원인 / 주소
Berson,Barry L.
Bialecki,Larry J.
Buck,Peter A.
대리인 / 주소
Koestner Bartani LLP
인용정보
피인용 횟수 :
49인용 특허 :
10
초록▼
A display system for a operating a device receives images from a first sensor and a second sensor that represent scenery outside the device. The display system is configured to detect moving objects in the images, as well as fuse the images to a single viewpoint. The fused image is transformed to a
A display system for a operating a device receives images from a first sensor and a second sensor that represent scenery outside the device. The display system is configured to detect moving objects in the images, as well as fuse the images to a single viewpoint. The fused image is transformed to a first viewpoint image from a first operator station in the device, and a second viewpoint image from a second operator station in the device. The first and second viewpoint images conform to the scenery outside the device from each operator station.
대표청구항▼
What is claimed is: 1. A display system for a device, comprising: computer executable instructions operable to: receive images from a first sensor and a second sensor representing scenery outside the device; detect moving objects in the images; fuse the images to a single viewpoint; transform the f
What is claimed is: 1. A display system for a device, comprising: computer executable instructions operable to: receive images from a first sensor and a second sensor representing scenery outside the device; detect moving objects in the images; fuse the images to a single viewpoint; transform the fused image to a first viewpoint image from a first operator station in the device and a second viewpoint image from a second operator station in the device, wherein the first and second viewpoint images conform to the scenery outside the device from each operator station; and generate a third display area associated with at least two mutually exclusive windows of information on a display device for the first operator station, the display device is mounted in an instrument panel of the device and the third display area and the two mutually exclusive windows of information are presented on the display device; generate a third display area associated with at least two mutually exclusive windows of information on another display device for the second operator station, the other display device is mounted in the instrument panel of the device and the third display area and the two mutually exclusive windows of information are presented on the display device; wherein the third display areas can be customized independently by the operators to display detailed information related to the information displayed in the associated windows and the first and second viewpoint images are displayed on the respective display devices. 2. The display system of claim 1, further comprising: computer executable instructions operable to: combine the first and second viewpoint images with symbols, wherein the symbols represent information regarding the operational state of the device and the moving objects detected in the images. 3. The display system of claim 1, wherein the instructions for detecting moving objects in the first sensor image are configured to execute in a first processor, and the instructions for detecting moving objects in the second sensor image are configured to execute in a second processor simultaneously with the instructions in the first processor. 4. The display system of claim 3, wherein the instructions for transforming the fused image to the first viewpoint image are configured to execute in the first processor, and the instructions for transforming the fused image to the second viewpoint image are configured to execute in the second processor. 5. The display system of claim 2, wherein the symbols represent the moving objects in the vicinity of the device. 6. The display system of claim 2, wherein at least one of the first and second viewpoint images include environmental information for the area where the device is operating. 7. The display system of claim 2, wherein the symbols represent weather hazards in the vicinity of the device. 8. The display system of claim 2, wherein the computer executable instructions are further operable to receive an enhanced image from a third sensor configured to provide an image of the out-the-window scenery in low-visibility conditions. 9. The display system of claim 8, wherein the computer executable instructions are further operable to fuse the single viewpoint image with the enhanced image. 10. The display system of claim 9, wherein the computer executable instructions are further operable to utilize data from at least one position sensor to determine the location of the objects with respect to the device. 11. The display system of claim 2, wherein the computer executable instructions are further operable to utilize data from off-board data sources regarding the objects. 12. The display system of claim 1, wherein the first sensor and the second sensor are video cameras. 13. The display system of claim 8, wherein the third sensor is a RADAR sensor. 14. The display system of claim 8, wherein the third sensor is a FLIR sensor. 15. A method for providing an out-the-window visual scene on a display device, comprising: receiving an image of a portion of the out-the-window visual scene from the viewpoint of a first type of sensor; receiving another image of a portion of the out-the-window visual scene from the viewpoint of another of the first type of sensor; fusing the images from the first type of sensors into a combined image to generate a first fused image; transforming the fused image to a first operator viewpoint and to a second operator viewpoint; and outputting the first operator viewpoint image to a first display device and the second operator viewpoint image to a second display device, wherein the display devices are positioned in an instrument panel to provide the portion of a desired out-the window visual scene in combination with a window that provides another portion of the desired out-the-window visual scene, and the viewpoint images are aligned with and scaled to conform to the out-the-window visual scene; and generating a common display area associated with at least two mutually exclusive windows of information on each of the display devices, wherein the common display area can be customized by the operator to display detailed information related to the information displayed in the associated windows. 16. The method of claim 15, further comprising detecting objects in the first fused image from the first type of sensor. 17. The method of claim 16, further comprising combining the first fused image with symbols representing the objects. 18. The method of claim 15, further comprising transforming the first operator viewpoint image and the second operator viewpoint image to conform to the out-the-window visual scene. 19. The method of claim 15, further comprising fusing the first fused image with an enhanced image of a portion of the out-the-window scenery from at least one of the group of a RADAR sensor and a FLIR sensor, to generate a second fused image. 20. The method of claim 15, further comprising fusing the second fused image with an enhanced image of a portion of the out-the-window scenery from at least one of the group of a RADAR sensor and a FLIR sensor, to generate a second fused image. 21. The method of claim 20, further comprising: transforming the second fused image to the first operator viewpoint and to the second operator viewpoint. 22. The method of claim 19, further comprising: providing portions of the transformed image with data from a terrain map database. 23. A device, comprising: a display device; and a display processor operable to: receive a first sensor image representing a portion of scenery outside the device; transform the first sensor image to a viewpoint image from an operator station in the device, wherein the viewpoint image is sized and oriented to conform to the scenery outside the device from the operator station; and output the first operator viewpoint image to the display device, wherein the display device is positioned in an instrument panel to provide the portion of a desired out-the-window visual scene in combination with a window that provides another portion of the desired out-the-window visual scene, and the viewpoint image is aligned with and scaled to conform to the out-the-window visual scene; wherein the display processor is further operable to generate a common display area associated with at least two mutually exclusive windows of information on the display device, wherein the common display area can be customized by the operator to display detailed information related to the information displayed in the associated windows. 24. The device of claim 23, wherein the display processor is further operable to combine the viewpoint image with symbols, wherein the symbols represent information regarding the operational state of the device and the moving objects detected in the images. 25. The device of claim 23, wherein the display processor is further operable to detect moving objects in the first sensor image. 26. The device of claim 23, wherein the display processor is further operable to generate symbols representing moving objects in the sensor image and the operational state of the device. 27. The device of claim 23, wherein the display processor is further operable to generate symbols representing weather hazards in the vicinity of the device. 28. The device of claim 23, wherein the display processor is further operable to receive an enhanced image of the out-the-window scenery in low-visibility conditions from a second sensor. 29. The device of claim 28, wherein the display processor is further operable to fuse the viewpoint image with the enhanced image. 30. The device of claim 25, wherein the display processor is further operable to utilize data from at least one position sensor to determine the location of the objects with respect to the device. 31. The device of claim 25, wherein the display processor is further operable to utilize data from off-board data sources regarding the objects. 32. The device of claim 23, wherein the sensor is a video camera. 33. The device of claim 28, wherein the second sensor is a RADAR sensor. 34. The device of claim 28, wherein the second sensor is a FLIR sensor. 35. An aircraft, comprising: a crewstation with cockpit windows; a first display device for one crewmember; a second display device for another crewmember; and a display processor operable to: receive an image of an out-the-window visual scene from the viewpoint of a first type of sensor; receive another image of a portion of the out-the-window visual scene from the viewpoint of another of the first type of sensor; fuse the images from the first type of sensors into a combined image to generate a first fused image; transform the fused image to a first operator viewpoint and to a second operator viewpoint; transform the first operator viewpoint image and the second operator viewpoint image to conform to the size and orientation of the out-the-window visual scene; and output the first operator viewpoint image to the first display device and the second operator viewpoint image to the second display device, wherein the display devices are positioned in an instrument panel to provide the portion of a desired out-the window visual scene in combination with a cockpit window that provides another portion of the desired out-the-window visual scene, and the viewpoint images are aligned with and scaled to conform to the out-the-window visual scene; wherein the display processor is further operable to generate a common display area associated with at least two mutually exclusive windows of information on each display device, wherein the common display area can be customized by the operator to display detailed information related to the information displayed in the associated windows. 36. The aircraft of claim 35, wherein the display processor is further operable to detect objects in the first fused image from the first type of sensor. 37. The aircraft of claim 36, wherein the display processor is further operable to combine the first fused image with symbols representing the objects and primary flight information for the aircraft. 38. The aircraft of claim 35, wherein the display processor is further operable to fuse the first fused image with an enhanced image of a portion of the out-the-window scenery from at least one of the group of a RADAR sensor and a FLIR sensor, to generate a second fused image. 39. The aircraft of claim 38, wherein the display processor is further operable to fuse the second fused image with an enhanced image of a portion of the out-the-window scenery from at least one of the group of a RADAR sensor and a FLIR sensor, to generate a second fused image. 40. The aircraft of claim 39, wherein the display processor is further operable to transform the second fused image to the first operator viewpoint and to the second operator viewpoint. 41. The aircraft of claim 40, wherein the display processor is further operable to provide portions of the transformed images using data from a terrain map database. 42. The aircraft according to claim 35 further comprising: a terrain database coupled to provide the display processor with at least a portion of the required out-the-window field of view on the display device. 43. The aircraft of claim 35, wherein the display processor is further operable to display one of the operator viewpoint displays to the operator acting as pilot in command of the aircraft during a predefined aircraft operational state, and to allow the pilot in command to choose an option on the display device to view detailed information about the aircraft and aircraft subsystems during other aircraft operational states.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (10)
Mithra M. K. V. Sankrithi ; David W. McKenna ; Mannon L. Wallace, Jr. ; Ronaldo O. Cabrera ; Gary D. Reysa ; Gerhard E. Seidel ; John Yeeles ; John Cashman, Airplane ground maneuvering camera system.
Alimpich Claudia ; Jeffcoat Benjamin Nelson ; Neuhard Deborah Elizabeth ; Vigil Luana Linda ; Wittig James Philip John, Data processor controlled interface with multiple tree of elements views expandable into individual detail views.
Matthew T. Smith ; Gary L. Owen, Method and apparatus for interactively and automatically selecting, controlling and displaying parameters for an avionics electronic flight display system.
Ebersole, John Franklin; Bastian, Mark Stanley; Walker, John Franklin; Madison, Richard Wade; Ebersole, Jr., John Franklin, Method to view unseen atmospheric phenomenon using augmented reality.
Osterhout, Ralph F.; Haddick, John D.; Lohse, Robert Michael; Cella, Charles; Nortrup, Robert J.; Nortrup, Edward H., AR glasses with event and sensor triggered AR eyepiece interface to external devices.
Osterhout, Ralph F.; Haddick, John D.; Lohse, Robert Michael; Cella, Charles; Nortrup, Robert J.; Nortrup, Edward H., AR glasses with event and sensor triggered control of AR eyepiece applications.
Osterhout, Ralph F.; Haddick, John D.; Lohse, Robert Michael; Cella, Charles; Nortrup, Robert J.; Nortrup, Edward H., AR glasses with event and user action control of external applications.
Feyereisen, Thea L.; He, Gang; Suddreth, John G.; Ververs, Patricia May, Aircraft flight deck displays and systems and methods for enhanced display of obstacles in a combined vision display.
McCusker, Patrick D.; Jinkins, Richard D.; Rademaker, Richard M., Enhanced flight vision system and method with radar sensing and pilot monitoring display.
Osterhout, Ralph F.; Haddick, John D.; Lohse, Robert Michael; Border, John N.; Miller, Gregory D.; Stovall, Ross W., Eyepiece with uniformly illuminated reflective display.
Miller, Gregory D.; Border, John N.; Osterhout, Ralph F., Grating in a light transmissive illumination system for see-through near-eye display glasses.
Miller, Gregory D.; Border, John N.; Osterhout, Ralph F., Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses.
Border, John N.; Bietry, Joseph; Osterhout, Ralph F., See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film.
Border, John N.; Haddick, John D.; Osterhout, Ralph F., See-through near-eye display glasses including a partially reflective, partially transmitting optical element.
Border, John N.; Osterhout, Ralph F., See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment.
Border, John N.; Bietry, Joseph; Osterhout, Ralph F., See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film.
Border, John N.; Osterhout, Ralph F., See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear.
Border, John N.; Haddick, John D.; Osterhout, Ralph F., See-through near-eye display glasses with a light transmissive wedge shaped illumination system.
Border, John N.; Haddick, John D.; Lohse, Robert Michael; Osterhout, Ralph F., See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light.
Woodell, Daniel L.; Jinkins, Richard D.; Meyer, Nathanael A.; Rademaker, Richard M.; Dickerson, Charles J., System and method for a terrain database and/or position validation.
Vos, David W.; Gavrilets, Vladislav; Jourdan, Damien B.; Ludington, Ben T., System and method for developing dynamic positional database for air vehicles and terrain features.
Wood, Robert B.; Tiana, Carlo L.; Kowash, Nathaniel S.; Jinkins, Richard D.; Rademaker, Richard M., System for and method of radar data processing for low visibility landing applications.
Woodell, Daniel L.; Jinkins, Richard D.; Meyer, Nathanael A.; Rademaker, Richard M.; Dickerson, Charles J., Terrain avoidance system and method using weather radar for terrain database generation.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.