IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0192195
(2002-07-10)
|
발명자
/ 주소 |
- Ebersole, John Franklin
- Bastian, Mark Stanley
- Walker, John Franklin
- Madison, Richard Wade
- Ebersole, Jr., John Franklin
|
출원인 / 주소 |
- Information Decision Technologies, LLC
|
대리인 / 주소 |
Mirick, O'Connell, DeMallie &
|
인용정보 |
피인용 횟수 :
14 인용 특허 :
8 |
초록
▼
The invention is a method for displaying otherwise unseen atmospheric phenomenon using augmented reality (the mixing of real media with computer generated media). The method uses computer generated images to represent existing weather conditions and presents this data to the user by combining the co
The invention is a method for displaying otherwise unseen atmospheric phenomenon using augmented reality (the mixing of real media with computer generated media). The method uses computer generated images to represent existing weather conditions and presents this data to the user by combining the computer generated images with the user's real environment. Computer generated images are used to represent such weather phenomenon as wake vortices, wind shear, and microbursts. These images are represented in such a way as to intuitively display relevant properties of the phenomenon to the system user, which increases the user's situational awareness and safety. The primary intended applications are for air traffic controllers and pilots to view these disturbances.
대표청구항
▼
1. A method for presenting invisible atmospheric phenomena to an occupant of an aircraft using data about atmospheric information comprising:using a computer to render an image representing the atmospheric phenomena information from the point of view of the aircraft occupant, the phenomena including
1. A method for presenting invisible atmospheric phenomena to an occupant of an aircraft using data about atmospheric information comprising:using a computer to render an image representing the atmospheric phenomena information from the point of view of the aircraft occupant, the phenomena including wingtip vortices from aircraft in the area, microbursts, wind shear, and clear air turbulence; wherein said rendered image clearly shows the type, intensity, and spatial extent of the atmospheric phenomena by drawing graphical representations and icons of multiple colors, and various sizes and shapes, and by using techniques comprising fuzziness, fading, transparency, and blending; wherein said rendered image clearly shows the locations of all aircraft in the local area that may cause or come in contact with atmospheric phenomena; providing an image or view of the real world; augmenting the image or view of the real world with the rendered image; and presenting the augmented view to the aircraft occupant with a display comprising a heads-up display, or head mounted display that is worn by the aircraft occupant; wherein the position and orientation of this display or of the occupant's head is determined in relation to the earth in six degrees of freedom, thereby allowing the image to be rendered from the point of view of the occupant, to disseminate atmospheric phenomenon information to the occupant. 2. The method of claim 1 in which providing an image comprises using a camera to capture the real world image, and wherein the presenting step accomplishes a display of the video-based augmented-reality image onto the display.3. The method of claim 1 in which the presenting step accomplishes a see-through-based augmented reality display of the rendered image on a see-through head mounted display, which allows the view of the real world to be directly visible to the occupant through the use of partial mirrors, to which the rendered image is added.4. The method of claim 1 in which the data are derived from sensors which acquire atmospheric data.5. The method of claim 1 in which the data are derived from direct observation by a human.6. The method of claim 5 in which the human observations are provided by one or more pilots.7. The method of claim 5 in which the human observations are provided by one or more air traffic controllers.8. The method of claim 1 in which the data are derived from atmospheric computer simulation.9. The method of claim 1 in which information about the atmospheric phenomena displayed to the user via text, where the text is drawn onto a view of a real background, appearing near the atmospheric phenomenon the text is describing, and the text is visually anchored to that physical phenomenon.10. The method of claim 9 in which the textual display is displayed to the user at the same time that non-textual graphics are displayed to the user.11. A method for presenting invisible atmospheric phenomena to a user using data about atmospheric information comprising:using a computer to render an image representing the atmospheric phenomena information from the point of view of the user, the phenomena including wingtip vortices from aircraft in the area, microbursts, wind shear, and clear air turbulence; wherein said rendered image clearly shows the type, intensity, and spatial extent of the atmospheric phenomena by drawing graphical representations and icons of multiple colors, various sizes and shapes, and by using techniques comprising fuzziness, fading, transparency, and blending; wherein said rendered image clearly shows the locations of all aircraft in the local area that may cause or come in contact with atmospheric phenomena; providing an image or view of the real world; augmenting the image or view of the real world with the rendered image; and presenting the augmented view to the user, to disseminate atmospheric phenomenon information. 12. The method of claim 11, in which a tracking system is used to track the user's viewpoint of the real world, and display the augmented view on a head mounted display worn by the user.13. The method of claim 12, in which providing an image comprises using a camera to capture the real world image, and wherein the presenting step accomplishes a display of the video-based augmented-reality image onto the head mounted display.14. The method of claim 12, in which the presenting step accomplishes a see-through-based augmented reality display of the rendered image on a see-through head mounted display, which allows the view of the real world to be directly visible to the user through the use of partial mirrors, to which the rendered image is added.15. The method of claim 11 in which the data are derived from sensors which acquire atmospheric data.16. The method of claim 11 in which the data are derived from direct observation by a human.17. The method of claim 16 in which the human observations are provided by one or more pilots.18. The method of claim 16 in which the human observations are provided by one or more air traffic controllers.19. The method of claim 11 in which the data are derived from atmospheric computer simulation.20. The method of claim 11 in which the augmented view is presented on a television or computer monitor.21. The method of claim 11 in which the augmented view is presented in a heads-up-display.22. The method of claim 11 in which the augmented view is presented in a heads-down-display.23. The method of claim 11 in which the augmented view is presented in a display moveable by the user, and further comprising tracking the position of the display, to present an augmented view corresponding to the position of the display.24. The method of claim 23 in which the augmented view is presented in a handheld binocular type of display.25. The method of claim 23 in which the augmented view is presented in a handheld monocular type of display.26. The method of claim 23 in which the augmented view is presented in a handheld movable display.27. The method of claim 11 in which providing an image or view of the real world comprises taking a real image with an imaging device that is not worn on the user's head.28. The method of claim 27 in which the viewpoint of the imaging device is a birds-eye-view.29. The method of claim 27 in which the image of the real world is a static image.30. The method of claim 27 in which the image of the real world is output from a radar.31. The method of claim 27 in which the image of the real world is from a ground-based stationary imaging sensor from a known viewpoint.32. The method of claim 27 in which the presenting step comprises displaying the augmented view on a fixed monitor.33. The method of claim 27 in which providing an image or view of the real world comprises capturing an image with a camera that is mounted to a head-mounted or other portable display device.34. The method of claim 11 in which information about the atmospheric phenomena is displayed to the user via text, where the text is drawn onto a view of a real background, appearing near the atmospheric phenomenon the text is describing, and the text is visually anchored to that physical phenomenon.35. The method of claim 34 in which the textual display is displayed to the user at the same time that non-textual graphics are displayed to the user.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.