IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
UP-0040498
(2005-01-21)
|
등록번호 |
US-7804981
(2010-10-21)
|
발명자
/ 주소 |
- Viggiano, Marc J
- Donovan, Todd A
- Gerry, Michael J
- Parks, Lara Marisa
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
11 인용 특허 :
10 |
초록
▼
A method and system for tracking the position of objects, such as aircraft and ground vehicles around an airport. A transmission from the object is received at a non-imaging surveillance device of known location, the transmission being used to determine the coordinate position of the object within a
A method and system for tracking the position of objects, such as aircraft and ground vehicles around an airport. A transmission from the object is received at a non-imaging surveillance device of known location, the transmission being used to determine the coordinate position of the object within a given time period. Image data for the object is captured by an imaging surveillance device of known location to provide an image of the object within the same given time period. The coordinate position is correlated with the image to provide composite data about the object within the same given time period. The composite data is displayed to a viewer as a visual depiction of the object.
대표청구항
▼
We claim: 1. A method for tracking the position of an object, comprising the steps of: receiving, at a non-imaging surveillance device of known location, a transmission from the object, said transmission being used to determine the coordinate position of the object within a given time period; captu
We claim: 1. A method for tracking the position of an object, comprising the steps of: receiving, at a non-imaging surveillance device of known location, a transmission from the object, said transmission being used to determine the coordinate position of the object within a given time period; capturing image data for the object at an imaging surveillance device of known location to provide an image of the object within said given time period; correlating said coordinate position with said image to provide composite data about the object within said given time period; and displaying the composite data to a viewer as a visual depiction of the object. 2. The method of claim 1, wherein said transmission contains position data for the object. 3. The method of claim 1, wherein said transmission is generated in response to a directional interrogation from a single non-imaging surveillance device, and said method further comprises the step of processing said transmission to calculate the range and azimuth of the object relative to said device. 4. The method of claim 1, wherein said transmission is received at a plurality of non-imaging surveillance devices of known location, and said method further comprises the step of processing said transmission to derive position data for the object. 5. The method of claim 4, wherein said processing step uses a multilateration algorithm to derive position data for the object. 6. The method of claim 4, wherein said transmission is generated in response to an interrogation from a non-imaging surveillance device, and said method further comprises the step of processing said transmission to determine range from the interrogating device and azimuth from the plurality of devices using the difference in time of arrival of the transmission at each said device. 7. The method of claim 1, wherein said method further comprises the step of generating position data for the object within said given time period using a non-interrogating, non-imaging surveillance device, and said correlating step comprises correlating said coordinate position with said generated position data. 8. The method of claim 7, wherein said non-interrogating, non-imaging surveillance device includes primary radar. 9. The method of claim 1, wherein said imaging surveillance device includes a camera operating in at least one of the visual spectrum and infrared spectrum. 10. The method of claim 1, wherein the composite data is displayed to the viewer as one of a 3-dimensional or 2-dimensional video image on a planar video display. 11. The method of claim 10, wherein said planar video display is a head-mounted display. 12. The method of claim 1, wherein the composite data is displayed via projection or a collection of planar video displays surrounding the viewer. 13. The method of claim 1, wherein the composite data is displayed on a holographic display. 14. The method of claim 1, wherein said visual depiction is an actual image of the object, and contains additional information provided by or derived from said non-imaging surveillance device. 15. The method of claim 1, wherein said visual depiction is a simulated image of the object, and contains additional information provided by or derived from said non-imaging surveillance device. 16. The method of claim 1, wherein said visual depiction is a combination of actual and simulated images of the object, and contains additional information provided by or derived from said non-imaging surveillance device. 17. The method of claim 1, further comprising the steps of: detecting the presence of the object within the field of view of said imaging surveillance device; determining the coordinate position of the object within the field of view of said imaging surveillance device; and correlating the determined coordinate position of the object within the field of view of the imaging surveillance device with the coordinate position of the object supplied by said non-imaging surveillance device. 18. The method of claim 17, wherein said detecting step comprises monitoring sequential images captured by said imaging surveillance device to determine a change in position of an object within the field of view of said imaging surveillance device, said change being indicative of the object entering the field of view of said imaging surveillance device. 19. The method of claim 17, wherein said detecting step comprises comparing archived image data of the field of view of said imaging surveillance device to actual image data of the field of view of said imaging surveillance device to detect when the object enters the field of view of said imaging surveillance device. 20. The method of claim 17, wherein the object is predetermined to be an object of interest, and said detecting step comprises comparing features within the actual image data supplied by said imaging surveillance device to archived features of objects of interest to detect the presence of the object within the field of view of said imaging surveillance device. 21. The method of claim 17, wherein said determining step comprises comparing the detected position of the object to reference points of known coordinates within the field of view of said imaging surveillance device to determine the coordinate position of the object within the field of view of said imaging surveillance device. 22. The method of claim 1, wherein the object comprises at least one of an aircraft and ground vehicle in and around an airport. 23. The method of claim 1, wherein image data is captured for the object at a plurality of imaging surveillance devices of known location, and said method further comprises the step of selecting the perspective view of one of said imaging surveillance devices to display that perspective view to the viewer. 24. The method of claim 23, wherein the image data from at least two of said imaging surveillance devices is interleaved to provide a perspective view to the viewer from a position between said at least two imaging surveillance devices. 25. A system for tracking the position of an object, comprising: means for receiving, at a non-imaging surveillance device of known location, a transmission from the object; means for determining, based on said received transmission, the coordinate position of the object within a given time period; means for capturing image data for the object at an imaging surveillance device of known location to provide an image of the object within said given time period; means for correlating said coordinate position with said image to provide composite data about the object within said given time period; and means for displaying the composite data to a viewer as a visual depiction of the object. 26. The system of claim 25, wherein said transmission contains position data for the object. 27. The system of claim 25, wherein said transmission is generated in response to a directional interrogation from a single non-imaging surveillance device, and said system further comprises means for processing said transmission to calculate the range and azimuth of the object relative to said device. 28. The system of claim 25, wherein said transmission is received at a plurality of non-imaging surveillance devices of known location, and said system further comprises means for processing said transmission to derive position data for the object. 29. The system of claim 28, wherein said means for processing said transmission includes a multilateration algorithm to derive position data for the object. 30. The system of claim 28, wherein said transmission is generated in response to an interrogation from a non-imaging surveillance device, and said system further comprises means for processing said transmission to determine range from the interrogating device and azimuth from the plurality of devices using the difference in time of arrival of the transmission at each said device. 31. The system of claim 25, wherein said system further comprises means for generating position data for the object within said given time period using a non-interrogating, non-imaging surveillance device, and said means for correlating correlates said coordinate position with said generated position data. 32. The system of claim 31, wherein said non-interrogating, non-imaging surveillance device includes primary radar. 33. The system of claim 25, wherein said imaging surveillance device includes a camera operating in at least one of the visual spectrum and infrared spectrum. 34. The system of claim 25, wherein the composite data is displayed to the viewer as one of a 3-dimensional or 2-dimensional video image, and said means for displaying comprises a planar video display. 35. The system of claim 34, wherein said planar video display is a head-mounted display. 36. The system of claim 25, wherein said means for displaying comprises a projection system or a collection of planar video displays surrounding the viewer. 37. The system of claim 25, wherein said means for displaying comprises a holographic display. 38. The system of claim 25, wherein said visual depiction is an actual image of the object, and contains additional information provided by or derived from said non-imaging surveillance device. 39. The system of claim 25, wherein said visual depiction is a simulated image of the object, and contains additional information provided by or derived from said non-imaging surveillance device. 40. The system of claim 25, wherein said visual depiction is a combination of actual and simulated images of the object, and contains additional information provided by or derived from said non-imaging surveillance device. 41. The system of claim 25, further comprising: means for detecting the presence of the object within the field of view of said imaging surveillance device; means for determining the coordinate position of the object within the field of view of said imaging surveillance device; and means for correlating the determined coordinate position of the object within the field of view of the imaging surveillance device with the coordinate position of the object supplied by said non-imaging surveillance device. 42. The system of claim 41, wherein said means for detecting comprises means for monitoring sequential images captured by said imaging surveillance device to determine a change in position of an object within the field of view of said imaging surveillance device, said change being indicative of the object entering the field of view of said imaging surveillance device. 43. The system of claim 41, wherein said means for detecting comprises means for comparing archived image data of the field of view of said imaging surveillance device to actual image data of the field of view of said imaging surveillance device to detect when the object enters the field of view of said imaging surveillance device. 44. The system of claim 41, wherein the object is predetermined to be an object of interest, and said means for detecting comprises means for comparing features within the actual image data supplied by said imaging surveillance device to archived features of objects of interest to detect the presence of the object within the field of view of said imaging surveillance device. 45. The system of claim 41, wherein said means for determining comprises means for comparing the detected position of the object to reference points of known coordinates within the field of view of said imaging surveillance device to determine the coordinate position of the object within the field of view of said imaging surveillance device. 46. The system of claim 25, wherein the object comprises at least one of an aircraft and ground vehicle in and around an airport. 47. The system of claim 25, wherein image data is captured for the object at a plurality of imaging surveillance devices of known location, and said system further comprises means for selecting the perspective view of one of said imaging surveillance devices to display that perspective view to the viewer. 48. The system of claim 47, wherein the image data from at least two of said imaging surveillance devices is interleaved to provide a perspective view to the viewer from a position between said at least two imaging surveillance devices. 49. A method for tracking the position of an object, comprising the steps of: receiving, at a plurality of non-imaging surveillance devices of known location, a transmission from the object, said transmission being used to determine the coordinate position of the object within a given time period; correlating the transmission received at each of said non-imaging surveillance devices with the transmission received at each of the other of said non-imaging surveillance devices within said given time period; capturing image data for the object at a plurality of imaging surveillance devices of known location to provide images of the object within said given time period; correlating the image received at each of said imaging surveillance devices with the image received at each of the other of said imaging surveillance devices within said given time period; correlating said coordinate position with said images to provide composite data about the object within said given time period; and displaying the composite data to a viewer as a visual depiction of the object. 50. A system for tracking the position of an object, comprising: means for receiving, at a plurality of non-imaging surveillance devices of known location, a transmission from the object; means for correlating the transmission received at each of said non-imaging surveillance devices with the transmission received at each of the other of said non-imaging surveillance devices within a given time period; means for determining, based on the correlated transmissions, the coordinate position of the object within said given time period; means for capturing image data for the object at a plurality of imaging surveillance devices of known location to provide images of the object within said given time period; means for correlating the image received at each of said imaging surveillance devices with the image received at each of the other of said imaging surveillance devices within said given time period; means for correlating said coordinate position with said images to provide composite data about the object within said given time period; and means for displaying the composite data to a viewer as a visual depiction of the object.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.