An image-augmented inertial navigation system includes an inertial navigation system configured to estimate a navigation state vector and an imager configured to output pixel signals associated with terrain features passing through a field view of the imager. The system further includes a processing
An image-augmented inertial navigation system includes an inertial navigation system configured to estimate a navigation state vector and an imager configured to output pixel signals associated with terrain features passing through a field view of the imager. The system further includes a processing unit configured to determine a distance from the imager to each of the pixel signals for a given image frame and to determine a distance between the imager and a centroid of one or more of the terrain features passing through the field of view of the imager for the given image frame. The processing unit is also configured to track each terrain feature as the terrain features pass through the field of view of the imager. The processing unit is further configured to update the navigation state vector of the inertial navigation system based on calculated NED coordinates position information of the tracked terrain features.
대표청구항▼
1. A system for aiding an inertial navigation system of a vehicle, the system for aiding the inertial navigation system comprising: an imager configured to output signals associated with features passing through a field view of the imager over a plurality of image frames; anda processing unit operat
1. A system for aiding an inertial navigation system of a vehicle, the system for aiding the inertial navigation system comprising: an imager configured to output signals associated with features passing through a field view of the imager over a plurality of image frames; anda processing unit operatively coupled the imager and configured to be operatively coupled to the inertial navigation system, the processing unit being configured to: determine a distance from the imager to one or more of the features passing through the field of view of the imager for a given image frame based on the signals,track at least one of the features from a first image frame containing the feature to any subsequent image frame while the feature remains within the field of view of the imager,calculate position information associated with the inertial navigation system relative to first tagged coordinates of each tracked feature, andupdate navigation information associated with the vehicle based on the calculated position information,wherein the signals associated with features passing through the field view of the imager comprise pixel signals associated with terrain features. 2. The system of claim 1, wherein the processing unit is configured to determine a distance from the imager to a centroid of at least one of the one or more features. 3. The system of claim 1, wherein the navigation information comprises NED coordinates position information of the inertial navigation system relative to the first tagged coordinates of each tracked feature. 4. The system of claim 1, wherein the navigation information comprises a navigation state vector associated with the inertial navigation system. 5. The system of claim 1, wherein the processing unit is configured to track new features as they pass within the field of view of the imager by determining the distance between the imager and the new features and by predicting a distance between the new features and the imager based on a current known location of the inertial navigation system. 6. The system of claim 1, wherein the processing unit is configured to determine and tag position information for new features a first time the new features enter into the field of view of the imager, based on at least one of the current inertial navigation system position, range to the new feature, and orientation and offset of the imager with respect to the inertial navigation system. 7. A vehicle comprising: an inertial navigation system; anda system for aiding the inertial navigation system, the system for aiding the inertial navigation system comprising: an imager configured to output signals associated with features passing through a field view of the imager over a plurality of image frames; anda processing unit operatively coupled the imager and the inertial navigation system, the processing unit being configured to: determine a distance from the imager to one or more of the features passing through the field of view of the imager for a given image frame based on the signals,track at least one of the features from a first image frame containing the feature to any subsequent image frame while the feature remains within the field of view of the imager,calculate position information associated with the inertial navigation system relative to first tagged coordinates of each tracked feature, andupdate navigation information associated with the inertial navigation system based on the calculated position information,wherein the signals associated with features passing through the field view of the imager comprise pixel signals associated with terrain features. 8. The vehicle of claim 7, wherein the processing unit is configured to determine a distance from the imager to a centroid of at least one of the one or more features. 9. The vehicle of claim 7, wherein the navigation information comprises NED coordinates position information of the inertial navigation system relative to the first tagged coordinates of each tracked feature. 10. The vehicle of claim 7, wherein the navigation information comprises a navigation state vector associated with the inertial navigation system. 11. The vehicle of claim 7, wherein the processing unit is configured to track new features as they pass within the field of view of the imager by determining the distance between the imager and the new features and by predicting a distance between the new features and the imager based on a current known location of the inertial navigation system. 12. The vehicle of claim 7, wherein the processing unit is configured to determine and tag position information for new features a first time the new features enter into the field of view of the imager, based on at least one of the current inertial navigation system position, range to the new feature, and orientation and offset of the imager with respect to the inertial navigation system. 13. A processor-implemented method of determining the position of a body in motion, the method comprising: receiving in the processor navigation information from an inertial navigation system associated with the body;determining a distance from an imager associated with the body to one or more features passing through a field of view of the imager for a given image frame;tracking at least one of the features from a first image frame to a subsequent image frame as the feature passes through the field of view of the imager;calculating position information for at least one of the tracked features; andupdating by the processor of the navigation information associated with the body based on the calculated position information,wherein tracking at least one of the features comprises tracking pixel signals associated with the features. 14. The method of claim 13, wherein determining the distance from the imager associated with the body to one or more features comprises determining a distance from the imager to a centroid of one or more terrain features. 15. The method of claim 13, wherein calculating the position information comprises calculating NED coordinates position information for at least one of the tracked features. 16. The method of claim 13, wherein updating the navigation information comprises updating a navigation state vector associated with the body. 17. The method of claim 13, further comprising tracking new features as they pass within the field of view of the imager by determining the distance between the imager and the new features and by predicting the distance between the new features and the imager based on a current known location of the inertial navigation system. 18. The method of claim 13, further comprising determining and tagging NED coordinates position information for new features a first time the new features enter into the field of view of the imager based on at least one of a current inertial navigation system position, range to the new feature, and orientation and offset of the imager with respect to the inertial navigation system.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (31)
Scherzinger,Bruno; Reid,Blake; Lithopoulos,Erik, AINS land surveyor system with reprocessing, AINS-LSSRP.
Lareau Andre G. ; Beran Stephen R. ; James Brian ; Quinn James P. ; Lund John, Autonomous electro-optical framing camera system with constant ground resolution, unmanned airborne vehicle therefor, and methods of use.
Arnoul, Patrick; Guerin, Jean-Pierre; Letellier, Laurent; Viala, marc, Method for calibrating the initial position and the orientation of one or several mobile cameras.
van der Merwe,Rudolph; Wan,Eric A.; Julier,Simon J., Navigation system applications of sigma-point Kalman filters for nonlinear estimation and sensor fusion.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.