Systems and methods for navigating a vehicle within an environment are provided. In one aspect, a method comprises: (a) selecting, with aid of a processor, a subset of a plurality of sensors to be used for navigating the vehicle within the environment based on one or more predetermined criteria, whe
Systems and methods for navigating a vehicle within an environment are provided. In one aspect, a method comprises: (a) selecting, with aid of a processor, a subset of a plurality of sensors to be used for navigating the vehicle within the environment based on one or more predetermined criteria, wherein the plurality of sensors are arranged on the vehicle such that each sensor of the plurality of sensors is configured to obtain sensor data from a different field of view; (b) processing, with aid of the processor, the sensor data from the selected sensor(s) so as to generate navigation information for navigating the vehicle within the environment; and (c) outputting, with aid of the processor, signals for controlling the vehicle based on the navigation information.
대표청구항▼
1. A navigation system, comprising: one or more propulsion units configured to effect movement of a vehicle and a plurality of sensors each configured to capture a plurality of sensor data, wherein the plurality of sensors comprise a plurality of imaging devices each configured to capture a pluralit
1. A navigation system, comprising: one or more propulsion units configured to effect movement of a vehicle and a plurality of sensors each configured to capture a plurality of sensor data, wherein the plurality of sensors comprise a plurality of imaging devices each configured to capture a plurality of images; andone or more processors operably coupled to the plurality of sensors and individually or collectively configured to: assess data quality of the plurality of sensor data from each of the plurality of sensors, wherein the data quality is assessed based on a feature point number of each image captured by said plurality of imaging devices, wherein said feature point number of each image is calculated using a corner detection algorithm that is a Features from Accelerated Segmented Test (FAST) algorithm;select at least one of the plurality of sensors based on the assessed data quality; anddetermine state information for the vehicle using the plurality of sensor data from the selected sensor(s), wherein the state information of the vehicle comprises information about an environment surrounding the vehicle. 2. The system of claim 1, wherein the one or more processors are further configured to: output control signals to the one or more propulsion units for effecting the movement of the vehicle, based on the state information of the vehicle. 3. The system of claim 1, wherein the plurality of sensors further comprise one or more of an ultrasonic sensor, a lidar sensor, or a radar sensor. 4. The system of claim 1, wherein the plurality of imaging devices are each oriented in a different direction relative to the vehicle. 5. The system of claim 4, wherein the different directions are orthogonal directions. 6. The system of claim 4, wherein the different directions comprise at least four different directions. 7. The system of claim 1, wherein the data quality is based on a saliency of each image captured by said plurality of imaging devices. 8. The system of claim 1, wherein the data quality is based on at least one of an exposure level or contrast level in each image captured by said plurality of imaging devices. 9. The system of claim 1, wherein the data quality is based on a suitability of said plurality of sensor data for use in determining the state information for the vehicle. 10. The system of claim 1, wherein the one or more processors are configured to assess whether the data quality of said plurality of sensor data exceeds a predetermined threshold. 11. The system of claim 1, wherein the one or more processors are configured to identify which of the plurality of sensor data has the highest data quality. 12. The system of claim 1, wherein the state information further comprises at least one of a position, an attitude, a velocity, or an acceleration of the vehicle. 13. The system of claim 1, wherein the one or more processors are configured to repeat steps to assess the data quality, select at least one of the plurality of sensors, and determine state information during operation of the vehicle. 14. The system of claim 1, wherein the one or more processors are further configured to select at least one of the plurality of sensors based on a power consumption of the plurality of sensors. 15. A navigation system, comprising: a plurality of imaging devices on a vehicle, wherein each imaging device is configured to capture a plurality of images, wherein the plurality of imaging devices comprise a primary imaging device and one or more secondary imaging devices; andone or more processors operably coupled to the plurality of imaging devices and individually or collectively configured to: (1) assess image quality of the plurality of images from the primary imaging device to determine whether said image quality meets a predetermined threshold, and (2) assess image quality of the plurality of images from the one or more secondary imaging devices if the image quality of the plurality of images from the primary imaging device does not meet the predetermined threshold, wherein the image quality of the plurality of images from the primary and/or secondary imaging devices are each assessed based on a feature point number of each image of said plurality of images, wherein said feature point number of each image is calculated using a corner detection algorithm that is a Features from Accelerated Segmented Test (FAST) algorithm;select at least one of the one or more secondary imaging devices based on the assessment of the image quality of the plurality of images from the one or more secondary imaging devices; anddetermine state information for the vehicle using the plurality of images from the selected secondary imaging device(s), wherein the state information of the vehicle comprises information about an environment surrounding the vehicle. 16. The system of claim 15, wherein the vehicle is an unmanned aerial vehicle. 17. The system of claim 15, wherein the plurality of imaging devices are arranged on the vehicle such that each imaging device of the plurality of imaging devices is configured to capture the plurality of images from a different field of view. 18. The system of claim 15, wherein the plurality of images comprises a plurality of successive image frames captured over a predetermined time interval. 19. The system of claim 18, wherein the predetermined time interval is within a range from about 0.02 seconds to about 0.1 seconds. 20. The system of claim 15, wherein the image quality of the plurality of images from the primary and/or secondary imaging devices are each based on saliency of each image of said plurality of images. 21. The system of claim 15, wherein the image quality of the plurality of images from the primary and/or secondary imaging devices are each based on at least one of an exposure level or contrast level in each image of said plurality of images. 22. The system of claim 15, wherein the state information further comprises at least one of a position, an attitude, a velocity, or an acceleration of the vehicle. 23. The system of claim 22, wherein the attitude comprises at least one of a roll orientation, a pitch orientation, or a yaw orientation of the vehicle. 24. A method for controlling a moving vehicle operably coupled thereto a plurality of imaging devices, comprising: capturing a plurality of images with each imaging device of the plurality of imaging devices; andwith aid of one or more processors, individually or collectively, assessing image quality of the plurality of images from each imaging device, wherein the image quality is assessed based on a feature point number of each image captured by said plurality of imaging devices, wherein said feature point number of each image is calculated using a corner detection algorithm that is a Features from Accelerated Segmented Test (FAST) algorithm;selecting at least one of the plurality of imaging devices based on the assessment of the image quality of the plurality of images; anddetermining state information for the vehicle using the plurality of images from the selected imaging device(s), wherein the state information of the vehicle comprises information about an environment surrounding the vehicle. 25. A method for assessing state information of a moving vehicle attached thereto a plurality of imaging devices, comprising: capturing a plurality of images with each imaging device of the plurality of imaging devices, wherein the plurality of imaging devices comprises a primary imaging device and one or more secondary imaging devices; andwith aid of one or more processors, individual or collectively, (1) assessing image quality of the plurality of images from the primary imaging device to determine whether said image quality meets a predetermined threshold, and (2) assessing image quality of the plurality of images from the one or more secondary imaging device if the image quality of the plurality of images from the primary imaging device does not meet the predetermined threshold, wherein the image quality of the plurality of images from the primary and/or secondary imaging devices are each assessed based on a feature point number of each image of said plurality of images, wherein the feature point number of each image is calculated using a corner detection algorithm that is a Features from Accelerated Segmented Test (FAST) algorithm;selecting at least one of the one or more secondary imaging devices based on the assessment of the image quality of the plurality of images from the one or more secondary imaging devices; anddetermining state information for the vehicle using the plurality of images from the selected secondary imaging device(s), wherein the state information of the vehicle comprises information about an environment surrounding the vehicle.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (15)
Wand Martin A. (Plano TX) Ho Anh V. (Dallas TX) Lin Chuan-Fu (Westboro MA) Huang Phen-Lan (Richardson TX) Williston John P. (Plano TX) Rice Haradon J. (Plano TX) Doty Thomas J. (Dallas TX), Closed-loop navigation system for mobile robots.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.