Various embodiments provide methods for controlling landings of a UAV in a landing zone including a plurality of landing bays. Various embodiments include a method implemented on a computing device for receiving continuous real-time sensor data from a transceiver and from sensors onboard the UAV, an
Various embodiments provide methods for controlling landings of a UAV in a landing zone including a plurality of landing bays. Various embodiments include a method implemented on a computing device for receiving continuous real-time sensor data from a transceiver and from sensors onboard the UAV, and detecting a target landing bay within the plurality of landing bays within the landing zone that is available for landing based on the continuous real-time sensor data. Orientation and position coordinates for landing in the target landing bay may be calculated based on the continuous real-time sensor data. Information regarding positions and flight vectors of a plurality of autonomous UAVs may be obtained, and a flight plan for landing in the target landing bay may be generated based on the orientation and the position coordinates, positions and flight vectors of the plurality of autonomous UAVs and a current orientation and position of the UAV.
대표청구항▼
1. A method performed by a processor of an unmanned aerial vehicle (UAV) for controlling landing in a landing zone including a plurality of landing bays while flying among a plurality of other UAVs, comprising: receiving continuous real-time sensor data;detecting a target landing bay within the plur
1. A method performed by a processor of an unmanned aerial vehicle (UAV) for controlling landing in a landing zone including a plurality of landing bays while flying among a plurality of other UAVs, comprising: receiving continuous real-time sensor data;detecting a target landing bay within the plurality of landing bays within the landing zone that is available for landing based on the continuous real-time sensor data;calculating an orientation and position coordinates for landing in the target landing bay based on the continuous real-time sensor data;obtaining information regarding positions and flight vectors of the plurality of other UAVs, wherein obtaining the information comprises: receiving position and flight vector reports from the plurality of other UAVs via a transceiver on board the UAV,wherein the UAV and the plurality of other UAVs are each independently executing a separate flight mission;generating a flight plan for landing in the target landing bay based on the orientation and the position coordinates, the positions and flight vector reports of the plurality of other UAVs, and a current orientation and position of the UAV;performing the flight plan for landing in the target landing bay;determining whether an exception condition is identified based on the continuous real-time sensor data and the flight plan for landing in the target landing bay; andin response to determining that an exception condition is identified: halting performance of the flight plan for landing in the target landing bay; andperforming exception-handling operations based on the identified exception condition, wherein performing the exception-handling operations includes detecting an alternative target landing bay from the plurality of landing bays within the landing zone that is available for landing based on the continuous real-time sensor data by: performing operations to adjust a perspective of the UAV independent of any landing routine, the operations comprising one or more of: causing the UAV to ascend to a higher altitude above the landing zone, changing a pitch setting of the UAV, changing a roll setting of the UAV, or changing a yaw setting of the UAV; andobtaining the continuous real-time sensor data in the adjusted perspective. 2. The method of claim 1, wherein the continuous real-time sensor data is received from sensors on board the UAV. 3. The method of claim 2, wherein obtaining information regarding the positions and flight vectors of the plurality of other UAVs further comprises: obtaining camera imagery via a camera, wherein the camera is one of the sensors on board the UAV;tracking the plurality of other UAVs using the camera imagery; andcalculating the positions and flight vectors of the plurality of other UAVs based on the camera imagery and the UAV's own position and flight vectors. 4. The method of claim 2, wherein detecting the target landing bay from the plurality of landing bays within the landing zone that is available for landing based on the continuous real-time sensor data comprises: obtaining camera imagery via a camera, wherein the camera is one of the sensors on board the UAV;determining whether the camera imagery includes imagery of an assigned landing bay; anddetecting within the camera imagery of an open landing bay that is available for landing in response to determining that the camera imagery does not include the imagery of the assigned landing bay. 5. The method of claim 4, wherein the imagery of the assigned landing bay or the imagery of the open landing bay is imagery of a landing pattern that comprises at least a circle circumscribing an asymmetric symbol, wherein each hemisphere of the asymmetric symbol is different than an opposite hemisphere of the asymmetric symbol such that the landing pattern indicates a global orientation. 6. The method of claim 5, wherein calculating the orientation comprises: calculating the orientation based on a comparison of a current heading of the UAV to the global orientation of the asymmetric symbol in the imagery of the landing pattern. 7. The method of claim 1, wherein the continuous real-time sensor data is received via at least the transceiver on board the UAV. 8. The method of claim 1, further comprising continuously transmitting reports of the UAV's own position and flight vector via the transceiver. 9. The method of claim 1, wherein performing the exception-handling operations comprises: calculating an alternative orientation and alternative position coordinates for landing in the alternative target landing bay based on the continuous real-time sensor data;adjusting the flight plan for landing in the alternative target landing bay based on the alternative orientation, the alternative position coordinates, and the current orientation and position of the UAV; andperforming the flight plan for landing in the alternative target landing bay. 10. The method of claim 1, wherein performing the exception-handling operations comprises: adjusting a parameter of a sensor on board the UAV that is configured to obtain the continuous real-time sensor data, wherein the parameter includes one or more of a zoom setting of a camera and a focus setting of the camera. 11. The method of claim 1, wherein determining whether an exception condition is identified based on the continuous real-time sensor data comprises: determining whether the target landing bay is obstructed based on the continuous real-time sensor data; andwherein halting performance of the flight plan for landing in the target landing bay in response to identifying the exception condition comprises: halting the flight plan for landing in the target landing bay in response to determining that the target landing bay is obstructed. 12. The method of claim 1, wherein determining whether an exception condition is identified based on the continuous real-time sensor data comprises: determining whether the UAV has lost track of the target landing bay based on the continuous real-time sensor data; andwherein halting performance of the flight plan for landing in the target landing bay in response to identifying the exception condition comprises: halting the flight plan for landing in the target landing bay in response to determining that the UAV has lost track of the target landing bay based on the continuous real-time sensor data. 13. The method of claim 1, wherein determining whether an exception condition is identified based on the continuous real-time sensor data and the flight plan for landing in the target landing bay comprises: continuously monitoring positions and flight vectors of the plurality of other UAVs while performing the flight plan for landing in the target landing bay; andcalculating a probability of a mid-air collision occurring with one or more of the plurality of other UAVs while following the flight plan for landing in the target landing bay based on the positions and flight vectors of the plurality of other UAVs; andwherein performing the exception-handling operations based on the identified exception condition comprises: adjusting the flight plan for landing in the target landing bay in response to determining that the calculated probability of the mid-air collision occurring exceeds a safety threshold. 14. The method of claim 1, further comprising: determining whether the UAV is beginning an approach into the landing zone based on the continuous real-time sensor data,wherein detecting the target landing bay from the plurality of landing bays within the landing zone that is available for landing based on the continuous real-time sensor data comprises: detecting the target landing bay within the landing zone based on the continuous real-time sensor data in response to determining that the UAV is beginning the approach. 15. The method of claim 14, wherein determining that the UAV is beginning the approach into the landing zone based on the continuous real-time sensor data comprises: comparing coordinates of the UAV from the continuous real-time sensor data to coordinates for the landing zone. 16. The method of claim 14, wherein determining that the UAV is beginning the approach into the landing zone based on the continuous real-time sensor data comprises: detecting imagery of the landing zone within the continuous real-time sensor data. 17. The method of claim 1, wherein calculating the position coordinates for landing in the target landing bay based on the continuous real-time sensor data comprises: calculating a center point of the surface of the target landing bay. 18. The method of claim 1, wherein the continuous real-time sensor data includes one or more of location data received from global positioning system receiver, audio data from a microphone, movement data from an accelerometer, and orientation data from a gyroscope. 19. A computing device, comprising: a memory;a transceiver; anda processor coupled to the memory and configured with processor-executable instructions to: receive continuous real-time sensor data;detect a target landing bay within a plurality of landing bays within a landing zone that is available for landing based on the continuous real-time sensor data;calculate an orientation and position coordinates for landing in the target landing bay based on the continuous real-time sensor data;obtain information regarding positions and flight vectors of a plurality of other UAVs operating within the landing zone by: receiving position and flight vector reports from the plurality of other UAVs via the transceiver,wherein the UAV and the plurality of other UAVs are each independently executing a separate flight mission;generate a flight plan for landing in the target landing bay based on the orientation and the position coordinates, the positions and flight vector reports of the plurality of other UAVs, and a current orientation and position of the computing device;perform the flight plan for landing in the target landing bay;determine whether an exception condition is identified based on the continuous real-time sensor data and the flight plan for landing in the target landing bay; andin response to determining that an exception condition is identified: halt performance of the flight plan for landing in the target landing bay; andperform exception-handling operations based on the identified exception condition, wherein the exception-handling operations include detecting an alternative target landing bay from the plurality of landing bays within the landing zone that is available for landing based on the continuous real-time sensor data by: performing operations to adjust a perspective of the UAV independent of any landing routine, the operations comprising one or more of: causing the UAV to ascend to a higher altitude above the landing zone, changing a pitch setting of the UAV, changing a roll setting of the UAV, or changing a yaw setting of the UAV; andobtaining the continuous real-time sensor data in the adjusted perspective. 20. The computing device of claim 19, wherein the computing device is within an unmanned aerial vehicle (UAV). 21. The computing device of claim 19, wherein the computing device further comprises sensors, andwherein the continuous real-time sensor data is received via the sensors, the transceiver, or both. 22. The computing device of claim 21, wherein the processor is further configured with processor-executable instructions to: continuously transmit reports of the computing device's own position and flight vector via the transceiver. 23. The computing device of claim 21, wherein the processor is further configured with processor-executable instructions to obtain information regarding the positions and flight vectors of the plurality of other UAVs by: obtaining camera imagery via a camera, wherein the camera is one of the sensors;tracking the plurality of other UAVs using the camera imagery; andcalculating the positions and flight vectors of the plurality of other UAVs based on the camera imagery and the computing device's own position and flight vectors. 24. A non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of a computing device of an unmanned aerial vehicle (UAV) to perform operations comprising: receiving continuous real-time sensor data;detecting a target landing bay within a plurality of landing bays within a landing zone that is available for landing based on the continuous real-time sensor data;calculating an orientation and position coordinates for landing in the target landing bay based on the continuous real-time sensor data;obtaining information regarding positions and flight vectors of a plurality of other UAVs operating within the landing zone, wherein obtaining the information comprises: receiving position and flight vector reports from the plurality of other UAVs via a transceiver on board the UAV,wherein the UAV and the plurality of other UAVs are each independently executing a separate flight mission;generating a flight plan for landing in the target landing bay based on the orientation and the position coordinates, the positions and flight vector reports of the plurality of other UAVs, and a current orientation and position of the UAV;performing the flight plan for landing in the target landing bay;determining whether an exception condition is identified based on the continuous real-time sensor data and the flight plan for landing in the target landing bay; andin response to determining that an exception condition is identified: halting performance of the flight plan for landing in the target landing bay; andperforming exception-handling operations based on the identified exception condition, wherein the exception-handling operations include detecting an alternative target landing bay from the plurality of landing bays within the landing zone that is available for landing based on the continuous real-time sensor data by: performing operations to adjust a perspective of the UAV independent of any landing routine, the operations comprising one or more of: causing the UAV to ascend to a higher altitude above the landing zone, changing a pitch setting of the UAV, changing a roll setting of the UAV, or changing a yaw setting of the UAV; andobtaining the continuous real-time sensor data in the adjusted perspective. 25. A computing device, comprising: means for receiving continuous real-time sensor data;means for detecting a target landing bay within a plurality of landing bays within a landing zone that is available for landing based on the continuous real-time sensor data;means for calculating an orientation and position coordinates for landing in the target landing bay based on the continuous real-time sensor data;means for obtaining information regarding positions and flight vectors of a plurality of other UAVs operating within the landing zone, wherein means for obtaining the information comprises: means for receiving position and flight vector reports from the plurality of other UAVs,wherein the UAV and the plurality of other UAVs are each independently executing a separate flight mission;means for generating a flight plan for landing in the target landing bay based on the orientation and the position coordinates, the positions and flight vector reports of the plurality of other UAVs, and a current orientation and position of the computing device;means for performing the flight plan for landing in the target landing bay;means for determining whether an exception condition is identified based on the continuous real-time sensor data and the flight plan for landing in the target landing bay;means for halting performance of the flight plan for landing in the target landing bay in response to determining that an exception condition is identified; andmeans for performing exception-handling operations based on the identified exception condition in response to determining that an exception condition is identified, comprising means for detecting an alternative target landing bay from the plurality of landing bays within the landing zone that is available for landing based on the continuous real-time sensor data, wherein the means for detecting the alternative target landing bay further comprises: means for performing operations to adjust a perspective of the UAV independent of any landing routine, comprising one or more of: means for causing the UAV to ascend to a higher altitude above the landing zone, means for changing a pitch setting of the UAV, means for changing a roll setting of the UAV, or means for changing a yaw setting of the UAV; andmeans for obtaining the continuous real-time sensor data in the adjusted perspective.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (4)
Burdoin Robert B. (Salt Lake City UT) Moolenijzer Nicolaas J. (Sandia Park NM) Strohacker Fred M. (Albuquerque NM), Airborne drone formation control system.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.