Unmanned aerial vehicle and methods for controlling same
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06T-011/60
B64C-039/02
G05D-001/10
G05D-001/00
G08G-005/00
G05D-001/04
G06T-001/00
G06T-011/20
H04N-007/18
출원번호
US-0726106
(2015-05-29)
등록번호
US-9346543
(2016-05-24)
발명자
/ 주소
Kugelmass, Bret
출원인 / 주소
Airphrame Inc.
대리인 / 주소
Run8 Patent Group, LLC
인용정보
피인용 횟수 :
8인용 특허 :
4
초록▼
One variation of a method for imaging an area of interest includes: within a user interface, receiving a selection for a set of interest points on a digital map of a physical area and receiving a selection for a resolution of a geospatial map; identifying a ground area corresponding to the set of in
One variation of a method for imaging an area of interest includes: within a user interface, receiving a selection for a set of interest points on a digital map of a physical area and receiving a selection for a resolution of a geospatial map; identifying a ground area corresponding to the set of interest points for imaging during a mission; generating a flight path over the ground area for execution by an unmanned aerial vehicle during the mission; setting an altitude for the unmanned aerial vehicle along the flight path based on the selection for the resolution of the geospatial map and an optical system arranged within the unmanned aerial vehicle; setting a geospatial accuracy requirement for the mission based on the selection for the mission type; and assembling a set of images captured by the unmanned aerial vehicle during the mission into the geospatial map.
대표청구항▼
1. A method, comprising: within a user interface, receiving a selection for a set of interest points on a digital map of a physical area;based on the set of interest points, selecting a ground area for imaging during a mission;generating a flight path over the ground area for execution by an unmanne
1. A method, comprising: within a user interface, receiving a selection for a set of interest points on a digital map of a physical area;based on the set of interest points, selecting a ground area for imaging during a mission;generating a flight path over the ground area for execution by an unmanned aerial vehicle during the mission, the flight path defining an imaging flight path for capturing images of the ground area, a first landing path specifying a landing site and associated with a first environmental condition, and a second landing path specifying the landing site and associated with a second environmental condition distinct from the first environmental condition;retrieving a current environmental condition proximal the landing site during execution of the flight path by an unmanned aerial vehicle during the mission;selecting the first landing path for the unmanned aerial vehicle based on a similarity between the first environmental condition of the first landing path and the current environmental condition proximal the landing site; andtransmitting a command to the unmanned aerial vehicle to execute the first landing path. 2. The method of claim 1, wherein generating the flight path comprises: selecting the landing site proximal the ground area from a previous geospatial map generated from images captured during a previous mission; from the previous geospatial map, identifying an obstruction proximal the landing site; and defining the first landing path and the second landing path to avoid the obstruction. 3. The method of claim 1, wherein retrieving the current environmental condition comprises receiving telemetry data of the unmanned aerial vehicle during execution of the flight path by the unmanned aerial vehicle and extrapolating the current environmental condition from the telemetry data. 4. The method of claim 3, wherein extrapolating the current environmental condition from the telemetry data comprises transforming the telemetry data into an actual flight path of the first unmanned aerial vehicle and estimating a local wind direction proximal the landing site based on a difference between the imaging flight path and the actual flight path of the first unmanned aerial vehicle. 5. The method of claim 4, wherein generating the flight path comprises assigning a first range of wind directions to the first landing path and assigning a second range of wind directions to the second landing path, the second range of wind conditions distinct from the first range of wind directions; wherein selecting the first landing path for the unmanned aerial vehicle comprises selecting the first landing path based on an overlap between the first range of wind directions and the local wind direction. 6. The method of claim 1, wherein generating the flight path comprises defining the imaging flight path, the first landing path, the second landing path, and a third landing path specifying a second landing site within the ground area distinct from the landing site; wherein selecting the first landing path for the unmanned aerial vehicle comprises selecting the landing site over the second landing site based to a detected wind condition proximal the second landing site; and wherein transmitting the command to the unmanned aerial vehicle comprises transmitting the command to the unmanned aerial vehicle to execute the first landing path upon conclusion of the imaging flight path. 7. The method of claim 1, wherein retrieving the current environmental condition proximal the landing site comprises identifying precipitation over the ground area; and wherein transmitting the command to the unmanned aerial vehicle comprises transmitting a command to terminate the imaging flight path and to initiate the first landing path in response to identification of precipitation over the ground area. 8. A method, comprising: within a user interface, receiving a selection for a land area survey and a selection for a resolution requirement for a geospatial map;in response to the selection for the land area survey, prompting selection of points on a digital map of a physical area within the user interface;in response to selection of a set of points on the digital map, defining a polygonal land area for imaging during a mission, the polygonal land area bounded by the set of points;generating a flight path over the polygonal land area for execution by an unmanned aerial vehicle during the mission;setting a target altitude for the unmanned aerial vehicle along the flight path based on the selection for the resolution requirement for the geospatial map and an imaging system arranged within the unmanned aerial vehicle;uploading the flight path and the target altitude to the unmanned aerial vehicle;selecting a ground control point within the polygonal land area; andassembling a set of images captured by the unmanned aerial vehicle during the mission into the geospatial map based on the ground control point. 9. The method of claim 8, further comprising setting a geospatial accuracy requirement for the mission based on the selection for the land area survey; and setting an overlap target between consecutive images in a series of images captured by the unmanned aerial vehicle along the flight path based on the geospatial accuracy requirement; wherein selecting the ground control point comprises selecting the a set of ground control points based on the geospatial accuracy requirement; and wherein assembling the set of images into the geospatial map comprises assembling the set of images into the geospatial map based on the geospatial accuracy requirement and the set of ground control points. 10. The method of claim 8, wherein setting the target altitude for the unmanned aerial vehicle along the flight path comprises retrieving terrain altitude data, identifying features within the ground area from the terrain altitude data, and defining a set of target altitudes over the features and along the flight path based on the resolution requirement and the imaging system arranged within the unmanned aerial vehicle. 11. The method of claim 8, wherein assembling the set of images into the geospatial map comprises passing the set of images to a third-party mapping service for assembly into an orthorectified geospatially-accurate visual map. 12. The method of claim 11, further comprising distributing portions of the orthorectified geospatially-accurate visual map to the user interface over a computer network. 13. The method of claim 8, wherein selecting the polygonal land area comprises assigning a geospatial coordinate to each point in the set of points on the digital map, defining each point in the set of points as a vertex in a set of vertices, and connecting the set of vertices by straight lines defining the polygonal land area. 14. A method, comprising: during execution of a mission by an unmanned aerial vehicle: receiving a first image file of a first image from the unmanned aerial vehicle over a first wireless network, the first image captured by the unmanned aerial vehicle during flight over a preselected ground area;receiving a first image file of a second image from the unmanned aerial vehicle over the first wireless network, the second image captured by the unmanned aerial vehicle during flight over the preselected ground area; andserving the first image file of the first image and the first image file of the second image to a user interface; andin response to completion of the mission: receiving a second image file of the first image from the unmanned aerial vehicle over a second wireless network, the second image file of the first image of a greater file size than the first image file of the first image;receiving a second image file of the second image from the unmanned aerial vehicle over the second wireless network, the second image file of the second image of a greater file size than the first image file of the second image; andassembling the second image file of the first image and the second image file of the second image into a geospatial map of the preselected ground area. 15. The method of claim 14, wherein serving the first image file of the first image and the first image file of the second image to the user interface further comprises: rendering the first image file of the first image in the user interface substantially in real-time in response to receipt of the first image file of the first image from the unmanned aerial vehicle during the mission; and replacing the first image file of the first image with the first image file of the second image in the user interface substantially in real-time in response to receipt of the first image file of the second image from the unmanned aerial vehicle during the mission. 16. The method of claim 14, wherein receiving the first image file of the first image comprises receiving a first thumbnail image file of the first image from the unmanned aerial vehicle over cellular communication network. 17. The method of claim 14, wherein receiving the second image file of the first image comprises receiving a high-resolution color image file of the first image from the unmanned aerial vehicle over a local area wireless computer networking. 18. The method of claim 14, wherein receiving the second image file of the first image comprises receiving a high-resolution infrared image file of the first image from the unmanned aerial vehicle. 19. The method of claim 14, wherein uploading the second image file of the first image comprises uploading, to the computer network, the second image file of the first image with a first time of capture of the first image and a geospatial coordinate of the unmanned aerial vehicle at the first time; and wherein assembling the second image file of the first image and the second image file of the second image into the geospatial map comprises stitching the second image file of the first image and the second image file of the second image into the geospatial map based on the geospatial coordinate of the unmanned aerial vehicle at the first time. 20. The method of claim 19, wherein generating the geospatial map comprises tagging a pixel in the geospatial map with the first time, the pixel defined in a region of the geospatial map corresponding to the first image file of the first image. 21. The method of claim 19, wherein assembling the second image file of the first image and the second image file of the second image into the geospatial map comprises passing the second image file of the first image and the second image file of the second image to a third-party mapping service for assembly into an orthorectified geospatially-accurate visual map.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (4)
Offer, Brad W.; Parks, Paul Clark; Bruce, Alan Eugene; Spinelli, Charles B.; Reid, Travis S., Aircraft emergency landing route system.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.