Unmanned aerial vehicle and methods for controlling same
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G05D-001/10
G06T-011/60
H04N-007/18
출원번호
US-0204634
(2014-03-11)
등록번호
US-9075415
(2015-07-07)
발명자
/ 주소
Kugelmass, Bret
출원인 / 주소
Airphrame, Inc.
대리인 / 주소
Run8 Patent Group
인용정보
피인용 횟수 :
27인용 특허 :
18
초록▼
One variation of a method for imaging an area of interest includes: within a user interface, receiving a selection for a set of interest points on a digital map of a physical area and receiving a selection for a resolution of a geospatial map; identifying a ground area corresponding to the set of in
One variation of a method for imaging an area of interest includes: within a user interface, receiving a selection for a set of interest points on a digital map of a physical area and receiving a selection for a resolution of a geospatial map; identifying a ground area corresponding to the set of interest points for imaging during a mission; generating a flight path over the ground area for execution by an unmanned aerial vehicle during the mission; setting an altitude for the unmanned aerial vehicle along the flight path based on the selection for the resolution of the geospatial map and an optical system arranged within the unmanned aerial vehicle; setting a geospatial accuracy requirement for the mission based on the selection for the mission type; and assembling a set of images captured by the unmanned aerial vehicle during the mission into the geospatial map.
대표청구항▼
1. A method, comprising: within a user interface, receiving a selection for a mission type, receiving a selection for a set of interest points on a digital map of a physical area, receiving a selection for a resolution of a geospatial map, and receiving a time accuracy for the geospatial map;identif
1. A method, comprising: within a user interface, receiving a selection for a mission type, receiving a selection for a set of interest points on a digital map of a physical area, receiving a selection for a resolution of a geospatial map, and receiving a time accuracy for the geospatial map;identifying a ground area corresponding to the set of interest points for imaging during a mission by selecting the ground area within an area bounded by the set of interest points and unassociated with images stored from a previous mission completed within a threshold period of time defined by the time accuracy;generating a flight path over the ground area for execution by an unmanned aerial vehicle during the mission;setting an altitude for the unmanned aerial vehicle along the flight path based on the selection for the resolution of the geospatial map and an optical system arranged within the unmanned aerial vehicle;setting a geospatial accuracy requirement for the mission based on the selection for the mission type; andassembling a set of images captured by the unmanned aerial vehicle during the mission and an image stored from a previous mission into the geospatial map based on the geospatial accuracy requirement, the previous mission completed within the threshold period of time defined by the time accuracy. 2. The method of claim 1, wherein setting the altitude for the unmanned aerial vehicle along the flight path comprises accessing terrain altitude data of features within the ground area and setting the altitude for the unmanned aerial vehicle to maintain an altitude over features within the ground area along the flight path. 3. The method of claim 1, wherein receiving the selection for the resolution of the geospatial map comprises receiving a selection for a value for a linear distance per pixel in the geospatial map, and wherein setting the altitude for the unmanned aerial vehicle along the flight path comprises specifying a target altitude for the unmanned aerial vehicle during the flight path to achieve the value for the linear distance per pixel in the geospatial map with images captured by the unmanned aerial vehicle through a lens of substantially fixed angle of view. 4. The method of claim 1, wherein generating the flight path over the ground area comprises generating the flight path over a first portion of the ground area for execution by the unmanned aerial vehicle during the mission and generating a second flight path over a second portion of the ground area for execution by a second unmanned aerial vehicle during the mission, and wherein assembling the set of images into the geospatial map comprises aggregating the set of images captured by the unmanned aerial vehicle and a second set of images captured by the second unmanned aerial vehicle during the mission into the geospatial map. 5. The method of claim 1, wherein receiving the selection for the mission type comprises receiving a selection for a structural survey, wherein receiving the selection for the set of interest points comprises receiving an address of a structure, wherein identifying the ground area comprises selecting a coordinate in a geospatial coordinate system corresponding to the address of the structure, and wherein setting the geospatial accuracy requirement for the mission comprises disabling a ground control point requirement for the mission based on the selection for the structural survey. 6. The method of claim 1, further comprising, through the user interface, receiving a selection for a particular geographic coordinate system for the geospatial map, wherein assembling the set of images into the geospatial map comprises tagging each pixel in the geospatial map with a geographic coordinate in a default geographic coordinate system and converting a geographic coordinate of each pixel in the geospatial map into the particular geographic coordinate system. 7. The method of claim 1, wherein assembling the set of images into the geospatial map comprises downloading the set of images from the unmanned aerial vehicle over a computer network upon completion of the mission. 8. A method, comprising: within a user interface, receiving a selection for a land area survey, receiving selection of three points on a digital map of a physical area, and receiving a selection for a resolution of a geospatial map;identifying a ground area for imaging during a mission by selecting a polygonal land area bounded by the three points;generating a flight path over the ground area for execution by an unmanned aerial vehicle during the mission;setting an altitude for the unmanned aerial vehicle along the flight path based on the selection for the resolution of the geospatial map and an optical system arranged within the unmanned aerial vehicle;selecting a ground control point within the ground area;setting a geospatial accuracy requirement for the mission based on the selection for the land area and the ground control point; andassembling a set of images captured by the unmanned aerial vehicle during the mission into the geospatial map based on the geospatial accuracy requirement. 9. The method of claim 8, further comprising, through the user interface, receiving a selection for an accuracy of the geospatial map, wherein selecting the ground control point comprises selecting a number of ground control points within the ground area based on the selection for the accuracy of the geospatial map, and wherein assembling the set of images into the geospatial map based on the geospatial accuracy requirement comprises pairing a portion of an image in the set of images to the ground control point to geographically locate the geospatial map. 10. A method, comprising: within a user interface, receiving a selection for a mission type, receiving a selection for a set of interest points on a digital map of a physical area, receiving a selection for a resolution of a geospatial map;identifying a ground area corresponding to the set of interest points for imaging during a mission;generating a flight path over the ground area for execution by an unmanned aerial vehicle during the mission, the flight path defining a first landing path and a second landing path at a landing site proximal the ground area for execution by the unmanned aerial vehicle, the first landing path associated with a first wind condition and the second landing path associated with a second wind condition different from the first wind conditionsetting an altitude for the unmanned aerial vehicle along the flight path based on the selection for the resolution of the geospatial map and an optical system arranged within the unmanned aerial vehicle;setting a geospatial accuracy requirement for the mission based on the selection for the mission type; andassembling a set of images captured by the unmanned aerial vehicle during the mission into the geospatial map based on the geospatial accuracy requirement. 11. The method of claim 10, wherein defining the first landing path and the second landing path comprise selecting the landing site proximal the ground area from a previous geospatial map generated from images captured in a previous mission, identifying an obstruction proximal the landing site from the previous geospatial map, and defining the first landing path and the second landing path that avoid the obstruction. 12. A method, comprising: within a user interface, receiving a selection for a set of interest points on a digital map of a physical area and receiving a selection for a resolution of a geospatial map;identifying a ground area corresponding to the set of interest points for imaging during a mission;generating a first flight path over a first portion of the ground area for execution by a first unmanned aerial vehicle during the mission;generating a second flight path over a second portion of the ground area for execution by a second unmanned aerial vehicle during the mission;setting a first altitude for the first unmanned aerial vehicle along the first flight path based on the selection for the resolution of the geospatial map and an optical system arranged within the first unmanned aerial vehicle;setting a second altitude for the second unmanned aerial vehicle along the second flight path based on the selection for the resolution of the geospatial map and an optical system arranged within the second unmanned aerial vehicle; andstitching a first set of images captured by the first unmanned aerial vehicle and a second set of images captured by the second unmanned aerial vehicle during the mission into the geospatial map. 13. The method of claim 12, wherein generating the first flight path and wherein generating the second flight path comprise selecting a common takeoff site and selecting a common landing site for the first unmanned aerial vehicle and the second unmanned aerial vehicle during the mission. 14. The method of claim 13, wherein selecting the common landing area comprises defining a first landing path and a second landing path at the landing site, the first landing path associated with a first wind condition and the second landing path associated with a second wind condition different from the first wind condition. 15. The method of claim 12, wherein generating the second flight path comprises generating the second flight path over the second portion of the ground area that intersects the first portion of the ground area across an overlap region, and wherein stitching the first set of images and the second set of images into the geospatial map comprises aligning a subset of images in the first set of images corresponding to the overlap region with a subset of images in the second set of images corresponding to the overlap region. 16. The method of claim 12, further comprising, within the user interface, receiving a selection for an accuracy of the geospatial map and setting an image capture rate for first unmanned aerial vehicle over the ground area during the mission. 17. A method, comprising: at an unmanned aerial vehicle, capturing a first image in a series of images of a preset ground area during a mission;in response to capturing the first image, uploading a first image file of the first image to a computer network over a first wireless communication protocol;at the unmanned aerial vehicle, capturing a second image in the series of images during the mission;in response to capturing the second image, uploading a first image file of the second image to the computer network over the first wireless communication protocol;in response to completion of the mission, detecting access to a second wireless communication protocol;uploading a second image file of the first image and a second image file of the second image to the computer network over the second wireless communication protocol, the second image file of the first image of a greater resolution than the first image file of the first image, and the second image file of the second image of a greater resolution than the first image file of the second image; andstitching the second image file of the first image and the second image file of the second image into a geospatial map of the preset ground area. 18. The method of claim 17, further comprising enabling visual access to the first image file of the first image, substantially in real-time with capture of the first image, through a computing device in communication with the computer network. 19. The method of claim 17, wherein uploading the first image file of the first image comprises uploading a thumbnail version of first image file to the computer network over cellular communication protocol. 20. The method of claim 19, wherein uploading the second image file of the first image comprises uploading a raw full-size version of the first image to the computer network over Wi-Fi communication protocol. 21. The method of claim 20, wherein uploading the second image file of the first image comprises uploading, to the computer network, the second image file of the first image with a time of capture of the first image and a three-dimensional location coordinate of the unmanned aerial vehicle at the time of capture of the first image, and wherein stitching the second image file of the first image and the second image file of the second image into the geospatial map comprises generating the geospatial map based on the three-dimensional location coordinate of the unmanned aerial vehicle at the time of capture of the first image, the geospatial map comprising a pixel tagged with the time of capture of the first image.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (18)
Schultz, Richard R.; Martel, Florent; Lendway, Matthew; Berseth, Brian L., Adaptive surveillance and guidance system for vehicle collision avoidance and interception.
Bodin, William Kress; Redman, Jesse J. W.; Thorson, Derral C., Navigating a UAV under remote control and manual control with three dimensional flight depiction.
Rosenwald, Ross D.; Saunders, Jeffery B.; Bossert, David E.; De Sa, Erwin M., Automated sensor platform routing and tracking for observing a region of interest while avoiding obstacles.
Chambers, Andrew; Wyrobek, Keenan; Rinaudo, Keller; Oksenhorn, Ryan; Hetzler, William, System and method for human operator intervention in autonomous vehicle operations.
Chambers, Andrew; Wyrobek, Keenan; Rinaudo, Keller; Oksenhorn, Ryan; Hetzler, William, System and method for human operator intervention in autonomous vehicle operations.
Loveland, Jim; Larson, Leif; Christiansen, Dan; Christiansen, Tad; Christiansen, Cam, Systems and methods for autonomous perpendicular imaging of test squares.
Loveland, Jim; Larson, Leif; Christiansen, Dan; Christiansen, Tad; Christiansen, Cam, Systems and methods for autonomous perpendicular imaging of test squares.
Larson, Leif; Loveland, Jim; Christiansen, Dan; Christiansen, Tad; Christiansen, Cam, Systems and methods for autonomous perpendicular imaging with a target field of view.
Loveland, Jim; Larson, Leif; Christiansen, Dan; Christiansen, Tad; Christiansen, Cam, Systems and methods for surface and subsurface damage assessments, patch scans, and visualization.
Bauer, Mark Patrick; Richman, Brian; Poole, Alan Jay; Michini, Bernard J.; Lovegren, Jonathan Anders; Bethke, Brett Michael; Li, Hui, Unmanned aerial vehicle rooftop inspection system.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.