Flight device, flight control system and method
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G05D-001/00
G05D-001/10
B64D-047/08
G06T-007/00
B64C-039/02
출원번호
US-0625225
(2017-06-16)
등록번호
US-10234873
(2019-03-19)
우선권정보
CN-2015 1 0778779 (2015-11-13)
발명자
/ 주소
Li, Zuoguang
출원인 / 주소
AUTEL ROBOTICS CO., LTD.
대리인 / 주소
Ladas & Parry LLP
인용정보
피인용 횟수 :
0인용 특허 :
3
초록▼
A flight device includes a processor and a memory storing instructions which are executed by the processor causes the processor to: acquire an image; determine a scene; determine a height of the flight device; calculate an image first and second direction offsets of a second image frame relative to
A flight device includes a processor and a memory storing instructions which are executed by the processor causes the processor to: acquire an image; determine a scene; determine a height of the flight device; calculate an image first and second direction offsets of a second image frame relative to a first image frame of two adjacent image frames; acquire an acceleration and an angular velocity of the flight device in three dimensions; compensate for the image first and second direction offsets, according to the acceleration and the angular velocity, to obtain image correction offsets; calculate an first and second direction offsets in world coordinates corresponding to the image correction offsets; and derive a velocity of the flight device according to a time interval between time points at which the two adjacent image frames are captured and according to the first direction offset and the second direction offset.
대표청구항▼
1. A flight device, comprising: a processor; anda memory communicably connected with the processor, the memory storing instructions, wherein when execution of the instructions by the processor causes the processor to:acquire an image captured by a binocular camera module of the flight device;determi
1. A flight device, comprising: a processor; anda memory communicably connected with the processor, the memory storing instructions, wherein when execution of the instructions by the processor causes the processor to:acquire an image captured by a binocular camera module of the flight device;determine a scene in which the flight device is currently located;determine a height of the flight device according to depth-of-field information of the image captured by the binocular camera module;calculate an image first direction offset and an image second direction offset of a second image frame of two adjacent image frames relative to a first image frame of the two adjacent image frames, according to the two adjacent image frames captured by the binocular camera module and the scene in which the flight device is currently located;acquire an acceleration and an angular velocity of the flight device in three dimensions that are detected by an acceleration sensor of the flight device; and compensating for the image first direction offset and the image second direction offset, according to the acceleration and the angular velocity of the flight device, so as to obtain image correction offsets comprising a corrected image first direction offset and a corrected image second direction offset; andcalculate a first direction offset and a second direction offset in world coordinates corresponding to the image correction offsets according to a lens focal length of the binocular camera module, a height of the flight device and the image correction offsets; and deriving a velocity of the flight device according to a time interval between time points at which the two adjacent image frames are captured and according to the first direction offset and the second direction offset in world coordinates. 2. The flight device according to claim 1, wherein, the determining a height of the flight device according to depth-of-field information of the image captured by the binocular camera module comprises:calibrating two cameras of the binocular camera module to acquire internal parameters of the camera module comprising focal length, image center and distortion coefficients, and external parameters of the camera module comprising rotation matrix and translation matrix of the camera module;correcting the two cameras;performing stereo matching to obtain a parallax map; andperforming three-dimensional reconstruction to obtain depth information, andnormalizing the depth information to a range of [0, 255] to obtain a depth map. 3. The flight device according to claim 1, wherein the flight device further comprises a distance sensor for detecting a distance from the flight device to the ground, and execution of the instructions by the processor further causes the processor to:determine, according to the scene in which the flight device is currently located, whether to derive the height of the flight device according to the depth-of-field information of the image captured by the camera module or to acquire the distance detected by the distance sensor as the height of the flight device. 4. The flight device according to claim 3, wherein execution of the instructions by the processor further causes the processor to: determine whether the scene in which the flight device is located is a richly-textured scene or a poorly-textured scene according to at least one parameter in the image captured by the camera module, and the at least one parameter comprises texture. 5. The flight device according to claim 4, wherein execution of the instructions by the processor further causes the processor to: choose whether to determine the height of the flight device according to the depth-of-field information of the image acquired by the camera module or to acquire the distance detected by the distance sensor as the height of the flight device, depending on whether the scene is the richly-textured scene and on a maximum region area (AM) of the same textures. 6. The flight device according to claim 5, wherein in the richly-textured scene, if it is determined that the maximum region area (AM) having a similar depth value in a depth image is larger than a minimum value (SMIN) and smaller than a maximum value (SMAX), execution of the instructions by the processor further causes the processor to: choose to determine the height of the flight device according to the depth-of-field information of the image captured by the cameras of the camera module, wherein the minimum value (SMIN) is a quarter of the maximum area of the image and the maximum value (SMAX) is three quarters of the maximum area of the image. 7. The flight device according to claim 5, wherein in the richly-textured scene, if it is determined that the maximum region area (AM) having a similar depth value in a depth image is larger than a maximum value (SMAX) and a difference between the height corresponding to the region that is derived from the depth-of-field information of the image captured by the camera module and the height measured by the distance sensor exceeds a threshold, execution of the instructions by the processor further causes the processor to: choose to determine the height of the flight device according to the depth-of-field information calculated from the image captured by the camera module, wherein the maximum value (SMAX) is three quarters of the maximum area of the image. 8. The flight device according to claim 5, wherein in the richly-textured scene, if it is determined that the maximum region area (AM) having a similar depth value in a depth image is smaller than a minimum value (SMIN) or larger than a maximum value (SMAX), execution of the instructions by the processor further causes the processor to: choose to adopt the distance measured by the distance sensor as the height of the flight device; the minimum value (SMIN) is a quarter of the maximum area of the image and the maximum value (SMAX) is three quarters of the maximum of the image. 9. The flight device according to claim 5, wherein execution of the instructions by the processor further causes the processor to: in a poorly-textured scene, adopt the distance measured by the distance sensor as the height of the flight device. 10. The flight device according to claim 1, wherein execution of the instructions by the processor further causes the processor to: calculate the first direction offset in world coordinates according to Formula 1: x1/X1=f/H, and calculates the second direction offset in world coordinates according to Formula 2: y1/Y1=f/H, x1 is a corrected image first direction offset, y1 is a corrected image second direction offset, f is lens focal length, H is height of the flight device, X1 is the first direction offset in world coordinates, and Y1 is the second direction offset in world coordinates; and calculate a rate of the flight device in the first direction and a rate of the flight device in the second direction, according to the time interval between time points at which the two adjacent image frames are captured by the camera module, and according to the first direction offset and the second direction offset in world coordinates. 11. A flight control method for controlling a flight device, comprising: acquiring an image captured by a binocular camera module of the flight device;determining a scene in which the flight device is currently located;determining a height of the flight device according to depth-of-field information of the image captured by the binocular camera module;calculating, according to two adjacent image frames captured by the binocular camera module and the scene in which the flight device is currently located, an image first direction offset and an image second direction offset of a second image frame of the two adjacent image frames relative to a first image frame of the two adjacent image frames;acquiring an acceleration and an three dimensional angular velocity of the flight device in three dimensions that are detected by an acceleration sensor of the flight device; and compensating for the image first direction offset and the image second direction offset, according to the acceleration and the angular velocity of the flight device to obtain image correction offsets comprising a corrected image first direction offset and a corrected image second direction offset; andcalculating a first direction offset and a second direction offset in world coordinates corresponding to the image correction offsets according to a lens focal length of the camera module, a height of the flight device and the image correction offsets; and deriving a velocity of the flight device according to a time interval between time points at which the two adjacent image frames are captured and according to the first direction offset and the second direction offset in world coordinates. 12. The flight control method according to claim 11, wherein the determining a height of the flight device according to depth-of-field information of the image captured by the binocular camera module comprises: measuring two cameras of the binocular camera module to acquire internal parameters comprising focal length, image center and distortion coefficients and external parameters comprising rotation matrix and translation matrix of the camera module;correcting the two cameras;performing stereo matching to obtain a parallax map; andperforming three-dimensional reconstruction to obtain depth information, and normalizing the depth information to a range of [0, 255] to obtain a depth map. 13. The flight control method according to claim 11, wherein the method further comprises: determining, according to the scene in which the flight device is currently located, whether to derive the height of the flight device according to the depth-of-field information of the image captured by the camera module or to acquire the distance detected by the distance sensor as the height of the flight device. 14. The flight control method according to claim 13, wherein the determining a scene in which the flight device is currently located comprises: determining whether the scene in which the flight device is located is a richly-textured scene or a poorly-textured scene according to at least one parameter in the image captured by the camera module, wherein the at least one parameter comprises texture. 15. The flight control method according to claim 14, wherein the determining, according to the scene in which the flight device is currently located, whether to derive the height of the flight device according to the depth-of-field information of the image captured by the camera module or to acquire the distance detected by the distance sensor as the height of the flight device comprises: in a richly-textured scene, if a maximum region area (AM) having a similar depth value in the depth image is larger than a minimum value (SMIN) and smaller than a maximum value (SMAX), choosing to determine the height of the flight device according to the depth-of-field information of the image captured by the cameras of the camera module, wherein the minimum value (SMIN) is a quarter of the maximum area of the image and the maximum value (SMAX) is three quarters of the maximum area of the image. 16. The flight control method according to claim 14, wherein the determining, according to the scene in which the flight device is currently located, whether to derive the height of the flight device according to the depth-of-field information of the image captured by the camera module or to acquire the distance detected by the distance sensor as the height of the flight device comprises: in a richly-textured scene, if a maximum region area (AM) having a similar depth value in the depth image is larger than a maximum value (SMAX), and a difference between the height corresponding to the region that is derived from the depth-of-field information of the image captured by the camera module and the height measured by the distance sensor exceeds a threshold, choosing to determine the height of the flight device according to the depth-of-field information calculated from the image captured by the camera module, wherein the maximum value (SMAX) is three quarters of the maximum area of the image. 17. The flight control method according to claim 14, wherein the determining, according to the scene in which the flight device is currently located, whether to derive the height of the flight device according to the depth-of-field information of the image captured by the camera module or to acquire the distance detected by the distance sensor as the height of the flight device comprises: in a richly-textured scene, if a maximum region area (AM) having a similar depth value in the depth image is smaller than a minimum value (SMIN) or larger than a maximum value (SMAX), choosing to adopt the distance measured by the distance sensor as the height of the flight device, wherein the minimum value (SMIN) is a quarter of the maximum area of the image and the maximum value (SMAX) is three quarters of the maximum area of the image. 18. The flight control method according to claim 14, wherein the determining, according to the scene in which the flight device is currently located, whether to derive the height of the flight device according to the depth-of-field information of the image captured by the camera module or to acquire the distance detected by the distance sensor as the height of the flight device comprises: in a poorly-textured scene, adopting the distance measured by the distance sensor as the height of the flight device. 19. The flight control method according to claim 11, wherein the calculating a first direction offset and a second direction offset in world coordinates corresponding to the image correction offsets according to a lens focal length of the binocular camera module, a height of the flight device and the image correction offsets comprises: calculating the first direction offset in world coordinates according to Formula: x1/X1=f/H; andcalculating the second direction offset in world coordinates according to Formula: y1/Y1=f/H, wherein x1 is the corrected image first direction offset, y1 is the corrected image second direction offset, f is the lens focal length, H is the height of the flight device, X1 is the first direction offset in world coordinates, and Y1 is the second direction offset in world coordinates. 20. A non-volatile computer storage medium storing computer executable instructions, wherein when the computer executable instructions are executed by a processor, causes the processor to perform the flight control method of claim 11.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (3)
Wood, Derek, Collision avoidance system and method utilizing variable surveillance envelope.
Boyd, Scott Patrick; Cui, Chengwu; Graber, Sarah; O'Brien, Barry James; Watson, Joshua John; Wilcox, Scott Michael, Obstacle awareness based guidance to clear landing space.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.