Systems and methods for autonomous drone navigation
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G08G-005/00
B64C-039/02
B64D-047/08
G01C-021/20
G05D-001/00
G05D-001/10
출원번호
US-0464659
(2017-03-21)
등록번호
US-10134293
(2018-11-20)
발명자
/ 주소
Parker, Jon Leland
Johnson, Christopher M.
Williams, Earl Otto
Malik, Jyoti Prakash
Foucha, Nigel M.
출원인 / 주소
Walmart Apollo, LLC
대리인 / 주소
McCarter & English, LLP
인용정보
피인용 횟수 :
0인용 특허 :
2
초록▼
Described are systems, methods, and computer readable medium for a drone navigation system. Exemplary embodiments provide a drone with an imaging device and a computing device in communication with the drone. The computing device receives a selection of a CAD blueprint that includes measurements of
Described are systems, methods, and computer readable medium for a drone navigation system. Exemplary embodiments provide a drone with an imaging device and a computing device in communication with the drone. The computing device receives a selection of a CAD blueprint that includes measurements of an interior portion of a building, and receives a start point and an end point on the blueprint. The computing device analyzes the blueprint and generates a route from the start point to the end point, and determines a first set of instructions in terms of distance and degrees to navigate the generated route. The computing device processes the first set of instructions to generate a second set of instructions in terms of yaw, pitch and roll. The second set of instructions are exported to the drone to cause the drone to navigate the generated route in the building.
대표청구항▼
1. A drone system comprising: a drone equipped with an imaging device; anda computing device including a communication interface and a processor implementing an extraction module and a conversion module, the communication interface enabling communication with the drone; wherein the extraction module
1. A drone system comprising: a drone equipped with an imaging device; anda computing device including a communication interface and a processor implementing an extraction module and a conversion module, the communication interface enabling communication with the drone; wherein the extraction module when executed: receives a selection of a Computer Aided Design (CAD) blueprint, the CAD blueprint including measurements of an interior portion of a building,receives input indicating a start point and an end point on the CAD blueprint,analyzes the CAD blueprint to generate a route from the start point to the end point, anddetermines a first set of instructions in terms of distance and degrees to navigate the generated route, and wherein the conversion module includes an application program interface (API) and when executed: processes the first set of instructions using the API to generate a second set of instructions for drone operation, the second set of instructions representing the distance and degrees of the first set of instructions converted into drone navigation commands;exports the second set of instructions as commands to the drone via the communication interface to cause the drone to navigate the route in the building. 2. The drone system of claim 1, further comprising: a server configured to perform image processing, and wherein the second set of instructions also includes instructions for the drone to stop periodically while navigating the route and actuate the imaging device to acquire imaging data, the imaging data transmitted to the server for image processing. 3. The drone system of claim 2, wherein the imaging data indicates information related to items placed on a plurality of shelves in the building. 4. The drone system of claim 3, wherein the position of the drone at a point in time is recorded with respect to the imaging data. 5. The drone system of claim 1, wherein the CAD blueprint indicates positions of fixtures and shelves. 6. The drone system of claim 1, wherein the CAD blueprint includes a plurality of rows or aisles of shelves, and the extraction module is configured to generate the route from the start point to the end point through each of the plurality of rows or aisles of shelves. 7. The system of claim 1, wherein the drone is an unmanned aerial vehicle or unmanned ground vehicle. 8. The system of claim 1, wherein the drone navigation commands in the second set of instructions include yaw, pitch and roll instructions. 9. A computer-implemented method for navigating a drone, the method comprising: receiving, at an extraction module implemented at a computing device, a selection of a Computer Aided Design (CAD) blueprint, the CAD blueprint including measurements of an inside of a building, wherein the extraction module further: receives input indicating a start point and an end point on the CAD blueprint,analyzes the CAD blueprint to generate a route from the start point to the end point, anddetermines a first set of instructions in terms of distance and degrees to navigate the generated route; andprocessing, at a conversion module implemented at the computing device and using an application program interface (API), the first set of instructions to generate a second set of instructions for drone operation, the second set of instructions representing the distance and degrees of the first set of instructions converted into drone navigation commands; andexporting the second set of instructions as commands to the drone via a communication interface of the computing device to cause the drone to navigate the route in the building. 10. The method of claim 9, wherein the second set of instructions also includes instructions for the drone to stop periodically while navigating the route and actuate the imaging device to acquire imaging data, the imaging data transmitted to the server for image processing. 11. The method of claim 10, wherein the imaging data indicates information related to items placed on a plurality of shelves in the building. 12. The method of claim 11, wherein the position of the drone at a point in time is recorded with respect to the imaging data. 13. The method of claim 9, wherein the CAD blueprint indicates positions of fixtures and shelves. 14. The method of claim 9, wherein the CAD blueprint includes a plurality of rows or aisles of shelves, and the route is generated from the start point to the end point through each of plurality of rows or aisles of shelves. 15. A non-transitory computer readable medium storing instructions that when executed by a processor causes the processor to implement a method for navigating a drone, the method comprising: receiving, at an extraction module implemented at the processor, a selection of a Computer Aided Design (CAD) blueprint, the CAD blueprint including measurements of an inside of a building, wherein the extraction module further: receiving input indicating a start point and an end point on the CAD blueprint,analyzes the CAD blueprint to generate a route from the start point to the end point, anddetermines a first set of instructions in terms of distance and degrees to navigate the generated route; andprocessing, at a conversion module implemented at the processor and using an application program interface (API), the first set of instructions to generate a second set of instructions for drone operation, the second set of instructions representing the distance and degrees of the first set of instructions converted into drone navigation commands; andexporting the second set of instructions as commands to the drone via a communication interface coupled to the processor to cause the drone to navigate the route in the building. 16. The non-transitory computer readable medium of claim 15, wherein the second set of instructions also includes instructions for the drone to stop periodically while navigating the route and actuate the imaging device to acquire imaging data, the imaging data transmitted to the server for image processing. 17. The non-transitory computer readable medium of claim 16, wherein the imaging data indicates information related to items placed on a plurality of shelves in the building. 18. The non-transitory computer readable medium of claim 17, wherein the position of the drone at a point in time is recorded with respect to the imaging data. 19. The non-transitory computer readable medium of claim 15, wherein the CAD blueprint indicates positions of fixtures and shelves. 20. The non-transitory computer readable medium of claim 15, wherein the CAD blueprint includes a plurality of rows or aisles of shelves, and the route is generated from the start point to the end point through each of plurality of rows or aisles of shelves.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (2)
Dennison, John C.; Campion, David C., Optical-flow techniques for improved terminal homing and control.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.