Human interaction with unmanned aerial vehicles
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G01C-023/00
G05D-001/00
B64D-031/04
B64D-031/06
B64C-039/02
B64D-001/10
B64D-001/12
G06Q-010/08
출원번호
US-0500645
(2014-09-29)
등록번호
US-9459620
(2016-10-04)
발명자
/ 주소
Schaffalitzky, Frederik
출원인 / 주소
Amazon Technologies, Inc.
대리인 / 주소
Kilpatrick Townsend & Stockton LLP
인용정보
피인용 횟수 :
10인용 특허 :
11
초록▼
In some examples, an unmanned aerial vehicle is provided. The unmanned aerial vehicle may include a propulsion device, a sensor device, and a management system. In some examples, the management system may be configured to receive human gestures via the sensor device and, in response, instruct the pr
In some examples, an unmanned aerial vehicle is provided. The unmanned aerial vehicle may include a propulsion device, a sensor device, and a management system. In some examples, the management system may be configured to receive human gestures via the sensor device and, in response, instruct the propulsion device to affect an adjustment to the behavior of the unmanned aerial vehicle. Human gestures may include visible gestures, audible gestures, and other gestures capable of recognition by the unmanned vehicle.
대표청구항▼
1. An unmanned aerial vehicle, comprising: a frame; a propulsion system connected to the frame; a retaining system connected to the frame and configured to retain a package; a communication system connected to the frame and configured to receive gesture input; and a management module associated with
1. An unmanned aerial vehicle, comprising: a frame; a propulsion system connected to the frame; a retaining system connected to the frame and configured to retain a package; a communication system connected to the frame and configured to receive gesture input; and a management module associated with at least the communication system and comprising: memory that stores computer-executable instructions; and at least one processor configured to access the memory and execute the computer-executable instructions to at least: access a delivery plan that identifies: the package; and a flight plan that includes a geographic location for delivery of the package; receive, via the communication system, a first gesture input comprising a first gesture by a first human user when the unmanned aerial vehicle is executing the flight plan; in response to receiving the first gesture input, access a portion of gesture information from a gesture database, gesture information from the gesture database describing a plurality of gestures capable of being recognized by the communication system; determine, based on the first gesture and the portion of gesture information, a trajectory adjustment for the unmanned aerial vehicle; instruct the propulsion system to implement the trajectory adjustment; receive a second gesture input comprising a second gesture by a second human user when the unmanned aerial vehicle is executing the flight plan; and instruct the retaining system to deliver the package at the physical location based in part on the second gesture input. 2. The unmanned aerial vehicle of claim 1, wherein at least one of the first gesture by the first human user and the second gesture by the second human user comprises an audible gesture or a visual gesture. 3. The unmanned aerial vehicle of claim 1, wherein the communication system comprises a light sensor including at least one of a visible light camera, an infrared camera, an RGB camera, a depth sensor, or an ultraviolet sensitive camera, and at least one of the first gesture input and the second gesture input is received via the light sensor. 4. The unmanned aerial vehicle of claim 1, wherein the gesture database is local to the management module, and accessing the portion of the gesture information from the gesture database includes accessing locally the portion of the gesture information from the gesture database. 5. An apparatus, comprising: a frame; a propulsion device connected to the frame; a retaining system connected to the frame; a sensor device connected to the frame and configured to receive human gestures; and a management system in communication at least with the sensor device, the management system comprising: memory that stores computer-executable instructions; and at least one processor configured to access the memory and execute the computer-executable instructions to at least: receive, from a management module associated with an electronic marketplace, a delivery plan, the delivery plan identifying: a package to be delivered by the apparatus to a human user at a distinct location; and a flight plan that includes the distinct location for delivery of the package, instruct the propulsion device to move the apparatus to the distinct location according to the flight plan; receive, via the sensor device, a first human gesture when the apparatus, executing the flight plan, is traveling to the distinct location; determine, based on the first human gesture, a behavior adjustment that causes the apparatus to deviate from the flight plan; instruct the apparatus to implement the behavior adjustment; and instruct, at least in response to receiving a second human gesture, the retaining system to release the package at the distinct location. 6. The apparatus of claim 5, wherein the at least one processor is further configured to access the memory and execute the computer-executable instructions to at least access gesture information from a gesture repository prior to determining the behavior adjustment, the gesture repository storing a plurality of gestures capable of being recognized by the management system. 7. The apparatus of claim 6, wherein determining the behavior adjustment includes: determining a meaning of the human gesture with respect to a comparison of the first human gesture with at least one of a plurality of gestures identified from the gesture information; and determining the behavior adjustment based in part on the meaning of the human gesture. 8. The apparatus of claim 7, wherein the behavior adjustment comprises an adjustment resulting in the apparatus moving closer to a first human user or an adjustment resulting in the apparatus moving further from the first human user. 9. The apparatus of claim 5, wherein the at least one processor is further configured to access the memory and execute the computer-executable instructions to at least verify an identity of the human user prior to instructing the retaining system to release the package at the distinct location. 10. The apparatus of claim 9, wherein verifying the identity of the human user comprises at least one of: verification by comparison of customer information identifying the human user with information provided by the human user during delivery, verification by connection of the apparatus with a user device of the human user, verification by connection of the apparatus with a specialized device of the human user, verification by the apparatus scanning a code provided by the human user, verification by a remote user receiving verification information from the human user via the apparatus, verification by the human gesture, or verification by a second human gesture distinct from or in combination with the human gesture. 11. The apparatus of claim 5, wherein the propulsion device comprises a plurality of propulsion devices, individual propulsion devices configured to contribute to at least one of vertical movement, lateral movement, or hovering of the apparatus. 12. The apparatus of claim 5, wherein the sensor device comprises at least one of a visual sensing device configured to at least receive visible human gestures or an audible sensing device configured to at least receive audible human gestures. 13. The apparatus of claim 12, wherein the first human gesture and the second human gestures comprise at least one of: a voice gesture, an arm gesture, a multiple-arm gesture, a hand gesture, a multiple-hand gesture, a body gesture, a natural reaction gesture, a flag gesture, a sign gesture, or a static gesture. 14. The apparatus of claim 5, wherein the first human gesture comprises a plurality of human gestures, and receiving the first human gesture comprises receiving the plurality of human gestures from a plurality of human users, and wherein the at least one processor is further configured to access the memory and execute the computer-executable instructions to at least determine, from the plurality of human gestures, a single human gesture, the behavior adjustment determined based on the single human gesture. 15. A computer-implemented method comprising: accessing a delivery plan identifying instructions for delivery of an item by an unmanned device, the delivery plan comprising a coarse positioning portion, a fine positioning portion, and a delivery decision portion; instructing the unmanned device to execute the delivery plan; receiving a first human gesture while the unmanned device is executing any one of the coarse positioning portion of the delivery plan, the fine positioning portion of the delivery plan, or the delivery decision portion of the delivery plan; accessing a human gesture repository, the human gesture repository storing a plurality of human gestures capable of being recognized by the unmanned device; and determining, based at least in part on a comparison of the first human gesture with one or more human gestures of the plurality of human gestures in the repository, whether to deliver the item. 16. The computer-implemented method of claim 15, wherein the delivery plan is adjusted when the comparison of the first human gesture with the one or more human gestures of the plurality of human gestures results in a match. 17. The computer-implemented method of claim 15, further comprising, after determining whether to adjust the delivery plan, adjusting the delivery plan by: instructing the unmanned device to navigate vertically away from a location where the first human gesture was received; and instructing the unmanned device to wait for a period of time before resuming the delivery plan. 18. The computer-implemented method of claim 15, further comprising, after determining whether to deliver the item, instructing, based at least in part on the comparison of the first human gesture with the one or more human gestures, the unmanned device to change from the coarse positioning portion of the delivery plan to one of the fine positioning portion of the delivery plan or the delivery decision portion of the delivery plan.
Padan, Nir, System and method for enhancing the payload capacity, carriage efficiency, and adaptive flexibility of external stores mounted on an aerial vehicle.
Delazari Binotto, Alecio Pedro; Guimaraes, Rodrigo Laiola; Caetano dos Santos, Davi Francisco, System, method and computer program product for controlling a mission-oriented robot based on a user's emotional state.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.