IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0350693
(2009-01-08)
|
등록번호 |
US-8392036
(2013-03-05)
|
발명자
/ 주소 |
- Jacobsen, Stephen C.
- Olivier, Marc X.
|
출원인 / 주소 |
|
인용정보 |
피인용 횟수 :
19 인용 특허 :
214 |
초록
▼
A remote operator console provides point and go navigation of a robotic vehicle. The remote operator console provides a display for visual representation of the environment in which the robotic vehicle is operating based on sensor information received from the robotic vehicle. An operator may design
A remote operator console provides point and go navigation of a robotic vehicle. The remote operator console provides a display for visual representation of the environment in which the robotic vehicle is operating based on sensor information received from the robotic vehicle. An operator may designate a target point on the display. The robotic vehicle is automatically navigated toward a location in the environment corresponding to the designated target point.
대표청구항
▼
1. A method of providing point and go navigation for remote control of an unmanned robotic vehicle, comprising: obtaining sensor information describing an environment in which the unmanned robotic vehicle is operating;communicating the sensor information to a remotely located operator console;displa
1. A method of providing point and go navigation for remote control of an unmanned robotic vehicle, comprising: obtaining sensor information describing an environment in which the unmanned robotic vehicle is operating;communicating the sensor information to a remotely located operator console;displaying a visual representation of the environment on the operator console based on the sensor information;designating a target point within the visual representation based on operator input;designating a transition point within the visual representation based on operator input;advancing the unmanned robotic vehicle toward a target location in the environment based on an automatic navigation response to the designated target point, which includes automatically determining a planned route from a current unmanned robotic vehicle position to the target location; andadvancing the unmanned robotic vehicle to the transition point, wherein the robotic vehicle is caused to reconfigure from a first pose corresponding to a first mode of operation to a second pose corresponding to a second mode of operation. 2. The method of claim 1, wherein the target location is defined as an operator-defined displacement from a point in the environment corresponding to the target point. 3. The method of claim 1, further comprising: updating the visual representation of the environment on the operator console as the unmanned robotic vehicle moves through the environment; andre-designating the target point within the updated visual representation based on additional operator input. 4. The method of claim 1, further comprising displaying a visual representation of the planned route on the operator console. 5. The method of claim 1, further comprising modifying the planned route based on operator input. 6. The method of claim 1, further comprising: identifying a target path within the visual representation based on the operator input; andautomatically navigating the unmanned robotic vehicle to follow a travel path in the environment correlated with the target path. 7. The method of claim 1, wherein designating the target point comprises touching the operator console at the target point on the visual representation of the environment. 8. The method of claim 1, wherein designating the target point comprises placing an electronic cursor at the target point on the visual representation of the environment. 9. The method of claim 1, wherein the target point is defined by entering coordinates into the operator console. 10. The method of claim 1, further comprising augmenting the visual representation of the environment using additional environmental information obtained independently of the unmanned robotic vehicle. 11. The method claim 10, wherein the additional environmental information is obtained from an electronic map. 12. The method claim 10, wherein the additional environmental information is obtained from a second environmental sensor. 13. The method of claim 12, wherein the second environmental sensor is mounted on a second unmanned robotic vehicle. 14. The method claim 10, wherein the additional environmental information is obtained from a GPS receiver. 15. The method of claim 1, further comprising: identifying a characteristic of the environment from the sensor information; andmodifying unmanned robotic vehicle behavior based on the characteristic. 16. A system for providing point and go navigation of an unmanned robotic vehicle, the system comprising: a) an unmanned robotic vehicle having an environmental sensor and a transmitter and receiver unit, and being capable of controlled movement within an environment; andb) a remote operator console in bi-directional communication with the unmanned robotic vehicle, the remote operator console comprising: i) a receiver for receiving sensor information from the unmanned robotic vehicle;ii) a display for displaying a visual representation of the environment based on the sensor information received from the environmental sensor;iii) an operator input function for defining at least one of a target point and a transition point on the visual representation, andiv) a transmitter for transmitting navigational commands to the unmanned robotic vehicle based on the target point,wherein the vehicle automatically determines a planned route from a current unmanned robotic vehicle position to a target location in the environment based at least in part on the target point, andwherein the vehicle, upon reaching the transition point, reconfigures from a first pose corresponding to a first mode of operation to a second pose corresponding to a second mode of operation. 17. The system of claim 16, wherein the navigational commands comprise target point coordinates. 18. The system of claim 17, wherein the target point coordinates are defined relative to the visual representation. 19. The system of claim 16, wherein the navigational commands comprise a series of movement commands. 20. The system of claim 16, wherein the environmental sensor is chosen from the group consisting of a camera, a stereo camera, a sound sensor, an electromagnetic sensor, a chemical sensor, a radar, a lidar, a range finder, a scanning range finder, a sonar, a contact sensor, a sniff sensor, a GPS receiver, an inertial measurement unit, an orientation sensor, and combinations thereof. 21. The system of claim 16, wherein the display and operator input are provided by a touch screen. 22. The system of claim 16, further comprising a communication link between the unmanned robotic vehicle and the remote operator console. 23. The system of claim 22, wherein the communication link is selected from the group consisting of a wireless radio frequency link, a free space optical link, a free space ultrasonic link, a wired link, a fiber optic link, and combinations thereof. 24. The system of claim 23, wherein the communication link further comprises at least one relay node.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.