A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include
A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device.
대표청구항▼
1. A telepresence robot system local terminal comprising: an electronic display;a processor in communication with the electronic display;a memory in communication with the processor, the memory comprising instructions executable by the processor configured to cause the processor to: retrieve at leas
1. A telepresence robot system local terminal comprising: an electronic display;a processor in communication with the electronic display;a memory in communication with the processor, the memory comprising instructions executable by the processor configured to cause the processor to: retrieve at least a portion of a plan view map representative of a facility;receive a video feed from an imaging system of a remote telepresence robot at a first perspective;receive positioning data associated with a current position of the remote telepresence robot relative to the plan view map;display the video feed from the imaging system of the remote telepresence robot and the plan view map; andtransmit a command to the remote telepresence robot specifying a movement for the remote telepresence robot; anda user input device in communication with the processor, the user input device configured to enable a user to select a movement for the remote telepresence robot via any of at least three options on a single user interface, the at least three options comprising: selecting a destination of the remote telepresence robot with respect to the video feed;selecting a destination of the remote telepresence robot with respect to the plan view map; andselecting a destination of the remote telepresence robot by incrementally advancing the remote telepresence robot in a direction relative to the current position of the remote telepresence robot. 2. The telepresence robot system local terminal of claim 1, wherein the selection of the movement comprises selecting an alternative perspective of the video feed by selecting a point within the video feed. 3. The telepresence robot system local terminal of claim 1, wherein the selection of the movement comprises selecting an alternative perspective of the video feed by selecting a point on the plan view map. 4. The telepresence robot system local terminal of claim 1, wherein the selection of the movement comprises selecting an alternative perspective of the video feed by incrementally panning or tilting the imaging system while the remote telepresence robot remains stationary. 5. The telepresence robot system local terminal of claim 1, wherein the selection of the movement may relate to rotating one of a lower portion of the remote telepresence robot and an upper portion of the remote telepresence robot. 6. The telepresence robot system local terminal of claim 1, wherein the instructions executable by the processor are further configured to cause the processor to: receive the selection of the destination of the remote telepresence robot from the user input device;determine a sequence of coordinates relative to the plan view map to create a navigation path between the current position of the remote telepresence robot and the selected destination of the remote telepresence robot; andtransmit a command to the remote telepresence robot comprising the sequence of coordinates forming the navigation path. 7. The telepresence robot system local terminal of claim 6, wherein the instructions executable by the processor are further configured to cause the processor to display the sequence of coordinates forming the navigation path overlaid on the plan view map. 8. The telepresence robot system local terminal of claim 6, wherein the instructions executable by the processor are further configured to cause the processor to: determine a distortion between the plan view map and the video feed received from the imaging system of the remote telepresence robot;apply the distortion to the sequence of coordinates forming the navigation path to determine corresponding video coordinates and perspective data describing a location and perspective of the sequence of coordinates relative to the video feed; anddisplay a three-dimensional rendition of the sequence of coordinates forming the navigation path overlaid on the video feed. 9. The telepresence robot system local terminal of claim 8, wherein the three-dimensional rendition of the sequence of coordinates forming the navigation path is overlaid on the video feed with respect to a floor detected in the video feed. 10. The telepresence robot system local terminal of claim 1, wherein the instructions executable by the processor are further configured to cause the processor to: receive a selection of a destination of the remote telepresence robot from the user input device;transmit destination coordinates relative to the plan view map to the remote telepresence robot, the destination coordinates corresponding to the selected destination;receive a sequence of coordinates relative to the plan view map from a navigation system of the remote telepresence robot, the sequence of coordinates forming a navigation path between the current position of the remote telepresence robot and the desired destination of the remote telepresence robot; anddisplay the sequence of coordinates forming the navigation path overlaid on the plan view map. 11. The telepresence robot system local terminal of claim 10, wherein the instructions executable by the processor are further configured to cause the processor to: determine a distortion between the plan view map and the video feed received from the imaging system of the remote telepresence robot;apply the distortion to the sequence of coordinates forming the navigation path to determine corresponding video coordinates and perspective data describing a location and perspective of the sequence of coordinates relative to the video feed; anddisplay a three-dimensional rendition of the sequence of coordinates forming the navigation path overlaid on the video feed. 12. The telepresence robot system local terminal of claim 11, wherein the three-dimensional rendition of the sequence of coordinates forming the navigation path is overlaid on the video feed with respect to a floor detected in the video feed. 13. The telepresence robot system local terminal of claim 1, wherein the instructions executable by the processor are further configured to cause the processor to receive coordinates on the plan view map of an obstacle detected by a sensor system of the remote telepresence robot. 14. The telepresence robot system local terminal of claim 1, wherein the plan view map is stored remotely. 15. The telepresence robot system local terminal of claim 14, wherein the plan view map is stored within the remote telepresence robot. 16. The telepresence robot system local terminal of claim 1, wherein the instructions executable by the processor are further configured to cause the processor to: determine a distortion between the plan view map and the video feed received from the imaging system of the remote telepresence robot; andgenerate a hybrid map view comprising a blended view of the plan view map and the video feed from the imaging system of the remote telepresence robot. 17. The telepresence robot system local terminal of claim 16, wherein the hybrid map view comprises a three-dimensional representation of the plan view map overlaid on the video feed. 18. The telepresence robot system local terminal of claim 1, wherein the instructions executable by the processor are further configured to cause the processor to: receive a request via the user input device for a rendered look ahead for a virtual location of the remote telepresence robot on the plan view map;determine a distortion between the plan view map and the video feed received from the imaging system of the remote telepresence robot;generate a virtual three-dimensional video feed based on a virtual location of the remote telepresence robot; anddisplay the virtual three-dimensional video feed based on the virtual location of the remote telepresence robot. 19. The telepresence robot system local terminal of claim 1, wherein the instructions executable by the processor are configured to cause the processor to display the plan view map with an indication of the current position of the remote telepresence robot on the plan view map. 20. The telepresence robot system local terminal of claim 1, wherein the user input device is further configured to allow a user to select a movement for the remote telepresence robot by selecting a destination of the remote telepresence robot from a list of pre-defined destinations. 21. The telepresence robot system local terminal of claim 1, wherein the plan view map is representative of robot-navigable areas of a robot operating surface. 22. A method for controlling a remote telepresence robot, comprising: retrieving at least a portion of a plan view map representative of a facility;receiving a video feed from an imaging system of the remote telepresence robot at a first perspective;receiving positioning data associated with a current position of the remote telepresence robot relative to the plan view map;displaying the video feed from the imaging system of the remote telepresence robot; andtransmitting a command to the remote telepresence robot; andreceiving a plurality of movement selections from a user input device via a single user interface, including: at least one movement selection made with respect to the video feed;at least one movement selection made with respect to the plan view map; andat least one movement selection made with respect to the current position of the remote telepresence robot by incrementally advancing the remote telepresence robot in a direction relative to the current position of the remote telepresence robot.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (1)
Allard, James R., Method and system for remote control of mobile robot.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.