A method of operating a robot includes electronically receiving images and augmenting the images by overlaying a representation of the robot on the images. The robot representation includes user-selectable portions. The method includes electronically displaying the augmented images and receiving an
A method of operating a robot includes electronically receiving images and augmenting the images by overlaying a representation of the robot on the images. The robot representation includes user-selectable portions. The method includes electronically displaying the augmented images and receiving an indication of a selection of at least one user-selectable portion of the robot representation. The method also includes electronically displaying an intent to command the selected at least one user-selectable portion of the robot representation, receiving an input representative of a user interaction with at least one user-selectable portion, and issuing a command to the robot based on the user interaction.
대표청구항▼
1. A method of operating a robot, the method comprising: electronically receiving images;augmenting the images by overlaying a representation of the robot on the images, the robot representation comprising user-selectable portions corresponding to movable portions of the robot;electronically display
1. A method of operating a robot, the method comprising: electronically receiving images;augmenting the images by overlaying a representation of the robot on the images, the robot representation comprising user-selectable portions corresponding to movable portions of the robot;electronically displaying the augmented images;receiving an indication of a selection of at least one user-selectable portion of the robot representation;electronically displaying an intent to command the selected at least one user-selectable portion of the robot representation;receiving an input representative of a user interaction with at least one user-selectable portion;determining at least one movement parameter of at least one movable portion of the robot using inverse kinematics based on the received input; andissuing a command to the robot based on the at least one movement parameter. 2. The method of claim 1, wherein the received images are captured by an imaging device disposed on the robot. 3. The method of claim 1, wherein the received images are captured by an imaging device located in a vicinity of the robot. 4. The method of claim 1, wherein the representation of the robot comprises at least one of a representation of a vehicle body, a representation of a gripper, a representation of a link, or a representation of an actuator. 5. The method of claim 4, wherein electronically displaying the intent to command the selected at least one user-selectable robot representation portion comprises modifying an appearance of the robot representation. 6. The method of claim 1, wherein the input representative of the user interaction is received from a touch display and comprises at least one of a linear finger swipe, a curved finger swipe, a multi-finger swipe, a multi-finger gesture, a tap, or a prolonged press. 7. The method of claim 1, wherein receiving the input representative of the user interaction with at least one user-selectable portion comprises: receiving a first input representative of a selection of a displayed object; andreceiving a second input representative of a selection of a robot behavior, the robot behavior associated with the object. 8. The method of claim 7, wherein the robot behavior comprises navigating the robot towards the object. 9. The method of claim 8, further comprising receiving an indication of a selection of an alternate approach direction, the robot behavior determining a drive path using odometry and/or inertial measurement signals from an inertial measurement unit of the robot to navigate the robot from a current location and a current approach direction to approach the object from the alternate approach direction. 10. The method of claim 8, wherein the robot behavior comprises grasping the object with a manipulator of the robot. 11. The method of claim 7, further comprising identifying in the images a plane of a ground surface supporting the robot and a location of the object with respect to the ground surface plane. 12. The method of claim 1, further comprising: receiving an indication of a selection of a reverse-out behavior; andexecuting the reverse-out behavior, the reverse-out behavior: determining at least one reverse-movement parameter of the at least one movable portion of the robot using inverse kinematics to move the at least one movable portion of the robot in an opposite direction along a path moved according to the issued command; andcommanding the at least one movable portion of the robot based on the at least one reverse-movement movement parameter. 13. The method of claim 1, further comprising: receiving an indication of a selection of an anti-tip behavior; andexecuting the anti-tip behavior, the anti-tip behavior: receiving inertial measurement signals from an inertial measurement unit of the robot;determining a location of a center of gravity of the robot; andwhen the location of the center of gravity of the robot moves outside a stable volume of space, commanding deployment of an appendage of the robot to alter the location of the center of gravity of the robot or brace the robot against a supporting surface to prevent tipping of the robot. 14. The method of claim 1, further comprising providing haptic feedback in response to the received input. 15. The method of claim 1, further comprising, when the determined at least one movement parameter of the at least one movable portion of the robot based on the received input violates a movement policy or is unexecutable, issuing a negative feedback response. 16. The method of claim 15, wherein the negative feedback response comprises at least one of a haptic feedback response, an audio feedback response, or a visual feedback response, the visual feedback response comprising displaying an indicator at or near any portions of the robot representation corresponding to any unmovable portions of the robot based on the received input. 17. An operator controller unit for controlling a robot, the operator controller unit comprising: a screen; anda processor in communication with the screen, the processor configured to: electronically receive images;augment the images by overlaying a representation of the robot on the images, the robot representation comprising user-selectable portions corresponding to movable portions of the robot;electronically display the augmented images on the screen;receive an indication from the screen of a selection of at least one user-selectable portion of the robot representation;electronically display on the screen an intent to command the selected at least one user-selectable portion of the robot representation;receive an input from the screen representative of a user interaction with at least one user-selectable portion;determine at least one movement parameter of at least one movable portion of the robot using inverse kinematics based on the received input; andissue a command to the robot based on the at least one movement parameter. 18. The operator controller unit of claim 17, wherein the received images are captured by an imaging device disposed on the robot. 19. The operator controller unit of claim 17, wherein the received images are captured by an imaging device located in a vicinity of the robot. 20. The operator controller unit of claim 17, wherein the representation of the robot comprises at least one of a representation of a vehicle body, a representation of a gripper, a representation of a link, or a representation of an actuator. 21. The operator controller unit of claim 20, wherein electronically displaying the intent to command the selected at least one user-selectable robot representation portion comprises modifying an appearance of the representation of the robot. 22. The operator controller unit of claim 17, wherein the input representative of the user interaction is received from a touch display and comprises at least one of a linear finger swipe, a curved finger swipe, a multi-finger swipe, a multi-finger gesture, a tap, or a prolonged press. 23. The operator controller unit of claim 17, wherein receiving the input representative of the user interaction with at least one user-selectable portion comprises: receiving a first input representative of a selection of a displayed object; andreceiving a second input representative of a selection of a robot behavior, the robot behavior associated with the object. 24. The operator controller unit of claim 23, wherein the robot behavior comprises navigating the robot towards the object. 25. The operator controller unit of claim 24, wherein receiving the input representative of the user interaction further comprises receiving an indication of a selection of an alternate approach direction, the robot behavior determining a drive path using odometry and/or inertial measurement signals from an inertial measurement unit of the robot to navigate the robot from a current location and a current approach direction to approach the object from the alternate approach direction. 26. The operator controller unit of claim 25, wherein the robot behavior comprises grasping the object with a manipulator of the robot. 27. The operator controller unit of claim 23, wherein the processor executes a surface recognition routine that identifies in the images a plane of a ground surface supporting the robot and a location of the object with respect to the ground surface plane. 28. The operator controller unit of claim 17, wherein the processor is configured to: receive an indication of a selection of a reverse-out behavior; andexecute the reverse-out behavior, the reverse-out behavior: determining at least one reverse-movement parameter of the at least one movable portion of the robot using inverse kinematics to move the at least one movable portion of the robot in an opposite direction along a path moved according to the issued command; andcommanding the at least one movable portion of the robot based on the at least one reverse-movement movement parameter. 29. The operator controller unit of claim 17, wherein the processor is configured to: receive an indication of a selection of an anti-tip behavior; andexecute the anti-tip behavior, the anti-tip behavior: receiving inertial measurement signals from an inertial measurement unit of the robot;determining a location of a center of gravity of the robot; andwhen the location of the center of gravity of the robot moves outside a stable volume of space, commanding deployment of an appendage of the robot to alter the location of the center of gravity of the robot or brace the robot against a supporting surface to prevent tipping of the robot. 30. The operator controller unit of claim 17, further comprising a vibration device, the processor configured to command the vibration device to provide haptic feedback in response to the received input. 31. The operator controller unit of claim 17, wherein the processor is configured to, when the determined at least one movement parameter of the at least one movable portion of the robot based on the received input violates a movement policy or is unexecutable, issue a negative feedback response. 32. The method of claim 31, wherein the negative feedback response comprises at least one of a haptic feedback response, an audio feedback response, or a visual feedback response, the visual feedback response comprising displaying on the screen an indicator at or near any portions of the robot representation corresponding to any unmovable portions of the robot based on the received input.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (31)
Phillips, Emilie; Powers, Aaron; Shein, Andrew; Jamieson, Josef P.; Sawyer, Tyson, Autonomous behaviors for a remote vehicle.
Wang, Yulun; Jordan, Charles S.; Laby, Keith P.; Southard, Jonathan; Pinter, Marco; Miller, Brian, Mobile robot with a head-based movement mapping scheme.
Geier George J. (Santa Clara CA) Heshmati Ardalan (Campbell CA) Johnson Kelly G. (Milpitas CA) McLain Patricia W. (Sunnyvale CA), Position and velocity estimation system for adaptive weighting of GPS and dead-reckoning information.
Kajita, Shigeo; Awano, Katsusuke; Tozawa, Shoji; Nishikawa, Hiroyasu; Miki, Masatoshi, Remote radio operating system, and remote operating apparatus, mobile relay station and radio mobile working machine.
Hoffman, Orin P. F.; Keefe, Peter; Smith, Eric; Wang, John; Labrecque, Andrew; Ponsler, Brett; Macchia, Susan; Madge, Brian J., Remotely operating a mobile robot.
Hoffman, Orin P. F.; Keefe, Peter; Smith, Eric; Wang, John; Labrecque, Andrew; Ponsler, Brett; Macchia, Susan; Madge, Brian J., Remotely operating a mobile robot.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.