Interactive and shared augmented reality system and method having local and remote access
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
H04N-005/232
H04N-005/225
출원번호
UP-0743315
(2003-12-23)
등록번호
US-7714895
(2010-06-03)
우선권정보
SE-0203908(2002-12-30)
발명자
/ 주소
Pretlove, John
Pettersen, Thomas
출원인 / 주소
ABB Research Ltd.
대리인 / 주소
Venable LLP
인용정보
피인용 횟수 :
69인용 특허 :
19
초록▼
An augmented reality system including a camera movably located at a local site captures an image. A registering unit generates graphics and registers the generated graphics to the image from the camera to provide a composite augmented reality image. A display device located at a remote site, physica
An augmented reality system including a camera movably located at a local site captures an image. A registering unit generates graphics and registers the generated graphics to the image from the camera to provide a composite augmented reality image. A display device located at a remote site, physically separated from the local site, displays a view including the composite augmented reality image. A communication link communicates information between the local and the remote site. A specifying unit specifies a position and an orientation in the remote site. The registering unit is adapted to register the generated graphical representation to the image in dependence of the specified position and orientation. The camera is arranged such that its position and orientation is dependent on the specified position and orientation.
대표청구항▼
The invention claimed is: 1. A system for remote programming of an industrial robot, the system comprising: a camera for capturing an image, the camera being movably located on the robot at a local site, a first registering unit configured to generate graphics and register the graphics generated by
The invention claimed is: 1. A system for remote programming of an industrial robot, the system comprising: a camera for capturing an image, the camera being movably located on the robot at a local site, a first registering unit configured to generate graphics and register the graphics generated by the first registering unit on the image from the camera, to provide a composite augmented reality image comprising graphical information overlaid on an image captured at the local site, a remote display device carried by or arranged on an operator located at a remote site, physically separated from the local site, for displaying a view comprising the composite augmented reality image, a tracking unit associated with the remote display device and configured to determine a remote position and an orientation of the remote display device in the remote site in relation to a fixed remote coordinate system, wherein the first registering unit is adapted to register the generated graphics and image captured at the local site to the augmented reality image in dependence on the position and orientation specified by the tracking unit, and wherein the remote display device is adapted to display the generated graphics to the augmented reality image and image captured at the local site in dependence on the position and orientation specified by the first specifying unit, a specifying unit configured to specify a position and an orientation of the robot at the local site in relation to a local coordinate system, wherein a position and orientation of the robot is dependent on the remote position and orientation specified by the tracking unit in the remote coordinate system, a second registering unit configured to generate graphics and register the generated graphics on an environment at the local site or an image of the environment of the local site, in dependence on the position and the orientation specified by the specifying unit, to provide a composite augmented reality image comprising the generated graphics overlaid on the image of the environment at the local site, and a communication link configured to communicate information between the local site and the remote site, and to communicate to the robot positions and orientations specified by the tracking unit. 2. The system according to claim 1, wherein said tracking unit is adapted to determine a position and orientation of a movable device located at the remote site, the first registering unit adapted to register the generated graphics on the image in dependence of the position and orientation of the movable device, and the camera is arranged such that its position and orientation are dependent on the position and orientation of the movable device. 3. The system according to claim 2, wherein said movable device is the remote display device. 4. The system according to claim 1, further comprising a graphical generator configured to generate a graphical representation, wherein the first registering unit and the second registering unit are adapted to generate graphics based on the graphical representation. 5. The system according to claim 1, further comprising operator input device located at the remote site and configured to feed data related to the graphics to be displayed to the system, wherein the system is adapted to generate the graphics based on said data. 6. The system according to claim 5, wherein said operator input device comprises a pointing device and a tracking unit configured to determine a position of the pointing device and wherein the system is adapted to generate a graphical representation of a point pointed out by a pointing member based on the position of the pointing device. 7. The system according to claim 1, further comprising a second movable device located at the local site, wherein the specifying unit comprises a second tracking unit configured to determine the position and the orientation of the second movable device. 8. The system according to claim 7, wherein said second movable device is a local display device. 9. The system according claim 7, further comprising a second camera for capturing an image, the camera being arranged in a fix relation to the second movable device, wherein the second registering unit is adapted to register the generated graphics generated by the second registering unit to the image from the second camera, to provide a composite augmented reality image, and wherein a local display device is adapted to display a view comprising the composite augmented reality image. 10. The system according to claim 9, further comprising a handheld display device comprising a display member and the camera. 11. The system according to claim 10, wherein the handheld display device is arranged so that the user seems to look directly through the display. 12. The system according to claim 1, wherein the remote display device is adapted to display a view seen from a first visual angle that depends on the position and orientation received from the first specifying tracking unit and wherein a local display device is adapted to display the same view as the remote display device seen from a second visual angle that depends on the position and orientation received from the second specifying unit. 13. The system according to claim 1, wherein the communication link is configured to transfer voices between the remote and the local site. 14. The system according to claim 1, wherein the communication link comprises a network. 15. The system according to claim 1, wherein the system is configured for remote programming of an industrial robot by controlling movements of the robot at the local site and teaching the robot one or more waypoints to carry out a task. 16. The system according to claim 1, wherein the robot comprises elements for a paint application. 17. The system according to claim 1, further comprising: a local display device configured to display the composite augmented reality image comprising the graphics generated by the second registering unit overlaid on the image of the environment at the local site. 18. A method for remote programming of an industrial robot arranged at a local site, the method comprising: obtaining an image at the local site from a camera mounted on the robot, generating first graphics, registering the first graphics with the image at the local site, generating a composite augmented reality image with a registering unit based on the image, the generated first graphics, and the specified position and orientation, displaying the augmented reality image on a remote display device carried by or arranged on an operator at a remote site that is physically separated from the local site, specifying a remote position and an orientation at the remote site with a tracking unit carried by or arranged on an operator at the remote site, wherein the position and orientation are in relation to a remote coordinate system, specifying a local position and an orientation at the local site in relation to a local coordinate system based on the remote position and orientation specified at the remote site in the remote coordinate system, positioning and orienting the robot in the specified local position and orientation such that a camera arranged on the robot assumes the specified local position and orientation, displaying a view comprising an environment of the local site and the generated first graphics projected on the environment in dependence of the specified local position and orientation, and remotely controlling movements of the robot at the local site and remotely teaching the robot one or more waypoints at the local site to carry out a task. 19. The method according to claim 18, wherein specifying a remote position and an orientation comprises determining a remote position and an orientation of a movable device located at the remote site and wherein the camera is positioned and oriented according to the remote position and orientation of the movable device. 20. The method according to claim 19, wherein said movable device comprises a remote display device and wherein said view comprising the composite augmented reality image is displayed on the remote display device. 21. The method according to claim 18, wherein the camera is mounted on the robot, the method further comprising controlling movements of the robot according to the remote position and orientation of a movable device. 22. The method according to claim 18, further comprising obtaining data related to the generated first graphics to be displayed, and generating the first graphics based on said data. 23. The method according to claim 18, further comprising receiving information about the position of a pointing device and generating first graphics representing a point pointed out by a pointing member, based on the position of the pointing device. 24. The method according to claim 18, wherein specifying a local position and an orientation in the local site comprises determining a local position and an orientation of a second movable device located at the local site. 25. The method according to claim 24, wherein the second movable device comprises a local display device and wherein said view, comprising the environment of the local site and the graphics, is displayed on the local display device. 26. The method according to claim 25, further comprising capturing an image from a second camera being arranged in a fixed relation to the second movable device, registering the generated graphics on the image from the second camera, to provide a composite augmented reality image, and displaying a view comprising the composite augmented reality image on the local display device. 27. The method according to claim 18, further comprising generating second graphics, and displaying the view comprising the environment of the local site and the second graphics projected on the environment in dependence of the specified local position and orientation. 28. The method according to claim 27, further comprising generating a local graphical representation, generating a remote graphical representation, transferring the local and remote graphical representations between the local and the remote site, generating the remote first graphics based on the local and the remote graphical representation, and generating the second graphics based on the local and the remote graphical representation. 29. The method according to claim 18, wherein the view displayed at the remote site comprises the environment of the local site and the overlaid graphics seen from a visual angle that depends on the remote position and orientation specified in the remote site and the view displayed in the local site comprises the environment of the local site and the overlaid graphics seen from a visual angle that depends on the local position and orientation specified in the local site. 30. A computer program product, comprising: a computer readable medium; and computer program instructions recorded on the computer readable medium and executable by a processor for performing a method for remote programming of an industrial robot by remotely displaying an augmented reality view comprising graphical information overlaid an image captured at a local site, the method comprising obtaining an image at the local site from a camera mounted on the robot, generating first graphics, registering the first graphics with the image at the local site, generating a composite augmented reality image with a registering unit based on the image, the generated first graphics, and the specified position and orientation, displaying the augmented reality image on a remote display device carried by or arranged on an operator at a remote site that is physically separated from the local site, specifying a remote position and an orientation at the remote site with a tracking unit carried by or arranged on an operator at the remote site, wherein the position and orientation are in relation to a remote coordinate system, specifying a local position and an orientation at the local site in relation to a local coordinate system based on the remote position and orientation specified at the remote site in the remote coordinate system, positioning and orienting the robot in the specified local position and orientation such that a camera arranged on the robot assumes the specified local position and orientation, displaying a view comprising an environment of the local site and the generated first graphics projected on the environment in dependence of the specified local position and orientation, and remotely controlling movements of the robot at the local site and remotely teaching the robot one or more waypoints at the local site to carry out a task.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (19)
Dalziel Marie R. (London GB2) Wiseman Neil E. (Cambridge GB2) Oliver Martin A. (Beckington GB2) Forrest Andrew K. (London GB2) Clocksin William F. (Girton GB2) King Tony R. (Cambridge GB2) Wipfel Rob, Adaptive vision-based controller.
Corby ; Jr. Nelson Raymond ; Meenan Peter Michael ; Solanas ; III Claude Homer ; Vickerman David Clark ; Nafis Christopher Allen, Augmented reality maintenance system employing manipulator arm with archive and comparison device.
Corby ; Jr. Nelson Raymond ; Meenan Peter Michael ; Solanas ; III Claude Homer ; Vickerman David Clark ; Nafis Christopher Allen, Augmented reality maintenance system for multiple rovs.
Friedrich,Wolfgang; Wohlgemuth,Wolfgang, Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus.
Sato,Hiroaki; Aso,Takashi; Kawai,Tomoaki; Iizuka,Yoshio; Morita,Kenji; Shimoyama,Tomohiko; Matsui,Taichi, Image processing apparatus and method with setting of prohibited region and generation of computer graphics data based on prohibited region and first or second position/orientation.
Ellenby John (San Francisco CA) Ellenby Thomas (San Francisco CA) Ellenby Peter (San Francisco CA), Vision imaging devices and methods exploiting position and attitude.
Ellenby, John; Ellenby, Thomas; Ellenby, Peter, Vision system computer modeling apparatus including interaction with real scenes with respect to perspective and spatial relationship as measured in real-time.
Mandella, Michael J.; Gonzalez-Banos, Hector H.; Alboszta, Marek, Computer interface for manipulated objects with an absolute pose detection component.
Hill, David M.; Evertt, Jeffrey J.; Jones, Alan M.; Roesler, Richard C.; Jean, Andrew William; Charbonneau, Emiko V., Enhanced configuration and control of robots.
Ballard, Brian Adams; Jenkins, Jeff; English, Edward Robert; Reily, Todd Richard; Athey, James Leighton, Method and system for representing and interacting with augmented reality content.
Hill, David M.; Jean, Andrew William; Evertt, Jeffrey J.; Jones, Alan M.; Roesler, Richard C.; Carlson, Charles W.; Charbonneau, Emiko V.; Dack, James, Mixed environment display of attached control elements.
Hill, David M.; Jones, Alan M.; Evertt, Jeffrey J.; Roesler, Richard C.; Charbonneau, Emiko V.; Jean, Andrew William, Mixed environment display of robotic actions.
Geisner, Kevin A; Perez, Kathryn Stone; Latta, Stephen G.; Sugden, Ben J; Vaught, Benjamin I; Cole, Jeffrey B; Kipman, Alex Aben-Athar; McIntyre, Ian D; McCulloch, Daniel, Service provision using personal audio/visual system.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.