국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
UP-0313787
(2005-12-22)
|
등록번호 |
US-7787992
(2010-09-20)
|
발명자
/ 주소 |
- Pretlove, John
- Skourup, Charlotte
- Öberg, Pierre
- Pettersen, Thomas
- Apneseth, Christoffer
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
49 인용 특허 :
12 |
초록
▼
A method to generate technical information about a device or process by recognizing the identity of a device or process, retrieving a stored image dependent on the device, combining the stored virtual image together with an image of a real device object, displaying the combination on a display and p
A method to generate technical information about a device or process by recognizing the identity of a device or process, retrieving a stored image dependent on the device, combining the stored virtual image together with an image of a real device object, displaying the combination on a display and providing a virtual control HMI means to the user for monitoring and/or controlling the device or process. A graphical user interface, computer program and a system are also described.
대표청구항
▼
The invention claimed is: 1. A method to generate a human-machine interface for a user to monitor or control of an industrial or commercial device or process, the method comprising: receiving with a portable computing device carried by the user information of an identity of said industrial device o
The invention claimed is: 1. A method to generate a human-machine interface for a user to monitor or control of an industrial or commercial device or process, the method comprising: receiving with a portable computing device carried by the user information of an identity of said industrial device or process or equipment in the vicinity of the user, matching with said portable computing device said industrial device to control system data or process or equipment, retrieving with the portable computing device at least one stored virtual image dependent on the identity of said industrial device or process or equipment, capturing an image of a real object with an imager carried by the user, combining with the portable computing device the at least one stored virtual image together with said image of a real object, displaying the combination on a display, displaying on a portable display associated with the portable computing device a virtual control human-machine interface for monitoring and/or controlling said industrial device or process or equipment, the human machine interface comprising at least one of information or manipulatable element superimposed on the image of a real object, and controlling said industrial device or process in response to manipulation the user of the at least one of information or manipulatable element superimposed on the image of a real object. 2. The method according to claim 1, further comprising: selecting said industrial device or process or equipment by the user moving or directing a pointer, and tracking with a tracking device carried by the user a position in space of the pointer. 3. The method according to claim 2, further comprising: selecting said industrial device or process or equipment by the user moving or directing the pointer in a field of view of the camera, and photographing with the imager the position and/or movement of the pointer. 4. The method according to claim 3, further comprising: selecting a part of said industrial device or process or equipment or at least one stored virtual image by the user moving or directing the pointer, and tracking with a tracking device carried by the user the position of the pointer so as to move or manipulate a computer generated image of the pointer shown on the display. 5. The method according to claim 1, further comprising: retrieving with the portable computing device status information or condition information for a part of said industrial device or process or equipment dependent on manipulation by the user of the virtual control human-machine interface. 6. The method according to claim 1, further comprising: generating a control action for a part of said industrial device, or process or equipment dependent on manipulation by the user of the virtual control human-machine interface. 7. The method according to claim 1, further comprising: detecting with a tracking device a position of a head of the user or other operator, and calculating a position and orientation and/or a real world coordinate position. 8. The method according to claim 2, further comprising: calculating a position and orientation and/or a real world coordinate position of said industrial device or process or equipment by image output from the imager. 9. The method according to claim 8, further comprising: calculating a position and orientation and/or a real world coordinate position with a focal length output from the imager. 10. The method according to claim 2, further comprising: selecting said industrial device or process or equipment by the user moving or directing a pointer, and tracking the position in space of the pointer using a technology based on any of ultrasound, visible light, infra red light, a GPS system, a mechanical linkage. 11. The method according to claim 1, further comprising: matching the identity of said industrial device or process or equipment to a physical location of said device or process or to a location embodied as a software entity in a control system for said device or process. 12. The method according to claim 8, further comprising: matching the identity of said industrial device or process or equipment to part of a known image of said industrial device or process or equipment. 13. The method according to claim 8, further comprising: detecting a sign, mark or visual code, and matching the recognized sign, mark or visual code to a stored data location and/or a coordinate comprising said industrial device or process or equipment. 14. The method according to claim 8, further comprising: detecting an optical, radio or sound signal radiating from a selected part of said industrial device or process or equipment, and matching the recognized signal to stored location and/or coordinate data comprising said industrial device or process. 15. The method according to claim 8, further comprising: determining a pose or orientation and/or direction of interest of the user, and calculating the location of said industrial device or process or equipment to from a physical location in a or from a location in a virtual plant. 16. The method according to claim 8, further comprising: identifying said industrial device or process or equipment, determining a position of the user, calculating the location of said industrial device or process or equipment relative the position of the user, and providing a graphic orientation physical location of a location in a virtual plant. 17. The method according to claim 1, further comprising: matching with stored data a marker that is visible, and identifying said industrial device or process or equipment by recognition of one or more numbers and/or letters or coded numbers and/or letters. 18. The method according to claim 17, further comprising: matching with stored data a marker that is visible, and identifying said industrial device or process or equipment by recognition of one or more numbers and/or letters encoded in a visible image. 19. The method according to claim 1, further comprising: matching an image of said industrial device or process or equipment to stored data, and identifying said industrial device or process or equipment with an image recognition module. 20. The method according to claim 15, further comprising: calculating a position of the pointer, matching the position to a location in a representation of the plant, and showing the graphic representation of the pointer overlaid on the representation of the plant. 21. A computer program product, comprising: a computer readable medium; and computer program instructions recorded on the computer readable medium and executable by a processor for carrying out a method comprising receiving with a portable computing device carried by the user information of an identity of said industrial device or process or equipment in the vicinity of the user, matching with said portable computing device said industrial device to control system data or process or equipment, retrieving with the portable computing device at least one stored virtual image dependent on the identity of said industrial device or process or equipment, capturing an image of a real object with an imager carried by the user, combining with the portable computing device the at least one stored virtual image together with said image of a real object, displaying the combination on a display, displaying on a portable display associated with the portable computing device a virtual control human-machine interface for monitoring and/or controlling said industrial device or process or equipment, the human machine interface comprising at least one of information or manipulatable element superimposed on the image of a real object, and controlling said industrial device or process in response to manipulation the user of the at least one of information or manipulatable element superimposed on the image of a real object. 22. A system for generating a human-machine interface for a user to use to monitor or control of an industrial or commercial device or process, comprising one or more augmented reality systems, each comprising: a detecting module configured to detect an identifying marker, a matching module configured to match the identified marker to said industrial or commercial device or process, a handheld interacting and pointing device comprising a tracking system for determining a position and orientation of the interacting and pointing device in relation to a world coordinate system, a graphic manipulation system for generating at least one stored virtual image arranged with computer program means for control of monitoring and/or control functions, the graphic manipulation system comprising manipulatable elements, a camera for capturing an image of a real object, and a wearable display device for visualizing an augmented reality display comprising the graphic manipulation system superimposed on said image captured by the camera of a real object of a part of said industrial or commercial device or process. 23. The system according to claim 22, wherein the handheld interacting and pointing device is configured to select said industrial or commercial device or process and to determine a position of the pointer, and wherein the handheld interacting and pointing device comprises a sensor integrated or built in to the pointer. 24. The system according to claim 22, wherein the handheld interacting and pointing device is configured to select said industrial or commercial device or process and determine a position of the pointer, and wherein the handheld interacting and pointing device comprises a camera operated by the user. 25. The system according to claim 22, wherein the handheld interacting and pointing device is configured to select said industrial or commercial device or process and determine a position of the pointer, and wherein the handheld interacting and pointing device comprises a camera operated by the user and the handheld interacting and pointing device. 26. The system according to claim 22, wherein the handheld interacting and pointing device is configured to select said industrial or commercial device or process and determine a position of the pointer, and wherein the handheld interacting and pointing device comprises a technology based on any from the list of: ultrasound, visible light, infra red light, a GPS system, a mechanical linkage. 27. The system according to claim 22, further comprising: one or more computer program modules for determining a position of a pointer in the real world and means for calculating a position of the pointer on a display for an image of the real world pointer. 28. The system according to claim 22, wherein the wearable display device is arranged in a known position relative the camera or integrated with the camera. 29. The system according to claim 25, further comprising: an image capture system unit or camera system for a video see-through system arrangement. 30. The system according to claim 22, further comprising: an identifying marker arranged relative to said industrial or commercial device or process. 31. The system according to claim 30, further comprising: an identifying marker comprising any from the list of: numerical sign, alphanumeric sign, coded alphanumeric sign, machine readable visible sign, bar code, visual sign embodying encoded alphanumeric information. 32. The system according to claim 30, further comprising: an identifying marker comprising a signal comprising any from the list of: IR detector or emitter, ultrasound emitter, sensor for a laser beam, wireless radio transmitter and receiver. 33. The system according to claim 22, further comprising: a tracking module configured to detect a position of a head of the user, or other operator, and a calculator configured to calculate a position and orientation and/or a real world coordinate position of the user. 34. The system according to claim 22, further comprising: an application unit comprising software modules configured to integrate real-time data and/or graphics with an image of an identified said industrial device or process. 35. The system according to claim 22, further comprising: a registration unit comprising software and/or hardware modules configured to overlay real world images with the computer-generated graphics for an optical see-through arrangement. 36. The system according to claim 22, further comprising: a communication module configured to support voice communication and/or of parts or all of an augmented reality interface and/or additional virtual information for a plurality of users. 37. The system according to claim 36, further comprising: a computer program module for one or more other logged-on users to manipulate the augmented reality display. 38. The system according to claim 36, further comprising: a computer program module for one or more other logged-on users to attach text or graphic annotations to an image comprised in the augmented reality display. 39. The system according to claim 22, further comprising: a computer program module and database module for handling information from documentation or other sources. 40. The system according to claim 22, wherein a hand or finger of the user is arranged configured in the system as an interacting and pointing device and/or arranged with a ring, collar, bracelet or other identifying object.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.