Process and device for determining the position and the orientation of an image reception means
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G05B-015/00
G05B-019/00
출원번호
UP-0952313
(2004-09-28)
등록번호
US-7818091
(2010-11-08)
우선권정보
DE-103 45 743(2003-10-01)
발명자
/ 주소
Kazi, Arif
Bischoff, Rainer
출원인 / 주소
Kuka Roboter GmbH
대리인 / 주소
Wood, Herron & Evans, LLP
인용정보
피인용 횟수 :
22인용 특허 :
27
초록▼
A process and a device are provided for determining the pose as the entirety of the position and the orientation of an image reception device. The process is characterized in that the pose of the image reception device is determined with the use of at least one measuring device that is part of a rob
A process and a device are provided for determining the pose as the entirety of the position and the orientation of an image reception device. The process is characterized in that the pose of the image reception device is determined with the use of at least one measuring device that is part of a robot. The device is characterized by a robot with an integrated measuring device that is part of the robot for determining the pose of the image reception device.
대표청구항▼
The invention claimed is: 1. A process for determining the pose as the entirety of the position and the orientation of an image reception means, the process comprising: providing an image reception means; recording an image with said image reception means; providing an automatic handling device, sa
The invention claimed is: 1. A process for determining the pose as the entirety of the position and the orientation of an image reception means, the process comprising: providing an image reception means; recording an image with said image reception means; providing an automatic handling device, said automatic handling device comprising a measuring means; determining the pose of the image reception means with said measuring means of said automatic handling device; providing an augmented reality system, said augmented reality system comprising an image-generating component; delivering said determined pose of said image reception means and said recorded image into said image-generating component of said augmented reality system; superimposing a virtual image on said recorded image in said augmented reality system to form a superimposed image; displaying said superimposed image via said augmented reality system; the user selecting an object in said superimposed image; changing traveling commands of said automatic handling device with said augmented reality system such that an orientation of said image reception means corresponds to a position of said selected object, wherein said selected object is arranged in a center position of said recorded image of said image reception means when said orientation of said image reception means corresponds to said position of said selected object; manually moving said automatic handling device to view the selected object from another perspective when the viewing point of the selected object is not suitable; maintaining the image reception means with the automatic handling device in a manually selected orientation when the viewing point is suitable; and storing the traveling commands. 2. A process in accordance with claim 1, wherein the pose of the image reception means is determined via angle sensors located in axle drives of said automatic handling device, said automatic handling device being a robot. 3. A process in accordance with claim 1, wherein said measuring means obtains pose data, said measuring means determining the pose of said image reception means, said pose data being assigned to the images recorded in a particular pose by said image reception means. 4. A process in accordance with claim 1, wherein said augmented reality system inserts virtual information of said automatic handling device into images recorded by the image reception means. 5. A process in accordance with claim 1, wherein an optical display is used as the image reception means. 6. A process in accordance with claim 1, wherein a camera is used as the image reception means. 7. A process in accordance with claim 6, wherein the camera is used at said automatic handling device, which is programmed especially to perform tasks. 8. A process in accordance with claim 6, wherein a camera is used on another automatic handling device located adjacent to said automatic handling device programmed to perform tasks. 9. A process in accordance with claim 1, wherein said measuring means is connected to said automatic handling device. 10. A process in accordance with claim 1, wherein the pose of said automatic handling device in the space is determined by means of at least one mark arranged in the space as well as at least one said measuring means arranged at a robot, such as a camera or a sensor. 11. A process in accordance with claim 1, wherein the pose of said automatic handling device in the space is determined by means of at least one mark arranged at said automatic handling device as well as at least one said measuring means arranged independently from said automatic handling device, such as a camera or at least one sensor. 12. A process in accordance with claim 1, wherein the pose of said automatic handling device relative to another automatic handling device is determined by means of a mark arranged on said automatic handling device and said measuring means, such as a camera or a sensor, which is arranged at said another automatic handling device. 13. A process in accordance with claim 1, wherein combining said recorded image with said virtual image to form said combined image includes overlaying said virtual image on said recorded image. 14. A device for determining a pose as the entirety of a position and an orientation of an image reception means, the device comprising: an automatic handling device with a measuring means integrated with said automatic handling device for determining the pose of the image reception means, said image reception means recording an image; an augmented reality system comprising an image-generating component, said image-generating component receiving said determined pose of said image reception means and said recorded image, wherein a virtual image of said augmented reality system is superimposed on said recorded image is to form a superimposed image, said augmented reality system generating an image reception means orientation signal when a position of a selected object in said superimposed image is changed in said augmented reality system, said image reception means receiving said image reception means orientation signal such that said image reception means is moved to a selected object position, said selected object position corresponding to said position of said selected object in said superimposed image, said selected object being arranged opposite a center position of said image reception means when said image reception means is in said selected object position; and a control for manually moving said automatic handling device to view the selected object from another perspective when the viewing point of the selected object is not suitable, for maintaining the image reception means with the automatic handling device in a manually selected orientation when the viewing point is suitable, and for storing traveling commands of said automatic handling device. 15. A device in accordance with claim 14, wherein an assigning means is provided for assigning pose data, which determine the pose of an image reception means and are obtained by said measuring means of said automatic handling device, with the particular images recorded in a particular pose by the image reception means. 16. A device in accordance with claim 14, wherein a means for inserting specific virtual information is provided for inserting virtual information into images recorded by the image reception means. 17. A device in accordance with claim 14, wherein said measuring means of said automatic handling device has angle sensors integrated in robot axles of said automatic handling device. 18. A device in accordance with claim 14, wherein said image reception means is an optical display. 19. A device in accordance with claim 18, wherein the camera is arranged on said automatic handling device, which is programmed especially to perform a task. 20. A device in accordance with claim 14, wherein said image reception means is a camera. 21. A device in accordance with claim 14, wherein a measuring means for determining the site and the position of a said image reception means arranged separately from said automatic handling device is arranged on said automatic handling device. 22. A device in accordance with claim 14, further comprising fixed marks in space as well as on said measuring means arranged on said automatic handling device, for determining the pose of said automatic handling device relative to said marks. 23. A device in accordance with claim 14, wherein at least one said mark is arranged on said automatic handling device as well as at least one said measuring means, which is arranged independently from said automatic handling device, such as a camera or at least one sensor, for receiving the pose of the said automatic handling device in the space. 24. A device in accordance with claim 14, wherein a mark is arranged on said automatic handling device as well as said measuring means, which is arranged on another said automatic handling device, such as a camera or a sensor, for determining the relative positions of the said two automatic handling devices. 25. A process for determining the orientation and position of an image reception means, the process comprising: providing an image reception means; recording an image with said image reception means; providing a robot, said robot comprising a measuring means; determining a pose of the image reception means with said measuring means of said robot; providing an augmented reality system, said augmented reality system comprising an image-generating component; delivering said determined pose of said image reception means and said recorded image into said image-generating component of said augmented reality system; superimposing a virtual image on said recorded image of said augmented reality system to form a superimposed image; displaying said superimposed image via said augmented reality system; the user selecting at least one object in said superimposed image; changing an orientation of said image reception means such that said orientation of said image reception means corresponds to a position of said selected object in said superimposed image, said selected object being arranged in a center position of said recorded image; manually moving said automatic handling device to view the selected object from another perspective when the viewing point of the selected object is not suitable; maintaining the image reception means with the automatic handling device in a manually selected orientation when the viewing point is suitable; and storing traveling commands of said automatic handling device.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (27)
Kelley Robert B. (Kingston RI) Birk John R. (Peace Dale RI) Duncan Dana L. (Wakefield RI) Tella Richard P. (Ashaway RI) Wilson Laurie J. (Ashaway RI), Apparatus and method to enable a robot with vision to acquire, orient and transport workpieces.
Steiner Walter R. (Ormond Beach FL) Kelly William A. (West Port Orange FL) Caesar ; Jr. Robert J. (Daytona Beach FL) Buchner Gregory C. (South Daytona FL) Morgan Michael L. (Ormond Beach FL), Computer image generation method for determination of total pixel illumination due to plural light sources.
Wang Xuguang (Provo UT) Red Walter E. (Provo UT) Manley Peter H. (Alpine UT), Robot end-effector terminal control frame (TCF) calibration method and device.
Bradski, Gary; Konolige, Kurt; Rublee, Ethan; Straszheim, Troy; Strasdat, Hauke; Hinterstoisser, Stefan, Continuous updating of plan for robotic object manipulation based on received sensor data.
Bradski, Gary; Konolige, Kurt; Rublee, Ethan; Straszheim, Troy; Strasdat, Hauke; Hinterstoisser, Stefan, Continuous updating of plan for robotic object manipulation based on received sensor data.
Konolige, Kurt; Rublee, Ethan; Hinterstoisser, Stefan; Straszheim, Troy; Bradski, Gary; Strasdat, Hauke, Detection and reconstruction of an environment to facilitate robotic interaction with the environment.
Konolige, Kurt; Rublee, Ethan; Hinterstoisser, Stefan; Straszheim, Troy; Bradski, Gary; Strasdat, Hauke Malte, Detection and reconstruction of an environment to facilitate robotic interaction with the environment.
Lin, Yhu-Tin; Daro, Timothy; Abell, Jeffrey A.; Turner, III, Raymond D.; Casoli, Daniel J., System and method for controlling a vision guided robot assembly.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.