IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0829322
(2004-04-22)
|
등록번호 |
US-7343036
(2008-03-11)
|
우선권정보 |
DE-103 18 205(2003-04-22) |
발명자
/ 주소 |
|
출원인 / 주소 |
- Siemens Aktiengesellschaft
|
대리인 / 주소 |
Harness, Dickey & Pierce, PLC
|
인용정보 |
피인용 횟수 :
32 인용 특허 :
7 |
초록
▼
A computer-based 3D imaging method is for a wireless endoscope unit, equipped with a camera, of the size of a capsule that can be swallowed by the patient. A medical apparatus is for the pseudo three-dimensional representation of the surroundings of the endoscope unit. The images recorded by the cam
A computer-based 3D imaging method is for a wireless endoscope unit, equipped with a camera, of the size of a capsule that can be swallowed by the patient. A medical apparatus is for the pseudo three-dimensional representation of the surroundings of the endoscope unit. The images recorded by the camera are subjected to a pattern recognition algorithm for identifying common features of chronologically successive individual images. Individual images that show spatially coherent structures are concatenated by superimposing common image features in the course of an image conditioning procedure in order to produce a pseudo three-dimensional representation.
대표청구항
▼
What is claimed is: 1. A computer-assisted 3D imaging method for a wireless, capsule-type endoscope unit equipped with a video camera, comprising: recording images of surroundings of the endoscope unit; transmitting image data of the recorded images, in a wireless fashion, from the endoscope unit t
What is claimed is: 1. A computer-assisted 3D imaging method for a wireless, capsule-type endoscope unit equipped with a video camera, comprising: recording images of surroundings of the endoscope unit; transmitting image data of the recorded images, in a wireless fashion, from the endoscope unit to at least one of a reception device and evaluation device; executing a pattern recognition algorithm for identifying substantially corresponding features of successive individual images of a recorded image sequence; carrying out an image processing procedure for the concatenation of individual images by superimposing the identified, substantially corresponding image features in order thereby to produce a pseudo three-dimensional representation of the surroundings of the endoscope unit; calculating distance squares (dij2:=d2(xMi, xRj)) between the image parameters, stored in the form of N-dimensional feature vectors (xMi), of recorded individual images with the image parameters, stored in the form of N-dimensional reference vectors (xRj), of images of diseased tissue structures from a reference image database by calculating the square of the Euclidean length (∥Δxij∥2) of their difference vectors (Δxij:=xMi xRj); and determining the reference vectors (xRj) of the reference images whose distance squares (dij2) are a minimum in relation to the respective feature vectors (xMi) of the individual images to be examined. 2. The computer-assisted 3D imaging method as claimed in claim 1, wherein, with each i-th recording, the position of the endoscope unit is detected and transmitted together with the image data to the reception and evaluation device and is digitally stored therein, i being a whole number greater than or equal to one. 3. The computer-assisted 3D imaging method as claimed in claim 2, wherein at least one of the position and orientation of the capsule-type endoscope unit is detected and inserted into the pseudo three-dimensional representation visualized via a display device. 4. The computer-assisted 3D imaging method as claimed in claim 2, wherein, for concatenation of two individual images (m, n), use is made of the path difference (Δxm,n:=xn-xm), covered by the capsule-type endoscope unit and loaded with a weighting factor, between the instantaneous recording positions (xm, xn) of the unit for recording the two individual images (m, n). 5. The computer-assisted 3D imaging method as claimed in claim 4, wherein the instantaneous recording positions (xm, xn) of the capsule-type endoscope unit are determined by evaluating X-ray pictures in which the endoscope unit is identifiable. 6. The computer-assisted 3D imaging method as claimed in claim 4, wherein the instantaneous recording positions (xm, xn) of the capsule-type endoscope unit are determined by evaluating the signal transit times (Tm, Tn) of the wireless image data transmission from the endoscope unit to the reception device. 7. The computer-assisted 3D imaging method as claimed in claim 1, wherein at least one of the position and orientation of the capsule-type endoscope unit is detected and inserted into the pseudo three-dimensional representation visualized via a display device. 8. The computer-assisted 3D imaging method as claimed in claim 7, wherein the pseudo three-dimensional representation of the surroundings of the endoscope unit visualized via the display device, is inspectable in the course of a virtual endoscopy by varying the viewing perspective with the aid of control signals of an input unit. 9. The computer-assisted 3D imaging method as claimed in claim 1, wherein different camera perspectives of the surroundings of the endoscope unit are displayed by navigating a cursor in a control window of an operator interface, represented on a display device, of a computer program. 10. The computer-assisted 3D imaging method as claimed in claim 9, wherein the navigation is performed by way of input parameters. 11. The computer-assisted 3D imaging method as claimed in claim 9, wherein the navigation is performed by way of input parameters including magnitude of an advancing movement in a direction of movement of the capsule-type endoscope unit, and magnitude of rotary movement about an axis pointing in the direction of movement. 12. The computer-assisted 3D imaging method as claimed in claim 9, wherein the pseudo three-dimensional representation of the surroundings of the endoscope unit visualized via the display device, is inspectable in the course of a virtual endoscopy by varying the viewing perspective with the aid of control signals of an input unit. 13. The computer-assisted 3D imaging method as claimed in claim 1, wherein the pseudo three-dimensional representation of the surroundings of the endoscope unit visualized via a display device, is inspectable in the course of a virtual endoscopy by varying the viewing perspective with the aid of control signals of an input unit. 14. The computer-assisted 3D imaging method as claimed in claim 1, wherein, for concatenation of two individual images (m, n), use is made of the path difference (Δxm,n:=xn-xm), covered by the capsule-type endoscope unit and loaded with a weighting factor, between the instantaneous recording positions (xm, xn) of the unit for recording the two individual images (m, n). 15. The computer-assisted 3D imaging method as claimed in claim 14, wherein instantaneous recording positions (xm, xn) of the capsule-type endoscope unit are determined by evaluating X-ray pictures in which the endoscope unit is identifyable. 16. The computer-assisted 3D imaging method as claimed in claim 14, wherein instantaneous recording positions (xm, xn) of the capsule-type endoscope unit are determined by evaluating the signal transit times (Tm, Tn) of the wireless image data transmission from the endoscope unit to the reception device. 17. The computer-assisted 3D imaging method as claimed in claim 1, wherein, for concatenation of two individual images (m, n), use is made of the path difference (Δxm,n:=xn-xm), covered by the capsule-type endoscope unit and loaded with a weighting factor, between the instantaneous recording positions (xm, xn) of the unit for recording the two individual images (m, n). 18. A wireless endoscope unit in the form of a swallowable capsule, comprising an integrated camera for recording a sequence of individual images; a transmitter for wireless transmission of image data of the recorded images to a reception device and evaluation device; and a permanent magnet, provided in the capsule, via which the endoscope unit is actively movable in a wireless fashion upon application of a temporally varying external magnetic field, wherein with each i-th recording, the position of the endoscope unit is detected and transmitted together with the image data to the reception device and evaluation device and is digitally stored therein, i being a whole number greater than or equal to one, and the evaluation device is configured to carry out an image processing procedure for concatenation of individual images, wherein the concatenation of two individual images (m, n), includes using the path difference (Δxm,n:=xn-xm), covered by the capsule-type endoscope unit and loaded with a weighting factor, between the instantaneous recording positions (xm, xn) of the unit for recording the two individual images (m, n). 19. A medical apparatus for recording and evaluating signals from a capsule-type endoscope unit, comprising: a reception unit for wireless reception of image information transmitted by the capsule-type endoscope unit; a computation unit for decoding the image data transmitted by the capsule-type endoscope unit and for carrying out an image conditioning process for producing a pseudo three-dimensional representation of received image information; and a display device for visualizing the conditioned image data, wherein the computation unit is configured to calculate distance squares (dij2:=d2(xMi, xRj)) between the image parameters, stored in the form of N-dimensional feature vectors (xMi), of recorded individual images with the image parameters, stored in the form of N-dimensional reference vectors (xRj), of images of diseased tissue structures from a reference image database by calculating the square of the Euclidean length (∥Δxij∥2) of their difference vectors (Δxij:=xMi xRj) and determine the reference vectors (xRj) of the reference images whose distance squares (dij2) are a minimum in relation to the respective feature vectors (xMi) of the individual images to be examined. 20. The medical apparatus as claimed in claim 19, further comprising a magnet tube, including field coils for generating a stationary homogeneous magnetic field ({right arrow over (B)}0), and one gradient coil, each with an associated gradient amplifier for three Cartesian space coordinates x, y and z for locally changing the magnetic field in the ��x-, ��y-and/or ��z-directions. 21. The medical apparatus as claimed in claim 20, further comprising: a distributed arrangement of metal sensors for locating metal parts of the capsule-type endoscope unit; and a measuring sensor, connected to the sensor arrangement, including a transponder as an interface between the sensor arrangement and the computation unit. 22. The medical apparatus as claimed in claim 19, further comprising: a distributed arrangement of metal sensors for locating metal parts of the capsule-type endoscope unit; and a measuring sensor, connected to the sensor arrangement, including a transponder as an interface between the sensor arrangement and the computation unit. 23. A 3D imaging method for a wireless, capsule-type endoscope unit equipped with a video camera, comprising: identifying substantially corresponding features of successive individual images of a recorded sequence of images of surroundings of the endoscope unit; concatenating individual images by superimposing identified, substantially corresponding image features in order thereby to produce a pseudo three-dimensional representation of the surroundings of the endoscope unit; calculating distance squares (dij2:=d2(xMi, xRj)) between the image parameters, stored in the form of N-dimensional feature vectors (xMi), of recorded individual images with the image parameters, stored in the form of N-dimensional reference vectors (xRj), of images of diseased tissue structures from a reference image database by calculating the square of the Euclidean length (∥Δxij∥2) of their difference vectors (Δxij:=xMi xRj); and determining the reference vectors (xRj) of the reference images whose distance squares (dij2) are a minimum in relation to the respective feature vectors (xMi) of the individual images to be examined. 24. The method of claim 23, wherein image data of the recorded images are transmitted, in a wireless fashion, from the endoscope unit to at least one of a reception device and evaluation device. 25. The method of claim 23, wherein a pattern recognition algorithm is executed to identify the substantially corresponding features of successive individual images. 26. A wireless endoscope unit in the form of a swallowable capsule, comprising a video camera for recording a sequence of individual images; a transmitter for wireless transmission of image data of the recorded images; and a permanent magnet, provided in the capsule, via which the endoscope unit is actively movable in a wireless fashion upon application of a temporally varying external magnetic field, wherein with each i-th recording, the position of the endoscope unit is detected and transmitted together with the image data to a reception and evaluation device and is digitally stored therein, i being a whole number greater than or equal to one, and the evaluation device is configured to carry out an image processing procedure for concatenation of individual images, wherein the concatenation of two individual images (m, n), includes using the path difference (Δxm,n:=xn-xm), covered by the capsule-type endoscope unit and loaded with a weighting factor, between the instantaneous recording positions (xm, xn) of the unit for recording the two individual images (m, n). 27. A 3D imaging system for a wireless, capsule-type endoscope unit equipped with a video camera, comprising: means for identifying substantially corresponding features of successive individual images of a recorded sequence of images of surroundings of the endoscope unit; means for concatenating individual images by superimposing identified, substantially corresponding image features in order thereby to produce a pseudo three-dimensional representation of the surroundings of the endoscope unit; means for calculating distance squares (dij2:=d2(xMi, xRj)) between the image parameters, stored in the form of N-dimensional feature vectors (xMi), of recorded individual images with the image parameters, stored in the form of N-dimensional reference vectors (xRj), of images of diseased tissue structures from a reference image database by calculating the square of the Euclidean length (∥Δxij∥2) of their difference vectors (Δxij:=xMi xRj); and means for determining the reference vectors (xRj) of the reference images whose distance squares (dij2) are a minimum in relation to the respective feature vectors (xMi) of the individual images to be examined. 28. A medical apparatus for recording and evaluating signals from a capsule-type endoscope unit, comprising: means for wireless reception of image information transmitted by the capsule-type endoscope unit; means for carrying out an image conditioning process for producing a pseudo three-dimensional representation of the received image data; means for calculating distance squares (dij2:=d2(xMi, xRj)) between the image parameters, stored in the form of N-dimensional feature vectors (xMi), of recorded individual images with the image parameters, stored in the form of N-dimensional reference vectors (xRj), of images of diseased tissue structures from a reference image database by calculating the square of the Euclidean length (∥Δxij∥2) of their difference vectors (Δxij:=xMi xRj); and means for determining the reference vectors (xRj) of the reference images whose distance squares (dij2) are a minimum in relation to the respective feature vectors (xMi) of the individual images to be examined; and means for displaying the conditioned image data. 29. A wireless endoscope unit in the form of a swallowable capsule, comprising means for recording a sequence of individual images; means for wireless transmission of image data of the recorded images; and means, provided in the capsule, for actively moving the endoscope unit in a wireless fashion upon application of a temporally varying external magnetic field, wherein with each i-th recording, the position of the endoscope unit is detected and transmitted together with the image data to a means for reception and evaluation and is digitally stored therein, i being a whole number greater than or equal to one, and the means for reception and evaluation is for carrying out an image processing procedure for concatenation of individual images, wherein the concatenation of two individual images (m, n), includes using the path difference (Δxm,n:=xn-xm), covered by the capsule-type endoscope unit and loaded with a weighting factor, between the instantaneous recording positions (xm, xn) of the unit for recording the two individual images (m, n). 30. A computer-assisted 3D imaging method for a wireless, capsule-type endoscope unit equipped with a video camera, comprising: recording images of surroundings of the endoscope unit; transmitting image data of the recorded images, in a wireless fashion, from the endoscope unit to at least one of a reception device and evaluation device; executing a pattern recognition algorithm for identifying substantially corresponding features of successive individual images of a recorded image sequence; and carrying out an image processing procedure for the concatenation of individual images by superimposing the identified, substantially corresponding image features in order thereby to produce a pseudo three-dimensional representation of the surroundings of the endoscope unit, wherein with each i-th recording, the position of the endoscope unit is detected and transmitted together with the image data to the reception and evaluation device and is digitally stored therein, being a whole number greater than or equal to one, and for concatenation of two individual images (m, n), use is made of the path difference (Δxm,n:=xn-xm), covered by the capsule-type endoscope unit and loaded with a weighting factor, between the instantaneous recording positions (xm, xn) of the unit for recording the two individual images (m, n). 31. A computer-assisted 3D imaging method for a wireless, capsule-type endoscope unit equipped with a video camera, comprising: recording images of surroundings of the endoscope unit; transmitting image data of the recorded images, in a wireless fashion, from the endoscope unit to at least one of a reception device and evaluation device; executing a pattern recognition algorithm for identifying substantially corresponding features of successive individual images of a recorded image sequence; and carrying out an image processing procedure for the concatenation of individual images by superimposing the identified, substantially corresponding image features in order thereby to produce a pseudo three-dimensional representation of the surroundings of the endoscope unit, wherein for concatenation of two individual images (m, n), use is made of the path difference (Δxm,n:=xn-xm), covered by the capsule-type endoscope unit and loaded with a weighting factor, between the instantaneous recording positions (xm, xn) of the unit for recording the two individual images (m, n), and instantaneous recording positions (xm, xn) of the capsule-type endoscope unit are determined by evaluating X-ray pictures in which the endoscope unit is identifiable. 32. A computer-assisted 3D imaging method for a wireless, capsule-type endoscope unit equipped with a video camera, comprising: recording images of surroundings of the endoscope unit; transmitting image data of the recorded images, in a wireless fashion, from the endoscope unit to at least one of a reception device and evaluation device; executing a pattern recognition algorithm for identifying substantially corresponding features of successive individual images of a recorded image sequence; and carrying out an image processing procedure for the concatenation of individual images by superimposing the identified, substantially corresponding image features in order thereby to produce a pseudo three-dimensional representation of the surroundings of the endoscope unit, wherein for concatenation of two individual images (m, n), use is made of the path difference (Δxm,n:=xn-xm), covered by the capsule-type endoscope unit and loaded with a weighting factor, between the instantaneous recording positions (xm, xn) of the unit for recording the two individual images (m, n), and instantaneous recording positions (xm, xn) of the capsule-type endoscope unit are determined by evaluating the signal transit times (Tm, Tn) of the wireless image data transmission from the endoscope unit to the reception device.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.