최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | UP-0446785 (2006-06-05) |
등록번호 | US-7831082 (2010-11-25) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 39 인용 특허 : 531 |
Apparatus and methods are disclosed for the calibration of a tracked imaging probe for use in image-guided surgical systems. The invention uses actual image data collected from an easily constructed calibration jig to provide data for the calibration algorithm. The calibration algorithm analytically
Apparatus and methods are disclosed for the calibration of a tracked imaging probe for use in image-guided surgical systems. The invention uses actual image data collected from an easily constructed calibration jig to provide data for the calibration algorithm. The calibration algorithm analytically develops a geometric relationship between the probe and the image so objects appearing in the collected image can be accurately described with reference to the probe. The invention can be used with either two or three dimensional image data-sets. The invention also has the ability to automatically determine the image scale factor when two dimensional data-sets are used.
What is claimed: 1. A system for registering probe space to image space, comprising: an image collection platform operatively connected to the probe, the probe having a tracking marker affixed thereon; a calibration jig containing at least one calibration pattern and having a tracking marker affixe
What is claimed: 1. A system for registering probe space to image space, comprising: an image collection platform operatively connected to the probe, the probe having a tracking marker affixed thereon; a calibration jig containing at least one calibration pattern and having a tracking marker affixed thereon; a three-dimensional position tracker operatively coupled to the image collection platform, comprising: a processor; a three-dimensional position sensor operatively coupled to the processor; a memory coupled to the processor, storing: an image of at least one calibration pattern; first instructions that when executed by the processor generate and store an image space reference position in the probe space, second instructions to locate an intersection point in each of a plurality of slices of an image from the image collection platform associated with the calibration pattern for each of the plurality of slices and extract a calibration point therefrom, identify an orientation of the intersection points in a first slice of the plurality of slices, compute a position component of the calibration point in calibration jig space by comparing distances from a center of one of the intersection points near each of the plurality of slices edges to the centers of the other two intersection points for each of the plurality of slices, and transform the positions of the calibration points described in calibration jig space to positions described in probe space. 2. The system of claim 1, wherein the image collection platform generates a two dimensional image of the calibration pattern. 3. The system of claim 2, further comprising instructions that when executed by the processor: locates an intersection point in image space associated with each calibration pattern for each of a plurality of images and extract a calibration point therefrom; determine the position for at least one calibration point in probe space for each of the plurality of images; and relate the positions of the calibration points in image space and the positions of said calibration points in probe space. 4. The system of claim 3, further comprising instructions that when executed by the processor: identify an orientation of the intersection points in a first image of the plurality of images collected; compute a position component of the calibration point in calibration jig space by comparing the distances from one of the intersection points near each image edge to the other two intersection points for each image; and transform the positions of calibration points described in calibration jig space to positions described in probe space. 5. The system of claim 4, wherein the instruction that identifies an orientation requires no user intervention. 6. The system of claim 4, wherein the instruction that identifies an orientation requires user intervention. 7. The system of claim 4, further comprising instructions that when executed by the processor: receive positions of identical calibration points described in image space and described in probe space; rotate the calibration points described in image space to align with the calibration points described in probe space; compute centroids of the calibration points described in the rotated image space and the calibration points described in the probe space; translate the calibration points described in the rotated image space to the calibration points described in probe space; and adjusts the scale of the calibration points described in the rotated and translated image space to minimize the point to point error with the calibration points described in probe space. 8. The system of claim 1 wherein: the probe comprises an ultrasonic transducer that generates and receives ultrasonic signals; and the image collection platform comprises a processing system which forms two-dimensional images from the ultrasonic signals received by the ultrasonic transducer. 9. The system of claim 1, wherein the image from the image collection platform generates three-dimensional volumetric images of the calibration pattern. 10. The system of claim 9, further comprising instructions that when executed by the processor: wherein the plurality of slices are extracted as two-dimensional slices from the volumetric image; determine the position for at least one calibration point in probe space for each of the plurality of slices; and relate the positions of the calibration points in slice space and the positions of said calibration points in probe space. 11. The system of claim 10, wherein the instruction that identifies an orientation requires no user intervention. 12. The system of claim 10, wherein the instruction that identifies an orientation requires user intervention. 13. The system of claim 10, further comprising instructions that when executed by the processor: receive positions of identical calibration points described in slice space and described in probe space; rotate the calibration points described in slice space to align with the calibration points described in probe space; compute centroids of the calibration points described in the rotated slice space and the calibration points described in the probe space; and translate the calibration points described in the rotated slice space to the calibration points described in probe space. 14. The system of claim 1, further comprising: a display coupled to the processor. 15. The system of claim 1, wherein said tracking marker includes at least one of an optical tracking marker, an infrared tracking marker, an electromagnetic tracking marker, an acoustic tracking marker, or combinations thereof. 16. The system of claim 1, wherein the tracking marker is a reflector, an emitter, or combinations thereof. 17. The system of claim 1, wherein the tracking sensor is an acoustic tracking sensor, an electromagnetic tracking sensor, an optical tracking sensor, or combinations thereof. 18. The system of claim 1, wherein the at least one calibration pattern is defined by an elongated member positioned relative to the calibration jig. 19. The system of claim 18, wherein the elongated member is a wire. 20. A system for registering probe space to image space, comprising: a tracking system operable to track a position of a calibration jig and a position of a probe; the probe operable to collect a plurality of images of at least one calibration pattern of the calibration jig; and a processor operable to execute instructions to: locate an intersection point in image space associated with the at least one calibration pattern for each of the plurality of images and extracting a calibration point therefrom; determine a position for at least one calibration point in probe space for each of the plurality of images; relate the positions of the calibration points in image space and the positions of said calibration points in probe space; and determine an image space reference position in probe space. 21. The system of claim 20, further comprising: a memory system to store the image space reference position. 22. The system of claim 20, wherein the processor is further operable to: identifying an orientation of the intersection points in a first image of the plurality of images collected; determine a position component of the calibration point in calibration jig space by comparing the distances from a center of one of the intersection points near each image edge to the centers of two other intersection points for each image; and transform the positions of calibration points described in calibration jig space to positions in probe space. 23. The system of claim 22 wherein the identification of orientation is performed automatically, specified manually, or combinations thereof. 24. The apparatus of claim 22, wherein the processor is further operable to: receive positions of identical calibration points described in image space and described in probe space; rotate the calibration points described in image space to align with the calibration points described in probe space; determine centroids of the calibration points described in the rotated image space and the calibration points described in the probe space; translate the calibration points described in the rotated image space to the calibration points described in probe space; and adjust the scale of the calibration points described in the rotated and translated image space to minimize the point to point error with the calibration points described in probe space. 25. The system of claim 20, wherein the probe includes an ultrasonic transducer operable to generate and receive ultrasonic signals; and further includes a probe processor to form two dimensional images from the ultrasonic signals received by the ultrasonic transducer. 26. The system of claim 20, wherein when the collection of a plurality of images of at least one calibration pattern with the probe includes: an ultrasound probe is positioned relative to the calibration jig to image the calibration jig with the ultrasound probe; wherein the ultrasound probe is operable to direct into a jig space of the calibration jig. 27. The system of claim 20, further comprising: a calibration jig operable to include the at least one calibration pattern. 28. The system of claim 27, wherein the calibration pattern is defined by an elongated member. 29. The system of claim 28, wherein the elongated member is a wire. 30. The system of claim 20, wherein the tracking system includes at least one of a tracking marker, tracking sensor, or combinations thereof. 31. The system of claim 30, wherein the tracking marker is a reflector, an emitter, or combinations thereof. 32. The system of claim 20, wherein the tracking system includes at least one of an optical tracking system, an infrared tracking system, an electromagnetic tracking system, an acoustic tracking system, a radiological tracking system, or combinations thereof. 33. A system for registering a probe space to image space, comprising: a tracking system to track a position of a calibration jig and a position of a probe; an imaging system to collect a three-dimensional image data of at least one calibration pattern of the calibration jig; an image processor operable to extract a two-dimensional slice from the three-dimensional image data and locate an intersection point in two-dimensional slice space associated with at least one calibration pattern for the two-dimensional slice to extract a calibration point therefrom; wherein the probe is operable to generate the three-dimensional image data of the least one calibration pole of the calibration jig; a probe processor operable to determine the position for at least one calibration point in probe space for each of the plurality of slices and relate the positions of the calibration points in two-dimensional slice space and the positions of said calibration points in probe space; and a reference processor operable to determine a reference position of the three-dimensional image data in probe space. 34. The system of claim 33, further comprising: a memory system to store the reference position of the three-dimensional image data. 35. The system of claim 33, wherein at least two of the imaging processor, the probe processor, the reference processor, or combinations thereof are a single processor. 36. The system of claim 33, wherein at least one of the imaging processor, the probe processor, the reference processor, or combinations thereof is operable to: extract a first two-dimensional slice from the three-dimensional image data; identify an orientation of the intersection points in the first two-dimensional slice; determine a position component of the calibration point in calibration jig space by comparing the distances from a center of one of the intersection points near the first two-dimensional slice edge to centers of two other intersection points for the first two-dimensional slice; and transform the positions of calibration points described in calibration jig space to positions described in probe space. 37. The system of claim 36, wherein a plurality of two-dimensional slices are extracted from the three-dimensional image data; an orientation of the intersection point of the plurality of two-dimensional slices is identified; a position component of the calibration point and the calibration jig space is determined, in part, at least by comparing the distance of a center of one of the intersection points near an edge of each of the plurality of two-dimensional slices and a center of two other intersection points of each of the plurality of two-dimensional slices. 38. The system of claim 33, wherein at least one of the imaging processor, the probe processor, the reference processor, or combinations thereof is operable to: receive positions of identical calibration points described in two-dimensional slice space and described in probe space; rotate the calibration points described in two-dimensional slice space to align with the calibration points described in probe space; determine centroids of the calibration points described in the rotated two-dimensional slice space and the calibration points described in the probe space; and translate the calibration points described in the rotated two-dimensional slice space to the calibration points described in probe space. 39. The system of claim 33, wherein the imaging system includes an ultrasonic transducer operable to generate the three-dimensional image data. 40. The system of claim 33, wherein the tracking system includes a tracking marker, a tracking sensor, or combinations thereof. 41. The system of claim 40, further comprising: a calibration jig; a probe; wherein each of the calibration jig and the probe is operably interconnected with a tracking marker. 42. The system of claim 33, wherein the tracking system includes at least one of an optical tracking system, an infrared tracking system, an electromagnetic tracking system, an acoustic tracking system, a radiological tracking system, or combinations thereof. 43. A system to register probe space to image space, comprising: an imaging probe operable to define probe space that is imaged by the imaging probe; a calibration system operable to be imaged by the imaging probe; a tracking system operable to track the imaging probe and the calibration system; and a processor operable to determine a reference point in the probe space based in part upon image data produced by the imaging probe of the calibration system. 44. The system of claim 43, wherein the imaging probe is an ultrasound transducer, an x-ray system, a magnetic resonance imaging system; microwave imaging, optical imaging, or combinations thereof. 45. The system of claim 43, further comprising: a probe tracking marker operably interconnected with the imaging probe; a calibration system tracking marker operably interconnected with the calibration system; wherein the tracking system is operable to track the probe tracking marker, the calibration tracking marker, or combinations thereof. 46. The system of claim 45, wherein the probe tracking marker, the calibration tracking marker, or combinations thereof include at least one of an acoustic tracking marker, an electromagnetic tracking marker, an optical tracking marker, a radiological tracking marker, an infrared tracking marker, or combinations thereof. 47. The system of claim 43, wherein the tracking system includes at least one of an acoustic tracking system, an electromagnetic tracking system, an optical tracking system, a radiological tracking system, an infrared tracking system, or combinations thereof. 48. The system of claim 47, wherein the tracking system includes a camera system operable to image at least one of the probe tracking marker, the calibration tracking marker, or combinations thereof. 49. The system of claim 43, wherein said calibration system includes an imageable member operable to be imaged by the imaging probe. 50. The system of claim 49, wherein the imageable member includes an elongated member. 51. The system of claim 50, wherein the elongated member includes a wire. 52. The system of claim 49, wherein the imaging probe is operable to image the imagable member defined by the calibration system along the axis of the calibration system; wherein the tracking system is operable to track at least one of the imaging probe, the calibration system, or combinations thereof during the imaging process. 53. The system of claim 52, wherein the processor is operable to determine the reference point in probe space based upon the reference point in the image space of the image data produced by the imaging probe. 54. The system of claim 53, wherein the processor is further operable to determine a transform between the probe space and the calibration space of the calibration system based upon the determination of the reference point. 55. The system of claim 52, wherein the calibration system includes a plurality of the imagable member. 56. The system of claim 55, wherein the plurality of imagable members define a “Z” shape. 57. The system of claim 43, wherein the imaging system is a 2D imaging system, a 3D imaging system, or combinations thereof. 58. The system of claim 43, further comprising a memory system operable to store image data that are produced by the imaging probe of the calibration system. 59. The system of claim 43, further comprising: a display system operable to display image data produced by the imaging probe. 60. The system of claim 43, wherein the calibration system includes an imageable member formed within a medium. 61. The system of claim 43, further comprising; a display system; wherein the display system is operable to display a calibrated image of a patient formed from the imaging probe. 62. The system of claim 61, wherein the display is operable to display image data defined in the probe space to determine a position of a portion of a patient relative to the probe. 63. The system of claim 43, wherein the processor is operable to execute computer instructions to operate the imaging probe; operate the tracking system to track at least one of the imaging probe, the calibration system, or combinations thereof; calibrate the imaging probe with use of the calibration system; and combinations thereof. 64. The system of claim 63, wherein the processor executes the instructions to calibrate the probe space to the calibration space in part by tracking the position of the imaging probe and the calibration system; and determining a position within image space relative to probe space based on the calibration.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.