IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0407420
(2012-02-28)
|
등록번호 |
US-8716973
(2014-05-06)
|
발명자
/ 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
2 인용 특허 :
60 |
초록
▼
A haptic arm comprising: a user connection element; a reference; a first linkage connecting the user connection element to the reference, where the first linkage provides at least six independent degrees of freedom, and contains an intermediate link, three force sensors, three angle sensors; a secon
A haptic arm comprising: a user connection element; a reference; a first linkage connecting the user connection element to the reference, where the first linkage provides at least six independent degrees of freedom, and contains an intermediate link, three force sensors, three angle sensors; a second linkage connecting the intermediate link to the reference; a third linkage connecting the intermediate link to the reference a fourth linkage connecting the intermediate link to the reference; the second, third, and fourth linkages each containing an independent actuator and position sensor.
대표청구항
▼
1. A haptic arm comprising: a user connection element;a reference;a first linkage connecting said user connection element to said reference, said first linkage configured and arranged to provide at least six independent degrees of freedom between said user connection element and said reference;said
1. A haptic arm comprising: a user connection element;a reference;a first linkage connecting said user connection element to said reference, said first linkage configured and arranged to provide at least six independent degrees of freedom between said user connection element and said reference;said first linkage comprising an intermediate link, three force sensors configured and arranged to sense forces applied by a user to said user connection element in three dimensions, and three angle sensors configured and arranged to sense a three dimensional orientation of said user connection element relative to said intermediate link;a second linkage connecting said intermediate link to said reference, said second linkage configured and arranged to power a first powered degree of freedom through a first actuator;said second linkage comprising a position sensor configured and arranged to measure said first powered degree of freedom;a third linkage connecting said intermediate link to said reference, said third linkage configured and arranged to power a second powered degree of freedom through a second actuator;said third linkage comprising a position sensor configured and arranged to measure said second powered degree of freedom; anda fourth linkage connecting said intermediate link to said reference, said fourth linkage configured and arranged to power a third powered degree of freedom through a third actuator;said fourth linkage comprising a position sensor configured and arranged to measure said third powered degree of freedom. 2. The haptic arm set forth in claim 1, wherein each of said actuators is mounted on said reference. 3. The haptic arm set forth in claim 2, wherein each of said actuators comprises an electric motor. 4. The haptic arm set forth in claim 1, wherein each of said force sensors comprises a strain gauge and each of said angle sensors comprises a potentiometer, resolver or encoder. 5. The haptic arm set forth in claim 4, wherein each of said force sensors is mounted on a force sensing link, and said force sensing link comprises openings adjacent each of said strain gauges configured and arranged to amplify strain sensed by said respective strain gauges. 6. The haptic arm set forth in claim 5, wherein said force sensing link and said intermediate link are the same. 7. The haptic arm set forth in claim 1, wherein each of said linkages is operatively configured and arranged such that gravity does not cause a torque on any of said actuators when said user connection element is in a starting position. 8. A user interface comprising: a reference;a first tool interface having a first tool handle with at least three independent translational degrees of freedom and three independent orientation degrees of freedom between said first tool handle and said reference;said first tool interface comprising three force sensors configured and arranged to measure forces applied by a user to said first tool interface along said three independent translational degrees of freedom, three orientation sensors configured and arranged to measure said three independent orientation degrees of freedom, and three actuators configured and arranged to control position of each of said three independent translational degrees of freedom;a second tool interface having a second tool handle with at least six independent degrees of freedom between said second tool handle and said reference;said second tool handle interface comprising three position sensors and three orientation sensors configured and arranged to measure said six independent degrees of freedom between said second tool handle and said reference;a video display configured and arranged to provide a stereoscopic view co-located with said first tool handle and said second tool handle; anda computer system comprising a three dimensional virtual environment;said three dimensional virtual environment comprising a first virtual tool having a first virtual position and a first virtual orientation, a second virtual tool having a second virtual position and a second virtual orientation, and at least one virtual object having a third virtual position and a third virtual orientation;said computer system programmed to adjust said first virtual position as a function of said first tool interface force sensors, and to adjust said second virtual position as a function of said second tool interface position sensors and orientation sensors;said computer system programmed to control said actuators such that said position of said first tool handle is adjusted to correlate to changes in said virtual position of said first virtual tool, andsaid computer system programmed to provide said video display with images of said virtual environment. 9. The user interface set forth in claim 8, wherein said computer system is programmed to detect virtual collisions between said first virtual tool and said virtual object and wherein said adjustment of said first virtual position of said virtual tool is a function of said collisions at a first frequency. 10. The user interface set forth in claim 9, wherein said virtual object comprises a voxel model. 11. The user interface set forth in claim 10, wherein said voxel model comprises multiple voxel elements and each of said elements comprises a hardness parameter and a stiffness parameter. 12. The user interface set forth in claim 8, wherein said virtual object comprises a mesh model. 13. The user interface set forth in claim 8, wherein said virtual object comprises a virtual tooth model having finite elements and each of said elements comprises a tooth material type parameter. 14. The user interface set forth in claim 13, wherein said tooth material type parameter is selected from a group consisting of a cavity, dentin, pulp and enamel. 15. The user interface set forth in claim 8, wherein said first virtual tool comprises a modification region. 16. The user interface set forth in claim 8, wherein said first virtual tool comprises an interaction region. 17. The user interface as set forth in claim 16, wherein said computer system is programmed to calculate an interaction force between said first virtual tool and said virtual object as a function of said interaction region, said first virtual tool position, and said virtual object position. 18. The user interface set forth in claim 17, wherein said interaction force is a function of a virtual object voxel element stiffness parameter. 19. The user interface as set forth in claim 17, wherein said first virtual tool comprises a virtual mass and said computer system is programmed to accelerate said first virtual tool with an acceleration generally equal to said force sensors outputs summed with said interaction force, and divided by said virtual mass. 20. The user interface set forth in claim 17, wherein said virtual object is modified as a function of a position of said modification region relative to said virtual object position. 21. The user interface set forth in claim 20, wherein said virtual object is modified as a function of said interaction force. 22. The user interface set forth in claim 8, and further comprising a foot pedal. 23. The user interface set forth in claim 22, wherein said foot pedal comprises a position sensor and a bias spring. 24. The user interface set forth in claim 23, wherein said virtual object is modified as a function of said foot pedal position sensor. 25. The user interface set forth in claim 8, wherein said second virtual tool comprises a virtual mirror surface. 26. The user interface set forth in claim 8, and further comprising a servo control module configured and arranged to control said actuators as a function of said position of said first virtual tool. 27. The user interface set forth in claim 26, wherein said computer system is programmed to detect virtual collisions between said first virtual tool and said virtual object, and wherein said adjustment of said first virtual position of said virtual tool is a function of said collisions at a first frequency, and wherein said servo control module operates at a frequency greater than said first frequency. 28. The user interface set forth in claim 26, wherein said first tool interface further comprises three position sensors and three tachometer sensors and said servo control module is configured and arranged to adjust an output as a function of said first tool position sensors and said tachometer sensors. 29. The user interface set forth in claim 26, wherein said servo control module comprises a kinematic model of said linkages. 30. The user interface set forth in claim 8, wherein said video display is configured and arranged to alternatively display a left eye image and a right eye image. 31. The user interface set forth in claim 8, wherein said video display is configured and arranged to simultaneously display a left eye image at a first polarization and a right eye image at a second polarization. 32. The user interface set forth in claim 8, wherein said video display is configured and arranged to display images which are true to scale. 33. The user interface set forth in claim 8, and further comprising a speaker. 34. A method of simulating a virtual dental surgery comprising the steps of: providing a first tool interface having a first tool handle with at least three independent translational degrees of freedom and three independent orientation degrees of freedom between said first tool handle and a reference;said first tool interface comprising three force sensors for measuring forces applied to said first tool interface along said three independent translational degrees of freedom, three orientation sensors for measuring said three independent orientation degrees of freedom, and three actuators for controlling each of said three independent translational degrees of freedom;providing a second tool interface having a second tool handle with at least six independent degrees of freedom between said second tool handle and said reference, said second tool handle interface comprising three position sensors and three orientation sensors for measuring said six independent degrees of freedom between said second tool handle and said reference;providing a video display arranged to produce a stereoscopic view co-located with said first tool handle and said second tool handle;providing a three dimensional virtual environment comprising a first virtual tool having a first virtual position and first virtual orientation, a second virtual tool having a second virtual position and second virtual orientation, and at least one virtual object having a third virtual position and third virtual orientation;adjusting said virtual position of said first virtual tool as a function of said first tool interface force sensors;adjusting said virtual position of said second virtual tool as a function of said second tool interface position sensors and said orientation sensors;controlling said actuators such that said first tool handle is adjusted to correlate to changes in said virtual position of said first virtual tool; andproviding said video display with images of said virtual environment. 35. The method set forth in claim 34, wherein said function comprises accelerating said first virtual tool within said virtual environment in proportion to an output of said three force sensors. 36. The method set forth in claim 34, wherein said step of controlling said actuators comprises using a PID. 37. The method set forth in claim 34, wherein said second virtual tool is a virtual mirror. 38. The method set forth in claim 34, wherein said virtual object comprises a triangular mesh. 39. The method set forth in claim 34, wherein said first virtual object comprises an interaction region defined by an analytic function. 40. The method set forth in claim 39, and further comprising the step of detecting a collision between said first virtual tool interaction region and said virtual object. 41. The method set forth in claim 40, and further comprising the step of calculating an interaction force as a function of said collision and accelerating said first virtual tool as a function of said interaction force. 42. The method set forth in claim 41, and further comprising the step of modifying said virtual object as a function of said collision. 43. The method set forth in claim 42, and further comprising the step of providing a foot pedal interface having at least one degree of freedom. 44. The method set forth in claim 43, wherein said step of modifying said virtual object is a function of said foot pedal interface.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.