A method of producing a projected useful form from an existing body delimited by a three-dimensional envelope using a tool mounted on a machine cooperating with a global satellite positioning system of the bifrequency, differential, kinematic and real time type such as GPS, the machine having at lea
A method of producing a projected useful form from an existing body delimited by a three-dimensional envelope using a tool mounted on a machine cooperating with a global satellite positioning system of the bifrequency, differential, kinematic and real time type such as GPS, the machine having at least one global positioning receiver in order to be moved according to a theoretical model of the form, is provided.
대표청구항▼
A method of producing a projected useful form from an existing body delimited by a three-dimensional envelope using a tool mounted on a machine cooperating with a global satellite positioning system of the bifrequency, differential, kinematic and real time type such as GPS, the machine having at lea
A method of producing a projected useful form from an existing body delimited by a three-dimensional envelope using a tool mounted on a machine cooperating with a global satellite positioning system of the bifrequency, differential, kinematic and real time type such as GPS, the machine having at least one global positioning receiver in order to be moved according to a theoretical model of the form, is provided. variable signals indicating the field of view; and a processor coupling the master controller to the slave, the processor generating the slave actuator signals by mapping the input device in the controller workspace with the end effector in the working space according to a transformation, the processor deriving the transformation in response to the state of variable signals of the imaging system, the linkages comprising joints and the state variable signals comprising joint configuration signals, the linkages coupled so that the processor derives the transformation in response to the joint configuration signals such that movement of an image of the end effector in a display appears substantially connected to the input device in the controller workspace; wherein if the image capture device is moved the state variables of the imaging system change and the input device is re-mapped with the end effector according to another transformation, the other transformation derived in response to the changed state variables of the imaging system. 2. The robotic system of claim 1, wherein the image capture device moves independently of the slave. 3. The robotic system of claim 2, wherein the processor derives the transformation so that an image of the end effector in the display appears substantially connected to the input device in the controller workspace. 4. The robotic system of claim 1, wherein the slave generates state variable signals responsive to a position of the end effector in the working space, the processor deriving the transformation using the state variable signals of the slave. 5. The robotic system of claim 4, wherein the processor determines a position and orientation of the input device in the master controller workspace from state variable signals of the master controller, wherein the processor determines a position and orientation of the end effector in the working space from the state variable signals of the slave, and wherein the processor generates the slave actuator signals by comparing the position and orientation of the input device and the end effector in a mapped space. 6. The robotic system of claim 4, wherein the slave comprises an elongate shaft supporting the end effector, wherein the image capture device comprises an endoscope having a distal end, and wherein the processor derives the transformation when the shaft and endoscope are inserted into the working space through arbitrary access sites so that the end effector and the distal end of the endoscope move by pivoting the shaft and the endoscope about the access sites. 7. The robotic system of claim 1, wherein the slave linkage is mounted to a slave base, and wherein the imaging system linkage is mechanically coupled to the slave base. 8. The robotic system of claim 1, wherein the imaging system linkage is mounted to a base of the slave, and wherein the slave base has wheels for transporting the end effector linkage and the image capture device. 9. The robotic system of claim 1, wherein the slave comprises a tool holder and a tool detachably mounted in the tool holder, the tool including the end effector and at least one joint, the robotic system further comprising a plurality of alternative kinematically dissimilar tools having alternative end effectors, the alternative tools mountable to the at least one tool holder in place of the tool, the processor capable of changing the transformation in response to a tool change signal. 10. The robotic system of claim 1, wherein the processor derives the transformation in real time. 11. The robotic system of claim 1, wherein the processor generates the slave actuation signals so that a change in angular orientation of the end effector remains within 5 degrees of a change in orientation of the input device. 12. A robotic system as claimed in claim 11, wherein an input device movement defines an input movement distance, and wherein the image of the end effector moves an output movement distance in response to the input movement distance, the output movement distance being significantly different than the input movement distance. 13. A robotic system comprising: a master controller having an input device movable in a controller workspace; a plurality of slaves having an end effector and at least one actuator operatively coupled to the end effector, the actuator moving the end effector in a workspace in response to slave actuator signals; an imaging system including an image capture device with a field of view movable in the workspace, the imaging system generating state variable signals indicating the field of view; and a processor coupling the master controller to the slave arm, the processor generating the slave actuator signals by mapping the input device in the controller workspace with the end effector in the workspace according to a transformation, the processor deriving the transformation in response to the state of variable signals of the imaging system; wherein if the image capture device is moved the state variables of the imaging system change and the input device is re-mapped with the end effector according to another transformation, the other transformation derived in response to the changed state variables of the imaging system, the master controller selectively associatable with the slaves in response to a slave selection signal, wherein the processor changes the transformation in response to the slave selection signal so that movement of an image of the end effector of the selected slave as shown in a display substantially corresponds to movement of the input device in the workspace. 14. A robotic system comprising: a master controller having an input device movable in a controller workspace; a slave having an end effector and at least one actuator operatively coupled to the end effector, the actuator moving the end effector in a workspace in response to slave actuator signals; an imaging system including an image capture device with a field of view movable in the workspace, the imaging system transmitting an image to a display; and a processor coupling the master controller to an arm of the slave, the processor generating the slave actuator signals by mapping the input device in the controller workspace with the end effector in the workspace according to a transformation, the processor deriving the transformation so that an image of the end effector in the display appears substantially connected to the input device in the workspace; wherein if the image capture device is moved the state variables of the imaging system change and the input device is re-mapped with the end effector according to another transformation, the other transformation derived in response to the changed state variables of the imaging system; and wherein at least one of the slave, the master controller, and the imaging system comprise a linkage having a detachable connection disposed between joints so as to define a first linkage portion attached to a base and a second linkage portion attached to the first linkage portion, the linkage portions defining first and second coordinate reference systems, wherein the processor derives the transformation indirectly from a first relationship of the first coordinate system to the base of the first linkage system and from a second relationship of the second coordinate system to the first coordinate system. 15. The robotic system of claim 14, wherein the master controller comprises a linkage supporting the input device and the slave comprises a linkage supporting the end effector, the master linkage and the slave linkage being kinematically dissimilar. 16. The robotic system of claim 15, wherein joints of the master linkage and joints of the slave linkage have different degrees of freedom. 17. The robotic system of claim 14, wherein joints of the master linkage and joints of the slave linkage define different locations in the mapped space. 18. The robotic system of claim 14, wherein the p rocessor calculates the transformation in response to a signal indicating at least one member of the group consisting of a movement of a camera, a decoupling and repositioning of one of the master and the slave relative to the other, a tool change mounting a different end effector on the slave, a change in scale of the mapping, manual movement of a passive joint of the master or slave, and association of the master with an alternative slave. 19. The robotic system of claim 14, wherein the slave senses non-visual sensory information at the workspace, the master controller presenting the non-visual information to an operator manipulating the input device. 20. The robotic system of claim 19, wherein the master controller presents the non-visual information to an operator by showing a graphical representation of the non-visual information with the display. 21. The robotic system of claim 20, wherein the non-visual information comprises force applied at the end effector. 22. The robotic system of claim 21, wherein the display represents the force as a member selected from the group consisting of a bar graph, a force vector, and an end effector color. 23. The robotic system of claim 19, wherein the master controller presents the non-visual information with an orientation correlating to the image. 24. The robotic system of claim 23, wherein the master controller comprises a plurality of actuators, the actuators providing tactile feedback to the operator manipulating the input device in response to master actuator signals generated by the processor. 25. The robotic system of claim 24, wherein the actuators apply loads against the input device in response to the master actuator signals, and wherein the processor generates the master actuator signals in response to a comparison between a position and orientation of the end effector and a position and orientation of the input device in the mapped space. 26. The robotic system of claim 23, wherein the non-visual information indicates forces and torques applied to the slave. 27. The robotic system of claim 26, wherein the non-visual information comprises forces and torques applied via the input device to a hand of a system operator so that the input device forces and torques substantially correspond to the slave forces and torques according to the image of the slave shown in the display. 28. A robotic system comprising: a master controller having an input device movable in a controller workspace; a slave having an end effector and at least one actuator operatively coupled to the end effector, the actuator moving the end effector in a workspace in response to slave actuator signals; an imaging system including an image capture device with a field of view movable in the workspace, the imaging system transmitting an image to a display; and a processor coupling the master controller to an arm of the slave, the processor generating the slave actuator signals by mapping the input device in the controller workspace with the end effector in the workspace according to a transformation, the processor deriving the transformation so that an image of the end effector in the display appears substantially connected to the input device in the workspace; wherein if the image capture device is moved the state variables of the imaging system change and the input device is re-mapped with the end effector according to another transformation, the other transformation derived in response to the changed state variables of the imaging system; and wherein the slave senses non-visual sensory information at the workspace, the master controller presenting the non-visual information to an operator manipulating the input device, wherein the non-visual information comprises force applied at the end effector, and wherein the master controller comprises a sound generator, the sound generator producing a sound varying in response to the force. 29. A robotic system comprising: a master controller having an input device movable in a controller workspace; a slave having an end effector and at least one actuator operatively coupled to the end effector, the actuator moving the end effector in a workspace in response to slave actuator signals; an imaging system including an image capture device with a field of view movable in the workspace, the imaging system transmitting an image to a display; and a processor coupling the master controller to an arm of the slave, the processor generating the slave actuator signals by mapping the input device in the controller workspace with the end effector in the workspace according to a transformation, the processor deriving the transformation so that an image of the end effector in the display appears substantially connected to the input device in the workspace; wherein if the image capture device is moved the state variables of the imaging system change and the input device is re-mapped with the end effector according to another transformation, the other transformation derived in response to the changed state variables of the imaging system; and wherein the master controller presents non-visual sensory information to a surgeon manipulating the input device, wherein the master controller presents the non-visual information with an orientation correlating to the image, wherein the non-visual information indicates forces and torques applied to the slave, and wherein a correlation between the non-visual information and the image is revised by the controller when the transformation is revised. 30. The robotic system of claim 29, wherein a pivotal joint between first and second grip members of the input device is substantially connected to a pivotal joint between first and second end effector elements. 31. A robotic method comprising: moving a master input device in a controller workspace by articulating a plurality of master joints; generating master joint signals responsive to the master joint configuration; moving an end effector in an end effector workspace by articulating a plurality of slave joints in response to slave motor signals; generating slave joint signals responsive to the slave joint configuration; calculating a position of the input device in the master controller space from the master joint signals; calculating the end effector position in the end effector workspace from the slave joint signals; mapping the end effector workspace with the controller workspace according to a transformation; generating the slave motor signals in response to a difference between the input device position and the end effector position in the mapped space; moving a field of view of an image capture device in the end effector workspace by articulating a plurality of image device joints; displaying an image of the field of view adjacent the master controller and generating image device signals responsive to the image device joint configuration; and revising the mapping in response to the image device signals so that movement of an image of the end effector shown in the display remains aligned with the movement of the input device in the master controller space. 32. A robotic system comprising: a master controller having an input device movable in a master controller space, the input device including first and second grip members for actuating with first and second digits of a hand of an operator; a slave having an end effector, the end effector moving in a workspace in response to slave actuator signals and including first and second end effector elements; and a processor coupling the master to the slave, the processor generating the slave actuator signals so that movement of the first and second grip members substantially map movement of the first and second end effector elements; wherein if a re-map signal is generated state variables of an imaging system change and the input device is re-mapped with the end effector according to another transformation, the other transformation deri ved in response to the changed state variables of the imaging system; and wherein a midpoint disposed between tips of the first and second grip members is substantially connected to a midpoint between tips of the end effector elements. 33. A robotic system comprising: a master controller having a handle supported by a plurality of joints so that the handle is movable in a master controller space, the joints defining a gimbal point of rotation about a plurality of axes, the handle disposed adjacent the gimbal point; and a slave having an end effector, the end effector moving in a workspace in response to movement of the handle; wherein if a re-map signal is generated state variables of an imaging system change and the input device is re-mapped with the end effector according to another transformation, the other transformation derived in response to the changed state variables of the imaging system. 34. The robotic system of claim 33, wherein the end effector is supported by a plurality of joints, a last joint being disposed adjacent the end effector, and further comprising a processor coupling the master to the slave, the processor generating slave actuator signals so that the gimbal point of the master is substantially connected to the last joint of the slave. 35. A robotic system comprising: a master controller having a handle, the handle moving in a master controller workspace; a slave supporting an end effector, the slave moving the end effector within a workspace in response to slave actuation signals; and a processor coupling the master to the slave, the processor generating the slave actuation signals so that movement of a mapping point along the handle of the master controller substantially maps movement of a mapping point along the end effector, the processor capable of changing at least one of the handle mapping point and the end effector mapping point. 36. The robotic system of claim 35, further comprising a display disposed adjacent the master, the display showing an image of the end effector with a magnification, wherein the at least one mapping point moves with changes in the magnification of the display. 37. A robotic method comprising: moving a master input device in a controller workspace by articulating a plurality of master joints; generating master joint signals responsive to the master joint configuration; moving an end effector in an end effector workspace by articulating a plurality of slave joints in response to slave motor signals; generating slave joint signals responsive to the slave joint configuration; calculating a position of the input device in the master controller space from the master joint signals; calculating the end effector position in the end effector workspace from the slave joint signals; mapping the end effector workspace with the controller workspace according to a transformation; and generating the slave motor signals in response to a difference between the input device position and the end effector position in the mapped space. 38. A robotic system comprising: a master controller having an input device movable in a controller workspace; a slave having an end effector and at least one actuator operatively coupled to the end effector, the actuator moving the end effector in a workspace in response to slave actuator signals; an imaging system including an image capture device with a field of view movable in the workspace, the imaging system generating state variable signals indicating the field of view; and a processor coupling the master controller to the slave arm, the processor generating the slave actuator signals by mapping the input device in the controller workspace with the end effector in the workspace according to a transformation, the processor deriving the transformation in response to the state of variable signals of the imaging system; wherein the slave comprises a tool holder and a tool detachably mounted in the tool holder, the tool i
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (7)
Bailey Scott E. ; Stratton Kenneth L., Apparatus and method for determining the position of a point on a work implement attached to and movable relative to a.
Gudat Adam J. (Edelstein IL) Henderson Daniel E. (Washington IL) Harrod Gregory R. (Peoria IL) Kleimenhagen Karl W. (Peoria IL), Method and apparatus for operating geography-altering machinery relative to a work site.
Gudat Adam J. (Edelstein IL) Henderson Daniel E. (Washington IL), Method and apparatus for real-time monitoring and coordination of multiple geography altering machines on a work site.
Gudat Adam J. ; Bradbury Walter J. ; Christensen Dana A. ; Kemner Carl A. ; Koehrsen Craig L. ; Kyrtsos Christos T. ; Lay Norman K. ; Peterson Joel L. ; Schmidt Larry E. ; Stafford Darrell E. ; Weinb, System and a method for enabling a vehicle to track a preset path.
Henderson Daniel E. ; Kleimenhagen Karl W. ; Koehrsen Craig L. ; Shetty Satish M., System and method for representing parameters in a work site database.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.