IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0463776
(2006-08-10)
|
등록번호 |
US-7301648
(2007-11-27)
|
발명자
/ 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
196 인용 특허 :
11 |
초록
▼
A new tracking technique is essentially "sourceless" in that it can be used anywhere with no set-up, yet it enables a much wider range of virtual environment-style navigation and interaction techniques than does a simple head-orientation tracker. A sourceless head orientation tracker is combined wit
A new tracking technique is essentially "sourceless" in that it can be used anywhere with no set-up, yet it enables a much wider range of virtual environment-style navigation and interaction techniques than does a simple head-orientation tracker. A sourceless head orientation tracker is combined with a head-worn tracking device that tracks a hand-mounted 3D beacon relative to the head. The system encourages use of intuitive interaction techniques which exploit proprioception.
대표청구항
▼
What is claimed is: 1. A method comprising: mounting a sourceless orientation tracker on a user's head, using a position tracker comprising a radiated energy detector to track a position of a first localized feature associated with a body part of the user other than the head relative to the user's
What is claimed is: 1. A method comprising: mounting a sourceless orientation tracker on a user's head, using a position tracker comprising a radiated energy detector to track a position of a first localized feature associated with a body part of the user other than the head relative to the user's head; and generating data representative of the tracked position. 2. The method of claim 1, further comprising mounting a virtual reality display on the user's head that contains one or more objects. 3. The method of claim 2, further comprising using said tracked position to display in the virtual reality display an interaction of said body part with an object of said one or more objects. 4. The method of claim 3, wherein said interaction comprises virtual direct manipulation of said object by the user. 5. The method of claim 3, wherein said interaction comprises a scaled-world grab. 6. The method of claim 3, wherein said interaction comprises interacting with tools hidden on the user's body. 7. The method of claim 3, wherein said interaction comprises interaction with menus on the user's body. 8. The method of claim 3, wherein said object includes a second body part, and wherein displaying said interaction comprises displaying a relative position between said body part and said second body part. 9. The method of claim 3, further comprising, in response to the user virtually grabbing an object displayed in the virtual reality display, moving the user toward the object in the virtual reality display. 10. The method of claim 3, wherein the virtual reality display has a frame of reference and further comprising determining a change in position of the user's head and, in response to said change in position, changing the viewpoint of the virtual reality display relative to the frame of reference. 11. The method of claim 10, wherein determining a change in position comprises determining a change in the position of the user's head relative to said body part. 12. The method of claim 2, wherein said virtual reality display comprises a fly-through virtual environment. 13. The method of claim 12, further comprising controlling flight speed based on said position. 14. The method of claim 12, further comprising controlling flight direction based on said position. 15. The method of claim 12, wherein tracking said position comprises tracking a position of both hands of the user and further comprising using the tracked positions to control one of flight direction or flight speed. 16. The method of claim 1, further comprising using signals obtained from said sourceless orientation tracker to compute a distance traveled by said user in a virtual reality environment; and generating data representative of such distance. 17. The method of claim 1, further comprising: (a) providing a virtual reality display having a frame of reference; (b) displaying in said virtual reality display an object associated with said body part; (c) providing an input mechanism for receiving an input from said user; (d) operating said virtual reality display in a first mode comprising, in response to a change in said tracked position, displaying a change in the apparent position of said object relative to said frame of reference; and (e) in response to an input from said input device, operating said virtual reality display in a second mode, comprising in response to a change in said tracked position, displaying a constant apparent position of said object relative to said frame of reference. 18. The method of claim 17 wherein, in said second mode, in response to a change in said tracked position, the viewpoint of said virtual reality display changes relative to said frame of reference. 19. The method of claim 1, further comprising placing a portable anchor beacon in a fixed location and using signals received from said beacon to track a position and an orientation of both said head and said body part and to generate data representative of such positions and orientations. 20. The method of claim 1, further comprising providing a head mounted display including a body stabilized information cockpit and displaying data to a user using such display. 21. The method of claim 20, wherein said information cockpit comprises a clear windshield. 22. The method of claim 21, further comprising, in response to user selection of an object of the one or more objects, displaying an information display window in the head mounted display. 23. The method of claim 22, wherein said information cockpit comprises a clear windshield and further comprising fixing said information display window to said clear windshield. 24. The method of claim 20, wherein said information cockpit comprises one or more objects. 25. The method of claim 24, further comprising using said tracked position to determine that the user has selected an object of the one or more objects. 26. The method of claim 24, further comprising modifying the appearance of an object of the one or more objects in response to a change in said tracked position. 27. The method of claim 26, wherein modifying the appearance of the object comprises changing the apparent distance of the object from the user in the display. 28. The method of claim 27, wherein the body part is the user's hand, and wherein the change in said tracked position results from the user virtually manipulating the object. 29. The method of claim 28, wherein said information cockpit includes a clear windshield and further comprising attaching said object to said windshield by virtually manipulating said object. 30. The method of claim 26, wherein said object is a cursor. 31. The method of claim 30, wherein said change in said tracked position comprises a component in a plane, and wherein the appearance of the cursor is modified in response to said change by moving it a distance based on magnitude and direction of said planar component. 32. The method of claim 20, wherein said information cockpit comprises one or more virtual instruments. 33. The method of claim 20, wherein said information cockpit comprises a virtual rearveiw mirror. 34. The method of claim 33, wherein said virtual rearview mirror may be repositioned by the user. 35. The method of claim 20, further comprising providing in said display indicia of a route toward a destination. 36. The method of claim 35, further comprising use a GPS receiver associated with said user to receive position data and using said position data to determine the position of said indicia in said display. 37. The method of claim 20, further comprising detecting a predefined hand gesture of the user and, in response to said hand gesture, resetting the heading direction of said cockpit. 38. The method of claim 1, further comprising sequentially positioning said localized feature at a first and then a second location, using said position tracker to determine positions of said first and second locations, and computing a distance between said positions, and generating data representative of such distance. 39. The method of claim 1, wherein said radiated energy detector comprises an acoustic detector. 40. A method comprising: mounting a first sourceless orientation tracker on a user's head; mounting a second sourceless orientation tracker on a body part of the user other than the user's head; and utilizing angular rate and linear acceleration signals from said first and second trackers to derive a differential inertial signal representative of a motion of the body part relative to the head. 41. The method of claim 40, further comprising using signals from said first tracker to obtain a sourceless measurement of the orientation of the user's head. 42. The method of claim 41, further comprising using signals from said first and second trackers to track both the position and orientation of the body part. 43. The method of claim 42, further comprising using relative range measurements between said head and said body part to correct drift in said tracking of the position and orientation of the body part. 44. The method of claim 43, further comprising providing signals to a haptic feedback device based on said tracked position or said tracked orientation.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.