System, method and computer program product for intuitive interactive navigation control in virtual environments
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06F-003/00
G06F-009/00
G06F-017/00
출원번호
US-0046179
(2002-01-16)
발명자
/ 주소
Hughes,David W
출원인 / 주소
Silicon Graphics, Inc.
대리인 / 주소
Sterne, Kessler, Goldstein &
인용정보
피인용 횟수 :
50인용 특허 :
5
초록▼
A system, method and computer program product is provided for interactive user navigation in a real-time 3D simulation. An assembly builder permits a user to build customized physics-based assemblies for user navigation in a variety of virtual environments. These assemblies are stored in a library a
A system, method and computer program product is provided for interactive user navigation in a real-time 3D simulation. An assembly builder permits a user to build customized physics-based assemblies for user navigation in a variety of virtual environments. These assemblies are stored in a library and are then accessed by a navigation run-time module that runs in conjunction with, or as a part of, a visual run-time application. The navigation run-time module receives high-level user goal requests via a simple and intuitive user interface, converts them into a series of tasks, and then selects the appropriate assembly or assemblies to perform each task. As a result, complex navigation may be achieved. Once selected, an assembly provides a physics-based eye-point model for user navigation. Collisions between the assembly and objects in the simulation are resolved using a real-time physics engine, thus ensuring smooth, cinematic-style eye-point modeling in addition to real-time control.
대표청구항▼
What is claimed is: 1. A method for providing interactive user navigation in a real-time three dimensional simulation, comprising the steps of: combining physical elements from a predefined set of physical elements to construct a plurality of behavioral assemblies; storing said plurality of behavio
What is claimed is: 1. A method for providing interactive user navigation in a real-time three dimensional simulation, comprising the steps of: combining physical elements from a predefined set of physical elements to construct a plurality of behavioral assemblies; storing said plurality of behavioral assemblies in a library; executing the real-time three dimensional simulation; selecting one of said plurality of behavioral assemblies from said library during execution of the simulation, wherein said selected one of said plurality of behavioral assemblies provides a physics-based eye-point model for user navigation in the simulation. 2. The method of claim 1, wherein said combining step comprises: presenting said predefined set of physical elements in a first window of a graphical user interface; and selectively combining physical elements from said first window in a second window of said graphical user interface to construct said plurality of behavioral assemblies. 3. The method of claim 2, wherein said combining step further comprises: adjusting a parameter of at least one of said selectively combined physical elements in a third window of said graphical user interface. 4. The method of claim 3, wherein said combining step further comprises: simulating the performance of said plurality of behavioral assemblies in a fourth window of said graphical user interface. 5. The method of claim 1, wherein said predefined set of physical elements includes at least one of a passive element, a constraint, an active element, or a resistive element. 6. The method of claim 1, wherein said selecting step comprises: identifying a goal request; translating said goal request into a plurality of tasks; and selecting one of said plurality of behavioral assemblies from said library to perform one of said plurality of tasks, wherein said selected one of said plurality of behavioral assemblies provides a physics-based eye-point model for user navigation during performance of said one of said plurality of tasks. 7. The method of claim 1, further comprising the steps of: detecting a collision between said selected one of said plurality of behavioral assemblies and an object in the real-time three dimensional simulation; and invoking a real-time physics engine to model the interaction between said selected one of said plurality of behavioral assemblies and said object. 8. A method for interactive user navigation in a real-time three dimensional simulation, comprising the steps of: combining physical elements from a predefined set of physical elements to construct a plurality of behavioral assemblies; storing said plurality of behavioral assemblies in a library; executing the real-time three dimensional simulation; and during execution of the real-time three dimensional simulation: generating a goal request, translating said goal request into a plurality of tasks, and selecting one of said plurality of behavioral assemblies from said library to perform one of said plurality of tasks, wherein said selected one of said plurality of behavioral assemblies provides a physics-based eye-point model for user navigation during performance of said one of said plurality of tasks. 9. A system for providing interactive user navigation in a real-time three dimensional simulation, comprising: an assembly builder, wherein said assembly builder includes a first interface that permits a user to combine physical elements from a predefined set of physical elements to construct a plurality of behavioral assemblies; a library that stores said plurality of behavioral assemblies; and a visual run-time application that executes the real-time three dimensional simulation, wherein said visual run-time application includes a second interface that receives a goal request from said user, and a navigation run-time module, said navigation run-time module configured to receive said goal request from said second interface and to select during said execution one of said plurality of behavioral assemblies from said library based on said goal request, wherein said selected one of said plurality of behavioral assemblies provides a physics-based eve-point model for interactive navigation in the simulation. 10. The system of claim 9, wherein said first interface comprises a first window that presents said predefined set of physical elements to said user and a second window that permits said user to selectively combine physical elements from said first window to construct said plurality of behavioral assemblies. 11. The system of claim 10, wherein said first interface further comprises a third window that permits said user to adjust a parameter of at least one of said selectively combined physical elements. 12. The system of claim 11, wherein said first interface further comprises a fourth window that permits said user to simulate the performance of said plurality of behavioral assemblies. 13. The system of claim 9, wherein said predefined set of physical elements includes at least one of a passive element, a constraint, an active element, or a resistive element. 14. The system of claim 9, wherein said navigation run-time module comprises: a goal interface, wherein said goal interface is configured to receive said goal request from said second interface and to translate said goal request into a plurality of tasks; and a task interactor, wherein said task interactor is configured to receive said plurality of tasks from said goal interface and to select one of said plurality of behavioral assemblies from said library to perform one of said plurality of tasks, wherein said selected one of said plurality of behavioral assemblies provides a physics-based eye-point model for user navigation during execution of said one of said plurality of tasks. 15. The system of claim 9, wherein said navigation run-time module comprises: a real-time physics engine; wherein said navigation run-time module is configured to detect a collision between said selected one of said plurality of behavioral assemblies and an object in the real-time three dimensional simulation and to invoke said real-time physics engine to model the interaction between said selected one of said plurality of behavioral assemblies and said object. 16. A system for providing interactive user navigation in a real-time three dimensional simulation, comprising: an assembly builder, wherein said assembly builder includes a first interface that permits a user to combine physical elements from a predefined set of physical elements to construct a plurality of behavioral assemblies; a library that stores said plurality of behavioral assemblies; and a visual run-time application that executes the real-time three dimensional simulation, wherein said visual run-time application includes a second interface that receives a goal request from said user, and a navigation run-time module, said navigation run-time module comprising: a goal interface, wherein said goal interface is configured to receive said goal request from said second interface and to translate said goal request into a plurality of tasks, and a task interactor, wherein said task interactor is configured to receive said plurality of tasks from said goal interface and to select one of said plurality of behavioral assemblies from said library to perform one of said plurality of tasks, wherein said selected one of said plurality of behavioral assemblies provides a physics-based eye-point model for user navigation during execution of said one of said plurality of tasks. 17. A computer program product comprising a computer useable medium having computer program logic recorded thereon for enabling a processor in a computer system to provide interactive user navigation in a real-time three dimensional simulation, said computer program logic comprising: first means for enabling the processor to combine physical elements from a predefined set of physical elements to construct a plurality of behavioral assemblies; second means for enabling the processor to store said plurality of behavioral assemblies in a library; third means for enabling the processor to execute the real-time three dimensional simulation; and fourth means for enabling the processor to select one of said plurality of behavioral assemblies from said library during execution of the simulation, wherein said selected one of said plurality of behavioral assemblies provides a physics-based eye-point model for user navigation in the simulation. 18. The computer program product of claim 17, wherein said first means includes: means for enabling the processor to present said predefined set of physical elements in a first interface window; and means for enabling the processor to present a second interface window, said second interface window permitting a user to selectively combine physical elements from said first interface window to construct said plurality of behavioral assemblies. 19. The computer program product of claim 18, wherein said first means further includes: means for enabling the processor to present a third interface window, said third interface window permitting said user to adjust a parameter of at least one of said selectively combined physical elements. 20. The computer program product of claim 19, wherein said first means further includes: means for enabling the processor to present a fourth interface window, said fourth interface window permitting said user to simulate the performance of said plurality of behavioral assemblies. 21. The computer program product of claim 17, wherein said predefined set of physical elements comprises at least one of a passive element, a constraint, an active element, or a resistive element. 22. The computer program product of claim 17, wherein said fourth means comprises: means for enabling the processor to receive a goal request; means for enabling the processor to translate said goal request into a plurality of tasks; and means for enabling the processor to select one of said plurality of behavioral assemblies from said library to perform one of said plurality of tasks, wherein said selected one of said plurality of behavioral assemblies provides a physics-based eye-point model for user navigation during execution of said one of said plurality of tasks. 23. The computer program product of claim 17, further comprising: means for enabling the processor to detect a collision between said selected one of said plurality of behavioral assemblies and an object in the real-time three dimensional simulation; and means for enabling the processor to invoke a real-time physics engine to model the interaction between said selected one of said plurality of behavioral assemblies and said object. 24. A computer program product comprising a computer useable medium having computer program logic recorded thereon for enabling a processor in a computer system to provide interactive user navigation in a real-time three dimensional simulation, said computer program logic comprising: first means for enabling the processor to combine physical elements from a predefined set of physical elements to construct a plurality of behavioral assemblies; second means for enabling the processor to store said plurality of behavioral assemblies in a library; third means for enabling the processor to execute the real-time three dimensional simulation; and fourth means for enabling the processor to, during execution of the real-time three dimensional simulation, receive a goal request, translate said goal request into a plurality of tasks, and select one of said plurality of behavioral assemblies from said library to perform one of said plurality of tasks, wherein said selected one of said plurality of behavioral assemblies provides a physics-based eye-point model for user navigation during performance of said one of said plurality of tasks.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (5)
Ellery Y. Chan ; Timothy B. Faulkner, Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model.
Gantt Brian D., Method and system for interactively determining and displaying geometric relationship between three dimensional objects based on predetermined geometric constraints and position of an input device.
Berry Richard Edmond ; Martin Shirley Lynn ; Mullaly John Martin, Navigation in three-dimensional workspace interactive displays having virtual force fields associated with selected objects.
Harada Hiroaki (Kawasaki JPX) Aoyama Tatsuro (Kawasaki JPX), Viewpoint setting apparatus for a computer graphics system for displaying three-dimensional models.
Hilliges, Otmar; Kim, David; Izadi, Shahram; Molyneaux, David; Hodges, Stephen Edward; Butler, David Alexander, Augmented reality with direct user interaction.
Hilliges, Otmar; Kim, David; Izadi, Shahram; Molyneaux, David; Hodges, Stephen Edward; Butler, David Alexander, Augmented reality with direct user interaction.
Colvin, Richard Thurston; Ball, William Russell; Gauvin, Jennifer Lynn; Smith, Benjamin Joseph; Smith, Nathaniel Thomas; Smith, II, Thomas Roy, Computer based system for training workers.
Moyers, Josh; Leacock, Matthew; Brody, Paul J.; Van Wie, David; Butler, Robert J., Persistent network resource and virtual area associations for realtime collaboration.
Moyers, Josh; Leacock, Matthew; Brody, Paul J.; Van Wie, David; Butler, Robert J., Persistent network resource and virtual area associations for realtime collaboration.
Howard, Richard D.; Lopiccolo, Jarrod; Anderson, Grant; Lindauer, Roy; Helman, Thomas; Larsen, Matthew, User defined scenarios in a three dimensional geo-spatial system.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.