IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0308730
(2002-12-03)
|
발명자
/ 주소 |
- Dietrich, Arne
- Schmidt, Hauke
- Hathout, Jean-Pierre
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
134 인용 특허 :
21 |
초록
▼
A mobile device for enhanced navigation and orientation including a visualization interface, a first sensor for providing signals indicative of a movement of the mobile device, a second sensor for providing further signals indicative of a movement of the mobile device, and a processor receiving sign
A mobile device for enhanced navigation and orientation including a visualization interface, a first sensor for providing signals indicative of a movement of the mobile device, a second sensor for providing further signals indicative of a movement of the mobile device, and a processor receiving signals from the first and second sensors, calculating a position and an orientation of the mobile device from the received signals, and generating a real time simulation of an environment via the visualization interface based on the position and orientation of the mobile device. According to an embodiment, the first and second sensors are implemented as an inertial sensor and a GPS receiver, respectively.
대표청구항
▼
1. A mobile device for enhanced navigation and orientation, comprising:a visualization interface;a first sensor for providing signals indicative of a movement of the mobile device including an inertial sensor;a second sensor for providing further signals indicative of a movement of the mobile device
1. A mobile device for enhanced navigation and orientation, comprising:a visualization interface;a first sensor for providing signals indicative of a movement of the mobile device including an inertial sensor;a second sensor for providing further signals indicative of a movement of the mobile device; anda processor receiving the signals from the first sensor and the second sensor, the processor calculating a three-dimensional position and a three-dimensional orientation of the mobile device from the received signals, and generating a real time simulation of an environment via the visualization interface based on the calculated position and orientation;wherein the processor uses signals from the inertial sensor to activate or deactivate user control operations.2. The mobile device of claim 1, wherein the first sensor includes an inertial sensor.3. The mobile device of claim 2, further comprising:means for receiving navigational information pertinent to a specific environment.4. The mobile device of claim 3, wherein the navigational information pertains to at least one of fairs, exhibits, museums, conferences, public buildings and campuses.5. The mobile device of claim 3, wherein the navigational information pertains to marine coastal areas and potential navigational hazards.6. The mobile device of claim 3, wherein the navigational information pertains to aeronautical navigation.7. The mobile device of claim 6, wherein the visualization interface is configured to display simulated aeronautical instrumentation as a backup to installed on-board aeronautical instrumentation.8. The mobile device of claim 2, wherein the inertial sensor includes linear acceleration sensors that measure accelerations of the mobile device along x, y, and z directional axes, respectively.9. The mobile device of claim 8, wherein the inertial sensor further includes rotational velocity sensors that measure rotational velocities of the mobile device about x, y, and z axes respectively.10. The mobile device of claim 1, wherein by manual movement of the mobile device, sensed by the first and second sensors, a user controls a point of view, a location and an attitude in at least one of a two-dimensional and a three-dimensional simulation of the environment.11. The mobile device of claim 1, wherein navigational information derived from the GPS receiver is used to determine a bias of the inertial sensor.12. The mobile device of claim 1, wherein the first and second sensors include GPS receivers.13. A method for navigating using a mobile device having a visual display, comprising:detecting a first set of signals indicative of movement of the mobile device;detecting a second set of signals indicative of movement of the mobile device;determining a position of the mobile device based on at least one of the first set of signals and the second set of signals;determining an orientation of the mobile device based on at least one of the first set of signals and the second set of signals;generating at least one of a two-dimensional and a three-dimension view of an environment based on the position and orientation of the mobile device; andcontrolling user functions of the mobile device based on the first set of signals.14. The method of claim 13, further comprising:receiving local navigation image information; andgenerating the at least one of a two-dimensional and a three-dimension view of the environment based on the position and orientation of the mobile device using the received local navigation image information.15. The method of claim 14, further comprising:navigating a virtual space using received local navigation image information that provides graphical details of the virtual space.16. The method of claim 14, further comprising:receiving local maritime image information at a coastal area; andgenerating a map view of the coastal area and a three-dimensional view of the coast including hazards using the received local maritime image information.17. The method of claim 13, wherein the first set of signals is generated by an inertial sensor.18. The method of claim 17, wherein the first set of signals are representative of a linear acceleration of the mobile device in x, y, and z directional axes, and a rotational velocity of the mobile device about the x, y, and z axes.19. The method of claim 13, further comprising:calibrating a position determined from the first set of signals provided by the inertial sensor using position information derived from the second set of signals provided by the GPS receiver.20. The method of claim 13, further comprising:in an aeronautical transport vehicle, simulating on-board navigational instruments in a user interface using data derived from the first set of signals and the second set of signals.21. A mobile device for enhanced navigation and orientation, comprising:a user interface having a visual display;an inertial sensor, the inertial sensor detecting a movement of the mobile device, the inertial sensor providing feedback signals for user control and for generating an image in the visual display and generating signals from which a location and an attitude of the mobile device are derived;a processor; anda means for receiving local navigation image information;wherein the processor computes a location and an attitude of the mobile device from the signals generated by the inertial sensor, and from the location, attitude, and information from the receiving means, the processor generates an image of a local environment, the image representing a depiction of the local environment viewed from the computed location and attitude of the mobile device and in response to particular signals received from the inertial sensor, the processor activates or deactivates controls of the user interface.22. The mobile device of claim 21, wherein the inertial sensor includes linear acceleration sensors for sensing acceleration in x, y, and z directional axes, and rotational velocity sensors for sensing rotational velocity about the x, y, and z axes.23. A visual navigation system comprising:a source of navigation information, the information including map information and image information, the source further including means for transmitting the navigation information;a mobile device including:means for receiving navigation information from the source,an inertial sensor,a GPS receiver, anda processor coupled to the inertial sensor, the GPS receiver, and the means for receiving, the processor including:a navigation module, the navigation module calculating a position of the mobile device using data from the inertial sensor and the GPS receiver, and generating map data using the received map information and the calculated position; anda user interface generator module, the user interface generator module calculating an orientation of the mobile device using data from the inertial sensor, generating a three-dimensional simulation using the position calculated by the navigation module, the calculated orientation and received image information, and controlling user functions in accordance with signals received via the inertial sensor.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.