IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
UP-0206024
(2008-09-08)
|
등록번호 |
US-7648236
(2010-02-22)
|
발명자
/ 주소 |
|
출원인 / 주소 |
- Motion Research Technologies, Inc.
|
대리인 / 주소 |
Maiorana, PC, Christopher P.
|
인용정보 |
피인용 횟수 :
99 인용 특허 :
5 |
초록
▼
An multi-use eyeglass apparatus is disclosed. The apparatus generally comprises a frame, a display device, at least one sensor and a transceiver. The frame may have a plurality of lens openings defining an optical path. The display device may be mounted to the frame and configured to display an imag
An multi-use eyeglass apparatus is disclosed. The apparatus generally comprises a frame, a display device, at least one sensor and a transceiver. The frame may have a plurality of lens openings defining an optical path. The display device may be mounted to the frame and configured to display an image to a user wearing the frame, wherein the image is located outside of the optical path. The at least one sensor is generally mounted to the frame and configured to sense a response from the user. The transceiver may be mounted to the frame and configured to (i) receive the image in a first receive message from external of the frame and (ii) transmit the response in a first transmit message to external of the frame.
대표청구항
▼
The invention claimed is: 1. An apparatus comprising: a frame having a plurality of lens openings defining an optical path; a display device mounted to said frame and configured to display an image to a user wearing said frame, wherein said image is located outside of said optical path; a transceiv
The invention claimed is: 1. An apparatus comprising: a frame having a plurality of lens openings defining an optical path; a display device mounted to said frame and configured to display an image to a user wearing said frame, wherein said image is located outside of said optical path; a transceiver mounted to said frame and configured to (i) receive said image in a first receive message from a device external to said frame, (ii) receive a plurality of hands-free commands, and (iii) transmit said hands-free commands in a first transmit message to said device external to said frame; a first accelerometer mounted to said frame, wherein said first accelerometer is configured to sense movement of a head of said user in relation to a vertical axis, wherein said first accelerometer is configured to generate a first of said hands-free commands; a second accelerometer mounted to said frame, wherein said second accelerometer is configured to sense movement of the head of said user in relation to a horizontal axis, wherein said second accelerometer is configured to generate a second of said hands-free commands, wherein said first of said hands-free commands and said second of said hands-free commands are (i) used to control said apparatus and (ii) presented to said device external to said apparatus; and a haptic device mounted to said frame configured to provide said user with haptic notifications from said device external to said apparatus regarding conditions external to said apparatus. 2. The apparatus according to claim 1, further comprising (i) a clock mounted to said frame, wherein said image comprises a time, or (ii) a temperature sensor mounted to said frame, wherein said image comprises a temperature. 3. The apparatus according to claim 1, further comprising a speaker mounted to said frame and configured to generate a sound audible to said user. 4. The apparatus according to claim 1, further comprising a microphone mounted to said frame and configured to generate a third of said hands-free commands. 5. The apparatus according to claim 4, further comprising a processor mounted to said frame and configured to translate (i) said first receive message into said image and (ii) said hands-free commands into said first transmit message. 6. The apparatus according to claim 5, further comprising: a speaker configured to generate a sound in response to an audio signal, said sound being detectable by said user, wherein said microphone is configured to generate a speech signal from a speech of said user, said processor is further configured to (i) generate an answer signal in response to said speech signal, (ii) translate said answer signal into a second transmit message, (iii) translate said speech signal into a third transmit message and (iv) translate a second receive message into said audio signal, and said transceiver is further configured to (i) transmit said second transmit message, (ii) transmit said third transmit message and (iii) receive said second receive message. 7. The apparatus according to claim 1, wherein said display device comprises an adjustable back lighting source. 8. The apparatus according to claim 1, wherein said display device comprises passive ambient back lighting optics. 9. The apparatus according to claim 1, further comprising at least one battery disposed within said frame. 10. The apparatus according to claim 1, wherein said image is formed above a lens opening of said frame. 11. A method for interfacing a user with multi-use eyeglasses having a plurality of lens openings defining an optical path, comprising the steps of: (A) receiving (i) an image in a first receive message from a device external to said multi-use eyeglasses and (ii) receiving a plurality of hands-free commands; (B) displaying said image to said user, wherein said image is located outside of said optical path; (C) transmitting said hands free commands in a first transmit message to said device external to said multi-use eyeglasses; (D) using a first accelerometer mounted to said multi-use eyeglasses, wherein said first accelerometer is configured (i) to sense movement of a head of said user in relation to a vertical axis, and (ii) to generate a first of said hands-free commands; (E) using a second accelerometer mounted to said multi-use eyeglasses, wherein said second accelerometer is configured (i) to sense movement of the head of said user in relation to a horizontal axis, and (ii) to generate a second of said hands-free commands, wherein said first of said hands-free commands and said second of said hands-free commands are (i) used to control said multi-use eyeglasses and (ii) presented to said device external to said multi-use eyeglasses; and (F) using a haptic device mounted to said multi-use eyeglasses configured to provide said user with haptic notifications from said device external to said multi-use eyeglasses regarding conditions external to said multi-use eyeglasses. 12. The method according to claim 11, further comprising the steps of: receiving a caller identification for an incoming telephone call in a second receive message from a telephone; and displaying said caller identification to said user. 13. The method according to claim 11, further comprising the steps of: receiving a ring command for an incoming telephone call in a second receive message from a telephone; and generating a ringing sound audible to said user in a speaker in response to said ring command. 14. The method according to claim 11, further comprising the steps of: adjusting a particular signal in said multi-use eyeglasses based on said hands-free commands. 15. The method according to claim 11, further comprising the steps of: generating an answer signal from said hands-free commands, wherein said answer signal comprises an accept call indication in response to an affirmative type of said hands-free commands; and transmitting said answer signal in a second transmit message to a telephone. 16. The method according to claim 15, wherein said answer signal further comprises a reject call indication in response to a negative type of said hands-free commands. 17. The method according to claim 15, wherein said answer signal further comprises a reject call indication in response to a lack of said hands-free commands over a predetermined time. 18. The method according to claim 11, further comprising the step of: generating a command signal in response to recognition of said hands-free commands; and transmitting said command signal in a second transmit message to said device external to said eyeglasses. 19. The method according to claim 11, wherein said image is for at least one of (i) a caller identification, (ii) an instant message, (iii) a global positioning system heading, (iv) a temperature, (v) a time, (vi) a gas detection notification, (vii) a compass heading, (viii) a PALM personal computer and (ix) a laptop computer. 20. An apparatus comprising: means for mounting a plurality of lens openings defining an optical path; means for receiving (i) an image in a first receive message from a device external to said apparatus, and (ii) a plurality of hands-free commands; means for displaying said image to a user, wherein said image is located outside of said optical path; means for transmitting said hands-free commands in a first transmit message to said device external to said apparatus; means for sensing movement of a head of said user in relation to a vertical axis, and generating a first of said hands-free commands; means for sensing movement of the head of said user in relation to a horizontal axis, and generating a second of said hands-free commands, wherein said first of said hands-free commands and said second of said hands-free commands are (i) used to control said apparatus and (ii) presented to said device external to said apparatus; and means of providing said user with haptic notifications from said device external to said multi-use eyeglasses regarding conditions external to said multi-use eyeglasses.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.