Smart necklace with stereo vision and onboard processing
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
A61H-003/06
A44C-015/00
G01C-021/20
G01C-021/36
G08B-003/00
G08B-006/00
G08B-013/196
G08B-021/12
출원번호
US-0562557
(2014-12-05)
등록번호
US-9629774
(2017-04-25)
발명자
/ 주소
Dayal, Rajiv
Moore, Douglas A.
Ota, Yasuhiro
Djugash, Joseph M. A.
Chen, Tiffany L.
Yamamoto, Kenichi
출원인 / 주소
Toyota Motor Engineering & Manufacturing North America, Inc.
대리인 / 주소
Snell & Wilmer LLP
인용정보
피인용 횟수 :
3인용 특허 :
158
초록▼
A smart necklace includes a body defining at least one cavity and having a neck portion and first and a second side portions. The necklace includes a pair of stereo cameras that is configured to detect image data including depth information corresponding to a surrounding environment of the smart nec
A smart necklace includes a body defining at least one cavity and having a neck portion and first and a second side portions. The necklace includes a pair of stereo cameras that is configured to detect image data including depth information corresponding to a surrounding environment of the smart necklace. The necklace further includes a positioning sensor configured to detect positioning data corresponding to a positioning of the smart necklace. The necklace includes a non-transitory memory positioned in the at least one cavity and configured to store map data and object data. The smart necklace also includes a processor positioned in the at least one cavity, coupled to the pair of stereo cameras, the positioning sensor and the non-transitory memory. The processor is configured to determine output data based on the image data, the positioning data, the map data and the object data.
대표청구항▼
1. A wearable electronic device to be worn around a neck of a user and for providing navigation, environmental awareness and social interaction functions to the user, the wearable electronic device comprising: a body defining at least one cavity and having a neck portion having a first end and a sec
1. A wearable electronic device to be worn around a neck of a user and for providing navigation, environmental awareness and social interaction functions to the user, the wearable electronic device comprising: a body defining at least one cavity and having a neck portion having a first end and a second end, a first side portion coupled to the first end, and a second side portion coupled to the second end;a camera positioned on a front surface of the first side portion or the second side portion and configured to detect image data corresponding to a surrounding environment of the wearable electronic device;a positioning sensor coupled to the body and configured to detect positioning data corresponding to a positioning of the wearable electronic device;a first button and a second button coupled to the body;a non-transitory memory positioned in the at least one cavity and configured to store map data and object data; anda processor positioned in the at least one cavity, coupled to the camera, the first button, the second button, the positioning sensor, and the non-transitory memory and configured to: determine navigation instructions based on the image data, the positioning data, and the map data in response to a depression of the first button, anddetermine an identity of an object based on the object data and the image data in response to a depression of the second button. 2. The wearable electronic device of claim 1 wherein: the first side portion includes: a first lower portion being rigid and substantially straight and having an upper end and a lower end, anda first middle portion having an upper end coupled to the first end of the neck portion and a lower end coupled to the upper end of the first lower portion, the first middle portion being flexible such that the first lower portion can become flush with a user body of the user when the wearable electronic device is worn; andthe second side portion includes: a second lower portion being rigid and substantially straight and having an upper end and a lower end, anda second middle portion having an upper end coupled to the second end of the neck portion and a lower end coupled to the upper end of the second lower portion, the second middle portion being flexible such that the second lower portion can become flush with the user body when the wearable electronic device is worn. 3. The wearable electronic device of claim 2 further comprising an insert having an upper end configured to attach to the first end of the neck portion, a lower end configured to attach to the upper end of the first middle portion, and an electrical connection configured to electrically couple the first end of the neck portion to the upper end of the first middle portion. 4. The wearable electronic device of claim 2 wherein a width of the first lower portion near the lower end of the first lower portion is greater than a width of the first middle portion, and a width of the second lower portion near the lower end of the second lower portion is greater than a width of the second middle portion. 5. The wearable electronic device of claim 2 further comprising a third button, a fourth button, and a button portion having a notch, a top that is proximal to the first lower portion and a bottom that is distal to the first lower portion such that the top of the button portion is connected to the lower end of the first lower portion, the first button, the second button, the third button, and the fourth button being positioned on a front surface of the button portion such that the first button and the second button are positioned near the top on the front surface, the third button and the fourth button are positioned near the bottom on the front surface of the button portion, and the notch is positioned between the first button, the second button, the third button, and the fourth button. 6. The wearable electronic device of claim 1 further comprising a third button coupled to the processor and wherein the processor is further configured to: determine a first location of the wearable electronic device based on the image data, the positioning data, and the map data in response to a first depression of the third button;determine a second location of the wearable electronic device based on the image data, the positioning data, and the map data in response to a second depression of the third button; anddetermine navigation instructions from the second location to the first location based on the image data, the positioning data, and the map data in response to the second depression of the third button. 7. The wearable electronic device of claim 1 further comprising: a first speaker positioned on the first side portion and a second speaker positioned on the second side portion, the first speaker and the second speaker being coupled to the processor and configured to output stereo audio data based on the navigation instructions or the identity of the object; anda first vibration unit positioned on the first side portion and a second vibration unit positioned on the second side portion, the first vibration unit and the second vibration unit being coupled to the processor and configured to generate stereo haptic information based on the navigation instructions or the identity of the object. 8. The wearable electronic device of claim 7 wherein the first side portion defines a first output cavity such that the first speaker and the first vibration unit are positioned in the first output cavity and the second side portion defines a second output cavity such that the second speaker and the second vibration unit are positioned in the second output cavity. 9. The wearable electronic device of claim 1 wherein the positioning sensor includes: an inertial measurement unit (IMU) sensor coupled to the body and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the wearable electronic device; anda global positioning system (GPS) sensor coupled to the body and configured to detect location data corresponding to a location of the wearable electronic device. 10. The wearable electronic device of claim 1 further comprising a battery configured to store energy and a power button coupled to the processor, wherein the processor is configured to switch the wearable electronic device between power-on state and a power-off state based on a first type of depression of the power button and to determine a status of the battery in response to a second type of depression of the power button. 11. The wearable electronic device of claim 1 further comprising a light sensor coupled to the camera and configured to detect an ambient light, wherein a sensitivity of the camera is configured to be adjusted based on the ambient light. 12. The wearable electronic device of claim 1 further comprising a necklace battery configured to store energy and a first charging contact configured to receive power via inductive charging, the first charging contact being positioned on a back surface of the first side portion or the second side portion and coupled to the necklace battery such that the power received by the first charging contact may transfer to the necklace battery. 13. The wearable electronic device of claim 12 further comprising a portable charging unit having a charging battery configured to store energy, a strap, and a second charging contact coupled to the charging battery via the strap, the portable charging unit being configured to provide the power from the charging battery to the necklace battery via the first charging contact and the second charging contact. 14. The wearable electronic device of claim 1 further comprising a first magnet positioned near a back surface of the first side portion and a second magnet positioned near a back surface of the second side portion, the first magnet and the second magnet being configured to attach to magnetic connectors coupled to a strap or device. 15. The wearable electronic device of claim 1 wherein a width of each of the neck portion, the first side portion, and the second side portion is greater than a thickness of each of the neck portion, the first side portion, and the second side portion to increase comfort of the body when worn by the user. 16. A smart necklace to be worn by a user and to provide environmental awareness, social interaction and navigation instructions to the user, the smart necklace comprising: an upper portion having a first end, a second end and a bottom surface configured to rest on a neck of the user, the upper portion defining an upper cavity and being rigid, substantially straight in a middle of the upper portion and curved towards the first end and the second end;a first lower portion being rigid and substantially straight, the first lower portion defining a first cavity and having an upper end and a lower end;a first middle portion having an upper end coupled to the first end of the upper portion and a lower end coupled to the upper end of the first lower portion, the first middle portion being flexible such that the first lower portion can become flush with a user body of the user when the smart necklace is worn;a second lower portion being rigid and substantially straight, the second lower portion defining a second cavity and having an upper end and a lower end;a second middle portion having an upper end coupled to the second end of the upper portion and a lower end coupled to the upper end of the second lower portion, the second middle portion being flexible such that the second lower portion can become flush with the user body when the smart necklace is worn;a pair of stereo cameras positioned on a front surface of the first lower portion and configured to detect image data including depth information corresponding to a surrounding environment of the smart necklace;a positioning sensor configured to detect positioning data corresponding to a positioning of the smart necklace;a plurality of buttons positioned on a front surface of the second lower portion, the plurality of buttons including a first button, a second button and a third button;a non-transitory memory positioned in the upper cavity, the first cavity or the second cavity and configured to store map data and object data;a processor positioned in the upper cavity, the first cavity or the second cavity and coupled to the non-transitory memory, the plurality of buttons, the pair of stereo cameras, and the positioning sensor and configured to determine: navigation instructions based on the map data and at least one of the image data or the positioning data in response to a depression of the first button,an identity of an object based on the object data and the image data in response to a depression of the second button, andan association of a word or words with the image data and store the word or words and the image data in memory in response to a depression of the third button; andan output device coupled to the processor and configured to output output data corresponding to at least one of the navigation instructions, the identity of the object, or the association of the word or words with the image data. 17. The smart necklace of claim 16 wherein the output device includes: a first speaker positioned on the first middle portion and a second speaker positioned on the second middle portion, the first speaker and the second speaker being coupled to the processor and configured to output stereo audio data based on the output data; anda first vibration unit positioned on the first middle portion and a second vibration unit positioned on the second middle portion, the first vibration unit and the second vibration unit being coupled to the processor and configured to generate stereo haptic information based on the navigation instructions. 18. The smart necklace of claim 16 further comprising a microphone positioned on the first lower portion or the second lower portion and configured to detect speech data and wherein the positioning sensor includes: an inertial measurement unit (IMU) sensor positioned in the upper cavity, the first cavity or the second cavity and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the smart necklace; anda global positioning system (GPS) unit positioned in the upper cavity, the first cavity or the second cavity and configured to detect location data corresponding to a location of the smart necklace. 19. The smart necklace of claim 16 further comprising a light sensor coupled to at least one of the pair of stereo cameras and configured to detect an ambient light, wherein a sensitivity of the at least one of the pair of stereo cameras is configured to be adjusted based on the ambient light. 20. A smart necklace for providing environmental awareness and social interaction to a user, comprising: an upper portion having a first end, a second end and a bottom surface configured to rest on a neck of the user, the upper portion defining an upper cavity and being rigid, substantially straight in a middle of the upper portion and curved towards the first end and the second end;a button portion being rigid and substantially straight, the button portion having an upper end and a lower end;a first lower portion being rigid and substantially straight, the first lower portion defining a first cavity and having an upper end and a lower end coupled to the upper end of the button portion;a first middle portion having an upper end coupled to the first end of the upper portion and a lower end coupled to the upper end of the first lower portion, the first middle portion being flexible such that the button portion and the first lower portion can become flush with a user body of the user when the smart necklace is worn;a second lower portion being rigid and substantially straight, the second lower portion defining a second cavity and having an upper end and a lower end;a second middle portion having an upper end coupled to the second end of the upper portion and a lower end coupled to the upper end of the second lower portion, the second middle portion being flexible such that the second lower portion can become flush with the user body when the smart necklace is worn;a pair of stereo, cameras positioned on a front surface of the first lower portion and configured to detect image data including depth information corresponding to a surrounding environment of the smart necklace;an inertial measurement unit (IMU) sensor positioned in the upper cavity, the first cavity or the second cavity and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the smart necklace;a plurality of buttons positioned on a front surface of the button portion;a non-transitory memory positioned in the upper cavity, the first cavity or the second cavity and configured to store map data and object data;a processor positioned in the upper cavity, the first cavity or the second cavity and coupled to the non-transitory memory, the plurality of buttons, the pair of stereo cameras, and the IMU sensor and configured to determine output data based on the inertial measurement data and the image data; andan output device coupled to the processor and configured to output the output data.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (158)
Chao, Hui; Das, Saumitra Mohan; Gupta, Rajarshi; Khorashadi, Behrooz; Sridhara, Vinay; Pakzad, Payam, Adaptive updating of indoor navigation assistance data for use by a mobile device.
Lynt Ingrid H. (7502 Toll Ct. Alexandria VA 22306) Lynt Christopher H. (7502 Toll Ct. Alexandria VA 22306), Apparatus for converting visual images into tactile representations for use by a person who is visually impaired.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Assisting a vision-impaired user with navigation based on a 3D captured image stream.
Janardhanan, Jayawardan; Dutta, Goutam; Tripuraneni, Varun, Attitude estimation for pedestrian navigation using low cost mems accelerometer in mobile applications, and processing methods, apparatus and systems.
Sawan,Mohamad; Harvey,Jean Fran챌ois; Roy,Martin; Coulombe,Jonathan; Savaria,Yvon; Donfack,Colince, Body electronic implant and artificial vision system thereof.
Kramer James P. (Stanford CA) Lindener Peter (E. Palo Alto CA) George William R. (Palo Alto CA), Communication system for deaf, deaf-blind, or non-vocal individuals using instrumented glove.
Kurzweil, Raymond C.; Albrecht, Paul; Gashel, James; Gibson, Lucy; Lvovsky, Lev, Gesture processing with low resolution images with high resolution processing for optical character recognition for a reading machine.
Strub, Henry B.; Burgess, David A.; Johnson, Kimberly H.; Cohen, Jonathan R.; Reed, David P., Hybrid recording unit including portable video recorder and auxillary device.
Hirsch Hermann (Hirschstrasse 5 A-9021 Klagenfurt (Karnten) ATX) Pichler Heinrich (Sailerackergasse 38/2 A-1190 Wien (Osterreich) ATX), Information system.
Wellner Pierre D.,GBX ; Flynn Michael J.,GBX ; Carter Kathleen A.,GBX ; Newman William M.,GBX, Interactive desktop display system for automatically adjusting pan and zoom functions in response to user adjustment of a feedback image.
Stanford Thomas H. (Escondido CA) Sahne Farhad Noroozi (San Diego CA) Riches Thomas P. (Temecula CA) O\Neill Robert (San Diego CA), Neck engageable transducer support assembly and method of using same.
Jung,Kyung Kwon; Chae,Yeon Sik; Rhee,Jin Koo, Object identification system combined with millimeter-wave passive image system and global positioning system (GPS) for the blind.
Holakovszky Lszl (Beregszsz u.4o/I. Budapest ; 1112 HUX) Endrei Kroly (Fehryri t 86. Budapest 1119 HUX) Kezi Lszl (Zugligeti t 69. Budapest 1121 HUX) Endrei Krolyn (Trogatt 55. Budapest 1021 HUX), Stereoscopic video image display appliance wearable on head like spectacles.
Dieberger, Andreas, System and method for non-visually presenting multi-part information pages using a combination of sonifications and tactile feedback.
Naick, Indran; Spinac, Clifford J.; Sze, Calvin L., Using a display associated with an imaging device to provide instructions to the subjects being recorded.
Lipton Lenny (San Rafael CA) Halnon Jeffrey J. (Richmond CA) Mitchell Larry H. (Cupertino CA) Hursey Robert (Carmel Valley CA), Wireless active eyewear for stereoscopic applications.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.