Smart necklace with stereo vision and onboard processing
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G01C-021/36
H04N-013/02
G06F-003/01
A61H-003/06
G01C-021/20
G06F-003/16
G10L-015/00
G10L-017/00
G10L-015/22
출원번호
US-0480575
(2014-09-08)
등록번호
US-10024679
(2018-07-17)
발명자
/ 주소
Moore, Douglas A.
Djugash, Joseph M. A.
Ota, Yasuhiro
출원인 / 주소
TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.
대리인 / 주소
Snell & Wilmer LLP
인용정보
피인용 횟수 :
0인용 특허 :
172
초록▼
A wearable neck device includes an IMU coupled to the wearable neck device and adapted to detect inertial measurement data and a GPS coupled to the device and adapted to detect location data. The wearable neck device further includes a camera adapted to detect image data and a memory adapted to stor
A wearable neck device includes an IMU coupled to the wearable neck device and adapted to detect inertial measurement data and a GPS coupled to the device and adapted to detect location data. The wearable neck device further includes a camera adapted to detect image data and a memory adapted to store data. The wearable neck device further includes a processor adapted to recognize an object in the surrounding environment by analyzing the data. The processor can determine a desirable action based on the data and a current time or day. The processor can determine a destination based on the determined desirable action. The processor can determine a navigation path based on the determined destination and the data. The processor is further adapted to determine output based on the navigation path. The wearable neck device further includes a speaker adapted to provide audio information to the user.
대표청구항▼
1. A wearable neck device for providing environmental awareness to a user, comprising: a left portion;a right portion;a flexible tube defining a cavity and connecting the left portion to the right portion, wherein at least one of the left portion, the right portion or the cavity houses one or more c
1. A wearable neck device for providing environmental awareness to a user, comprising: a left portion;a right portion;a flexible tube defining a cavity and connecting the left portion to the right portion, wherein at least one of the left portion, the right portion or the cavity houses one or more components including:an inertial measurement unit (IMU) sensor configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the wearable neck device;a global positioning system (GPS) unit configured to detect location data corresponding to a location of the wearable neck device;at least one camera configured to detect image data corresponding to a surrounding environment of the wearable neck device that includes a first object or person and a second object;a memory configured to store object data regarding previously determined objects, a preferred distance between the wearable neck device and the first object or person, and previously determined user data associated with the user;a processor connected to the IMU sensor, the GPS unit and the at least one camera, and configured to: recognize the second object in the surrounding environment by analyzing the image data based on the stored object data and the location data including the location of the wearable neck device,compare a plurality of image features of the second object to a plurality of reference image features for an object category, the object category identifying a plurality of objects having the same plurality of reference image features,determine that the second object identifies with the object category based on the comparison,determine a desirable event or action based on the object category and the previously determined user data,determine a destination based on the determined desirable event or action,determine a navigation path for navigating the wearable neck device to the determined destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data,determine a distance between the wearable neck device and the first object or person;determine that a current walking speed of the user should increase or decrease based on the preferred distance and the determined distance, anddetermine output data based on the determined navigation path; anda speaker or a vibration unit configured to provide to the user the output data that includes audio information or haptic information that indicates that the current walking speed of the user should increase or decrease. 2. The wearable neck device of claim 1 wherein the at least one camera includes a pair of stereo cameras configured to detect depth information. 3. The wearable neck device of claim 1 wherein the at least one camera includes a wide angle camera configured to detect the image data within a 120 degree field of view. 4. The wearable neck device of claim 1 wherein the memory and the processor are positioned within the cavity of the flexible tube. 5. The wearable neck device of claim 1 wherein the at least one camera includes a pair of stereo cameras positioned on either the left portion or the right portion and a wide angle camera positioned on the other of the left portion or the right portion. 6. The wearable neck device of claim 5 wherein the processor is further configured to: determine that one of the pair of stereo cameras or the wide angle camera is obstructed, andalert the user to the obstructed stereo camera or the obstructed wide angle camera. 7. The wearable neck device of claim 6 wherein the processor is further configured to disregard data received from the obstructed stereo camera or the obstructed wide angle camera. 8. The wearable neck device of claim 1 wherein the vibration unit includes a first vibratory motor coupled to the left portion and a second vibratory motor coupled to the right portion, the first vibratory motor and the second vibratory motor configured to provide stereo vibration data. 9. The wearable neck device of claim 1 further comprising a wireless communication antenna for establishing an audio or video communication with another portable electronic device or computer used by another person, wherein the processor is further configured to establish the audio or video communication based on the determined desirable event or action. 10. The wearable neck device of claim 1 further comprising a strap adapted to electrically and mechanically attach to the left portion and the right portion to synchronize components positioned on the left portion with components positioned on the right portion. 11. The wearable neck device of claim 1 further comprising a microphone configured to detect a speech of the user or another person, wherein the processor is further configured to: parse a conversation of the user or the another person into speech elements;analyze the speech elements based on the previously determined user data; anddetermine the desirable event or action further based on the analyzed speech elements. 12. A method for providing continuous social and environmental awareness to a user by a wearable neck device comprising: detecting, via a camera, a Global Positioning System (GPS) unit or an inertial measurement unit (IMU) sensor, inertial measurement data corresponding to a positioning, velocity, or acceleration of the wearable neck device, location data corresponding to a location of the wearable neck device or image data corresponding to a surrounding environment of the wearable neck device and a moving object or person in the surrounding environment;storing, in a memory, object data regarding one or more previously determined objects, a preferred distance between the wearable neck device and the moving object or person, and previously determined user data regarding the user;recognizing, by a processor, an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the location, positioning, velocity, or acceleration of the wearable neck device;comparing, by the processor, a plurality of image features of the object to a plurality of reference image features for an object category, the object category identifying a plurality of objects having the same plurality of reference image features;determining, by the processor, that the object identifies with the object category based on the comparison;determining, by the processor, a desirable event or action based on the object category and the previously determined user data;determining, by the processor, a destination based on the determined desirable event or action;determining, by the processor, a navigation path for navigating the wearable neck device to the determined destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data;determining, by the processor, a distance between the wearable neck device and the moving object or person;determining, by the processor, that a current walking speed of the user should increase or decrease based on the preferred distance and the determined distance;determining, by the processor, output data based on the determined navigation path; andproviding, via a speaker or a vibration unit, the output data that includes audio information or haptic information that indicates that the current walking speed of the user should increase or decrease. 13. The method of claim 12 wherein providing the output data to the user further includes providing audio or haptic information based on divergence data between the object data and the image data. 14. The method of claim 13 wherein providing the audio or haptic information to the user includes providing stereo haptic information. 15. The method of claim 12 wherein storing the object data includes storing the object data in a remote database that can be accessed by other intelligent devices. 16. The method of claim 12 further comprising storing, in the memory, map data wherein determining the navigation path includes determining the navigation path based on the image data, the map data, and at least one of the location data or the inertial measurement data. 17. The method of claim 12 further including transmitting, via an antenna, the image data and the at least one of the inertial measurement data or the location data to a remote device, wherein the processor is located on the remote device. 18. A wearable neck device for providing environmental awareness to a user, comprising: an inertial measurement unit (IMU) sensor coupled to the wearable neck device and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the wearable neck device;a global positioning system (GPS) unit coupled to the wearable neck device and configured to detect location data corresponding to a location of the wearable neck device;a wide angle camera coupled to the wearable neck device;a pair of stereo cameras coupled to the wearable neck device, the pair of stereo cameras configured to detect depth information regarding a surrounding environment, the wide angle camera and the pair of stereo cameras being configured to detect image data corresponding to the surrounding environment of the wearable neck device and a moving object or person in the surrounding environment;a memory configured to store object data regarding previously determined objects, a preferred distance between the wearable neck device and the moving object or person and previously determined user data associated with the user;a processor connected to the IMU sensor, the GPS unit, the wide angle camera and the pair of stereo cameras and configured to: recognize an object in the surrounding environment by analyzing the image data based on the stored object data and the location data including the location of the wearable neck device,compare a plurality of image features of the object to a plurality of reference image features for an object category, the object category identifying a plurality of objects having the same plurality of reference image features,determine that the object identifies with the object category based on the comparison,determine a desirable event or action based on the object category and the previously determined user data,determine a destination based on the determined desirable event or action,determine a navigation path for navigating the wearable neck device to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data,determine a current distance between the wearable neck device and the moving object or person,determine that a current speed of the wearable neck device should increase when the current distance between the wearable neck device and the moving object or person is greater than the preferred distance, anddetermine that the current speed of the wearable neck device should decrease when the current distance between the wearable neck device and the moving object or person is less than the preferred distance; anda speaker configured to provide audio information to the user based on at least one of the recognized object, determined desirable event or action, navigation path, or determination that the current speed of the wearable neck device should increase or decrease; andat least one vibratory motor configured to provide haptic information to the user based on at least one of the recognized object, the determined desirable event or action, the determined navigation path, determination that the current speed of the wearable neck device should increase or decrease. 19. The wearable neck device of claim 18 further comprising a wireless communication antenna for establishing data communication with another portable electronic device or computer.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (172)
Chao, Hui; Das, Saumitra Mohan; Gupta, Rajarshi; Khorashadi, Behrooz; Sridhara, Vinay; Pakzad, Payam, Adaptive updating of indoor navigation assistance data for use by a mobile device.
Lynt Ingrid H. (7502 Toll Ct. Alexandria VA 22306) Lynt Christopher H. (7502 Toll Ct. Alexandria VA 22306), Apparatus for converting visual images into tactile representations for use by a person who is visually impaired.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Assisting a vision-impaired user with navigation based on a 3D captured image stream.
Janardhanan, Jayawardan; Dutta, Goutam; Tripuraneni, Varun, Attitude estimation for pedestrian navigation using low cost mems accelerometer in mobile applications, and processing methods, apparatus and systems.
Sawan,Mohamad; Harvey,Jean Fran챌ois; Roy,Martin; Coulombe,Jonathan; Savaria,Yvon; Donfack,Colince, Body electronic implant and artificial vision system thereof.
Kramer James P. (Stanford CA) Lindener Peter (E. Palo Alto CA) George William R. (Palo Alto CA), Communication system for deaf, deaf-blind, or non-vocal individuals using instrumented glove.
Kurzweil, Raymond C.; Albrecht, Paul; Gashel, James; Gibson, Lucy; Lvovsky, Lev, Gesture processing with low resolution images with high resolution processing for optical character recognition for a reading machine.
Hanson Charles M. (Richardson TX) Koester Vaughn J. (Dallas TX) Fallstrom Robert D. (Richardson TX), Head mounted video display and remote camera system.
Strub, Henry B.; Burgess, David A.; Johnson, Kimberly H.; Cohen, Jonathan R.; Reed, David P., Hybrid recording unit including portable video recorder and auxillary device.
Hirsch Hermann (Hirschstrasse 5 A-9021 Klagenfurt (Karnten) ATX) Pichler Heinrich (Sailerackergasse 38/2 A-1190 Wien (Osterreich) ATX), Information system.
Wellner Pierre D.,GBX ; Flynn Michael J.,GBX ; Carter Kathleen A.,GBX ; Newman William M.,GBX, Interactive desktop display system for automatically adjusting pan and zoom functions in response to user adjustment of a feedback image.
Kretsch Mary J. (Vallejo CA) Gunn Moira A. (San Francisco CA) Fong Alice K. (San Francisco CA), Method and system for measurement of intake of foods, nutrients and other food components in the diet.
Stanford Thomas H. (Escondido CA) Sahne Farhad Noroozi (San Diego CA) Riches Thomas P. (Temecula CA) O\Neill Robert (San Diego CA), Neck engageable transducer support assembly and method of using same.
Jung,Kyung Kwon; Chae,Yeon Sik; Rhee,Jin Koo, Object identification system combined with millimeter-wave passive image system and global positioning system (GPS) for the blind.
Holakovszky Lszl (Beregszsz u.4o/I. Budapest ; 1112 HUX) Endrei Kroly (Fehryri t 86. Budapest 1119 HUX) Kezi Lszl (Zugligeti t 69. Budapest 1121 HUX) Endrei Krolyn (Trogatt 55. Budapest 1021 HUX), Stereoscopic video image display appliance wearable on head like spectacles.
Dieberger, Andreas, System and method for non-visually presenting multi-part information pages using a combination of sonifications and tactile feedback.
Naick, Indran; Spinac, Clifford J.; Sze, Calvin L., Using a display associated with an imaging device to provide instructions to the subjects being recorded.
Lipton Lenny (San Rafael CA) Halnon Jeffrey J. (Richmond CA) Mitchell Larry H. (Cupertino CA) Hursey Robert (Carmel Valley CA), Wireless active eyewear for stereoscopic applications.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.