System for monitoring individuals as they age in place
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
A61B-005/11
A61B-005/00
A61B-003/11
A61B-005/0402
A61B-005/0476
A61B-005/103
A61B-005/1171
A61B-005/16
A61B-007/04
G09B-005/00
A61B-005/1455
G06K-009/00
G06K-009/62
G08B-021/04
A63B-024/00
G09B-005/06
G09B-019/00
G06F-019/00
A61F-002/76
A61B-005/0205
A61B-005/024
A61B-005/053
A61B-005/08
G02C-011/00
출원번호
US-0562454
(2014-12-05)
등록번호
US-9795324
(2017-10-24)
발명자
/ 주소
Sales, Jay William
Klosinski, Jr., Richard Chester
Workman, Matthew Allen
Murphy, Meghan Kathleen
Steen, Matthew David
출원인 / 주소
Vision Service Plan
대리인 / 주소
Brient Globerman, LLC
인용정보
피인용 횟수 :
1인용 특허 :
109
초록▼
A computer-implemented method, and related system, for monitoring the wellbeing of an individual by providing eyewear that includes at least one sensor for monitoring the motion of the user. In various embodiments, the system receives data generated by the at least one sensor, uses the data to deter
A computer-implemented method, and related system, for monitoring the wellbeing of an individual by providing eyewear that includes at least one sensor for monitoring the motion of the user. In various embodiments, the system receives data generated by the at least one sensor, uses the data to determine the user's movements using the received data, and compares the user's movements to previously established movement patterns of the user. If the system detects one or more inconsistencies between the user's current movements as compared to the previously established movement patterns of the user, the system may notify the user or a third party of the detected one or more inconsistencies. The system may similarly monitor a user's compliance with a medical regime and notify the user or a third party of the user's compliance with the regime.
대표청구항▼
1. A computer-implemented method of monitoring the wellbeing of an individual, the method comprising the steps of: a. providing a user with computerized eyewear comprising at least one sensor for monitoring the motion of the user;b. receiving, by one or more processors, a pre-defined command from th
1. A computer-implemented method of monitoring the wellbeing of an individual, the method comprising the steps of: a. providing a user with computerized eyewear comprising at least one sensor for monitoring the motion of the user;b. receiving, by one or more processors, a pre-defined command from the user, for the at least one sensor to generate a first set of data identifying one or more user-defined movement patterns for the user;c. in response to receiving the pre-defined command, collecting, by one or more processors, via the at least one sensor, the first set of data;d. using, by one or more processors, the first set of data to determine one or more typical movement patterns of the user;e. generating, by one or more processors, an established one or more user-defined movement patterns for the user based on the first set of data generated by the at least one sensor after the user with the computerized eyewear provides the pre-defined command for the at least one sensor to generate the first set of data identifying the one or more user-defined movement patterns for the user;f. receiving, by one or more processors, a second set of data generated by the at least one sensor after the established one or more user-defined movement patterns have been generated;g. at least partially in response to receiving the second set of data generated by the at least one sensor, determining, by one or more processors, the user's movements using the received second set of data;h. at least partially in response to determining the user's movements, comparing, by one or more processors, the user's movements based on the second set of data to the previously established one or more user-defined movement patterns for the user;i. detecting, by one or more processors, one or more inconsistencies between the current user's determined movements as compared to the previously established one or more user-defined movement patterns;j. at least partially in response to detecting the one more inconsistencies, notifying, by one or more processors, at least one recipient of the one or more inconsistencies, where the at least one recipient is a recipient selected from a group consisting of: the user and a third party. 2. The computer-implemented method of claim 1, wherein the at least one sensor comprises at least one sensor selected from a group consisting of: a. a motion sensor;b. an accelerometer;c. a gyroscope;d. a geomagnetic sensor;e. a global positioning system sensor;f. an impact sensor;g. a microphone;h. a forward facing camera;i. a heart rate monitor;j. a pulse oximeter;k. a blood alcohol monitor;l. a respiratory rate sensor; andm. a transdermal sensor. 3. The computer-implemented method of claim 2, wherein the at least one sensor comprises at least one sensor selected from a group consisting of: a motion sensor, an accelerometer, a global positioning sensor, a gyroscope, and a forward facing camera. 4. The computer-implemented method of claim 2, wherein the method further comprises the step of: a. calculating, by a processor, a number of steps taken by the user in a particular day;b. at least partially in response to calculating the number of steps, comparing, by a processor, the calculated number of steps taken by the user in the particular day to a predetermined average number of steps taken by the user in a day; andc. at least partially in response to comparing the calculated number of steps to the predetermined average number of steps, notifying the user or a third party if the calculated number of steps in the particular day is less than a predetermined percentage of the predetermined average number of steps taken by the user in a day. 5. The computer-implemented method of claim 2, further comprising the steps of: a. detecting, by a processor, whether the user moves during a predefined time period; andb. at least partially in response to detecting whether the user moves during the predefined time period, notifying, by a processor, the at least one recipient selected from a group consisting of: the user or a third party if the user does not move during the predefined time period. 6. The computer-implemented method of claim 2, further comprising the steps of: a. detecting, by a processor, from the received data generated by the at least one sensor if the user experiences a sudden acceleration or sudden impact; andb. at least partially in response to detecting that the user has experienced a sudden acceleration or sudden impact, notifying, by a processor, the user or a third party that the user experienced the sudden acceleration or sudden impact. 7. The computer-implemented method of claim 2, further comprising the steps of: a. detecting, by a processor, from the received data generated by the at least one sensor: (1) whether the user is breathing; and (2) whether the user's heart is beating; andb. at least partially in response to determining that the user is not breathing or that the user's heart is not beating, sending a notification to a third party. 8. The computer-implemented method of claim 2, further comprising the steps of: a. receiving, by a processor, from the user or third party, a medicine regime associated with the user;b. storing, by a processor, the medicine regime in memory;c. receiving, by a processor, data generated by a forward facing camera associated with the computerized eyewear;d. analyzing, by a processor, the received data to determine data selected from a group consisting of one or more: i. types of medicine taken by the user;ii. times the medicine is taken by the user; andiii. doses of the medicine taken by the user;e. at least partially in response to analyzing the received data, comparing, by a processor, the one or more of the types of medicine taken, the one or more times the medicine is taken, or the one or more doses of medicine taken to the stored medicine regime for the user;f. at least partially in response to comparing the one or more of the type of medicine taken, the time the medicine is taken and the dose of medicine taken, identifying, by a processor, one or more inconsistencies between the stored medicine regime, and the one or more types of medicine taken, the one or more times the medicine is taken, or the one or more doses of medicine taken;g. at least partially in response to identifying the one or more inconsistencies between the medicine regime and the one or more of the types of medicine taken, the one or more times the medicine is taken, or the one or more doses of medicine taken, sending an alert to the user or a third party of the one or more inconsistencies. 9. The computer-implemented method of claim 8, wherein: a. the data generated comprises one or more images captured by the forward facing camera;b. the step of analyzing the received data further comprises: i. detecting, by a processor, one or more pills in the one or more images;ii. comparing, by a processor, the one or more detected pills found in the one or more images to one or more known images of pills stored in a database;iii. identifying, by a processor, the one or more pills by matching the one or more pills from the one or more images to the one or more known images of pills stored in the database; andiv. detecting, by a processor, a time that the one or more images were taken. 10. A computer-implemented method of monitoring the wellbeing of an individual, the method comprising the steps of: a. providing a user with computerized eyewear comprising at least one sensor for monitoring actions taken by a user, the at least one sensor including a forward-facing camera that is a component of the computerized eyewear and is configured to automatically capture image data;b. receiving, by a processor, a medicine regime associated with the user;c. receiving, by a processor, data generated by the at least one sensor including image data automatically captured by the forward-facing camera of the computerized eyewear;d. analyzing, by a processor, the received data generated by the at least one sensor to determine data that identifies: i. types of medicine taken by the user based on image data automatically captured by the forward-facing camera that is a component of the computerized eyewear;ii. times the medicine is taken by the user; andiii. doses of medicine taken by the user based on image data automatically captured by the forward-facing camera that is a component of the computerized eyewear;e. at least partially in response to analyzing the received data generated by the at least one sensor, comparing, by a processor, the medicine regime for the user to the determined types of medicine taken by the user based on image data automatically captured by the forward-facing camera that is a component of the computerized eyewear, times the medicine is taken by the user, and doses of medicine taken by the user based on image data automatically captured by the forward-facing camera that is a component of the computerized eyewear;f. detecting, by a processor, one or more inconsistencies between the medicine regime associated with the user and the determined types of medicine taken by the user based on image data automatically captured by the forward-facing camera that is a component of the computerized eyewear, times the medicine is taken by the user, and doses of medicine taken by the user based on image data automatically captured by the forward-facing camera that is a component of the computerized eyewear;g. at least partially in response to detecting one or more inconsistencies between the medicine regime associated with the and the determined types of medicine taken by the user based on image data automatically captured by the forward-facing camera that is a component of the computerized eyewear, the medicine is taken by the user, and doses of medicine taken by the user based on image data automatically captured by the forward-facing camera that is a component of the computerized eyewear, notifying, by a processor, at least one recipient of the detected inconsistencies, where the at least one recipient is selected from a group consisting of: the user and the third party. 11. The computer-implemented method of claim 10, wherein the at least one sensor further comprises one or more sensors selected from a group consisting of: a scanner, a motion sensor, and a microphone. 12. The computer-implemented method of claim 10, wherein: a. the data generated is one or more images automatically captured by a forward facing camera;b. the step of analyzing the received data further comprises: i. detecting, by a processor, one or more pills in the one or more images;ii. comparing, by a processor, the one or more detected pills found in the one or more images to one or more known images of pills stored in a database;iii. identifying, by a processor, the one or more pills by matching the one or more pills from the one or more images to the one or more known images of pills stored in the database; andiv. detecting, by a processor, a time that the one or more images were taken. 13. The computer-implemented method of claim 10, further comprising the step of: a. detecting, by a processor, a level of one or more medicines in the user's bloodstream by at least one sensor of the computerized eyewear that monitors the level of one or more medicines in the user's bloodstream;b. comparing, by a processor, the level of the one or medicines in the user's bloodstream to a predefined level for each of the one or more medicines stored in a database for the user; and c. at least partially in response to comparing the level of the one or medicines in the user's bloodstream, notifying the at least one recipient when the level of the one or more medicines is below the predefined level for each of the one or more medicines, where the at least one recipient is selected from a group consisting of: the user and a third party. 14. The computer-implemented method of claim 10, further comprising the steps of: a. determining, by a processor, the user's one or more movements using the received data generated by the at least one sensor;b. analyzing, by a processor, the received data generated by the at least one sensor to determine the one or more movements associated with the user;c. at least partially in response to analyzing the received data, comparing, by a processor, the user's one or more movements to previously established one or more movement patterns for the user;d. detecting, by a processor, one or more inconsistencies between the user's one or more movements as compared to the previously established one or more movement patterns; ande. at least partially in response to detecting one or more inconsistencies between the current user's one or more movements as compared to previously established one or more movement patterns, notifying, by a processor, at least one recipient selected from a group consisting of: the user or a third party of the detected one or more inconsistencies. 15. A computer-implemented method of monitoring the wellbeing of an individual, the method comprising the steps of: a. providing a user with computerized eyewear comprising at least one sensor for monitoring actions taken by a user, the at least one sensor including: i) a sound-capturing unit that is a component of the computerized eyewear and that is configured to capture sound data,ii) a forward-facing camera that is a component of the computerized eyewear and that is configured to capture image data,iii) a location-determination unit that is a component of the computerized eyewear and that is configured to capture location data;b. receiving, by one or more processors, a medicine regime associated with the user, wherein the medicine regime is provided by: i) at least one of a group of users consisting of: (1) the user and (2) the user's physician, stating a name for each medicine in the medicine regime such that the sound-capturing unit of the computerized eyewear captures sound data for the name of each medicine in the medicine regime; andii) the forward-facing camera of the computerized eyewear capturing image data of each medicine of the medicine regime and the medicine bottle for each medicine of the medicine regime;c. storing, by one or more processors, sound data and image data for the provided medicine regime in a database;d. receiving, by one or more processors, data generated by the at least one sensor including image data captured by the forward-facing camera of the computerized eyewear;e. analyzing, by one or more processors, the received data generated by the at least one sensor to determine at least image data that identifies: i) a first type of medicine taken by the user at a first geographic location, wherein the determination of the first type of medicine taken by the user is based on image data captured by the forward-facing camera and the determination of the first geographic location is based on location data captured by the location-determination unit;ii) a first time that the first type of medicine was taken by the user at the first geographic location; andiii) a first dose of the first type of medicine taken by the user at the first geographic location based on image data captured by the forward-facing camera that is a component of the computerized eyewear;f. analyzing, by one or more processors, the received data generated by the at least one sensor to determine at least image data that identifies: i) a second type of medicine taken by the user at a second geographic location, wherein the determination of the second type of medicine taken by the user is based on image data captured by the forward-facing camera and the determination of the second geographic location is based on location data captured by the location-determination unit, and the first type of medicine is different from the second type of medicine, and the first geographic location is different from the second geographic locationii) a second time that the second type of medicine was taken by the user at the second geographic location; andiii) a second dose of the second type of medicine taken by the user at the second geographic location based on image data captured by the forward-facing camera that is a component of the computerized eyewear;g. at least partially in response to analyzing the received data generated by the at least one sensor, comparing, by one or more processors, the medicine regime for the user to: i) the determined first type of medicine taken by the user at the first geographic location, the first time that the first type of medicine was taken, and the first dose of the first type of medicine that was taken, andii) the determined second type of medicine taken by the user at the second geographic location, the second time that the second type of medicine was taken, and the second dose of the second type of medicine that was taken;h. detecting, by one or more processors, one or more inconsistencies between the medicine regime associated with the user and i) the determined first type of medicine taken by the user at the first geographic location, the first time that the first type of medicine was taken, and the first dose of the first type of medicine that was taken, orii) the determined second type of medicine taken by the user at the second geographic location, the second time that the second type of medicine was taken, and the second dose of the second type of medicine that was taken;i. at least partially in response to detecting the one more inconsistencies, notifying, by one or more processors, at least one recipient of the one or more inconsistencies, where the at least one recipient is a recipient selected from a group consisting of: the user and a third party. 16. The computer implemented method of claim 15, wherein the steps of determining the first type of medicine taken by the user at the first geographic location and determining the second type of medicine taken by the user at the second geographic location further comprise: a. detecting, by one or more processors, one or more pills from the image data;b. comparing, by one or more processors, the one or more detected pills found from the image data to one or more known images of pills stored in the database; andc. identifying, by one or more processors, the one or more pills by matching the one or more pills from the image data to the one or more known images of pills stored in the database. 17. The computer-implemented method of claim 15, further comprising the steps of: a. detecting, by one or more processors, a level of one or more medicines in the user's bloodstream by at least one sensor of the computerized eyewear that monitors the level of one or more medicines in the user's bloodstream;b. comparing, by one or more processors, the level of the one or medicines in the user's bloodstream to a predefined level for each of the one or more medicines stored in a database for the user; andc. at least partially in response to comparing the level of the one or medicines in the user's bloodstream, notifying the at least one recipient when the level of the one or more medicines is below the predefined level for each of the one or more medicines, where the at least one recipient is selected from a group consisting of: the user and a third party. 18. The computer-implemented method of claim 15, further comprising the steps of: a. determining, by one or more processors, the user's one or more movements using the received data generated by the at least one sensor;b. analyzing, by one or more processors, the received data generated by the at least one sensor to determine the one or more movements associated with the user;c. at least partially in response to analyzing the received data, comparing, by one or more processors, the user's one or more movements to previously established one or more movement patterns for the user;d. detecting, by one or more processors, one or more inconsistencies between the user's one or more movements as compared to the previously established one or more movement patterns; ande. at least partially in response to detecting one or more inconsistencies between the current user's one or more movements as compared to previously established one or more movement patterns, notifying, by one or more processors, at least one recipient selected from a group consisting of: the user or a third party of the detected one or more inconsistencies. 19. The computer-implemented method of claim 1, wherein the pre-defined command is defined by the user.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (109)
Fletcher James C. Administrator of the National Aeronautics and Space Administration ; with respect to an invention of ( Sierra Madre CA) Konigsberg Eph (Sierra Madre CA), Accelerometer telemetry system.
Lamb, Mathew J.; Sugden, Ben J.; Crocco, Jr., Robert L.; Keane, Brian E.; Miles, Christopher E.; Perez, Kathryn Stone; Massey, Laura K.; Kipman, Alex Aben-Athar, Constraint based information inference.
Keal, William Kerry; Sachs, David; Lin, Shang-Hung; Anderson, Erik, Deduced reckoning navigation without a constraint relationship between orientation of a sensor platform and a direction of travel of an object.
Kang, Mingoo; Kang, Haengjoon; Park, Jongsoon; Park, Jinyung; Kim, Jongcheol; Park, Junho; Hwang, Sunjung, Digital data reproducing apparatus and corresponding method for reproducing content based on user characteristics.
Swab, Gregory; Malackowski, James E.; Greaves, Mikal; Milesi, Rolf; Ligtenberg, Christiaan; Meier, Thomas, Eyewear with exchangeable temples housing a transceiver forming ad hoc networks with other devices.
Sundaresan Jayaraman ; Sungmee Park ; Rangaswamy Rajamanickam ; Chandramohan Gopalsamy, Fabric or garment with integrated flexible information infrastructure.
Yuen, Shelten Gee Jao; Park, James; Lee, Hans Christiansen, Methods and systems for generation and rendering interactive events having combined activity and location information.
Yuen, Shelten Gee Jao; Park, James; Lee, Hans Christiansen, Methods and systems for metrics analysis and interactive rendering, including events having combined activity and location information.
Yuen, Shelten Gee Jao; Park, James; Lee, Hans Christiansen, Methods and systems for processing social interactive data and sharing of tracked activity associated with locations.
Steuer Robert R. (Salt Lake City UT) Rogers Robert K. (Salt Lake City UT) Horne Robert H. (Holladay UT), Miniature physiological monitor with interchangeable sensors.
Yuen, Shelten Gee Jao; Park, James; Friedman, Eric Nathan; Martinez, Mark Manuel; Axely, Andrew Cole, Portable monitoring devices for processing applications and processing analysis of physiological conditions of a user associated with the portable monitoring device.
David A. Goldman ; Stephen Tomasiewicz ; Vincent Ioanna ; Steven Lacker ; Patrick Rice ; Hua Wang, Shaft sensor assembly for angular velocity, torque, and power.
Van De Walle Gerjan F. A. (Eindhoven NLX) Widdershoven Franciscus P. (Eindhoven NLX), System and device with vertical and rotary wheel-velocity-measuring for determining vehicle displacement.
Krueger, Wesley W O, System and method for measuring and minimizing the effects of vertigo, motion sickness, motion intolerance, and/or spatial disorientation.
Molettiere, Peter Andrew; Yuen, Shelten Gee Jao; Hong, Jung Ook; Axley, Andrew Cole; Park, James, Tracking user physical activity with multiple devices.
Kim, Kwang-soo; Jung, Do-sung, User health monitoring system comprising 3D glasses and display apparatus, and display apparatus and control method thereof.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.