Devices, systems, and methods for empathetic computing
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06F-003/041
G06F-003/01
G06F-001/16
A61B-005/16
출원번호
US-0938105
(2015-11-11)
등록번호
US-9830005
(2017-11-28)
발명자
/ 주소
Sakaguchi, Rikko
Ishikawa, Hidenori
출원인 / 주소
SomniQ, Inc.
대리인 / 주소
Dorsey & Whitney LLP
인용정보
피인용 횟수 :
12인용 특허 :
98
초록▼
Devices, systems, and methods for empathetic computing are described herein. An example empathetic computing device includes an enclosure configured to fit into the palm of a user. The enclosure may have an upper portion and a lower portion and comprise an interface board, a processor, and a touch s
Devices, systems, and methods for empathetic computing are described herein. An example empathetic computing device includes an enclosure configured to fit into the palm of a user. The enclosure may have an upper portion and a lower portion and comprise an interface board, a processor, and a touch sensor. The interface board may have a plurality of light emitting devices configured to provide a light pattern of a visual response. The processor may be coupled to a memory device. The touch sensor may be configured to detect the touch of the user. The touch sensor may comprise a touch belt disposed along the enclosure circumferentially and a touch pad.
대표청구항▼
1. An empathetic computing device, comprising: an enclosure having a first portion and a second portion;an interface board arranged within the enclosure and comprising a light emitting device configured to provide a visual response, a sound generator configured to provide an auditory response, a vib
1. An empathetic computing device, comprising: an enclosure having a first portion and a second portion;an interface board arranged within the enclosure and comprising a light emitting device configured to provide a visual response, a sound generator configured to provide an auditory response, a vibrator configured to provide a vibrational response, or combinations thereof; a processor coupled to the interface board and a memory device; anda sensor configured to detect a proximity of the user prior to any physical contact between the empathetic computing device and the user and configured to transmit a proximity signal to the processor for providing a first response, based in part on a determination that the user is in proximity, and further configured to detect a natural action of the user and transmit a second signal to the processor for providing a second response to the user, wherein the second response is configured to emulate an action, expression, or emotion of the user. 2. The empathetic computing device of claim 1, wherein at least one of the first portion or the second portion of the enclosure is partially translucent. 3. The empathetic computing device of claim 1, wherein the sensor includes one or more sensors selected from a touch sensor, a proximity sensor, a motion sensor, an accelerometer, a gyroscope, a light sensor, a pressure sensor, a heat sensor, or combinations thereof. 4. The empathetic computing device of claim 1, wherein the sensor comprises a plurality of infrared sensors disposed around a circumference of the empathetic computing device for detecting the proximity of the user to the empathetic computing device. 5. The empathetic computing device of claim 1, wherein the empathetic computing device is configured to enter a one of a plurality of different modes responsive to detection of one of a plurality of different of natural actions of the user, and wherein the empathetic computing device is configured to provide a first visual, auditory, or vibrational responses responsive to entering a first mode of the plurality of different modes and a second different visual, auditory, or vibrational response responsive to entering a second mode of the plurality of different modes. 6. The empathetic computing device of claim 1, wherein the interface board comprises one or more light emitting diodes configured to provide a light expression. 7. The empathetic computing device of claim 1 wherein empathetic computing device is configured to detect a facial expression of the user and generate a light pattern emulating the facial expression of the user. 8. The empathetic computing device of claim 1, wherein the empathetic computing device is configured to generate a relatively long vibration in response to the determination that the user is in proximity and further generate a relatively short vibration in response to the user touching the empathetic computing device. 9. An empathetic computing system, comprising: an empathetic computing device including a plurality of sensors, a processor, and a memory coupled to the processor,wherein the plurality of sensors includes at least one proximity sensor configured to detect a user in proximity but prior to touching the empathetic computing device, and at least one additional sensor configured to detect a natural action of the user following detection of the user in proximity to the empathetic computing device, andwherein the empathetic computing device is configured to provide a first response to the user upon the detection of the user inproximity and a second different response to the user upon detection of the natural action of the user following the detection of the user in proximity. 10. The empathetic computing system of claim 9, wherein at least one of the first or second response is a visual response comprising a light pattern provided by a plurality of light emitting devices of the empathetic computing device. 11. The empathetic computing system of claim 9, wherein the natural action of the user following the detection of the user in proximity comprises touching the empathetic computing device, supporting the empathetic computing device, partially grasping the empathetic computing device, fully grasping the empathetic computing device, clasping the empathetic computing device, or a combination thereof. 12. The empathetic computing system of claim 9, wherein the processor is configured to identify an event and enter an initial mode associated with either the detection of the user proximity or the detection of the natural action of the user and wherein the empathetic computing device is further configured to weight the first event to provide a weighted event and enter a weighted mode based on the weighted event and the initial m ode. 13. The empathetic computing system of claim 9, wherein the empathetic computing device is further configured to selectively enable one or more features, one or more components, or a combination thereof, responsive to the either the detection of the user in proximity or the detection of the natural action of the user. 14. The empathetic computing system of claim 13, wherein the empathetic computing device is configured to selectively enable a speech analysis feature responsive to the detection of the natural action of the user. 15. A method of interfacing with a user, comprising receiving a first set of user data associated with a user;identifying a plurality of events associated with the user based on the first set of user data, wherein the identifying one or more events includes identifying a first event based on detection of the user in proximity to but without touching the empathetic computing device, and identifying a second event based on physical contact between the user and the empathetic computing device;selecting an initial mode based on the first event; andproviding a plurality of responses using a light device, an audio device, a vibration device, or a combination thereof, wherein the providing a plurality of responses includes providing a first response to the user based on the initial mode, and providing a second response to the user based on a second mode associated with the second event, wherein the second response is configured to emulate an action, expression, or emotion of the user. 16. The method of claim 15, further comprising: receiving a second set of user data;weighting at least one of the plurality of events based on the second set of user data to provide a weighted event;selecting a weighted mode based on the weighted event and the initial mode; andproviding, using one of the light device, the audio device, or the vibration device, another response to the user based on the weighted event. 17. The method of claim 16, wherein weighting the event based on the user data to provide a weighted event comprises weighting the event based on a duration of the event. 18. The method of claim 15, wherein the identifying a plurality of events comprises performing feature extraction on the first set of user data. 19. The method of claim 15, wherein the identifying plurality of events comprises detecting a facial expression of the user and wherein the providing a plurality of responses comprises generating a response emulating the facial expression of the user. 20. The method of claim 15, wherein the providing a plurality of responses includes providing a light pattern by a plurality of LEDs in response to one of the plurality of events associated with the user. 21. The method of claim 15, wherein the providing a plurality of responses includes generating a relatively long vibration in response to the user approaching the empathetic computing device and generating a relatively short vibration in response to the user touching the empathetic computing device. 22. The method of claim 15, wherein identifying an event based on the user data comprises comparing the user data to a data model. 23. The method of claim 15, wherein the providing a plurality of responses includes generating vibration to cause an orientation of the empathetic computing device to be adjusted.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (98)
Vu, Sonny X.; Montgomery, Matthew Ryan; Ng Sheung-yan, Jessica, Activity monitoring device.
Jobs Steven P. ; Ive Jonathan P. ; Coster Daniel J. ; Stringer Christopher J. ; De Iuliis Daniele ; Andre Bartley K. ; Howarth Richard P. ; Seid Calvin Q. ; Satzger Douglas B. ; van de Loo Marc J., Cursor control device.
Rouillac, Nichole Suzanne; Rowe, Luane; Dubreucq, Maxime David Eric; Fong, Lawrence Herman; Lu, Chi Jen; Lee, Jenny Yoon Joo; Ng, Christina L., Heart rate monitor.
Jobs Steven P. ; Andre Bartley K. ; Coster Daniel J. ; De Iuliis Daniele ; Howarth Richard P. ; Ive Jonathan P. ; Rohrbach Matthew Dean ; Satzger Douglas B. ; Seid Calvin Q. ; Stringer Christopher J., Power adapter.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.