[미국특허]
User gesture input to wearable electronic device involving outward-facing sensor of device
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06F-003/01
G06F-003/0346
G06F-003/03
G06F-003/0481
G06F-001/16
출원번호
US-0015909
(2013-08-30)
등록번호
US-9477313
(2016-10-25)
발명자
/ 주소
Mistry, Pranav
Sadi, Sajid
Yao, Lining
Snavely, John
출원인 / 주소
Samsung Electronics Co., Ltd.
대리인 / 주소
Baker Botts L.L.P.
인용정보
피인용 횟수 :
21인용 특허 :
138
초록▼
In one embodiment, a wearable apparatus includes a sensor, a processor coupled to the sensor, and a memory coupled to the processor that includes instructions executable by the processor. When executing the instructions, the processor detects by the sensor movement of at least a portion of an arm of
In one embodiment, a wearable apparatus includes a sensor, a processor coupled to the sensor, and a memory coupled to the processor that includes instructions executable by the processor. When executing the instructions, the processor detects by the sensor movement of at least a portion of an arm of a user; detects, based at least in part on the movement, a gesture made by the user; and processes the gesture as input to the wearable apparatus.
대표청구항▼
1. A wearable apparatus comprising: an optical sensor;a processor coupled to the optical sensor; andmemory coupled to the processor that comprises instructions executable by the processor, the processor being operable when executing the instructions to: determine an orientation of the wearable appar
1. A wearable apparatus comprising: an optical sensor;a processor coupled to the optical sensor; andmemory coupled to the processor that comprises instructions executable by the processor, the processor being operable when executing the instructions to: determine an orientation of the wearable apparatus;compare the orientation of the wearable apparatus to a predetermined orientation; andin response to a determination that the orientation of the wearable apparatus corresponds to the predetermined orientation: detect by the optical sensor at least a portion of a hand of a user;determine, at least in part based on output from the optical sensor, that the hand detected by the optical sensor is holding a tangible object;detect, at least in part by the optical sensor, the tangible object held by the hand of the user;detect, at least in part by the optical sensor, a movement of the tangible object; andprocess the movement of the tangible object as input to the wearable apparatus. 2. The wearable apparatus of claim 1, wherein the wearable apparatus is worn on the arm attached to the other hand of the user. 3. The wearable apparatus of claim 1, wherein the processor is further operable when executing the instructions to provide, based on the input, output for display on a display of another device. 4. The wearable apparatus of claim 1, wherein the portion of the hand comprises at least one finger. 5. The wearable apparatus of claim 1, wherein the processor is further operable when executing the instructions to: detect by the optical sensor a lack of movement of at least a portion of the tangible object; andprocess the lack of movement of the tangible object as input to the wearable apparatus. 6. The wearable apparatus of claim 1, wherein the processor is further operable when executing the instructions to process the movement based at least in part on a duration of the movement. 7. The wearable apparatus of claim 1, wherein the processor is further operable when executing the instructions to: output, based on the input, content on a display of the wearable device. 8. The wearable apparatus of claim 1, wherein the processor is further operable when executing the instructions to: detect an orientation of the tangible object; andprocess the movement based at least in part on the orientation. 9. The wearable apparatus of claim 8, wherein detecting an orientation of the tangible object comprises: detecting an initial orientation of the tangible object; anddetecting a subsequent orientation of the tangible object. 10. The wearable apparatus of claim 1, wherein the processor is further operable to process the input based at least in part on a setting specified by a software application associated with the wearable apparatus. 11. The wearable apparatus of claim 1, wherein the processor is further operable to: detect a tangible surface nearby the tangible object;detect the movement of the tangible object relative to the surface; andprocess the relative movement as the input. 12. The wearable apparatus of claim 11, wherein the processor is further operable to process the tangible surface as a virtual keyboard or a virtual display. 13. The wearable apparatus of claim 1, comprising: a device body comprising: a touch-sensitive display;a rotatable element about the touch-sensitive display; andan encoder for detecting rotation of the rotatable element;a band for affixing the device body to the user; andthe optical sensor, comprising the optical sensor in or on the band communicably coupled to the touch-sensitive display. 14. One or more computer-readable non-transitory storage media embodying software that is operable when executed to: determine an orientation of a wearable apparatus;compare the orientation of the wearable apparatus to a predetermined orientation; andin response to a determination that the orientation of the wearable apparatus corresponds to the predetermined orientation: detect, by an optical sensor of the wearable computing apparatus, at least a portion of a hand of a user;determine, at least in part based on output from the optical sensor, that the hand detected by the optical sensor is holding a tangible object;detect, at least in part by the optical sensor, the tangible object held by the hand of the user;detect, at least in part by the optical sensor, a movement of the tangible object; andprocess the movement of the tangible object as input to the wearable apparatus. 15. The media of claim 14, wherein the wearable apparatus is worn on the arm attached to the other hand of the user. 16. The media of claim 14, wherein the software is further operable when executed to provide, based on the input, output for display on a display of another device. 17. The media of claim 14, wherein the portion of the hand comprises at least one finger. 18. The media of claim 14, wherein the software is further operable when executed to: detect by the optical sensor a lack of movement of at least a portion of the tangible object; andprocess the lack of movement of the tangible object as input to the wearable apparatus. 19. The media of claim 14, wherein the software is further operable when executed to process the movement based at least in part on a duration of the movement. 20. The media of claim 14, wherein the software is further operable when executed to: output, based on the input, content on a display of the wearable apparatus. 21. The media of claim 14, wherein the software is further operable when executed to: detect an orientation of the tangible object; andprocess the movement based at least in part on the orientation. 22. The media of claim 21, wherein, to detect an orientation of the tangible object, the software is operable when executed to: detect an initial orientation of the tangible object; anddetect a subsequent orientation of the tangible object. 23. The media of claim 14, wherein the software is further operable when executed to process the input based at least in part on a setting specified by a software application associated with the wearable apparatus. 24. The media of claim 14, wherein the software is further operable when executed to: detect a tangible surface nearby the tangible object;detect the movement of the tangible object relative to the surface; andprocess the relative movement as the input. 25. The media of claim 24, wherein: the software is further operable when executed to process the tangible surface as a virtual keyboard or a virtual display. 26. The media of claim 14, wherein the wearable computing device comprises: a device body comprising: a touch-sensitive display;a rotatable element about the touch-sensitive display; andan encoder for detecting rotation of the rotatable element;a band for affixing the device body to the user; andthe optical sensor in or on the band communicably coupled to the touch-sensitive display. 27. A method comprising: determining an orientation of a wearable apparatus;comparing the orientation of the wearable apparatus to a predetermined orientation; andin response to a determination that the orientation of the wearable apparatus corresponds to the predetermined orientation: detecting, by an optical sensor of the wearable computing device, at least a portion of a hand of a user;determining, at least in part based on output from the optical sensor, that the hand detected by the sensor is holding the tangible object;detecting, at least in part by the optical sensor, a tangible object held by the hand of the user;detecting, at least in part by the optical sensor, a movement of the tangible object; andprocessing the movement of the tangible object as input to the wearable device. 28. The method of claim 27, wherein the wearable computing device is worn on the arm attached to the other hand of the user. 29. The method of claim 27, further comprising providing, based on the input, output for display on a display of another device. 30. The method of claim 27, wherein the portion of the hand comprises at least one finger. 31. The method of claim 27, further comprising detecting by the optical sensor a lack of movement of at least some of the portion of the tangible object; and processing the lack of movement of the tangible object as input to the wearable apparatus. 32. The method of claim 27, wherein the movement is based at least in part on a duration of the movement. 33. The method of claim 27, further comprising outputting, based on the input, content on a display of the wearable device. 34. The method of claim 27, further comprising detecting an orientation of the portion of the tangible object, wherein the input is further based at least in part on the orientation. 35. The method of claim 34, wherein detecting an orientation of the portion of the arm of the user comprises: detecting an initial orientation of the tangible object; anddetecting a subsequent orientation of the tangible object. 36. The method of claim 27, further comprising processing the input based at least in part on a setting specified by a software application associated with the wearable device. 37. The method of claim 27, further comprising: detecting a tangible surface nearby the tangible object;detecting the movement of the tangible object relative to the surface; andprocessing the relative movement as the input. 38. The method of claim 37, further comprising processing the tangible surface as a virtual keyboard or a virtual display. 39. The method of claim 27, wherein the wearable computing device comprises: a device body comprising: a touch-sensitive display;a rotatable element about the touch-sensitive display; andan encoder for detecting rotation of the rotatable element;a band for affixing the wearable computing device body to the user; andthe optical sensor, comprising the optical sensor in or on the band communicably coupled to the touch-sensitive display.
Alameh, Rachid M; Forest, Francis W; Hede, William S; Lemke, Mark R; Witte, Robert S, Devices and methods for initiating functions based on movement characteristics relative to a reference.
Klein, Sandro David; Smith-Kielland, Ingvald Alain; Louie, Alex; Scott, Cheryl; Scott, Wayne, Hand held remote control device having an improved user interface.
Burrell Jonathan C. (Olathe KS) Beason Lawrence W. (Olathe KS) Burrell Gary L. (Olathe KS) Laverick David J. (Overland Park KS), Portable handheld combination GPS and communication transceiver.
Braun, Max; Geiss, Ryan; Ho, Harvey; Starner, Thad Eugene; Taubman, Gabriel, Systems and methods for controlling a cursor on a display using a trackpad input device.
Kawahara,Hideya; Hong,Yoojin; Byrne,Paul; Ludolph,Frank E.; Sasaki,Curtis J.; Nishijima,Eitaro, Visual representation and other effects for application management on a device with a small screen.
Fujisawa,Teruhiko; Chihara,Hiroyuki, Wrist-watch device having communication function, information display method, control program, and recording medium.
Chirakan, Jason; Hanthorn, Douglas; Herring, Dean F., Systems and methods for implementing retail processes based on machine-readable images and user gestures.
Bailey, Matthew; Grant, Aaron; Bouchard, Eric Philippe, Systems, devices, and methods effected in response to establishing and/or terminating a physical communications link.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.