최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | US-0017599 (2013-09-04) |
등록번호 | US-9123255 (2015-09-01) |
발명자 / 주소 |
|
출원인 / 주소 |
|
인용정보 | 피인용 횟수 : 0 인용 특허 : 81 |
One embodiment includes a computer-implemented method using a window environment of a display, with a detached imaging sensor, to enable a user to learn. Another embodiment includes a computer-implemented system helping a user learn using a detached imaging sensor. In yet another embodiment, a compu
One embodiment includes a computer-implemented method using a window environment of a display, with a detached imaging sensor, to enable a user to learn. Another embodiment includes a computer-implemented system helping a user learn using a detached imaging sensor. In yet another embodiment, a computer-implemented system monitors automatically more than once a user's behavior while the user is working on materials. Through monitoring the user's volitional or involuntary behavior, the system determines whether to change what is to be presented by the display. The change could include providing rewards, punishments, and stimulation; or changing the materials. The system can also react by asking the user a question. Based on the user's response, the system may change to more appropriate materials, or different presentation styles.
1. A computer-implemented method using an apparatus with at least an imaging sensor, the method comprising: acquiring measurements, by the apparatus, regarding a behavior of a user, the measurements from sensing a feature of the user by the imaging sensor of the apparatus, the sensor being detached
1. A computer-implemented method using an apparatus with at least an imaging sensor, the method comprising: acquiring measurements, by the apparatus, regarding a behavior of a user, the measurements from sensing a feature of the user by the imaging sensor of the apparatus, the sensor being detached from the feature of the user at least when sensing the feature;analyzing, by the apparatus, the measurements;determining, by the apparatus, from the analyzing the measurements, at least the behavior being associated with a first window of a display, the display, in a window environment, to display at least the first window and a second window;identifying, by the apparatus, materials for presentation to the user, in the first window of the display at least in view of the behavior being associated with the first window,wherein at least the imaging sensor and the display to have a spatial relationship to help the determining;further determining, by the apparatus, the user changing from paying attention to not paying attention to the first window at least based on other measurements regarding at least another behavior of the user; andchanging, by the apparatus, materials to be presented in the first window of the display at least in view of the further determining. 2. A computer-implemented method as recited in claim 1, wherein the acquiring measurements includes acquiring facial measurements of the user. 3. A computer-implemented method as recited in claim 2, wherein the facial measurements include measurements regarding at least an eye of the user. 4. A computer-implemented method as recited in claim 2, wherein the determining further comprises determining a facial orientation of the user based at least in part on the facial measurements of the user. 5. A computer-implemented method as recited in claim 2, wherein the analyzing includes identifying a vertical distance of the user. 6. A computer-implemented method as recited in claim 2, wherein the analyzing includes identifying a horizontal distance of the user. 7. A computer-implemented method as recited in claim 1, wherein the method further comprises determining whether the user is looking away from the display. 8. A computer-implemented method as recited in claim 1, wherein the determining includes determining at least a speed at which the user views materials on the display to at least affect the identifying. 9. A computer-implemented method as recited in claim 1, wherein the imaging sensor includes an optical imaging sensor, andwherein the measurements include an image of at least a part of the user. 10. A computer-implemented method as recited in claim 1 further comprising sensing and analyzing measurements from another sensor of the apparatus, with the another sensor sensing another feature of the user to identify materials for presentation in the first window. 11. A computer-implemented method as recited in claim 10, wherein the imaging sensor includes an optical imaging sensor,wherein at least some of the measurements from the imaging sensor of the apparatus include an image regarding the head of the user, andwherein at least some of the measurements from the another sensor of the apparatus include measurements of the user, but not of the head of the user. 12. A computer-implemented method as recited in claim 1 further comprising sensing by the apparatus another feature of the user to produce measurements to be analyzed to identify materials for presentation in the first window. 13. A computer-implemented method as recited in claim 12, wherein the method further comprises alternating between sensing of the feature and sensing of the another feature to identify materials for presentation in the first window. 14. A computer-implemented method as recited in claim 12, wherein the method further comprises alternating between the sensing and the analyzing of the feature and the another feature to identify materials for presentation in the first window. 15. A computer-implemented method using an apparatus with at least an imaging sensor, the method comprising: acquiring measurements, by the apparatus, regarding a behavior of a user, the measurements from sensing a feature of the user by the imaging sensor of the apparatus, the sensor being detached from the feature of the user at least when sensing the feature;analyzing, by the apparatus, the measurements;determining, by the apparatus, from the analyzing the measurements, at least a position in a first window of a display, the display, in a window environment, to display at least the first window and a second window, wherein at least the imaging sensor and the display to have a spatial relationship to help the determining;identifying, by the apparatus, materials for presentation to the user, in the first window of the display at least in view of the determining of the position in the first window;linking, by the apparatus, digital content based on at least some of the measurements to an identity of the user, with the at least some of the measurements being acquired in a first session;acquiring additional measurements, by the apparatus, in a second session regarding at least a behavior of the user in the second session, the additional measurements being acquired by the sensor;accessing, by the apparatus, in the second session, the digital content based on at least some of the measurements, at least based on the digital content being linked to the identity of the user;comparing, by the apparatus, the digital content based on at least some of the measurements from the first session and digital content based on at least some of the additional measurements from the second session; andidentifying materials for presentation in the first window in the second session at least based on the comparing. 16. A computer-implemented method as recited in claim 1, wherein the method further comprises identifying what is to be presented at least in view of subsequent measurements from the sensor regarding at least another behavior of the user, wherein the subsequent measurements provide an indication that the user is paying attention to the first window, with the subsequent measurements being subsequent to the other measurements. 17. A computer-implemented method as recited in claim 1, wherein the method further comprises continuing on, in the first window of the display, with a presentation that was stopped, at least in view of subsequent measurements from the sensor regarding the user, wherein the subsequent measurements provide an indication that the user is paying attention to the first window, with the subsequent measurements being subsequent to the other measurements. 18. A computer-implemented method as recited in claim 1, wherein the method further comprises: using, by the apparatus, at least a rule to identify what is to be presented by the display; andself-adapting, by the apparatus, to change the rule subsequent to the using of the rule, without requiring additional measurement from the user, so as to affect what is to be presented by the display. 19. A computer-implemented method as recited in claim 10, wherein measurements from one of the sensor or the another sensor include measurements regarding the head of the user, andwherein measurements from the other one of the sensor or the another sensor include measurements of the user, but not of the head of the user. 20. A computer-implemented method as recited in claim 19, wherein the measurements regarding the head of the user include measurements regarding at least an eye of the user. 21. A computer-implemented method as recited in claim 20, wherein the method further comprises determining whether the user is looking away from the display. 22. A computer-implemented method as recited in claim 20, wherein the determining includes determining at least a speed at which the user views materials on the display to at least affect the identifying. 23. A computer-implemented method as recited in claim 22, wherein the method further comprises identifying materials to be presented in the first window of the display at least in view of subsequent measurements from the apparatus regarding the user,wherein the subsequent measurements provide an indication that the user is paying attention to the display, with the subsequent measurements being subsequent to the other measurements. 24. A computer-implemented method as recited in claim 22, wherein the imaging sensor and the display to have a spatial relationship based at least partially on the imaging sensor being connected and adjacent to the display. 25. A computer-implemented method as recited in claim 24, wherein the method further comprises determining that at least another behavior of the user is not for the first window, but is for another window of the display to affect what is to be presented by the display. 26. A computer-implemented method as recited in claim 25, wherein the method further comprises: using, by the apparatus, at least a rule to identify what is to be presented by the display; andself-adapting, by the apparatus, to change the rule subsequent to the using of the rule, without requiring additional measurement from the user, so as to affect what is to be presented by the display. 27. A computer-implemented method as recited in claim 1, wherein the imaging sensor and the display to have a spatial relationship based at least partially on the imaging sensor being connected and adjacent to the display. 28. A computer-implemented method as recited in claim 1, wherein the determining includes determining that at least another behavior of the user is not for the first window, but is for another window of the display. 29. An apparatus comprising: an imaging sensor to sense a feature of a user to produce measurements regarding a behavior of the user, the sensor to be detached from the feature of the user when sensing the feature; anda processor coupled to the imaging sensor and to a display, the display, in a window environment, to display at least a first window and a second window, the processor to: analyze the measurements;determine, from the analysis of the measurements, at least the behavior to be associated with the first window of the display; andidentify materials for presentation to the user in the first window of the display at least in view of the behavior to be associated with the first window,wherein at least the imaging sensor and the display to have a spatial relationship to help the determination to be made;make another determination regarding the user changing from paying attention to not paying attention to the first window at least based on other measurements regarding at least another behavior of the user; andchange materials to be presented in the first window of the display at least in view of the another determination to be made. 30. An apparatus as recited in claim 29, wherein the measurements to include facial measurements of the user. 31. An apparatus as recited in claim 29, wherein the processor further to determine whether the user is looking away from the display. 32. An apparatus as recited in claim 29, wherein the to determine includes to determine at least a speed at which the user views materials on the display. 33. An apparatus as recited in claim 29 further comprising another sensor to sense another feature of the user to produce measurements to be analyzed by the processor to identify materials for presentation in the first window. 34. An apparatus as recited in claim 33, wherein the imaging sensor includes an optical imaging sensor,wherein at least some of the measurements from the imaging sensor of the apparatus to include an image regarding the head of the user, andwherein at least some of the measurements from the another sensor of the apparatus to include measurements of the user, but not of the head of the user. 35. An apparatus as recited in claim 29, wherein the imaging sensor to sense another feature of the user to produce measurements to be analyzed so as to identify materials for presentation in the first window. 36. An apparatus comprising: an imaging sensor to sense a feature of a user to produce measurements regarding a behavior of the user, the sensor to be detached from the feature of the user at least when sensing the feature; anda processor coupled to the imaging sensor and a display, the display, in a window environment, to display at least a first window and a second window, the processor to: analyze the measurements;determine, from the analysis of the measurements, at least a position in the first window of the display, wherein at least the imaging sensor and the display to have a spatial relationship to help the determination to be made;identify materials for presentation to the user in the first window of the display at least in view of the determination of the position in the first window;link digital content based on at least some of the measurements to an identity of the user, with the at least some of the measurements to be acquired in a first session;access, in a second session, the digital content based on at least some of the measurements, at least based on the digital content to be linked to the identity of the user;compare the digital content based on at least some of the measurements from the first session with digital content based on at least some additional measurements to be from the sensor in the second session, the additional measurements regarding at least a behavior of the user in the second session; andidentify materials for presentation in the first window in the second session at least based on the comparison. 37. An apparatus as recited in claim 29, wherein the processor further to identify materials to be presented in the first window at least in view of subsequent measurements from the sensor regarding the user, wherein the subsequent measurements to indicate the user is paying attention to the first window, with the subsequent measurements being subsequent to the other measurements. 38. An apparatus as recited in claim 29, wherein the processor further to continue on, in the first window of the display, with a presentation that was stopped, at least in view of subsequent measurements from the sensor regarding the user, wherein the subsequent measurements to suggest the user is paying attention to the first window, with the subsequent measurements being subsequent to the other measurements. 39. An apparatus as recited in claim 29, wherein the processor to: use at least a rule to identify what is to be presented by the display; andself-adapt to change the rule subsequent to the using of the rule, without requiring additional measurement from the user, so as to affect what is to be presented by the display. 40. An apparatus as recited in claim 29, wherein the imaging sensor and the display to have a spatial relationship based at least partially on the imaging sensor to be connected and adjacent to the display. 41. An apparatus as recited in claim 29, wherein the processor further to determine at least another behavior of the user to be for another window of the display so as to affect what is to be presented by the display. 42. An apparatus as recited in claim 33, wherein measurements from one of the sensor or the another sensor to include measurements regarding the head of the user, andwherein measurements from the other one of the sensor or the another sensor to include measurements of the user, but not of the head of the user. 43. An apparatus as recited in claim 42, wherein the measurements regarding the head of the user to include measurements regarding at least an eye of the user. 44. An apparatus as recited in claim 43, wherein the processor further to determine whether the user is looking away from the display based on a behavior of the user. 45. An apparatus as recited in claim 43, wherein the processor further to determine at least a speed at which the user views materials on the display so as to affect what is to be presented by the display. 46. An apparatus as recited in claim 45, the processor further to identify materials for presentation in the first window at least in view of subsequent measurements from the apparatus regarding the user, wherein the subsequent measurements to provide an indication regarding the user is paying attention to the display, with the subsequent measurements being subsequent to the other measurements. 47. An apparatus as recited in claim 45, wherein the imaging sensor and the display to have a spatial relationship based at least partially on the imaging sensor to be connected and adjacent to the display. 48. An apparatus as recited in claim 47, wherein the processor further to determine that at least another behavior of the user to be for another window of the display so as to affect what is to be presented by the display. 49. An apparatus as recited in claim 48, wherein the processor to: use at least a rule to identify what is to be presented by the display; andself-adapt to change the rule subsequent to the using of the rule, without requiring additional measurement from the user, so as to affect what is to be presented by the display. 50. A non-transitory computer readable storage medium comprising a plurality of instructions, the plurality of instructions executable by an apparatus to result in the apparatus: acquiring measurements regarding a behavior of a user, the measurements from sensing a feature of the user by an imaging sensor of the apparatus, the sensor being detached from the feature of the user at least when sensing the feature;analyzing the measurements;determining, from the analyzing the measurements, at least the behavior being associated with a first window of a display, with the display, in a window environment, to display at least a second window;identifying materials for presentation to the user in the first window of the display at least in view of the behavior being associated with the first window,wherein at least the imaging sensor and the display to have a spatial relationship to help the determining;further determining that the user changing from paying attention to not paying attention to the first window at least based on other measurements regarding at least another behavior of the user; andchanging materials to be presented in the first window of the display at least in view of the further determining. 51. A computer-implemented method as recited in claim 10, wherein measurements from one of the sensor or the another sensor include an image regarding the head of the user,wherein measurements from the other one of the sensor or the another sensor include measurements of the user, but not of the head of the user,wherein the determining includes determining at least a speed at which the user views materials on the display to help the identifying, andwherein the imaging sensor and the display to have a spatial relationship based at least partially on the imaging sensor being connected and adjacent to the display. 52. A computer-implemented method as recited in claim 51, wherein the method further comprises: using, by the apparatus, at least a rule to identify what is to be presented by the display; andself-adapting, by the apparatus, to change the rule subsequent to the using of the rule, without requiring additional measurement from the user, so as to affect what is to be presented by the display. 53. An apparatus as recited in claim 33, wherein measurements from one of the sensor or the another sensor of the apparatus to include an image regarding the head of the user,wherein measurements from the other one of the sensor or the another sensor to include measurements of the user, but not of the head of the user,wherein the to determine includes to determine at least a speed at which the user views materials on the display, andwherein the imaging sensor and the display to have a spatial relationship based at least partially on the imaging sensor to be connected and adjacent to the display. 54. An apparatus as recited in claim 53, wherein the processor to: use at least a rule to identify what is to be presented by the display; andself-adapt to change the rule subsequent to the using of the rule, without requiring additional measurement from the user, so as to affect what is to be presented by the display. 55. A computer-implemented method as recited in claim 1 further comprising determining, by the apparatus, based at least partially on measurements by the apparatus, a heart beat of the user to identify materials for presentation via the display. 56. A computer-implemented method as recited in claim 24 further comprising determining, by the apparatus, based at least partially on measurements by the apparatus, a heart beat of the user to identify materials for presentation via the display. 57. An apparatus as recited in claim 29, wherein the processor further to determine a heart beat of the user to identify materials for presentation via the display. 58. An apparatus as recited in claim 47, wherein the processor further to determine a heart beat of the user to identify materials for presentation via the display. 59. An apparatus as recited in claim 29, wherein the processor further, after to make another determination, to wait for a preset amount of time before to change materials to be presented in the first window of the display. 60. An apparatus as recited in claim 32, wherein the speed to include a speed of a movement of the user, andwherein the movement of the user to include a movement of at least one of the eyes of the user. 61. A computer-implemented method as recited in claim 5, wherein the vertical distance of the user includes a vertical distance of the face of the user. 62. A computer-implemented method as recited in claim 6, wherein the horizontal distance of the user includes a horizontal distance of the face of the user. 63. A computer-implemented method as recited in claim 1, wherein the method further comprises, after determining the user changing from paying attention to not paying attention to the first window, waiting for a preset amount of time before changing, by the apparatus, materials to be presented in the first window of the display. 64. A non-transitory computer readable storage medium as recited in claim 50, wherein the plurality of instructions executable by the apparatus to further result in the apparatus, after determining the user changing from paying attention to not paying attention to the first window, waiting for a preset amount of time before changing materials to be presented in the first window of the display. 65. A computer-implemented method as recited in claim 8, wherein the speed includes a speed of a movement of the user, andwherein the movement of the user includes a movement of at least one of the eyes of the user. 66. A computer-implemented method as recited in claim 22, wherein the method further comprises performing a calibration process on the user, and wherein the determining at least a speed depends at least on the calibration process. 67. An apparatus as recited in claim 45, wherein the processor further to perform a calibration process on the user, and wherein the to determine the at least a speed depends at least on the calibration process. 68. A computer-implemented method as recited in claim 22, wherein the determining at least a speed helps identify the user's paying of attention on materials on the display. 69. An apparatus as recited in claim 45, wherein the to determine at least a speed to help identify the user's paying of attention on materials on the display. 70. An apparatus as recited in claim 29, wherein the processor further to link digital content based on at least some of the measurements to an identity of the user. 71. An apparatus as recited in claim 29, wherein the processor to retrieve at least a portion of the materials for presentation to the user by the display, via a network external to the apparatus, with the network including the Internet. 72. An apparatus as recited in claim 36 wherein the processor to identify certain materials for presentation by the display to base at least partially on measurements regarding the user in the first session, additional measurements regarding the user in the second session, and subsequent additional measurements from the sensor regarding the user in a third session. 73. An apparatus as recited in claim 36 further comprising another sensor to sense another feature of the user to produce measurements to be analyzed by the processor to identify materials for presentation in the first window. 74. An apparatus as recited in claim 73, wherein the materials for presentation to be identified at least via the another sensor to sense the another feature includes to ask the user to respond to an inquiry. 75. An apparatus as recited in claim 73, wherein the processor to retrieve at least a portion of the materials for presentation in the second session, via a network external to the apparatus, with the network including the Internet. 76. An apparatus as recited in claim 75, wherein the to determine includes to determine at least a speed at which the user views materials on the display. 77. An apparatus as recited in claim 76, wherein the processor further to determine at least another behavior of the user to be for another window of the display so as to affect what is to be presented by the display. 78. An apparatus as recited in claim 77, wherein the to identify materials in the second session includes to identify training materials for a product for presentation to the user. 79. An apparatus as recited in claim 77, wherein the processor further to determine measurements regarding the user, change materials to be presented in the first window at least in view of the determination to be made, and continue on, in the first window of the display, with a presentation that was stopped, at least in view of subsequent measurements regarding the user, wherein the subsequent measurements to indicate the user is paying attention to the first window. 80. A non-transitory computer readable storage medium comprising a plurality of instructions, the plurality of instructions executable by an apparatus to result in the apparatus: acquiring measurements regarding a behavior of a user, the measurements from sensing a feature of the user by an imaging sensor of the apparatus, the sensor being detached from the feature of the user at least when sensing the feature;analyzing the measurements;determining, from the analyzing the measurements, at least a position in a first window of a display, with the display, in a window environment, to display at least a second window, wherein at least the imaging sensor and the display to have a spatial relationship to help the determining;identifying materials for presentation to the user in the first window of the display at least in view of the determining the position in the first window;linking digital content based on at least some of the measurements to an identity of the user, with the at least some of the measurements being acquired in a first session;accessing, in a second session, the digital content based on at least some of the measurements, at least based on the digital content being linked to the identity of the user;comparing the digital content based on at least some of the measurements with digital content based on at least some additional measurements from the sensor in the second session, the additional measurements regarding at least a behavior of the user in the second session; andidentifying materials for presentation in the first window in the second session at least in view of the comparing. 81. A non-transitory computer readable storage medium as recited in claim 50, wherein the plurality of instructions executable by the apparatus to further result in the apparatus acquiring measurements regarding at least another behavior of the user from sensing another feature of the user by another sensor of the apparatus, to identify materials for presentation in the first window of the display. 82. A non-transitory computer readable storage medium as recited in claim 50, wherein the plurality of instructions executable by the apparatus to further result in the apparatus linking digital content based on at least some of the measurements to an identity of the user. 83. A non-transitory computer readable storage medium as recited in claim 50, wherein the plurality of instructions executable by the apparatus to further result in the apparatus retrieving at least a portion of the materials for presentation to the user by the display, via a network external to the apparatus, with the network including the Internet. 84. A non-transitory computer readable storage medium as recited in claim 50, wherein the determining includes determining at least a speed at which the user views materials on the display. 85. A non-transitory computer readable storage medium as recited in claim 50, wherein the plurality of instructions executable by the apparatus to further result in the apparatus determining at least another behavior of the user to be for another window of the display to affect what is to be presented by the display. 86. A non-transitory computer readable storage medium as recited in claim 80 wherein certain materials for presentation by the display to base at least in part on measurements regarding the user in the first session, additional measurements regarding the user in the second session, and subsequent additional measurements from the sensor regarding the user in a third session. 87. A non-transitory computer readable storage medium as recited in claim 50, wherein the plurality of instructions executable by the apparatus to further result in the apparatus continuing on, in the first window of the display, with a presentation that was stopped, at least in view of subsequent measurements regarding the user, wherein the subsequent measurements to indicate the user paying attention to the first window, with the subsequent measurements being subsequent to the other measurements. 88. A non-transitory computer readable storage medium as recited in claim 81, wherein measurements from one of the sensor or the another sensor to include an image regarding the head of the user, andwherein measurements from the other one of the sensor or the another sensor to include measurements of the user, but not of the head of the user. 89. A non-transitory computer readable storage medium as recited in claim 80, wherein the plurality of instructions executable by the apparatus to further result in the apparatus acquiring measurements regarding at least another behavior of the user from sensing another feature of the user by another sensor of the apparatus, to identify materials for presentation in the first window of the display. 90. A non-transitory computer readable storage medium as recited in claim 89, wherein the materials for presentation identified at least via the another sensor sensing the another feature includes asking the user to respond to an inquiry. 91. A non-transitory computer readable storage medium as recited in claim 89, wherein the plurality of instructions executable by the apparatus to further result in the apparatus retrieving at least a portion of the materials for presentation in the second session, via a network external to the apparatus, with the network including the Internet. 92. A non-transitory computer readable storage medium as recited in claim 91, wherein the plurality of instructions executable by the apparatus to further result in the apparatus determining at least a speed at which the user views materials on the display to affect what is to be presented by the display. 93. A non-transitory computer readable storage medium as recited in claim 92, wherein the plurality of instructions executable by the apparatus to further result in the apparatus determining at least another behavior of the user to be for another window of the display to affect what is to be presented by the display. 94. A non-transitory computer readable storage medium as recited in claim 93, wherein the plurality of instructions executable by the apparatus to further result in the apparatus identifying training materials for a product for presentation to the user in view of determining by the apparatus at least another behavior of the user for the first window of the display. 95. A non-transitory computer readable storage medium as recited in claim 93, wherein the plurality of instructions executable by the apparatus to further result in the apparatus determining the user not paying attention to the first window at least based on measurements regarding the user, changing materials for presentation in the first window at least in view of the determining, and continuing on, in the first window of the display, with a presentation that was stopped, at least in view of subsequent measurements regarding the user, wherein the subsequent measurements to indicate the user paying attention to the first window. 96. A computer-implemented method as recited in claim 22, wherein the materials for presentation identified at least via the another sensor sensing the another feature includes asking the user to respond to an inquiry. 97. A computer-implemented method as recited in claim 22, wherein the identifying materials for presentation at least via the imaging sensor sensing the feature depends at least on comparing the speed with a reference speed. 98. A computer-implemented method as recited in claim 97, wherein the reference speed comprises a dynamic reference speed. 99. A computer-implemented method as recited in claim 97, wherein the reference speed depends at least on a speed of another user. 100. An apparatus as recited in claim 45, wherein the materials for presentation to be identified at least via the another sensor to sense the another feature includes to ask the user to respond to an inquiry. 101. An apparatus as recited in claim 45, wherein the materials for presentation to be identified at least via the imaging sensor to sense the feature depends at least on the processor to compare the speed with a reference speed. 102. An apparatus as recited in claim 101, wherein the reference speed comprises a dynamic reference speed. 103. An apparatus as recited in claim 101, wherein the reference speed depends at least on a speed of another user. 104. A computer-implemented method as recited in claim 1, wherein the changing materials to be presented includes linking to a web location to be presented by the display. 105. A computer-implemented method as recited in claim 1, wherein the method further comprises acquiring certain measurements, by the apparatus, regarding the user; and not using, by the apparatus, the certain measurements to identify what to present in the first window of the display, in view of the further determining of the user changing from paying attention to not paying attention.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.