최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | US-0063223 (2016-03-07) |
등록번호 | US-9953088 (2018-04-24) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 0 인용 특허 : 2040 |
A user request is received from a mobile client device, where the user request includes at least a speech input and seeks an informational answer or performance of a task. A failure to provide a satisfactory response to the user request is detected. In response to detection of the failure, informati
A user request is received from a mobile client device, where the user request includes at least a speech input and seeks an informational answer or performance of a task. A failure to provide a satisfactory response to the user request is detected. In response to detection of the failure, information relevant to the user request is crowd-sourced by querying one or more crowd sourcing information sources. One or more answers are received from the crowd sourcing information sources, and the response to the user request is generated based on at least one of the one or more answers received from the one or more crowd sourcing information sources.
1. A method for providing a response to a user request, comprising: at an electronic device with one or more processors and memory: receiving a user request, the user request including at least a speech input and seeks an informational answer or performance of a task, wherein: the user request is as
1. A method for providing a response to a user request, comprising: at an electronic device with one or more processors and memory: receiving a user request, the user request including at least a speech input and seeks an informational answer or performance of a task, wherein: the user request is associated with a detected failure to provide a satisfactory response to the user request; andone or more crowd sourcing information sources relevant to the user request are queried in response to the detected failure to provide a satisfactory response to the user request; andgenerating a response to the user request based on the one or more answers obtained from querying the one or more crowd sourcing information sources. 2. The method of claim 1, wherein: one or more queries are generated based on the user request; andthe one or more queries are sent to the one or more crowd sourcing information sources to obtain the one or more answers. 3. The method of claim 1, wherein: the one or more crowd sourcing information sources are identified from a set of crowd sourcing information sources. 4. The method of claim 1, further comprising: prior to the one or more crowd sourcing information sources being queried: requesting user permission to send the information contained in the user request to the one or more crowd sourcing information sources; andreceiving user permission to send the information contained in the user request to the one or more crowd sourcing information sources. 5. The method of claim 1, further comprising: presenting at least one real-time answer to the user request, wherein the real-time answer is obtained from a real-time answer-lookup database; andafter presenting the at least one real-time answer, presenting at least one non-real-time answer to the user request, wherein the non-real-time answer is obtained from a non-real-time expert service. 6. The method of claim 1, further comprising: presenting a remedial response to the user request, wherein the remedial response is presented upon failing to obtain any answer from at least one of the one or more crowd sourcing information sources. 7. The method of claim 1, wherein: more than one answer is obtained from the one or more crowd sourcing information sources; andthe more than one answer is ranked in accordance with predetermined criteria. 8. The method of claim 1, wherein at least one of the one or more answers from the crowd sourcing information sources is obtained from individual members of the public in non-real-time. 9. The method of claim 1, wherein the detected failure to provide a satisfactory response to the user request comprises a determination that a web-search based on information contained in the user request is unsatisfactory to the user. 10. The method of claim 1, further comprising: prior to generating the response to the user request, providing a second response to the user request; andreceiving user input indicating that the second response is unsatisfactory, wherein the detected failure to provide a satisfactory response to the user request comprises the received user input. 11. The method of claim 1, wherein the detected failure to provide a satisfactory response to the user request is based on usage logs associated with a user. 12. The method of claim 1, wherein querying the one or more crowd sourcing information sources includes querying one or more remote sources. 13. A non-transitory computer-readable medium storing instructions, the instructions, when executed by one or more processors, cause the processors to perform operations comprising: receiving a user request, the user request including at least a speech input and seeks an informational answer or performance of a task, wherein: the user request is associated with a detected failure to provide a satisfactory response to the user request; andone or more crowd sourcing information sources relevant to the user request are queried in response to detecting the failure to provide a satisfactory response to the user request; andgenerating a response to the user request based on the one or more answers obtained from querying the one or more crowd sourcing information sources. 14. The non-transitory computer-readable medium of claim 13, wherein: one or more queries are generated based on the user request; andthe one or more queries are sent to the one or more crowd sourcing information sources to obtain the one or more answers. 15. The non-transitory computer-readable medium of claim 13, wherein: the one or more crowd sourcing information sources are identified from a set of crowd sourcing information sources. 16. The non-transitory computer-readable medium of claim 13, wherein the operations further comprise: prior to the one or more crowd sourcing information sources being queried: requesting user permission to send the information contained in the user request to the one or more crowd sourcing information sources; andreceiving user permission to send the information contained in the user request to the one or more crowd sourcing information sources. 17. The non-transitory computer-readable medium of claim 13, wherein the operations further comprise: presenting at least one real-time answer to the user request, wherein the real-time answer is obtained from a real-time answer-lookup database; andafter presenting the at least one real-time answer, presenting at least one non-real-time answer to the user request, wherein the non-real-time answer is obtained from a non-real-time expert service. 18. The non-transitory computer-readable medium of claim 13, wherein the operations further comprise: presenting a remedial response to the user request, wherein the remedial response is presented upon failing to obtain any answer from at least one of the one or more crowd sourcing information sources. 19. The non-transitory computer-readable medium of claim 13, wherein: more than one answer is obtained from the one or more crowd sourcing information sources; andthe more than one answer is ranked in accordance with predetermined criteria. 20. The non-transitory computer-readable medium of claim 13, wherein at least one of the one or more answers from the crowd sourcing information sources is obtained from individual members of the public in non-real-time. 21. The non-transitory computer-readable medium of claim 13, wherein querying the one or more crowd sourcing information sources includes querying one or more remote sources. 22. A system, comprising: one or more processors; andmemory storing instructions, the instructions, when executed by the one or more processors, cause the processors to perform operations comprising: receiving a user request, the user request including at least a speech input and seeks an informational answer or performance of a task, wherein: the user request is associated with a detected failure to provide a satisfactory response to the user request; andone or more crowd sourcing information sources relevant to the user request are queried in response to detecting the failure to provide a satisfactory response to the user request; andgenerating a response to the user request based on the one or more answers obtained from querying the one or more crowd sourcing information sources. 23. The system of claim 22, wherein: one or more queries are generated based on the user request; andthe one or more queries are sent to the one or more crowd sourcing information sources to obtain the one or more answers. 24. The system of claim 22, wherein: the one or more crowd sourcing information sources are identified from a set of crowd sourcing information sources. 25. The system of claim 22, wherein the operations further comprise: prior to the one or more crowd sourcing information sources being queried: requesting user permission to send the information contained in the user request to the one or more crowd sourcing information sources; andreceiving user permission to send the information contained in the user request to the one or more crowd sourcing information sources. 26. The system of claim 22, wherein the operations further comprise: presenting at least one real-time answer to the user request, wherein the real-time answer is obtained from a real-time answer-lookup database; andafter presenting the at least one real-time answer, presenting at least one non-real-time answer to the user request, wherein the non-real-time answer is obtained from a non-real-time expert service. 27. The system of claim 22, wherein the operations further comprise: presenting a remedial response to the user request, wherein the remedial response is presented upon failing to obtain any answer from at least one of the one or more crowd sourcing information sources. 28. The system of claim 22, wherein: more than one answer is obtained from the one or more crowd sourcing information sources; andthe more than one answer is ranked in accordance with predetermined criteria. 29. The system of claim 22, wherein at least one of the one or more answers from the crowd sourcing information sources is obtained from individual members of the public in non-real-time. 30. The system of claim 22, wherein querying the one or more crowd sourcing information sources includes querying one or more remote sources. 31. A method for providing a response to a user request, comprising: at one or more electronic devices with one or more processors and memory: receiving a user request, the user request including at least a speech input and seeks an informational answer or performance of a task;detecting a failure to provide a satisfactory response to the user request;in response to detecting the failure, crowd-sourcing information relevant to the user request by querying one or more crowd sourcing information sources;receiving one or more answers from the crowd sourcing information sources; andgenerating a response to the user request based on at least one of the one or more answers received from the one or more crowd sourcing information sources.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.