최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | US-0503370 (2014-09-30) |
등록번호 | US-9606986 (2017-03-28) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 0 인용 특허 : 393 |
Systems and processes for discourse input processing are provided. In one example process, a discourse input can be received from a user. An integrated probability of a candidate word in the discourse input and one or more subclasses associated with the candidate word can be determined based on a co
Systems and processes for discourse input processing are provided. In one example process, a discourse input can be received from a user. An integrated probability of a candidate word in the discourse input and one or more subclasses associated with the candidate word can be determined based on a conditional probability of the candidate word given one or more words in the discourse input, a probability of the candidate word within a corpus, and a conditional probability of the candidate word given one or more classes associated with the one or more words. A text string corresponding to the discourse input can be determined based on the integrated probability. An output based on the text string can be generated.
1. A method for processing discourse input comprising: at an electronic device with a processor and memory storing one or more programs for execution by the processor: receiving a discourse input from a user;determining a text string corresponding to the discourse input, wherein determining the text
1. A method for processing discourse input comprising: at an electronic device with a processor and memory storing one or more programs for execution by the processor: receiving a discourse input from a user;determining a text string corresponding to the discourse input, wherein determining the text string comprises:determining, using a first language model, a conditional probability of a candidate word in the discourse input given one or more words in the discourse input;determining, using a second language model, a probability of the candidate word within a corpus;determining, using a third language model, a conditional probability of the candidate word given one or more classes associated with the one or more words; anddetermining an integrated probability of the candidate word and one or more subclasses associated with the candidate word based on:the conditional probability of the candidate word given the one or more words;the probability of the candidate word within the corpus; andthe conditional probability of the candidate word given the one or more classes,wherein the text string is based on the integrated probability; andgenerating an output based on the text string. 2. The method of claim 1, wherein the integrated probability is a joint probability of the candidate word and the one or more subclasses associated with the candidate word given the one or more words and the one or more classes. 3. The method of claim 1, wherein the integrated probability is determined based on assuming that a conditional probability of a subclass associated with the candidate word given the candidate word, the one or more words, and the one or more classes is equal to a conditional probability of the subclass associated with the candidate word given the candidate word and the one or more classes. 4. The method of claim 1, wherein the integrated probability is determined based on assuming that a conditional probability of the candidate word given a subclass associated with the candidate word and the one or more classes is equal to the conditional probability of the candidate word given the subclass associated with the candidate word. 5. The method of claim 1, wherein the integrated probability is determined based on assuming that a conditional probability of the one or more classes given the candidate word and the one or more words is equal to a conditional probability of the one or more classes given the one or more words. 6. The method of claim 1, wherein a class of the one or more classes includes one or more subclasses. 7. The method of claim 1, wherein the integrated probability is determined at least in part by dividing the conditional probability of the candidate word given the one or more classes by the probability of the candidate word within the corpus. 8. The method of claim 1, wherein the integrated probability is less dependent on the conditional probability of the candidate word given the one or more classes when the probability of the candidate word within the corpus is higher than when the probability of the candidate word within the corpus is lower. 9. The method of claim 1, wherein the first language model and the second language model are a same language model. 10. A method for processing discourse input comprising: at an electronic device with a processor and memory storing one or more programs for execution by the processor:receiving a discourse input from a user;determining a text string corresponding to the discourse input, wherein determining the text string comprises: determining, using a first language model, a conditional probability of a candidate word in the discourse input given one or more words in the discourse input; andapplying a weight to the conditional probability of the candidate word given the one or more words to obtain a weighted conditional probability of the candidate word given the one or more words, wherein the weight is based on a conditional probability of the candidate word given one or more classes associated with the one or more words, and wherein the text string is based on the weighted conditional probability of the candidate word; andgenerating an output based on the text string. 11. The method of claim 10, wherein the weight is further based on a probability of the candidate word within a corpus. 12. The method of claim 11, further comprising: determining, using a second language model, the probability of the candidate word within the corpus. 13. The method of claim 12, wherein the first language model and the second language model are a same language model. 14. The method of claim 11, wherein the weight is inversely proportional to the probability of the candidate word within the corpus. 15. The method of claim 11, wherein the weight comprises the conditional probability of the candidate word given the one or more classes divided by the probability of the candidate word within the corpus. 16. The method of claim 15, wherein the weighted conditional probability of the candidate word comprises a dot product of the weight and the conditional probability of the candidate word given the one or more words. 17. The method of claim 10, further comprising: determining, using a third language model, the conditional probability of the candidate word given the one or more classes. 18. The method of claim 10, wherein the weight is proportional to the conditional probability of the candidate word given the one or more classes. 19. The method of claim 10, wherein the weighted conditional probability of the candidate word is proportional to the conditional probability of the candidate word given the one or more words. 20. The method of claim 10, wherein the weighted conditional probability of the candidate word is based on an assumption that a conditional probability of a subclass associated with the candidate word given the candidate word, the one or more words, and the one or more classes is equal to a conditional probability of the subclass associated with the candidate word given the candidate word and the one or more classes. 21. The method of claim 10, wherein the weighted conditional probability of the candidate word is based on an assumption that a conditional probability of the candidate word given a subclass associated with the candidate word and the one or more classes is equal to a conditional probability of the candidate word given the subclass associated with the candidate word. 22. The method of claim 10, wherein the weighted conditional probability of the candidate word is based on an assumption that a conditional probability of the one or more classes given the candidate word and the one or more words is equal to a conditional probability of the one or more classes given the one or more words. 23. A non-transitory computer-readable storage medium comprising instructions for causing one or more processor to: receiving a discourse input from a user;determining a text string corresponding to the discourse input, wherein determining the text string comprises: determining, using a first language model, a conditional probability of a candidate word in the discourse input given one or more words in the discourse input; andapplying a weight to the conditional probability of the candidate word given the one or more words to obtain a weighted conditional probability of the candidate word given the one or more words, wherein the weight is based on a conditional probability of the candidate word given one or more classes associated with the one or more words, and wherein the text string is based on the weighted conditional probability of the candidate word; andgenerating an output based on the text string. 24. An electronic device comprising: one or more processors;memory; andone or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for: receiving a discourse input from a user;determining a text string corresponding to the discourse input, wherein determining the text string comprises:determining, using a first language model, a conditional probability of a candidate word in the discourse input given one or more words in the discourse input; andapplying a weight to the conditional probability of the candidate word given the one or more words to obtain a weighted conditional probability of the candidate word given the one or more words, wherein the weight is based on a conditional probability of the candidate word given one or more classes associated with the one or more words, and wherein the text string is based on the weighted conditional probability of the candidate word; andgenerating an output based on the text string.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.