최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | US-0724641 (2015-05-28) |
등록번호 | US-9785630 (2017-10-10) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 1 인용 특허 : 1885 |
Systems and processes are disclosed for predicting words in a text entry environment. Candidate words and probabilities associated therewith can be determined by combining a word n-gram language model and a unigram language model. Using the word n-gram language model, based on previously entered wor
Systems and processes are disclosed for predicting words in a text entry environment. Candidate words and probabilities associated therewith can be determined by combining a word n-gram language model and a unigram language model. Using the word n-gram language model, based on previously entered words, candidate words can be identified and a probability can be calculated for each candidate word. Using the unigram language model, based on a character entered for a new word, candidate words beginning with the character can be identified along with a probability for each candidate word. In some examples, a geometry score can be included in the unigram probability related to typing geometry on a virtual keyboard. The probabilities of the n-gram language model and unigram model can be combined, and the candidate word or words having the highest probability can be displayed for a user.
1. A method for predicting words, the method comprising: at an electronic device: receiving typed input from a user, wherein the typed input comprises a character associated with a new word;determining, using a word n-gram model, a first probability of a predicted word based on a previously entered
1. A method for predicting words, the method comprising: at an electronic device: receiving typed input from a user, wherein the typed input comprises a character associated with a new word;determining, using a word n-gram model, a first probability of a predicted word based on a previously entered word in the typed input;determining, using a unigram model, a second probability of the predicted word based on the character associated with the new word in the typed input, wherein the second probability is determined based on a geometry score associated with the typed input;determining a combined probability of the predicted word based on the first probability and the second probability; andcausing the predicted word to be displayed based on the combined probability. 2. The method of claim 1, wherein determining the first probability comprises: determining a set of predicted words and associated probabilities based on the previously entered word in the typed input. 3. The method of claim 2, further comprising: generating a subset of the set of predicted words by removing words from the set based on the character associated with the new word in the typed input. 4. The method of claim 3, wherein the subset of predicted words comprises words having prefixes that comprise the character associated with the new word in the typed input; and wherein the subset of predicted words comprises the predicted word. 5. The method of claim 1, wherein the geometry score is determined based on a key selection. 6. The method of claim 5, wherein the geometry score comprises a likelihood of a sequence of characters given the key selection. 7. The method of claim 5, wherein the key selection comprises key selection on a virtual keyboard. 8. The method of claim 1, wherein determining the second probability comprises: determining a set of predicted words and associated probabilities based on the character associated with the new word in the typed input. 9. The method of claim 8, wherein the set of predicted words comprises words having prefixes that comprise the character associated with the new word in the typed input; and wherein the set of predicted words comprises the predicted word. 10. The method of claim 1, wherein determining the second probability comprises: traversing a unigram trie to determine the second probability. 11. The method of claim 1, wherein determining the first probability comprises: determining the first probability based on a sequence of previously entered words. 12. The method of claim 1, wherein determining the second probability comprises: determining the second probability based on a sequence of characters in the typed input associated with the new word. 13. The method of claim 1, wherein determining the combined probability comprises: determining the product of the first probability and the second probability. 14. The method of claim 1, wherein the electronic device comprises a phone, a desktop computer, a laptop computer, a tablet computer, a television, a television set top box, or a wearable electronic device. 15. A non-transitory computer-readable storage medium comprising instructions for causing one or more processors to: receive typed input from a user, wherein the typed input comprises a character associated with a new word;determine, using a word n-gram model, a first probability of a predicted word based on a previously entered word in the typed input;determine, using a unigram model, a second probability of the predicted word based on the character associated with the new word in the typed input, wherein the second probability is determined based on a geometry score associated with the typed input;determine a combined probability of the predicted word based on the first probability and the second probability; andcause the predicted word to be displayed based on the combined probability. 16. A system comprising: one or more processors;memory;one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for: receiving typed input from a user, wherein the typed input comprises a character associated with a new word;determining, using a word n-gram model, a first probability of a predicted word based on a previously entered word in the typed input;determining, using a unigram model, a second probability of the predicted word based on the character associated with the new word in the typed input, wherein the second probability is determined based on a geometry score associated with the typed input;determining a combined probability of the predicted word based on the first probability and the second probability; andcausing the predicted word to be displayed based on the combined probability. 17. The computer-readable storage medium of claim 15, wherein determining the first probability comprises: determining a set of predicted words and associated probabilities based on the previously entered word in the typed input. 18. The computer-readable storage medium of claim 17, further comprising instructions for causing the one or more processors to: generate a subset of the set of predicted words by removing words from the set based on the character associated with the new word in the typed input. 19. The system of claim 16, wherein determining the first probability comprises: determining a set of predicted words and associated probabilities based on the previously entered word in the typed input. 20. The system of claim 19, wherein the one or more programs further include instructions for: generating a subset of the set of predicted words by removing words from the set based on the character associated with the new word in the typed input.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.