[미국특허]
Training parsers to approximately optimize NDCG
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06F-007/00
G06F-017/30
출원번호
US-0962751
(2010-12-08)
등록번호
US-8473486
(2013-06-25)
발명자
/ 주소
He, Xiaodong
Gao, Jianfeng
Gillenwater, Jennifer
출원인 / 주소
Microsoft Corporation
대리인 / 주소
Turk IP Law, LLC
인용정보
피인용 횟수 :
0인용 특허 :
3
초록▼
A supervised technique uses relevance judgments to train a dependency parser such that it approximately optimizes Normalized Discounted Cumulative Gain (NDCG) in information retrieval. A weighted tree edit distance between the parse tree for a query and the parse tree for a document is added to a ra
A supervised technique uses relevance judgments to train a dependency parser such that it approximately optimizes Normalized Discounted Cumulative Gain (NDCG) in information retrieval. A weighted tree edit distance between the parse tree for a query and the parse tree for a document is added to a ranking function, where the edit distance weights are parameters from the parser. Using parser parameters in the ranking function enables approximate optimization of the parser's parameters for NDCG by adding some constraints to the objective function.
대표청구항▼
1. A method to be executed at least in part in a computing device for training a parser to optimize search operations, the method comprising: receiving a query and a plurality of returned documents with relevance judgments;parsing the query and the documents to obtain parse trees;computing tree edit
1. A method to be executed at least in part in a computing device for training a parser to optimize search operations, the method comprising: receiving a query and a plurality of returned documents with relevance judgments;parsing the query and the documents to obtain parse trees;computing tree edit distances for each query-document pair based on the parse trees;incorporating the tree edit distances to a ranking function employed for ranking the documents for the received query;determining updated parser parameters from the ranking function; andupdating the parser with the updated parser parameters in an iterative manner until a predefined threshold is reached. 2. The method of claim 1, further comprising: initializing the parser parameters heuristically prior to a first parsing operation. 3. The method of claim 1, further comprising: collecting counts for the parser parameter gradient descent for determining the updated parser parameters. 4. The method of claim 1, wherein the parser parameters are updated to optimize a Normalized Discount Cumulative Gain (NDCG) of an information retrieval engine. 5. The method of claim 1, wherein a tree edit distance between a query tree and a corresponding document tree is a combination of at least one from a set of: node insertion, deletion, and substitution operations. 6. The method of claim 5, wherein a cost associated with each of the operations is a function of the parsing parameters that created respective nodes associated with the node insertion, deletion, and substitution operations. 7. The method of claim 1, wherein the ranking function includes a RankNet convex cross-entropy objective function and at least one of its derivatives. 8. The method of claim 1, wherein the parsing is performed employing an algorithm that chooses a parent for each child based on a maximum posterior probability of all possible parents for that child. 9. The method of claim 1, wherein the parser parameters include at least one from a set of: root, child, and stop parameters. 10. The method of claim 1, wherein the query is a long query. 11. A computing device for training a parser to optimize search operations, the computing device comprising: a memory storing instructions;a processor coupled to the memory, the processor executing a search engine in conjunction with the instructions stored in the memory, wherein the search engine is configured to: receive a query and a plurality of returned documents with relevance judgments;initialize parser parameters of the parser heuristically;parse the query and the documents to obtain parse trees;compute tree edit distances for each query-document pair based on the parse trees;incorporate the tree edit distances to a ranking function employed for ranking the documents for the received query;collect counts for a parser parameter gradient descent to determine updated parser parameters for optimizing a Normalized Discount Cumulative Gain (NDCG) of the search engine; andupdate the parser with the updated parser parameters in an iterative manner until a predefined threshold is reached. 12. The computing device of claim 11, wherein the search engine is further configured to scale the gradient by a selected function of the NDCG. 13. The computing device of claim 11, wherein the search engine employs a dependency model with valence (DMV) for the parser. 14. The computing device of claim 11, wherein the search engine is further configured to: employ at least one from a set of: node insertion, deletion, and substitution operation costs in tree edit distance based ranking of the returned documents. 15. The computing device of claim 14, wherein the search engine is further configured to: employ a cost function conditioned on at least one from a set of: a relevance of words in a returned document, a node's height in a parse tree, and a number of children of a node in a parse tree. 16. The computing device of claim 11, wherein the NDCG for top L returned documents is defined as: NDCG@L=1Z∑i=1L2vi-1log2(1+i), where v is a vector of relevance labels corresponding to the returned documents, Z is a normalization factor substantially equal to an ideal NDCG at cutoff L “INDCG@L”. 17. The computing device of claim 11, wherein the parser examines at least one from a set of: a title, a body text, and a metadata of each returned document. 18. A computer-readable storage medium with instructions stored thereon for supervised training of a dependency parser to optimize search operations, the instructions comprising: receiving a query and a plurality of returned documents with relevance judgments;initializing parser parameters of the parser heuristically;parsing the query and the documents to obtain parse trees;computing tree edit distances for each query-document pair based on the parse trees, wherein each tree edit distance between a query tree and a corresponding document tree is a combination of at least one from a set of: node insertion, deletion, and substitution operations;incorporating the tree edit distances to a ranking function employed for ranking the documents for the received query;collecting counts for a parser parameter gradient descent to determine updated parser parameters for optimizing a Normalized Discount Cumulative Gain (NDCG) of the search engine; andupdating the parser with the updated parser parameters in an iterative manner until a predefined threshold for NDCG optimization is reached. 19. The computer-readable medium of claim 18, wherein the instructions further comprise: adding at least one of a normalization constraint and a positivity constraint to a cross-entropy based convex objective function to ensure that the parser parameters normalize to approximately 1 and stay non-negative. 20. The computer-readable medium of claim 19, wherein the objective function includes at least one from a set of: a root parsing model parameter, a child parsing model parameter, a stop parsing model parameter, a vocabulary of observed words in the returned documents, and a set of the returned documents for the query.
Daily, Daniel J.; Gentle, Christopher R.; Kawahara, Lisa Y.; Maity, Ashis; Thomas, Michael J., Computer mediated natural language based communication augmented by arbitrary and flexibly assigned personality classification systems.
Bengio Yoshua,CAX ; Bottou Leon ; LeCun Yann Andre, Module for constructing trainable modular network in which each module inputs and outputs data structured as a graph.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.