IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0374406
(2003-02-26)
|
등록번호 |
US-7483868
(2009-01-27)
|
발명자
/ 주소 |
|
출원인 / 주소 |
- Computer Associates Think, Inc.
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
2 인용 특허 :
37 |
초록
▼
Method of incrementally forming and adaptively updating a neural net model are provided. A function approximation node is incrementally added to the neural net model. Function parameters for the function approximation node are determined and function parameters of other nodes in the neural network m
Method of incrementally forming and adaptively updating a neural net model are provided. A function approximation node is incrementally added to the neural net model. Function parameters for the function approximation node are determined and function parameters of other nodes in the neural network model are updated, by using the function parameters of the other nodes prior to addition of the function approximation node to the neural network model.
대표청구항
▼
What is claimed is: 1. A computer-implemented method of incrementally forming and adaptively updating a neural net comprising: (a) using a set of sample data patterns to form a hierarchical list of function approximation node candidates, each function approximation node candidate located at the cen
What is claimed is: 1. A computer-implemented method of incrementally forming and adaptively updating a neural net comprising: (a) using a set of sample data patterns to form a hierarchical list of function approximation node candidates, each function approximation node candidate located at the center of a hierarchically arranged cluster; (b) incrementally adding to the neural net a function approximation node selected from the list of function approximation node candidates; (c) computing function parameters for the function approximation node and updating function parameters of other nodes in the neural network by using the function parameters of the other nodes prior to addition of the function approximation node to the neural network and (d) storing an updated neural net including the function approximation node and the updated function parameters for use during the recognition of one or more patterns in a new set of data; and (e) using the updated neural net to improve the performance of a system, wherein the new set of data comprises data that describes a behavior of the system. 2. The method of claim 1, wherein if an accuracy level of the neural net with the function approximation node added thereto is below a predetermined accuracy level, steps (b) and (c) are repeated. 3. The method of claim 1, wherein the list of function approximation node candidates is formed by splitting the set of sample data patterns into a plurality of clusters in a first level of a cluster hierarchy, determining that a selected cluster in the first level has a population exceeding a predetermined size, and splitting the selected cluster into two or more clusters and replacing the selected cluster with the two or more clusters in a next level of the cluster hierarchy. 4. The method of claim 3 further comprising sorting the clusters on each level of the cluster hierarchy based on cluster size, to form a sorted list of function approximation node candidates. 5. The method of claim 1, wherein the neural network, is adaptively updated by incrementally adding one or more additional nodes to the neural net, to represent new data corresponding to a data range not represented in the set of sample data patterns. 6. The method of claim 1 further comprising: monitoring an accuracy level of the neural net while the neural net is used on-line; and adaptively updating the neural net, if the accuracy level of the neural net is below a predetermined threshold. 7. The method of claim 6, wherein the adaptive update includes incrementally adding one or more additional nodes to the neural net, to represent new data. 8. The method of claim 7, wherein the new data corresponds to a change in system dynamics. 9. The method of claim 6, wherein the adaptive update includes updating the function parameters of the nodes in the neural net. 10. The method of claim 6, wherein if the adaptive updating reaches a limit, a full retrain of the neural net, is performed. 11. The method of claim 1 further comprising adaptively updating the neural net by adding one or more additional nodes to the neural net, based on new data patterns. 12. The method of claim 11, wherein the additional nodes are formed by applying a clustering methodology to the new data patterns. 13. The method of claim 12, wherein the clustering methodology includes clustering the new data patterns into a number of clusters which is approximately a number of the nodes in the neural net; determining that a selected cluster is far away from positions associated with the respective nodes in the neural net; and adding to the neural net an additional node associated with the selected cluster and a center of the selected cluster. 14. The method of claim 11, wherein a set of initial weights is determined for the nodes in the neural net when the neural net is formed, and when the additional nodes are added during adaptive update, a set of new weights for the nodes in the neural net is computed, and the initial weights are combined with the new weights for the nodes based on a forgetting factor. 15. The method of claim 14, wherein the forgetting factor is determined based on a cause of neural net degradation. 16. The method of claim 1 further comprising applying an orthogonal least squares methodology to determine a set of weights for the neural net. 17. The method of claim 16, wherein the set of weights are adaptively updated by using new data patterns. 18. The method of claim 16, wherein the set of weights are updated to compensate for system drift. 19. The method of claim 1, wherein the function parameters for the nodes in the neural net are determined by applying a hierarchical k-means clustering methodology to a set of sample data patterns. 20. The method of claim 1, wherein the function approximation node is a radial basis node, and a center and radius of the radial basis node are determined through a hierarchical k-means clustering methodology. 21. The method of claim 1, wherein the function approximation node is a Gaussian node. 22. The method of claim 1, wherein the function approximation node is a sigmoidal basis node. 23. The method of claim 1, wherein the function approximation node is a wavelet basis node. 24. The method of claim 1, wherein the function approximation node is non-linear. 25. A computer-implemented method of incrementally forming a supervised learning neural net from data in the form of input-output pairs, comprising: applying a hierarchical clustering methodology to a set of sample data patterns to form a list of function approximation node candidates; incrementally adding one or more function approximation nodes to the supervised learning neural net until the supervised learning neural net has an accuracy level at or above a predetermined accuracy level, wherein the function approximation nodes are selected from the list of function approximation node candidates; and computing function parameters for the function approximation node and updating function parameters of other nodes in the neural network, by using the function parameters of the other nodes prior to addition of the function approximation node to the neural network; storing an updated supervised learning neural net, including the function approximation node and the updated function parameters for use during the recognition of one or more patterns in a new set of data, and using the updated neural net to improve the performance of a system, wherein the new set of data comprises data that describes a behavior of the system. 26. A computer system, comprising: a processor; and a program storage device readable by the computer system, tangibly embodying a program of instructions executable by the processor to perform a method of incrementally forming and adaptively updating a supervised learning neural net formed from data in the form of input-output pairs, the method comprising: (a) using a set of sample data patterns to form a hierarchical list of function approximation node candidates, each function approximation node candidate located at the center of a hierarchically arranged cluster; (b) incrementally adding to the supervised learning neural net a function approximation node selected from the list of function approximation node candidates; (c) determining function parameters for the function approximation node and updating function parameters of other nodes in the supervised learning neural network, by using the function parameters of the other nodes prior to addition of the function approximation node to the supervised learning neural network; and (d) storing the updated supervised learning neural net including the function approximation node and the updated function parameters for use during the recognition of one or more patterns in a new set of data, and (e) using the updated neural net to improve the performance of a system ˜ wherein the new set of data comprises data that describes a behavior of the system. 27. A program storage device readable by a machine, tangibly embodying a program of instructions executable by the machine to perform a method of incrementally forming and adaptively updating a supervised learning neural net from data in the form of input-output pairs, the method comprising: (a) using a set of sample data patterns to form a hierarchical list of function approximation node candidate, each function approximation node candidate located at the center of a hierarchically arranged clusters; (b) incrementally adding to the supervised learning neural net a function approximation node selected from the list of function approximation node candidates; (c) determining function parameters for the function approximation node and updating function parameters of other nodes in the supervised learning neural network, by using the function parameters of the other nodes prior to addition of the function approximation node to the supervised learning neural network; and (d) storing the updated supervised learning neural net including the function approximation node and the updated function parameters for use during the recognition of one or more patterns in a new set of data; and (e) using the updated neural net to improve the performance of a system, wherein the new set of data comprises data that describes a behavior of the system. 28. The method of claim 1, wherein updating the function parameters of other nodes in the neural network comprises computing a set of new weights for each other node in the neural network. 29. The method of claim 1, wherein the hierarchical list of function approximation node candidates comprises a plurality of levels, each level including a plurality of clusters. 30. The method of claim 1, wherein each function approximation node candidate is located at the center of a hierarchically arranged cluster, each hierarchically arranged cluster comprises a population not exceeding a predetermined size threshold. 31. The method of claim 25, wherein updating the function parameters of the other nodes in the neural network comprises computing a set of new weights for each other node in the neural network. 32. The method of claim 25, wherein the list of function approximation node candidates comprises a plurality of levels, each level including a plurality of clusters. 33. The method of claim 25, wherein each hierarchically arranged cluster comprises a population not exceeding a predetermined size threshold. 34. The method of claim 26, wherein updating the function parameters of the other nodes in the neural network comprises computing a set of new weights for each other node in the neural network. 35. The method of claim 26, wherein the list of function approximation node candidates comprises a plurality of levels, each level including a plurality of clusters. 36. The method of claim 26, wherein each hierarchically arranged cluster comprises a population not exceeding a predetermined size threshold. 37. The program storage device of claim 27, wherein updating the function parameters of the other nodes in the neural network comprises computing a set of new weights for each other node in the neural network. 38. The program storage device of claim 27, wherein the list of function approximation node candidates comprises a plurality of levels, each level including a plurality of clusters. 39. The program storage device of claim 27, wherein each hierarchically arranged cluster comprises a population not exceeding a predetermined size threshold.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.