IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
UP-0786949
(2007-04-12)
|
등록번호 |
US-7797259
(2010-10-04)
|
발명자
/ 주소 |
- Jiang, Qin
- Srinivasa, Narayan
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
13 인용 특허 :
3 |
초록
▼
Described is a system for temporal prediction. The system includes an extraction module, a mapping module, and a prediction module. The extraction module is configured to receive X(1), . . . X(n) historical samples of a time series and utilize a genetic algorithm to extract deterministic features in
Described is a system for temporal prediction. The system includes an extraction module, a mapping module, and a prediction module. The extraction module is configured to receive X(1), . . . X(n) historical samples of a time series and utilize a genetic algorithm to extract deterministic features in the time series. The mapping module is configured to receive the deterministic features and utilize a learning algorithm to map the deterministic features to a predicted {circumflex over (x)}(n+1) sample of the time series. Finally, the prediction module is configured to utilize a cascaded computing structure having k levels of prediction to generate a predicted {circumflex over (x)}(n+k) sample. The predicted {circumflex over (x)}(n+k) sample is a final temporal prediction for k future samples.
대표청구항
▼
What is claimed is: 1. A system for temporal prediction, comprising one or more processors having: an extraction module, the extraction module being configured to receive X(1), . . . , X(n) historical samples of a time series and utilize a search and optimization algorithm to extract deterministic
What is claimed is: 1. A system for temporal prediction, comprising one or more processors having: an extraction module, the extraction module being configured to receive X(1), . . . , X(n) historical samples of a time series and utilize a search and optimization algorithm to extract deterministic features in the time series, wherein the deterministic features are phase-space representations (PSR) of the time series; a mapping module, the mapping module being configured to receive the deterministic features and utilize a learning algorithm to map the deterministic features to a predicted {circumflex over (x)}(n+1) sample of the time series; and a prediction module, the prediction module being configured to utilize a cascaded computing structure having k levels of prediction to generate a predicted {circumflex over (x)}(n+k) sample, the predicted {circumflex over (x)}(n+k) sample being a final temporal prediction for k future samples, wherein the prediction module is configured to utilize a cascaded computing structure having k levels of prediction, wherein each level of prediction is configured to receive the X(1) through X(n) historical samples and the past {circumflex over (x)}(n+1) sample through a {circumflex over (x)}(n+k−1) sample, and wherein the prediction module further utilizes the extraction module and mapping module to generate a predicted {circumflex over (x)}(n+k) sample, the predicted {circumflex over (x)}(n+k) sample being a final temporal prediction for k future samples; and wherein {circumflex over (x)}(n+k)=G(Pn+k−1) and Pn+k−1={wix(n+k−1−di)}mi=1, where wi is a weight factor, di is a delay factor, and m is an embedded dimension, with parameters {wi,di,m} being independent of prediction horizon. 2. A system as set forth in claim 1, wherein the extraction module is configured to extract the deterministic features as a multi-dimensional feature subset using the search and optimization algorithm, wherein each subset is extracted according to how many past samples are needed, a relative time sample number of each of the past samples with respect to a current time sample, and a weight of each of the past samples. 3. A system as set forth in claim 2, wherein the mapping module is configured to use the deterministic features from the extraction module to construct a training set having elements, where each element in the training set comprises the multi-dimensional feature subset and a corresponding next sample from the known historical time series, and being further configured to use the training set to train the mapping module to transform the deterministic features into the predicted {circumflex over (x)}(n+1) sample of the time series. 4. A system as set forth in claim 3, wherein the learning algorithm is a neural network. 5. A system as set forth in claim 4, wherein the search and optimization algorithm is a genetic algorithm. 6. A system as set forth in claim 5, wherein the genetic algorithm is a nested genetic algorithm. 7. A system as set forth in claim 1, wherein the mapping module is configured to use the deterministic features from the extraction module to construct a training set having elements, where each element in the training set comprises the multi-dimensional feature subset and a corresponding next sample from the known historical time series, and being further configured to use the training set to train the mapping module to transform the deterministic features into the predicted {circumflex over (x)}(n+1) sample of the time series. 8. A system as set forth in claim 1, wherein the learning algorithm is a neural network. 9. A system as set forth in claim 1, wherein the search and optimization algorithm is a genetic algorithm. 10. A system as set forth in claim 9, wherein the genetic algorithm is a nested genetic algorithm. 11. A computer program product for temporal prediction, the computer program product comprising computer-readable instruction means encoded on a computer-readable medium that are executable by a computer for causing a computer to perform operations of: receiving X(1), . . . , X(n) historical samples of a time series and extracting deterministic features in the time series utilizing a search and optimization algorithm, wherein the deterministic features are phase-space representations (PSR) of the time series; mapping the deterministic features to a predicted {circumflex over (x)}(n+1) sample of the time series utilizing a learning algorithm; and generating a predicted {circumflex over (x)}(n+k) sample using a cascaded computing structure having k levels of prediction, the predicted {circumflex over (x)}(n+k) sample being a final temporal prediction for k future samples, further comprising instruction means for causing a computer to operate as a cascaded computing structure having k levels of prediction, wherein each level of prediction is configured to receive the X(1) through X(n) historical samples and the past {circumflex over (x)}(n+1) sample through a {circumflex over (x)}(n+k−1) sample, and wherein the prediction module further utilizes the extraction module and mapping module to generate a predicted {circumflex over (x)}(n+k) sample, the predicted {circumflex over (x)}(n+k) sample being a final temporal prediction for k future samples; and wherein {circumflex over (x)}(n+k)=G(Pn+k−1) and Pn+k−1={wix(n+k−1−di)}mi=1, where wi is a weight factor, di is a delay factor, and m is an embedded dimension, with parameters {wi,di,m} being independent of prediction horizon. 12. A computer program product as set forth in claim 11, further comprising instruction means for causing a computer to extract the deterministic features as a multi-dimensional feature subset using the search and optimization algorithm, wherein each subset is extracted according to how many past samples are needed, a relative time sample number of each of the past samples with respect to a current time sample, and a weight of each of the past samples. 13. A computer program product as set forth in claim 12, further comprising instruction means for causing a computer to use the deterministic features from the extraction module to construct a training set having elements, where each element in the training set comprises the multi-dimensional feature subset and a corresponding next sample from the known historical time series, and further comprising instruction means to cause a computer to use the training set to train the mapping module to transform the deterministic features into the predicted {circumflex over (x)}(n+1) sample of the time series. 14. A computer program product as set forth in claim 13, further comprising instruction means for causing a computer to use a neural network as the learning algorithm. 15. A computer program product as set forth in claim 14, further comprising instruction means for causing a computer to use a genetic algorithm as the search and optimization algorithm. 16. A computer program product as set forth in claim 15, further comprising instruction means for causing a computer to use a nested genetic algorithm as the genetic algorithm. 17. A computer program product as set forth in claim 11, further comprising instruction means for causing a computer to use the deterministic features from the extraction module to construct a training set having elements, where each element in the training set comprises the multi-dimensional feature subset and a corresponding next sample from the known historical time series, and further comprising instruction means to cause a computer to use the training set to train the mapping module to transform the deterministic features into the predicted {circumflex over (x)}(n+1) sample of the time series. 18. A computer program product as set forth in claim 11, further comprising instruction means for causing a computer to use a neural network as the learning algorithm. 19. A computer program product as set forth in claim 11, further comprising instruction means for causing a computer to use a genetic algorithm as the search and optimization algorithm. 20. A computer program product as set forth in claim 19, further comprising instruction means for causing a computer to use a nested genetic algorithm as the genetic algorithm. 21. A method for temporal prediction, comprising acts of: receiving X(1), . . . , X(n) historical samples of a time series and extracting deterministic features in the time series utilizing a search and optimization algorithm, wherein the deterministic features are phase-space representations (PSR) of the time series; mapping the deterministic features to a predicted {circumflex over (x)}(n+1) sample of the time series utilizing a learning algorithm; and generating a predicted {circumflex over (x)}(n+k) sample using a cascaded computing structure having k levels of prediction, the predicted {circumflex over (x)}(n+k) sample being a final temporal prediction for k future samples, operating a cascaded computing structure having k levels of prediction, wherein each level of prediction is configured to receive the X(1) through X(n) historical samples and the past {circumflex over (x)}(n+1) sample through a {circumflex over (x)}(n+k−1) sample; and generating a predicted {circumflex over (x)}(n+k) sample, the predicted {circumflex over (x)}(n+k) sample being a final temporal prediction for k future samples; and wherein {circumflex over (x)}(n+k)=G(Pn+k−1) and Pn+k−1={wix(n+k−1−di)}i=1m, where wi is a weight factor, di is a delay factor, and m is an embedded dimension, with parameters {wi,di,m} being independent of prediction horizon. 22. A method as set forth in claim 21, further comprising an act of extracting the deterministic features as a multi-dimensional feature subset using the search and optimization algorithm, wherein each subset is extracted according to how many past samples are needed, a relative time sample number of each of the past samples with respect to a current time sample, and a weight of each of the past samples. 23. A method as set forth in claim 22, further comprising acts of: using the deterministic features from the extraction module to construct a training set having elements, where each element in the training set comprises the multi-dimensional feature subset and a corresponding next sample from the known historical time series; and using the training set to train the mapping module to transform the deterministic features into the predicted {circumflex over (x)}(n+1) sample of the time series. 24. A method as set forth in claim 23, further comprising an act for using a neural network as the learning algorithm. 25. A method as set forth in claim 24, further comprising an act of using a genetic algorithm as the search and optimization algorithm. 26. A method as set forth in claim 25, further comprising an act of using a nested genetic algorithm as the genetic algorithm. 27. A method as set forth in claim 21, further comprising acts of: using the deterministic features from the extraction module to construct a training set having elements, where each element in the training set comprises the multi-dimensional feature subset and a corresponding next sample from the known historical time series; and using the training set to train the mapping module to transform the deterministic features into the predicted {circumflex over (x)}(n+1) sample of the time series. 28. A method as set forth in claim 21, further comprising an act of using a neural network as the learning algorithm. 29. A method as set forth in claim 21, further comprising an act of using a genetic algorithm as the search and optimization algorithm. 30. A method as set forth in claim 29, further comprising an act of using a nested genetic algorithm as the genetic algorithm.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.