최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기韓國情報技術學會論文誌 = Journal of Korean institute of information technology, v.21 no.12, 2023년, pp.35 - 45
Lee, Jehyuk
초록이 없습니다.
Serradilla, Oscar, Zugasti, Ekhi, Rodriguez, Jon, Zurutuza, Urko. Deep learning models for predictive maintenance: a survey, comparison, challenges and prospects. Applied intelligence, vol.52, no.10, 10934-10964.
Liao, L., Kottig, F.. A hybrid framework combining data-driven and model-based methods for system remaining useful life prediction. Applied soft computing, vol.44, 191-199.
Kim, Myoeng-Gyun, Jeon, Wang Su, Rhee, Sang-Yong. Artificial Intelligence-based Monitoring Application for Production Line Failure/Predictive Maintenance. 韓國情報技術學會論文誌 = Journal of Korean institute of information technology, vol.21, no.10, 147-158.
Lee, G.-J, Heydt, G.T. An interactive–dynamic mechanism conceptualizing the cost and benefit of electric power quality. Electric power systems research, vol.69, no.1, 69-75.
Chevillon, Guillaume. DIRECT MULTI-STEP ESTIMATION AND FORECASTING. Journal of economic surveys, vol.21, no.4, 746-785.
Rumelhart, David E., Hinton, Geoffrey E., Williams, Ronald J.. Learning representations by back-propagating errors. Nature, vol.323, no.6088, 533-536.
I. Sutskever, O. Vinyals, and Q. V. Le, "Sequence to sequence learning with neural networks", Advances in neural information processing systems, Vol. 27, 2014.
10.1162/neco.1997.9.8.1735 S. Hochreiter and J. Schmidhuber, "Long short-term memory", Neural computation, Vol. 9, No. 8, pp. 1735-1780, Nov. 1997.
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, T. Kaiser, and I. Polosukhin, "Attention is All you Need", Advances in Neural Information Processing Systems, Vol. 30, 2017.
A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, S. Gelly, J. Uszkoreit, and N. Houlsby, "An Image is Worth 16x16 Words: transformers for Image Recognition at Scale", International Conference on Learning Representations, 2021.
Zhou, Haoyi, Zhang, Shanghang, Peng, Jieqi, Zhang, Shuai, Li, Jianxin, Xiong, Hui, Zhang, Wancai. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. Proceedings of the ... aaai conference on artificial intelligence, vol.35, no.12, 11106-11115.
Y. Zhang and J. Yan, "Crossformer: transformer utilizing cross-dimension dependency for multivariate time series forecasting", The Eleventh International Conference on Learning Representations, Sep. 2022.
T. Zhou, Z. Ma, Q. Wen, X. Wang, L. Sun, and R. Jin, "Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting", International Conference on Machine Learning, pp. 27268-27286, Jun. 2022.
Y. Liu, H. Wu, J. Wang, and M. Long, "Non-stationary transformers: Exploring the stationarity in time series forecasting", Advances in Neural Information Processing Systems, Vol. 35, pp. 9881-9893, 2022.
S. Liu, H. Yu, C. Liao, J. Li, W. Lin, A. X. Liu, and S. Dustdar, "Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting", International conference on learning representations, 2021.
V. L. Guen and N. Thome, "Shape and Time Distortion Loss for Training Deep Time Series Forecasting Models", Advances in Neural Information Processing Systems, pp. 4191-4203, 2019.
Sannino, Ambra, Svensson, Jan, Larsson, Tomas. Power-electronic solutions to power quality problems. Electric power systems research, vol.66, no.1, 71-82.
S. H. Ko and C. S. Hong, "Anomaly Detection in Power Quality using LSTM-based Variational Autoencoder", In Korea Computer Congress 2021, pp. 1016-1018, Jun. 2021.
AI-Hub dataset, https://aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&aihubDataSe=realm&dataSetSn=239 [accessed: Mar. 27, 2023]
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.