최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기융합정보논문지 = Journal of Convergence for Information Technology, v.11 no.7, 2021년, pp.31 - 38
김준용 (서울신학대학교 IT융합소프트웨어학과) , 박구락 (공주대학교 컴퓨터공학부)
In this paper, in order to obtain the optimization of the RNN model used for sentiment analysis, the correlation of each model was studied by observing the trend of loss and accuracy according to hyperparameter tuning. As a research method, after configuring the hidden layer with LSTM and the embedd...
G. Y. Lee & S. B. Lee. (2018). Universal Prediction System Realization Using RNN. The Journal of Korean Institute of Information Technology, 16(10), 11-20. DOI : 10.14801/jkiit.2018.16.10.11
C. Y. Lee & T. G. Lee, Kyungseop Shin. (2019). Performance Comparison of Natural Language Processing Model Based on Deep Neural Networks. The Journalof Korean Institute Comunications and Information Sciences, 44(7), 1344-1350
I. Aliyu, R. M. Mahmood & C. G. Lim. (2019). LSTM Hyperparameter Optimization for an EEG-Based Efficient Emotion Classification in BCI. The Journal of the Korea institute of electronic communication sciences, 14(6), 1171-1180. DOI : 10.37727/jkdas.2019.21.4.1771
Colah' Blog. (2015). Understanding LSTM Networks. (Online). https://colah.github.io/posts/2015-08-Understanding-LSTMs/
X. Zhang, F. Chen & R. Huang. (2018). A combination of RNN and CNN for attentionbased relation classification. Procedia computer science, 131, 911-917.
C. Baziotis, N. Pelekis & C. Doulkeridis. (2017, August). Datastories at semeval-2017 task 6: Siamese LSTM with attention for humorous text comparison. In Proceedings of the 11th International Workshop on Semantic Evaluation (SemEval-2017) (pp. 390-395).
D. A. Reddy, M. A. Kumar & K. P. Soman. (2019). c In Soft Computing and Signal Processing (pp. 385-394). Springer, Singapore.
T. Mikolov, I. Sutskever, K. Chen, G. Corrado & J. Dean. (2013). Distributed representations of words and phrases and their compositionality. arXiv preprint arXiv:1310.4546.
N. Reimers & I. Gurevych. (2019). Alternative weighting schemes for elmo embeddings. arXiv preprint arXiv:1904.02954.
A. Tifrea, G.Becigneul & O. E. Ganea. (2018). Poincar\'e GloVe: Hyperbolic Word Embeddings. arXiv preprint arXiv:1810.06546.
S. Choi, J. Seol & S. G. Lee. (2016), On Word Embedding Models and Parameters Optimized for Korean. Korean Language Information Science Society, 252-256.
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
Free Access. 출판사/학술단체 등이 허락한 무료 공개 사이트를 통해 자유로운 이용이 가능한 논문
※ AI-Helper는 부적절한 답변을 할 수 있습니다.