최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기정보처리학회논문지. KIPS transactions on software and data engineering. 소프트웨어 및 데이터 공학, v.11 no.4, 2022년, pp.169 - 178
이창재 (연세대학교 소프트웨어학부) , 나동열 (연세대학교 소프트웨어학부)
Morphemes are most primitive units in a language that lose their original meaning when segmented into smaller parts. In Korean, a sentence is a sequence of eojeols (words) separated by spaces. Each eojeol comprises one or more morphemes. Korean morphological analysis (KMA) is to divide eojeols in a ...
D. Ra, M. Cho, and Y. Kim, "Enhancing a Korean part-of-speech tagger based on a maximum entropy model," Journal of the Korean Data Analysis Society, Vol.9, No.4, pp.1623-1638, 2007.
K. Cho, et al., "Learning phrase representations using RNN Encoder-decoder for statistical machine translation," in Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp.1724-1734, 2014.
I. Sutskever, O. Vinyals, and Q. V. Le, "Sequence to sequence learning with neural networks," in Advances in Neural Information Processing Systems, pp.3104-3112, 2014.
D. Bahdanau, K. Cho, and Y. Bengio, "Neural machine translation by jointly learning to align and translate," in Proceedings of the International Conference on Learning Representations, San Diego, California, 2015.
T. Luong, H. Pham, and C. D. Manning, "Effective approaches to attention-based neural machine translation," in Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp.1412-1421, 2015.
A. Vaswani, et al., "Attention is all you need," in Advances in Neural Information Processing Systems, pp.6000-6010, 2017.
J. Devlin, M. Chang, K. Lee, and K. Toutanova, "BERT: Pre-training of deep bidirectional transformers for language understanding," in Proceedings of NAACL-HLT, Minneapolis, Minnesota, pp.4171-4186, 2019.
J. Zhu, et al., "Incorporating BERT into neural machine translation," in Proceedings of the International Conference on Learning Representations, Addis Ababa, Ethiopia, 2020.
Q. Wang, et al., "Learning Deep Transformer Models for Machine Translation," in Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, pp.1810-1822, 2019.
T. Nguyen and J. Salazar, "Transformers without tears: Improving the normalization of self-attention," in Proceedings of the 16th International Workshop on Spoken Language Translation, 2019.
A. Graves, "Sequence transduction with recurrent neural networks," in Proceedings of the 29th International Conference on Machine Learning Workshop on Representation Learning, Edinburgh, Scotland, 2012.
M. Freitag and Y. Al-Onaizan, "Beam search strategies for neural machine translation," in Proceedings of the First Workshop on Neural Machine Translation, Vancouver, Canada, pp.56-60, 2017.
E. Battenberg, et al., "Exploring neural transducers for end-to-end speech recognition," in Proceedings of 2017 IEEE Automatic Speech Recognition and Understanding Workshop, Okinawa, Japan, pp.206-213, 2017.
H. S. Hwang and C. K. Lee, "Korean morphological analysis using sequence-to-sequence learning with copying mechanism," in Proceedings of the Korea Computer Congress 2016, pp.443-445, 2016.
J. Li, E. H. Lee, and J.-H. Lee, "Sequence-to-sequence based morphological analysis and part-of-speech tagging for Korean language with convolutional features," Journal of Korean Institute of Information Scientists and Engineers, Vol.44, No.1, pp.57-62, 2017.
S.-W. Kim and S.-P. Choi, "Research on joint models for Korean word spacing and POS (Part-Of-Speech) tagging based on bidirectional LSTM-CRF," Journal of Korean Institute of Information Scientists and Engineers, Vol.45, No.8, pp.792-800, 2018.
B. Choe, I.-h. Lee, and S.-g. Lee, "Korean morphological analyzer for neologism and spacing error based on sequence-to-sequence," Journal of Korean Institute of Information Scientists and Engineers, Vol.47, No.1, pp.70-77, 2020.
J. Min, S.-H. Na, J.-H. Shin, and Y.-K. Kim, "Stack pointer network for Korean morphological analysis," in Proceedings of the Korea Computer Congress 2020, pp.371-373, 2020.
Y. Choi and K. J. Lee, "Performance analysis of Korean morphological analyzer based on transformer and BERT," Journal of Korean Institute of Information Scientists and Engineers, Vol.47, No.8, pp.730-741, 2020.
J. Y. Youn and J. S. Lee, "A pipeline model for Korean morphological analysis and part-of-speech tagging using sequence-to-sequence and BERT-LSTM," in Proceedings of the 32nd Annual Conference on Human & Cognitive Language Technology, pp.414-417, 2020.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.