최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기한국융합학회논문지 = Journal of the Korea Convergence Society, v.12 no.5, 2021년, pp.23 - 29
박찬준 (고려대학교 컴퓨터학과) , 박기남 (고려대학교 정보창의교육연구소) , 문현석 (고려대학교 컴퓨터학과) , 어수경 (고려대학교 컴퓨터학과) , 임희석 (고려대학교 컴퓨터학과)
Recent deep learning-based natural language processing studies are conducting research to improve performance by training large amounts of data from various sources together. However, there is a possibility that the methodology of learning by combining data from various sources into one may prevent ...
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N Gomez, L. Kaiser & I. Polosukhin. (2017). Attention is all you need. In Advances in neural information processing systems, 5998-6008.
C. Park, Y. Yang, K. Park & H. Lim. (2020). Decoding strategies for improving low-resource machine translation. Electronics, 9(10), 1562.
C. Park, C. Lee, Y. Yang & H. Lim. (2020). Ancient Korean Neural Machine Translation. IEEE Access, 8, 116617-116625.
K. Song, X. Tan, T. Qin, J. Lu & T. Y. Liu. (2019). Mass: Masked sequence to sequence pre-training for language generation. arXiv preprint arXiv:1905.02450.
C. Park, Y. Lee, C. Lee & H Lim, (2020). "Quality, not Quantity? : Effect of parallel corpus quantity and quality on Neural Machine Translation," The 32st Annual Conference on Human Cog-nitive Language Technology.
P. Koehn, V. Chaudhary, A. El-Kishky, N. Goyal, P. J. Chen & F. Guzman. (2020, November). Findings of the WMT 2020 shared task on parallel corpus filtering and alignment. In Proceedings of the Fifth Conference on Machine Translation 726-742.
Sen, Sukanta, Asif Ekbal, and Pushpak Bhattacharyya. "Parallel Corpus Filtering based on Fuzzy String Matching." Proceedings of the Fourth Conference on Machine Translation (Volume 3: Shared Task Papers, Day 2). 2019.
C. J. Park, Y. D. Oh, J. K. Choi, D. P. Kim & H. Lim. (2020). Toward High Quality Parallel Corpus Using Monolingual Corpus. The 10th International Conference on Convergence Technology (ICCT 2020), Volume 10, 146-147.
Y. Liu, M. Ott, N. Goyal, J. Du, M. Joshi, D. Chen & V. Stoyanov. (2019). Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692.
T. B. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal ... & D. Amodei. (2020). Language models are few-shot learners. arXiv preprint arXiv:2005.14165.
H. Yang, M. Wang, D. Wei, H. Shang, J. Guo, Z. Li, ... & Y. Chen. (2020, November). HW-TSC's Participation at WMT 2020 Automatic Post Editing Shared Task. In Proceedings of the Fifth Conference on Machine Translation (pp. 797-802).
E. Fonseca et al. (2019). "Findings of the WMT 2019 Shared Tasks on Quality Estimation." Proceedings of the Fourth Conference on Machine Translation (Volume 3: Shared Task Papers, Day 2). 2019.
S. Edunov et al. (2018). "Understanding back-translation at scale." arXiv preprint arXiv:1808.09381.
K. Papineni, S. Roukos, T. Ward & W. J. Zhu. (2002, July). Bleu: a method for automatic evaluation of machine translation. In Proceedings of the 40th annual meeting of the Association for Computational Linguistics 311-318.
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
출판사/학술단체 등이 한시적으로 특별한 프로모션 또는 일정기간 경과 후 접근을 허용하여, 출판사/학술단체 등의 사이트에서 이용 가능한 논문
※ AI-Helper는 부적절한 답변을 할 수 있습니다.