최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기한국정보통신학회논문지 = Journal of the Korea Institute of Information and Communication Engineering, v.26 no.7, 2022년, pp.949 - 955
조준희 (Web Progamming, Korea Digital Media High School) , 오하영 (College of Computing and Informatics, Sungkyunkwan University)
Deep learning-based text summarization models are not free from datasets. For example, a summarization model trained with a news summarization dataset is not good at summarizing other types of texts such as internet posts and papers. In this study, we define this phenomenon as Data Bias Problem (DBP...
C. Y. Lin, "ROUGE: A Package for Automatic Evaluation of Summaries," in Proceedings of the Workwhop on Text Summarization Branches Out, Barcelona, Spain, pp. 74-81, 2004.
J. Wei and K. Zou, "EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks," in Proceeding of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, Hong Kong, China, pp. 6382-6388, 2019. DOI:10.18653/v1/D19-1670.
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin, "Attention is all you need," in 31st Conference on Neural Information Processing Systems, Long Beach: CA, USA, 2017.
S. Narayan, S. B. Cohen, and M. Lapata, "Don't Give Me the Details, Just the Summary! Topic-Aware Convolutional Neural Networks for Extreme Summarization," in Proceeding of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, pp. 1797-1807, 2018. DOI: 10.18653/v1/D18-1206.
B. Kim, H. Kim, and G. Kim, "Abstractive Summarization of Reddit Posts with Multi-level Memory Networks," in Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis: MN, USA, pp. 2519-2531, 2019. DOI: 10.18653/v1/N19-1260.
M. Lewis, Y. Liu, N. Goyal, M. Ghazvininejad, A. Mohamed, O. Levy, V. Stoyanov, and L. Zettlemoyer, "BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension," in Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online, pp. 7871-7880, 2020. DOI: 10.18653/v1/2020.acl-main.703.
C. Raffel, N. Shazeer, A. Roberts, K. Lee, S. Narang, M. Matena, Y. Zhou, W. Li, and P. J. Liu, "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer," Journal of Machine Learning Research, vol. 21, pp. 1-67, Jun. 2020.
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
오픈액세스 학술지에 출판된 논문
※ AI-Helper는 부적절한 답변을 할 수 있습니다.