최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기한국융합학회논문지 = Journal of the Korea Convergence Society, v.12 no.1, 2021년, pp.57 - 63
한승규 (고려대학교 컴퓨터학과) , 임희석 (고려대학교 컴퓨터학과)
Spoken language understanding, which aims to understand utterance as naturally as human would, are mostly focused on English language. In this paper, we construct a Korean language dataset for spoken language understanding, which is based on a conversational corpus between reservation system and its...
* AI 자동 식별 결과로 적합하지 않은 문장이 있을 수 있으니, 이용에 유의하시기 바랍니다.
S. Yu, N. Kulkarni, H. Lee, & J. Kim. (2017). Syllable-level neural language model for agglutinative language. arXiv preprint, arXiv:1708.05515.
Y. Kim. (2014). Convolutional neural networks for sentence classification. arXiv preprint, arXiv:1408.5882.
Z. Zhao & Y. Wu. (2016). Attention-based convolutional neural networks for sentence classification. INTERSPEECH, 705-709.
S. Hochreiter & J. Schmidhuber, (1997). Long short-term memory. Neural computation, 9(8), 1735-1780.
K. Yao, B. Peng, Y. Zhang, D. Yu, G. Zweig, & Y. Shi. (2014). Spoken language understanding using long short-term memory neural networks. 2014 IEEE Spoken Language Technology Workshop (SLT), 189-194.
Y. B. Kim, S. Lee, & K. Stratos. (2017). Onenet: Joint domain, intent, slot prediction for spoken language understanding. IEEE Automatic Speech Recognition and Understanding Workshop(ASRU), 547-553.
Z. Huang, W. Xu, and K. Yu. (2015). Bidirectional lstm-crf models for sequence tagging. arXiv preprint, arXiv:1508.01991.
B. Liu & I. Lane. (2016). Attention-based recurrent neural network models for joint intent detection and slot filling. arXiv preprint, arXiv:1609.01454.
J. Devlin, M. W. Chang, K. Lee, & K. Toutanova. (2018). Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint, arXiv:1810.04805.
A. Vaswani, et al. (2017). Attention is all you need. In Advances in neural information processing systems, 5998-6008.
Q. Chen, Z. Zhuo, & W. Wang. (2019). Bert for joint intent classification and slot filling. arXiv preprint, arXiv:1902.10909.
SKT-Brain. (2019). KoBERT, GitHub[Online], https://github.com/SKTBrain/KoBERT
J. Oh, S. Jo, Y. Lim, & Y.S. Choi. (2018). Improving Utterance Intent Classification via Hierarchical Attention-based Recurrent Neural Network. The Korean Institute of Information Scientists and Engineers, 575-577.
K. Park, S. Na, J. Shin, & Y. Kim. (2019). BERT for Korean Natural Language Processing: Named Entity Tagging, Sentiment Analysis, Dependency Parsing and Semantic Role Labeling. The Korean Institute of Information Scientists and Engineers, 584-586.
A. So, K. Park, & H. Lim. (2018). A study on building korean dialogue corpus for restaurant reservation and recommendation. Annual Conference on Human and Language Technology, 630-632.
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
출판사/학술단체 등이 한시적으로 특별한 프로모션 또는 일정기간 경과 후 접근을 허용하여, 출판사/학술단체 등의 사이트에서 이용 가능한 논문
※ AI-Helper는 부적절한 답변을 할 수 있습니다.