최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기IEEE access : practical research, open solutions, v.9, 2021년, pp.84067 - 84078
Seo, Hyein (Chungnam National University, Daejeon, South Korea) , Jung, Sangkeun (Chungnam National University, Daejeon, South Korea) , Hwang, Taewook (Chungnam National University, Daejeon, South Korea) , Kim, Hyunji (Chungnam National University, Daejeon, South Korea) , Roh, Yoon-Hyung (Electronics and Telecommunications Research Institute, Daejeon, South Korea)
Natural language understanding (NLU) is a core technique for implementing natural user interfaces. In this study, we propose a neural network architecture to learn syntax vector representation by employing the correspondence between texts and syntactic structures. For representing the syntactic stru...
Proc Korea Comput Congr Measuring the syntactic similarity between Korean sentences using lee 2016 792
Maillard, Jean, Clark, Stephen, Yogatama, Dani. Jointly learning sentence embeddings and syntax with unsupervised Tree-LSTMs. Natural language engineering, vol.25, no.4, 433-449.
Proc Korea Softw Congr Semantic vector learning using pretrained bert kim 2019 1427
arXiv 1805 10190 Snips voice platform: An embedded spoken language understanding system for private-by-design voice interfaces coucke 2018
arXiv 1801 04871 Building a conversational agent overnight with dialogue self-play shah 2018
Proc 10th Int Workshop Spoken Dialogue Syst Technol (IWSDS) Benchmarking natural language understanding services for building conversational agents liu 2019
Advances in neural information processing systems Pytorch: An imperative style, high-performance deep learning library paszke 2019 8024
arXiv 1910 03771 HuggingFace’s transformers: State-of-the-art natural language processing wolf 2019
Jung, Sangkeun. Semantic vector learning for natural language understanding. Computer speech & language, vol.56, 130-145.
arXiv 1909 11942 ALBERT: A lite BERT for self-supervised learning of language representations lan 2019
Hochreiter, Sepp, Schmidhuber, Jürgen. Long Short-Term Memory. Neural computation, vol.9, no.8, 1735-1780.
arXiv 1907 11692 RoBERTa: A robustly optimized BERT pretraining approach liu 2019
arXiv 1803 11175 Universal sentence encoder cer 2018
arXiv 1810 04805 BERT: Pre-training of deep bidirectional transformers for language understanding devlin 2018
Proc LREC Universal Stanford dependencies: A cross-linguistic typology marneffe 2014 14 4585
Proc Workshop ACL Text Summarization Branches Out ROUGE: A package for automatic evaluation of summaries lin 2004 74
Straightforward Statistics for the Behavioral Sciences evans 1996
Advances in neural information processing systems XLNet: Generalized autoregressive pretraining for language understanding yang 2019 32
해당 논문의 주제분야에서 활용도가 높은 상위 5개 콘텐츠를 보여줍니다.
더보기 버튼을 클릭하시면 더 많은 관련자료를 살펴볼 수 있습니다.
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
오픈액세스 학술지에 출판된 논문
※ AI-Helper는 부적절한 답변을 할 수 있습니다.