최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기한국융합학회논문지 = Journal of the Korea Convergence Society, v.12 no.4, 2021년, pp.23 - 30
임정우 (고려대학교 컴퓨터학과) , 문현석 (고려대학교 컴퓨터학과) , 이찬희 (고려대학교 컴퓨터학과) , 우찬균 (통계청 조사시스템관리과) , 임희석 (고려대학교 컴퓨터학과)
An Automated Industry and Occupation Coding System assigns statistical classification code to the enormous amount of natural language data collected from people who write about their industry and occupation. Unlike previous studies that applied information retrieval, we propose a system that does no...
Y. K. Kang. (2001). Automatic coding system for industry and occupation classification. The Korean Association for Survey Research. Fall Conference 2001, 33-45.
Population and Housing Census. (2020) Understanding of the Census. https://www.census.go.kr/cui/cuiDefView.do?q_menu3&q_sub1
Statistics Korea. (Year Unknown) Statistics Korea Census on Establishments . https://kostat.go.kr/understand/info/info_kost/1/index.action?bmoderead&cdS010004
H. S. Lim. (2004). An automated Classification System of Standard Industry and Occupation Codes by Using Information Retrieval Techniques. The Journal of Korean Association of Computer Education 7(4), 51-60.
C. K. Woo. (2020). A Study on Automatic Coding of Korean Standard Industrial Classification Based on Deep Learning. Masters dissertation. Korea University, Seoul.
H. D. Cheol. (2007). A Research on the Design and Implementation of the Automated Industry and Occupation Coding System. Masters dissertation. Hannam University, Daejeon
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez & I. Polosukhin. (2017, December). Attention is all you need. In Proceedings of the 31st International Conference on Neural Information Processing Systems (pp. 6000-6010).
M. Thompson, M. E. Kornbau & J. Vesely. (2012). Creating an Automated Industry and Occupation Coding Process for the American Community Survey. Seattle : U.S Census Bureau.
S. Hochreiter & J. Schmidhuber. (1997). Long short-term memory. Neural computation, 9(8), 1735-1780. DOI : 10.1162/neco.1997.9.8.1735
M. S. Choi, & B. W. On. (2019). A Comparative Study on the Accuracy of Sentiment Analysis of Bi-LSTM Model by Morpheme Feature. Proceedings of KIIT Conference, 2019(6), 307-309.
Y. T. Oh, M. T. Kim & W. J. Kim (2019). Korean Movie-review Sentiment Analysis Using Parallel Stacked Bidirectional LSTM Model. Journal of KIISE, 46(1), 45-49 DOI : 10.5626/JOK.2019.46.1.45
J. Devlin, M. W. Chang, K. Lee & K. Toutanova. (2019, June). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers) (pp. 4171-4186). DOI : 10.18653/v1/N19-1423
K. H. Kim, C. E. Park, C. K. Lee, & H. K. Kim. (2020). Korean End-to-end Neural Coreference Resolution with BERT. Journal of KIISE, 47(10), 942-947. DOI : 10.5626/JOK.2020.47.10.942
Y. S. Choi & K. J. Lee. (2020). Performance Analysis of Korean Morphological Analyzer based on Transformer and BERT. Journal of KIISE, 47(8), 730-741. DOI : 10.5626/JOK.2020.47.8.730
T. Kudo & J. Richardson. (2018, November). SentencePiece: A simple and language independent subword tokenizer and detokenizer for Neural Text Processing. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing: System Demonstrations (pp. 66-71). DOI : 10.18653/v1/D18-2012
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
출판사/학술단체 등이 한시적으로 특별한 프로모션 또는 일정기간 경과 후 접근을 허용하여, 출판사/학술단체 등의 사이트에서 이용 가능한 논문
※ AI-Helper는 부적절한 답변을 할 수 있습니다.