최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기한국문헌정보학회지 = Journal of the Korean Society for Library and Information Science, v.58 no.1, 2024년, pp.5 - 30
이용구 (경북대학교 사회과학대학 문헌정보학과) , 김선욱 (대구가톨릭대학교 사회과학대학 문헌정보학과)
The purpose of this study is to extract topics from experimental data using the topic modeling methods(LDA, Top2Vec, and BERTopic) and compare the characteristics and differences between these models. The experimental data consist of 55,442 papers published in 85 academic journals in the field of li...
Blei, D. & Lafferty, J. (2005). Correlated topic models. Advances in Neural Information Processing?Systems, 18, 147-154.
Blei, D. M., Ng, A. Y., & Jordan, M. I. (2003). Latent dirichlet allocation. Journal of Machine?Learning Research, 3, 993-1022.
Chen, A. T., Sheble, L., & Eichler, G. (2013). Topic modeling and network visualization to explore?patient experiences. In Visual Analytics in Healthcare Workshop 2013.
Deerwester, S., Dumais, S. T., Furnas, G. W., Landauer, T. K., & Harshman, R. (1990). Indexing?by latent semantic analysis. Journal of the American Society for Information Science, 41(6),?391-407. https://doi.org/10.1002/(SICI)1097-4571(199009)41:6 3.0.CO;2-9
Egger, R. & Yu, J. (2022). A topic modeling comparison between LDA, NMF, Top2Vec, and?BERTopic to demystify twitter posts. Frontiers in Sociology, 7, 886498.?https://doi.org/10.3389/fsoc.2022.886498
Gao, Q., Huang, X., Dong, K., Liang, Z., & Wu, J. (2022). Semantic-enhanced topic evolution?analysis: a combination of the dynamic topic model and word2vec. Scientometrics, 127,?1543-1563. https://doi.org/10.1007/s11192-022-04275-z
Hofmann, T. (1999). Probabilistic latent semantic indexing. In Proceedings of the 22nd Annual?International ACM SIGIR Conference on Research and Development in Information Retrieval,?50-57. https://doi.org/10.1145/3130348.3130370
Jelodar, H., Wang, Y., Yuan, C., Feng, X., Jiang, X., Li, Y., & Zhao, L. (2019). Latent Dirichlet?allocation (LDA) and topic modeling: models, applications, a survey. Multimedia Tools?and Applications, 78(11), 15169-15211. https://doi.org/10.1007/s11042-018-6894-4
Jing, X. Y., Zhang, D., & Tang, Y. Y. (2004). An improved LDA approach. IEEE Transactions?on Systems, Man, and Cybernetics, Part B (Cybernetics), 34(5), 1942-1951.?https://doi.org/10.1109/tsmcb.2004.831770
Mehrotra, R., Sanner, S., Buntine, W., & Xie, L. (2013). Improving LDA topic models for microblogs?via tweet pooling and automatic labeling. In Proceedings of the 36th International ACM?SIGIR Conference on Research and Development in Information Retrieval, 889-892.?https://doi.org/10.1145/2484028.2484166
Reimers, N. & Gurevych, I. (2019). Sentence-BERT: Sentence embeddings using Siamese BERT-networks. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language?Processing and the 9th International Joint Conference on Natural Language Processing?(EMNLP-IJCNLP), 3982-3992. https://doi.org/10.18653/v1/D19-1410
Vayansky, I. & Kumar, S. A. (2020). A review of topic modeling methods. Information Systems,?94, 1-15. https://doi.org/10.1016/j.is.2020.101582
Yuan, C. & Yang, H. (2019). Research on K-value selection method of K-means clustering?algorithm. J, 2(2), 226-235. https://doi.org/10.3390/j2020016
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
오픈액세스 학술지에 출판된 논문
※ AI-Helper는 부적절한 답변을 할 수 있습니다.