최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기Pattern recognition letters, v.135, 2020년, pp.15 - 21
Park, Cheong Hee (Corresponding author.) , Lee, Gyeong-Hoon
Abstract Traditional linear dimension reduction methods such as Principal component analysis (PCA) and linear discriminant analysis (LDA) have been used in many application areas due to simplicity and high performance. However, in data streams where data instances are generated continuously over ti...
Duda 2001 Pattern Classification
IEEE Trans. Pattern Anal. Mach. Intell. Weng 25 8 1034 2003 10.1109/TPAMI.2003.1217609 Candid covariance-free incremental principal component analysis
IEEE Trans. Syst. Man Cybern.Part B(Cybernetics) Pang 35 5 905 2005 10.1109/TSMCB.2005.847744 Incremental linear discriminant analysis for classification of data streams
Yan 2004 Proceedings of KDD Immc: incremental maximum margin criterion
IEEE Trans. Knowl. Data Eng. Yan 18 3 320 2006 10.1109/TKDE.2006.45 Effective and efficient dimesnionality reduction for large-scale and streaming data processing
Li 2004 Proceedings of the Advances in neural information processing systems 16 Efficient and robust feature extraction by amximum amrgin criterion
Liu 2009 Proceedings of International Conference on Data Mining Least squares incremental linear discriminant analysis
Pattern Recognit. Yeh 46 1267 2013 10.1016/j.patcog.2012.11.008 A rank-one update method for least squares linear discriminant analysis with concept drift
Pattern recognition X. 47 2014 Incremental partial least squares analysis of big streaming data
Pattern Recognit. Park 41 1083 2008 10.1016/j.patcog.2007.07.022 A comparison of generalized linear discriminant analysis algorithms
J. Math. Analysis and Application Oja 106 69 1985 10.1016/0022-247X(85)90131-3 On stochastic approximation of the eigenvators and eigenvalues of the expectation of a random matrix
IEEE Trans. Neural Netw. sanger 2 459 1989 10.1016/0893-6080(89)90044-0 Optimal unsupervised learning in a Sinhle-layer linear feedforward neural network
Zhang 2001 Technical report MSU-CSE-01-23, Dept. of Computer Science and Eng. Convergence analysis of complementary candid incremental principal component analysis
Commun. Stat. Simul. Comput. Helland 17 22 581 1988 10.1080/03610918808812681 On the structure of partial least squares regression
Ye 2007 Proceedings of International conference on machine learning Least squares linear discriminant analysis
SIAM J. Matrix Anal. Appl. Park 27 2 474 2005 10.1137/040607599 A relationship between linear discriminant analysis and the generalized minimum squared error solution
SIAM Rev. Hager 31 2 221 1989 10.1137/1031049 Updating the inverse of a matrix
IEEE Trans. Knowl. Data Eng. Ye 17 9 1208 2005 10.1109/TKDE.2005.148 IDR/QR: an incremental dimension reduction algorithm via QR decomposition
Image Vis. Comput. Lu 36 1 2015 10.1016/j.imavis.2015.01.002 Incremental learning from chunk data for IDR/QR
IEEE Trans. Neural Netw. Learn.Syst. Chu 26 11 2716 2015 10.1109/TNNLS.2015.2391201 Incremental linear discriminant analysis: a fast algorithm and comparison
Neural Netw. Zhu 85 33 2017 10.1016/j.neunet.2016.10.001 An online incremental orthogonal component analysis method for dimensionality reduction
Siam journal on scientific computing Zhang 38 3 2016 10.1137/15M1035653 Incremental regularized least squares for dimensionality reduction of large-scale data
J. Mach. Learn. Res. Bifet 11 1601 2010 Moa: massive online analysis
Hulten 2001 Proceedings of KDD Mining time-changing data streams
해당 논문의 주제분야에서 활용도가 높은 상위 5개 콘텐츠를 보여줍니다.
더보기 버튼을 클릭하시면 더 많은 관련자료를 살펴볼 수 있습니다.
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.