최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기한국해양정보통신학회논문지 = The journal of the Korea Institute of Maritime Information & Communication Sciences, v.7 no.5, 2003년, pp.1044 - 1051
곽영태 (익산대학 컴퓨터과학과)
This paper surveys the EBP(Error Back Propagation) learning, the Cross Entropy function and the LBL(Layer By Layer) learning, which are used for learning the MLP(Multi Layer Perceptrons). We compare the merits and demerits of each learning method in the handwritten digit recognition. Although the sp...
* AI 자동 식별 결과로 적합하지 않은 문장이 있을 수 있으니, 이용에 유의하시기 바랍니다.
D. E. Rumelhart and J. I. McCelland, Parallel Distributed Processing, MIT Press, Cambridge, MA, pp. 318-362, 1986
R. P. Lippmann, 'An Introduction to Computing with Neural Nets,' IEEE ASSP Magazine, vol. 4, no. 2, pp. 4-22, April 1987
J. M. Zurada, Introduction to Artificial Neural Systems, West Publishing Co., 1992
Simon Haykin, Neural Networks: A Comprehensive Foundation, Macmillan College Publishing Co., 1994
J. R. Chen and P. Mars, 'Stepsize variation methods for accelerating the backpropation algorithm,' Proc. IJCNN Jan. 15-19, 1990, Washington, DC, USA, vol. I, pp. 601-604
Ali Rezgui and Nazif Tepedelenlioglu, 'The effect of the slope of the activation function on the back propagation algorithm,' Proceeding of IJCNN'90 Washington D.C., vol. 1, pp. 707-710
Plagianakos, V.P., Magoulas, G.D., Vrahatis, M.N., 'Deterministic nonmonotone strategies for effective training of multilayer perceptrons,' IEEE Trans. Neural Networks, vol. 13, pp. 1268-1284, 2002
A. Van Ooyen and B. Nienhuis, 'Improving the convergence of the back-propagation algorithm,' Neural Networks, vol. 78, pp. 465-471, 1992
G.-J. Wang and C.-C. Chen, 'A Fast Multilayer Neural-Network Training Algorithm Based on the Layer-By-Layer Optimizing Procedures,' IEEE Trans. Neural Networks, vol. 7, pp. 768-775, May, 1996
Lengelle, R., and Denoeux, T., 'Training MLPs Layer by Layer Using an Objective Function for Internal Representations,' Neural Networks, vol. 9, January, 1996
Jim. Y. F. Yam and Tommy W. S. Chow, 'Extended Least Squares Based Algorithm for Training Feedforward Networks,' IEEE Trans. Neural Networks, vol. 8, pp. 806-810, May, 1997
C M. Bishop, Neural Networks for Pattern Recognition, Clarendon Press, Oxford, 1997
Ampazis, N., Perantonis, S,J., "Two highly efficient second-order algorithms for training feedforward networks,' IEEE Trans. Neural Networks, vol. 13, pp. 1064-1074, Sep., 2002
J. J. Hull, 'A database for handwritten text recognition research,' IEEE Trans. Pattern and Machine Intell., vol. 16, pp. 550-554, 1994
J. Villiers and E. Barnard, 'Backpropagation Neural Nets with One and Two Hidden Layers,' IEEE Trans. Neural Netwoks, vol. 4, no. 1, pp. 136-141, 1993
M M Islam and K Murase, 'A new algorithm to design compact two-hidden-layer artificial neural networks,' Neural Netwoks, vol. 14, 2001
K. Hornik, M. Stinchcombe, and H. White, 'Multilayer feedforward networks are universal approximators,' Neural Networks, vol. 2, pp. 359-366, 1989
Shah, J.V., Chi-Sang Poon, 'Linear independence of internal representations in multilayer perceptrons,' IEEE Trans. Neural Netwoks, vol. 10, no. 1, pp. 10-18, 1999
David J. Winter, Matrix Algebra, Macmillan Publishing Company, 1992
※ AI-Helper는 부적절한 답변을 할 수 있습니다.