최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기농촌지도와 개발 = Journal of agricultural extension & community development, v.16 no.4, 2009년, pp.939 - 965
이민수 (전북발전연구원) , 최영찬 (서울대학교 지역정보)
The Machine Learning has been identified as a promising approach to knowledge-based system development. This study aims to examine the ability of machine learning techniques for farmer's decision making and to develop the reference model for using pig farm data. We compared five machine learning tec...
* AI 자동 식별 결과로 적합하지 않은 문장이 있을 수 있으니, 이용에 유의하시기 바랍니다.
핵심어 | 질문 | 논문에서 추출한 답변 |
---|---|---|
머신러닝 기법은 세 개의 주요 머신러닝 분파로 나뉘어지는데, 무엇인가? | 첫 번째는 Hunt et al.(1966)에 의해 제시된 기호적 학습(symbolic learning)이며, 주요 알고리즘은 의사결정나무(induction of decision tree), 의사결정규칙(decision rules), 논리프로그램(induction of logic programs) 등이 있다. 두 번째는 Nilsson(1965)에 의해 제시된 통계적 방법론(statistical methods)이며, 통계 혹은 패턴인식 기법으로 불리며, 주요 알고리즘으로 k-NN(k-nearest neighbors), 판별분석(discriminant analysis), 베이지안 분류기(Bayesian classifiers) 등이 있다. 마지막 세 번째는 Hunt et al.(1962)에 제시된 인공신경망(neural networks) 방법론이며, 알고리즘으로는 역전파학습(backpropagation learning), Kohonen SOM(Kohonen’s self-organizing network), Hofield 연상메모리(Hofield’s associative memory) 등이 있다. | |
모돈의 총산자수는 주로 무엇에 의해 좌우되는가? | 모돈의 생산성 예측 모형의 설계를 위해 먼저 총산자수를 결정하는 변인들을 설정하였다. 모돈의 총산자수는 주로 모돈의 유전적 변인과 농가의 사양 및 환경관리에 의해 좌우된다. 본 연구에서는 투입변인으로는 이전 3산차까지의 성적을 투입변인으로 설정하였다. | |
CHAID 알고리즘이란 무엇인가? | 의사결정나무모형의 경우 최적분리기준으로는 CHAID 알고리즘을 사용하였다. CHAID 알고리즘은 카이스퀘어 검정(x2 test)을 사용하여, 부모마디로 부터 분리되는 자식마디들이 최대한 서로 다르도록 만드는 것이다. 최적분리는 x2 검정을 통해 계산된 가장 적은 p값(p-value)을 갖는 분리(split)를 선택한다. |
대한양돈협회. (2005). 전업 양돈농가 실태보고서. 대한양돈협회.
대한양돈협회. (2007). 2007년 양돈장 질병보고서. Pig & Pork.
Bentz, Y., and Merunkay, D. (2000). Neural Networks and the Multinomial Logit for Branch Choice Modeling: a Hybrid Approach. Journal of Forecasting 19(3): 177-200.
Bitchler, M. and Kiss, C. (2004). A Comparison of Logistic Regression, k-Nearest Neighbor, and Decision Tree Induction for Campaign Management. Proceedings of the Tenth Americas Conference on Information Systems. New York. August: 1918-1925.
Bound, D., and Ross, D. (1997). Forecasting Customer Response with Neural Network. Handbook of Neural Computation. G6.2. 1-7.
Breiman, L. (1996). Heuristics of instability and stabilization in model selection. Annals of Statistics 24(6): 2350-2383.
Breiman, L., J. Friedman, Olshen, R., and Stone C. (1984). Classification and Regression and Regression Trees. Belmont, CA: Wadsworth.
Cho, S., M. Jang, et al. (1997). Virtural sample generation using a population of networks. Neural Processing Letters 12: 88-89.
Chung, H. M. and P. Gray. (1999). Data Mining. Journal of Management Information Systems 16(1): 11-17.
Cybenko, G. (1989). Approximation by superpositions of a sigmoidal function. Mathematics of Control. Signals, and Systems 2(4): 303-314.
Freund, Y. and R. E. Schapire. (1996). Game theory, on-line prediction and boosting. Proceedings of the Annual ACM Conference on Computational Learning Theory.
Gray, P. and H. J. Watson. (1998). Professional Briefings...Present and Future Directions in Data Warehousing. Database for Advances in Information Systems 29(3): 83-90.
Gray, P. and H. J. Watson. (1998). Decision Support in the Data Warehouse. N.J.: Upper Saddle River.
Han, J. and M. Kamber. (2001). Data Mining: Concepts and Techniques San Francisco. Morgan-Kaufmann Academic Press.
Hand, D. J. (1998). Data Mining: Statistics and More?. The American Statistician 52(2): 112-118.
Hornik, K., M. Stinchcombe, et al. (1990). Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks. Neural Networks 3(5): 551-560.
Hunt, E., J. Martin, et al. (1966). Experiments in induction. New York: Academic Press.
Iddings, R.K., and Apps, J.W. (1990). What Influences Farmers' Computer Use?. Journal of Extension 28(1)(http://www.joe.org/joe/1990spring/a4.html.2004/10/1).
Jayas, D. S., J. Paliwal, et al. (2000). Multi-layer neural networks for image analysis of agricultural products. Journal of Agricultural and Engineering Research 77(2): 119-128.
Kass, G. (2001). An Exploratory Technique for Investigating Large Quantities of Categorical Data. Applied Statistics 29(1980): 119-127.
Kirchner, K., K. H. Tolle, et al. (2004a). Decision tree technique applied to pig farming datasets. Livestock Production Science 90(2-3): 191-200.
Kirchner, K., K. H. Tolle, et al. (2004b). The analysis of simulated sow herd datasets using decision tree technique. Computers and Electronics in Agriculture 42(2): 111-127.
Kononenko, I. (2001). Machine learning for medical diagnosis: History, state of the art and perspective. Artificial Intelligence in Medicine 23(1): 89-109.
Kuhlmann, F., and Brodersen, C. (2001). Information technology and farm management: developments and perspectives. Computers and Electronics in Agriculture 30: 71-83.
Langley, P. and H. A. Simon. (1995). Applications of machine learning and rule induction. Communications of the ACM 38(11): 54-64.
Levenberg, K. (1994). A Method for the Solution of Certain Non-Linear Problems in Least Squares. Quarterly Journal of Applied Mathematics 2(2):164-168.
Levin, N. and Zahavi, J. (2001). Predictive Modeling Using Segmentation. Journal of Interactive marketing 15: 2-22.
McQueen, R. J., S. R. Garner, et al. (1995). Applying machine learning to agricultural data. Comput. Electron. Agric. 12(4): 275-293.
Mitchell, T. M. (1997). Machine Learning. New York: McGraw-Hill.
Moutinho, L., Curry, B., Davies, F., and Rita, P. (1994). Neural Network in Marketing. New York: Routledge.
Murthy, K. S. (1998). Automatic Construction of Decision Trees from Data: A Multi-disciplinary Survey. Data Mining and Knowledge Discovery 2: 345-389.
Nilsson, N. (1965). Learning machines. New York: McGraw-Hill.
Peacock, P. R. (1998). Data mining in marketing: Part 1. Marketing Management 6(4): 9.
Peacock, P. R. (1998). Data mining in marketing: Part 2. Marketing Management 7(1): 15.
Pietersma, D., R. Lacroix, et al. (2003). Induction and evaluation of decision trees for lactation curve analysis. Computers and Electronics in Agriculture 38(1): 19-32.
Quinlan, J. R. (1993). C4.5: Program of Machine Learning. CA.: Morgan Kaufman Publishing.
Rumelhart, D. E., B. Widrow, et al. (1994). Basic ideas in neural networks. Communications of the ACM 37(3): 87-92.
Rumelhart, D. E., Hinton, G. E., and Williams, R. J. (1986). Learning Internal Representation by Error Propagation. in Parallel Distributied Processing: Explorations in the Microstructure of Cognition. D.E. Rumelhart and J.A. McClelland(Eds.). Cambridge. MA: MIT Press.
Schultz, A., R. Wieland, et al. (2000). Neural networks in agroecological modelling-Stylish application or helpful tool?. Computers and Electronics in Agriculture 29(1-2): 73-97.
Scott Mitchell, R., L. A. Smith, et al. (1996). An investigation into the use of machine learning for determining oestrus in cows. Computers and Electronics in Agriculture 15(3): 195-213.
Sonquist, J., Baker, E., and Morgan, J. N. (1971). Searching for Structure, Survey Research Center, Ann Arbor: University of Michigan.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.