IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
UP-0561285
(2004-07-01)
|
등록번호 |
US-7742806
(2010-07-12)
|
국제출원번호 |
PCT/US2004/021307
(2004-07-01)
|
§371/§102 date |
20051220
(20051220)
|
국제공개번호 |
WO05/002313
(2005-01-13)
|
발명자
/ 주소 |
- Sternickel, Karsten
- Szymanski, Boleslaw
- Embrechts, Mark
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
7 인용 특허 :
16 |
초록
▼
The use of machine learning for pattern recognition in magnetocardiography (MCG) that measures magnetic fields emitted by the electrophysiological activity of the heart is disclosed herein. Direct kernel methods are used to separate abnormal MCG heart patterns from normal ones. For unsupervised lear
The use of machine learning for pattern recognition in magnetocardiography (MCG) that measures magnetic fields emitted by the electrophysiological activity of the heart is disclosed herein. Direct kernel methods are used to separate abnormal MCG heart patterns from normal ones. For unsupervised learning, Direct Kernel based Self-Organizing Maps are introduced. For supervised learning Direct Kernel Partial Least Squares and (Direct) Kernel Ridge Regression are used. These results are then compared with classical Support Vector Machines and Kernel Partial Least Squares. The hyper-parameters for these methods are tuned on a validation subset of the training data before testing. Also investigated is the most effective pre-processing, using local, vertical, horizontal and two-dimensional (global) Mahanalobis scaling, wavelet transforms, and variable selection by filtering. The results, similar for all three methods, were encouraging, exceeding the quality of classification achieved by the trained experts. Thus, a device and associated method for classifying cardiography data is disclosed, comprising applying a kernel transform to sensed data acquired from sensors sensing electromagnetic heart activity, resulting in transformed data, prior to classifying the transformed data using machine learning.
대표청구항
▼
We claim: 1. A method for automating the identification of meaningful features and the formulation of expert rules for classifying magnetocardiography data, comprising: applying a wavelet transform to sensed data acquired from sensors sensing magnetic fields generated by a patient's heart activity,
We claim: 1. A method for automating the identification of meaningful features and the formulation of expert rules for classifying magnetocardiography data, comprising: applying a wavelet transform to sensed data acquired from sensors sensing magnetic fields generated by a patient's heart activity, resulting in wavelet domain data; applying a kernel transform to said wavelet domain data, resulting in transformed data; and identifying said meaningful features and formulating said expert rules from said transformed data, using machine learning. 2. The method of claim 1, further comprising: acquiring said sensed data from magnetic sensors proximate a patient's heart. 3. The method of claim 2, further comprising: classifying said transformed data using machine learning. 4. The method of claim 1, further comprising: classifying said transformed data using machine learning. 5. The method of claim 1, further comprising: classifying said transformed data using machine learning. 6. The method of claim 1, said kernel transform satisfying Mercer conditions. 7. The method of claim 1, said kernel transform comprising a radial basis function. 8. The method of claim 1, said applying a kernel transform comprising: assigning said transformed data to a first hidden layer of a neural network; applying training data descriptors as weights of said first hidden layer of said neural network; and calculating weights of a second hidden layer of said neural network numerically. 9. The method of claim 8, said calculating said weights of said second hidden layer numerically further comprising: calculating said weights of said second hidden layer using kernel ridge regression. 10. The method of claim 1, said applying a kernel transform comprising: applying a direct kernel transform. 11. The method of claim 1, further comprising: classifying said transformed data using a self-organizing map (SOM). 12. The method of claim 1, further comprising: classifying said transformed data using a direct kernel self-organizing map (DK-SOM). 13. The method of claim 1, further comprising: classifying said transformed data using kernel partial least square (K-PLS) machine learning. 14. The method of claim 1, further comprising: classifying said transformed data using direct kernel partial least square (DK-PLS) machine learning. 15. The method of claim 1, further comprising: classifying said transformed data using a least-squares support vector machine (LS-SVM). 16. The method of claim 1, further comprising: classifying said transformed data using a direct kernel principal component analysis (DK-PCA). 17. The method of claim 1, further comprising: classifying said transformed data using a support vector machine (SVM/SVMLib). 18. The method of claim 17, said classifying said transformed data using a support vector machine (SVM/SVMLib) further comprising: setting an SVMLib regularization parameter, C, to C=1/λ, for an n data kernel, wherein: said λ is proportional to said n to a power of 3/2. 19. The method of claim 17, said classifying said transformed data using a support vector machine (SVM/SVMLib) further comprising: setting an SVMLib regularization parameter, C, to C=1/λ, for an n data kernel, wherein: λ = min { 1 ; ( n 1500 ) 3 2 } . 20. The method of claim 1, said converting transforming said sensed data into said wavelet domain data comprising: applying a Daubechies wavelet transform to said sensed data. 21. The method of claim 1, further comprising: selecting features from said wavelet domain data which improve said classification of magnetocardiography data. 22. The method of claim 21, said selecting said features further comprising: eliminating selected undesirable features from said wavelet data. 23. The method of claim 22, said eliminating selected undesirable features comprising: eliminating outlying data from said wavelet data. 24. The method of claim 22, said eliminating selected undesirable features comprising: eliminating cousin descriptors from said wavelet data. 25. The method of claim 21, said selecting said features further comprising: retaining only selected desirable features from said wavelet data. 26. The method of claim 25, said retaining only selected desirable features further comprising: using a training data set; and using a validation data set for confirming an absence of over-training of said training set. 27. The method of claim 26, said retaining only selected desirable features further comprising: using a genetic algorithm to obtain an optimal subset of features from said training data set; and using said genetic algorithm for evaluating performance on said validation date set. 28. The method of claim 26, said retaining only selected desirable features further comprising: measuring sensitivities of said features from said wavelet data in relation to a predicted responses of said features; and eliminating lower-sensitivity features from among said features with comparatively lower sensitivity than other, higher-sensitivity features from among said features. 29. The method of claim 21, said selecting said features further comprising: eliminating selected undesirable features from said wavelet data; and retaining only selected desirable features from said wavelet data. 30. The method of claim 1, further comprising: normalizing said sensed data. 31. The method of claim 30, said normalizing said sensed data comprising: Mahalanobis scaling said sensed data. 32. The method of claim 1, further comprising: centering a kernel of said kernel transform. 33. The method of claim 32, said centering said kernel comprising: subtracting a column average from each column of a training data kernel; storing said column average for later recall, when centering a test data kernel subtracting a row average form each row of said training data kernel. 34. The method of claim 33, said centering said kernel further comprising: adding said stored column average to each column of said test data kernel; for each row, calculating an average of said test data kernel; and subtracting said row average from each horizontal entry of said test data kernel. 35. An apparatus for automating the identification of meaningful features and the formulation of expert rules for classifying magnetocardiography data, comprising computerized storage, processing and programming for: applying a wavelet transform to sensed data acquired from sensors sensing magnetic fields generated by a patient's heart activity, resulting in wavelet domain data; applying a kernel transform to said wavelet domain data, resulting in transformed data; and identifying said meaningful features and formulating said expert rules from said transformed data, using machine learning. 36. The apparatus of claim 35, further comprising an input for: acquiring said sensed data from magnetic sensors proximate a patient's heart. 37. The apparatus of claim 36, further comprising computerized storage, processing and programming for: classifying said transformed data using machine learning. 38. The apparatus of claim 35, further comprising computerized storage, processing and programming for: classifying said transformed data using machine learning. 39. The apparatus of claim 35, further comprising computerized storage, processing and programming for: classifying said transformed data using machine learning. 40. The apparatus of claim 35, wherein kernel transform satisfies Mercer conditions. 41. The apparatus of claim 35, said kernel transform comprising a radial basis function. 42. The apparatus of claim 35, said computerized storage, processing and programming for applying a kernel transform further comprising computerized storage, processing and programming for: assigning said transformed data to a first hidden layer of a neural network; applying training data descriptors as weights of said first hidden layer of said neural network; and calculating weights of a second hidden layer of said neural network numerically. 43. The apparatus of claim 42, said computerized storage, processing and programming for calculating said weights of said second hidden layer numerically further comprising computerized storage, processing and programming for: calculating said weights of said second hidden layer using kernel ridge regression. 44. The apparatus of claim 35, said computerized storage, processing and programming for applying a kernel transform further comprising computerized storage, processing and programming for: applying a direct kernel transform. 45. The apparatus of claim 35, further comprising computerized storage, processing and programming for: classifying said transformed data using a self-organizing map (SOM). 46. The apparatus of claim 35, further comprising computerized storage, processing and programming for: classifying said transformed data using a direct kernel self-organizingmap (DK-SOM). 47. The apparatus of claim 35, further comprising computerized storage, processing and programming for: classifying said transformed data using kernel partial least square (K-PLS) machine learning. 48. The apparatus of claim 35, further comprising computerized storage, processing and programming for: classifying said transformed data using direct kernel partial least square (DK-PLS) machine learning. 49. The apparatus of claim 35, further comprising computerized storage, processing and programming for: classifying said transformed data using a least-squares support vector machine (LS-SVM). 50. The apparatus of claim 35, further comprising computerized storage, processing and programming for: classifying said transformed data using a direct kernel principal component analysis (DK-PCA). 51. The apparatus of claim 35, further comprising computerized storage, processing and programming for: classifying said transformed data using a support vector machine (SVM/SVMLib). 52. The apparatus of claim 51, said computerized storage, processing and programming for classifying said transformed data using a support vector machine (SVM/SVMLib) transform further comprising computerized storage, processing and programming for: setting an SVMLib regularization parameter, C, to C=1/λ, for an n data kernel, wherein: said λ is proportional to said n to a power of 3/2. 53. The apparatus of claim 51, said computerized storage, processing and programming for classifying said transformed data using a support vector machine (SVM/SVMLib) transform further comprising computerized storage, processing and programming for: setting an SVMLib regularization parameter, C, to C=1/λ, for an n data kernel, wherein: λ = min { 1 ; ( n 1500 ) 3 2 } . 54. The apparatus of claim 35, said computerized storage, processing and programming for transforming said sensed data into said wavelet domain data comprising computerized storage, processing and programming for: applying a Daubechies wavelet transform to said sensed data. 55. The apparatus of claim 35, further computerized storage, processing and programming for: selecting features from said wavelet domain data which improve said classification of magnetocardiography data. 56. The apparatus of claim 55, said comprising computerized storage, processing and programming for selecting said features further comprising computerized storage, processing and programming for: eliminating selected undesirable features from said wavelet data. 57. The apparatus of claim 56, said comprising computerized storage, processing and programming for eliminating selected undesirable features comprising computerized storage, processing and programming for: eliminating outlying data from said wavelet data. 58. The apparatus of claim 56, said computerized storage, processing and programming for eliminating selected undesirable features comprising computerized storage, processing and programming for: eliminating cousin descriptors from said wavelet data. 59. The apparatus of claim 55, said computerized storage, processing and programming for selecting said features further comprising computerized storage, processing and programming for: retaining only selected desirable features from said wavelet data. 60. The apparatus of claim 59, said computerized storage, processing and programming for retaining only selected desirable features further comprising computerized storage, processing and programming for: using a training data set; and using a validation data set for confirming an absence of over-training of said training set. 61. The apparatus of claim 60, said computerized storage, processing and programming for retaining only selected desirable features further comprising computerized storage, processing and programming for: using a genetic algorithm to obtain an optimal subset of features from said training data set; and using said genetic algorithm for evaluating performance on said validation date set. 62. The apparatus of claim 60, said computerized storage, processing and programming for retaining only selected desirable features further comprising computerized storage, processing and programming for: measuring sensitivities of said features from said wavelet data in relation to a predicted responses of said features; and eliminating lower-sensitivity features from among said features with comparatively lower sensitivity than other, higher-sensitivity features from among said features. 63. The apparatus of claim 55, said computerized storage, processing and programming for selecting said features further comprising computerized storage, processing and programming for: eliminating selected undesirable features from said wavelet data; and retaining only selected desirable features from said wavelet data. 64. The apparatus of claim 35, further comprising computerized storage, processing and programming for: normalizing said sensed data. 65. The apparatus of claim 64, said computerized storage, processing and programming for normalizing said sensed data comprising computerized storage, processing and programming for: Mahalanobis scaling said sensed data. 66. The apparatus of claim 35, further comprising computerized storage, processing and programming for: centering a kernel of said kernel transform. 67. The apparatus of claim 66, said computerized storage, processing and programming for centering said kernel comprising computerized storage, processing and programming for: subtracting a column average from each column of a training data kernel; storing said column average for later recall, when centering a test data kernel subtracting a row average form each row of said training data kernel. 68. The apparatus of claim 67, said computerized storage, processing and programming for centering said kernel further comprising computerized storage, processing and programming for: adding said stored column average to each column of said test data kernel; for each row, calculating an average of said test data kernel; and subtracting said row average from each horizontal entry of said test data kernel.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.