IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0779766
(2010-05-13)
|
등록번호 |
US-8102306
(2012-01-24)
|
발명자
/ 주소 |
- Smith, Jr., Jerry Rosson
- Krycia, Joseph R.
|
출원인 / 주소 |
- The United States of America as represented by the Secretary of the Navy
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
22 인용 특허 :
18 |
초록
▼
Typical inventive practice provides for electronic communication of a computer with a display, an active radar device (for transmitting target-location data and environmental data), a light measurement device (for transmitting visual light data), and passive imaging devices covering bands in the vis
Typical inventive practice provides for electronic communication of a computer with a display, an active radar device (for transmitting target-location data and environmental data), a light measurement device (for transmitting visual light data), and passive imaging devices covering bands in the visual, infrared (MWIR and/or LWIR), and millimeter wave regions of the electromagnetic spectrum. Inventive software in the computer's memory establishes “operational modes.” Each operational mode is defined at least by a predominant environmental (obscuration and lighting) character, ascribes “modal indices” to individual imaging devices, and carries its own multispectral image fusion algorithm (which, pursuant to the ascribed modal indices, attributes weights to the imaging data from the respective imaging devices). The inventive software aims the imaging devices toward the target, selects an operational mode (based on the environmental data), and executes the image fusion algorithm associated with the selected operational mode so that a fused multispectral image is displayed.
대표청구항
▼
1. A computer-implemented multispectral imaging method comprising: establishing a set of operational modes, each said operational mode corresponding to circumstance including a prevailing environmental character as generally exists between a target and plural imaging devices, each said operational m
1. A computer-implemented multispectral imaging method comprising: establishing a set of operational modes, each said operational mode corresponding to circumstance including a prevailing environmental character as generally exists between a target and plural imaging devices, each said operational mode being characterized by modal indices individually assigned to said imaging devices, each said operational mode having associated therewith an image fusion algorithm for performing multispectral imaging of said target, said prevailing environmental character including a prevailing obscuration character and a prevailing visual lighting character, said imaging devices including a visual said imaging device, an infrared said imaging device, and a millimeter wave said imaging device, said modal indices being indicative of weighting, by the associated said image fusion algorithm, of imaging data respectively obtainable from said imaging devices;obtaining target location data from a radar device, said target location data being informative of the location of said target;obtaining visual light data from a visual light measurement device, said visual light data being informative of said prevailing visual lighting character;obtaining environmental data from a radar device, said environmental data being informative of said prevailing environmental character;aiming at least two said imaging devices toward said target, said aiming of said imaging devices being based on the obtained said target location data;selecting a said operational mode, said selection of said operational mode being based on the obtained said environmental data; andexecuting said image fusion algorithm associated with the selected said operational mode, said execution of the associated said operational mode including obtaining imaging data from at least two of the aimed said imaging devices, fusing the obtained said imaging data, and displaying an image representative of said fusion of the obtained said imaging data. 2. The computer-implemented method of claim 1, wherein said prevailing obscuration character describes the nature and degree of environmental obscuration. 3. The computer-implemented method of claim 1, wherein the infrared said imaging device includes a mid-wave infrared device and a long-wave infrared device. 4. The computer-implemented method of claim 1, wherein said visual light measurement device includes at least one of a timekeeping device and a light meter. 5. The computer-implemented method of claim 1, wherein said circumstance to which each said operational mode corresponds further includes the distance of said target from said imaging devices, and wherein said selection of said operational mode is further based on the obtained said target location data. 6. The computer-implemented method of claim 1, wherein said aiming is of all said imaging devices toward said target, and wherein said obtaining of said imaging data is from all said imaging devices. 7. A multispectral imaging system comprising a computer, a display, a radar device, a visual light measurement device, and plural imaging devices, said imaging devices including a visual said imaging device, an infrared said imaging device, and a millimeter wave said imaging device, said computer communicating with said display, said radar device, said visual light measurement device, and said imaging devices, said computer being configured to perform a method including: establishing a set of operational modes, each said operational mode corresponding to circumstance including a prevailing environmental character as generally exists between a target and said imaging devices, each said operational mode being characterized by modal indices individually assigned to said imaging devices, each said operational mode having associated therewith an image fusion algorithm for performing multispectral imaging of said target, said prevailing environmental character including a prevailing obscuration character and a prevailing visual lighting character, said modal indices being indicative of weighting, by the associated said image fusion algorithm, of imaging data respectively obtainable from said imaging devices;obtaining target location data from said radar device, said target location data being informative of the location of a target;obtaining visual light data from said visual light measurement device, said visual light data being informative of said prevailing visual lighting character;obtaining environmental data from said radar device, said environmental data being informative of said prevailing environmental character;aiming at least two said imaging devices toward said target, said aiming of said imaging devices being based on the obtained said target location data;selecting a said operational mode, said selection of said operational mode being based on the obtained said environmental data; andexecuting said image fusion algorithm associated with the selected said operational mode, said execution of the associated said operational mode including obtaining imaging data from at least two of the aimed said imaging devices, fusing the obtained said imaging data, and displaying on said display an image representative of said fusion of the obtained said imaging data. 8. The multispectral imaging system of claim 7, wherein said prevailing obscuration character describes the nature and degree of environmental obscuration. 9. The multispectral imaging system of claim 7, wherein the infrared said imaging device includes a mid-wave infrared device and a long-wave infrared device. 10. The multispectral imaging system of claim 7, wherein said visual light measurement device includes at least one of a timekeeping device and a light meter. 11. The multispectral imaging system of claim 7, wherein said circumstance to which each said operational mode corresponds further includes the distance of said target from said imaging devices, and wherein said selection of said operational mode is further based on the obtained said target location data. 12. The multispectral imaging system of claim 7, wherein said aiming is of all said imaging devices toward said target, and wherein said obtaining of said imaging data is from all said imaging devices. 13. A computer program product for use in association with sensory apparatus in order to effect multispectral imaging, the computer program product comprising a computer-readable storage medium having computer-readable program code portions stored therein, the computer-readable program code portions including: a first executable portion for establishing a set of operational modes, each said operational mode corresponding to circumstance including a prevailing environmental character as generally exists between a target and plural imaging devices, each said operational mode being characterized by modal indices individually assigned to said imaging devices, each said operational mode having associated therewith an image fusion algorithm for performing multispectral imaging of said target, said prevailing environmental character including a prevailing obscuration character and a prevailing visual lighting character, said imaging devices including a visual said imaging device, an infrared said imaging device, and a millimeter wave said imaging device, said modal indices being indicative of weighting, by the associated said image fusion algorithm, of imaging data respectively obtainable from said imaging devices;a second executable portion for obtaining target location data from a radar device, said target location data being informative of the location of said target;a third executable portion for obtaining visual light data from a visual light measurement device, said visual light data being informative of said prevailing visual lighting character;a fourth executable portion for obtaining environmental data from a radar device, said environmental data being informative of said prevailing environmental character;a fifth executable portion for aiming at least two said imaging devices toward said target, said aiming of said imaging devices being based on the obtained said target location data;a sixth executable portion for selecting a said operational mode, said selection of said operational mode being based on the obtained said environmental data; anda seventh executable portion for obtaining imaging data from at least two of the aimed said imaging devices; andan eighth executable portion for fusing the obtained said imaging data in accordance with said image fusion algorithm associated with the selected said operational mode. 14. The computer program product of claim 13, wherein said prevailing obscuration character describes the nature and degree of environmental obscuration. 15. The computer program product of claim 13, wherein the infrared said imaging device includes a mid-wave infrared device and a long-wave infrared device. 16. The computer program product of claim 13, wherein said visual light measurement device includes at least one of a timekeeping device and a light meter. 17. The computer program product of claim 13, wherein said circumstance to which each said operational mode corresponds further includes the distance of said target from said imaging devices, and wherein said selection of said operational mode is further based on the obtained said target location data. 18. The computer program product of claim 13, wherein said aiming is of all said imaging devices toward said target, and wherein said obtaining of said imaging data is from all said imaging devices. 19. The computer program product of claim 13, wherein said fifth executable portion is for aiming only those said imaging devices whose imaging data are fused in accordance with said image fusion algorithm associated with the selected said operational mode, and wherein said seventh executable portion is for obtaining imaging data from all of the aimed said imaging devices. 20. The computer program product of claim 13, wherein the computer program product is additionally for use in association with a display, and wherein the computer-readable program code portions include a ninth executable portion for displaying an image representative of said fusion of the obtained said imaging data.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.