IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0539835
(2009-08-12)
|
등록번호 |
US-8319666
(2012-11-27)
|
발명자
/ 주소 |
- Weinmann, Robert V.
- Gelinske, Joshua N.
- Allen, Robert M.
- Wiig, Johan A.
|
출원인 / 주소 |
|
대리인 / 주소 |
Law Office of Mark Brown, LLC
|
인용정보 |
피인용 횟수 :
52 인용 특허 :
19 |
초록
▼
A system and method of acquiring information from an image of a vehicle in real time wherein at least one imaging device with advanced light metering capabilities is placed aboard a vehicle, a computer processor means is provided to control the imaging device and the advanced light metering capabili
A system and method of acquiring information from an image of a vehicle in real time wherein at least one imaging device with advanced light metering capabilities is placed aboard a vehicle, a computer processor means is provided to control the imaging device and the advanced light metering capabilities, the advanced light metering capabilities are used to capture an image of at least a portion of the vehicle, and image recognition algorithms are used to identify the current state or position of the corresponding portion of the vehicle.
대표청구항
▼
1. A method of acquiring information from an image of at least a portion of a vehicle comprising the steps of: providing at least one imaging device aboard said vehicle;providing a computer processor connected to controlling said imaging device;capturing an image of said at least a portion of a vehi
1. A method of acquiring information from an image of at least a portion of a vehicle comprising the steps of: providing at least one imaging device aboard said vehicle;providing a computer processor connected to controlling said imaging device;capturing an image of said at least a portion of a vehicle with said imaging device;inputting said image to said computer processor;identifying with said computer processor a state of said image;said computer processor providing an output corresponding to said image state;providing said imaging device with advanced light metering capabilities chosen from among the group comprising spot metering, average metering and center-weighted average metering; andcontrolling said light metering capabilities with said computer processor. 2. The method of claim 1 wherein all steps are performed in real time while said vehicle is in operation. 3. The method of claim 1 further comprising the steps of: analyzing said image state with a rules engine executing on said computer processor; anddetermining if said image state indicates that said vehicle is in violation of a condition defined by said rules engine and, if so, initiating an appropriate response to said violation. 4. The method of claim 3 wherein said rules engine comprises aircraft flight profile rules as used by a flight operations quality assurance (FOQA) program. 5. The method of claim 3 wherein said step of initiating an appropriate response includes at least one of: sounding an aural alarm; displaying a visual alarm; and reporting the condition to an off-board station by means of a telemetry device. 6. The method of claim 1 wherein said step of using advanced light metering capabilities to capture an image further includes applying said advanced light metering capabilities directly to raw pixel data before an image file is created. 7. The method of claim 1 wherein said image includes an object of interest comprising at least a portion of an instrument panel of said vehicle. 8. The method of claim 7 wherein said at least a portion of said instrument panel is a feature selected from the group comprising: mechanical gauge, digital readout, status light, functional switch, computer display, and operator control. 9. The method of claim 1, which includes the steps of: calibrating the imaging device with an adaptive imaging module;acquiring a test image of an object of interest with said adaptive imaging module; andidentifying coordinates for said object of interest in said test image. 10. The method of claim 9, which includes the steps of: providing an object library comprising information corresponding to images of pre-identified objects of interest;determining if an object of interest is included in the object library;if said object of interest is found in the object library, storing the object of interest configuration from said object library with said computer processor; andif said object of interest is not found in the object library, identifying and storing its configuration and operational characteristics with said computer processor. 11. The method of claim 10 wherein said object library includes pre-identified objects of interest corresponding to specific types of vehicles and said method includes the step of selecting the vehicle type from said object library. 12. The method of claim 1, which includes the steps of: providing raw image data of an object of interest from said imaging device;applying a low-pass filter to said raw image data to remove image noise therefrom;using an edge detection algorithm for identifying points in said image data at which the image brightness changes sharply or has detectable discontinuities;applying a binary hard limiter to convert an edge-only image to a binary image of said object of interest; andproviding an output corresponding to a state of said object of interest binary image. 13. The method of claim 12, which includes the step of: applying a high-pass filter to perform edge detection on said image data. 14. The method of claim 12, which includes the step of: applying an image differentiator to perform edge detection on said image data. 15. The method of claim 12, which includes the steps of: creating a fiducial image of an object of interest;storing with said computer processor said fiducial image;providing a binary image with said imaging device; andaligning said binary image detected by said imaging device by comparing and cross-correlating said binary image with said fiducial image. 16. The method of claim 15, which includes the steps of: applying a mask to said aligned binary image to isolate a portion of said object of interest;said imaging device providing an input to said computer processor corresponding to said isolated portion of said object of interest; andsaid computer processor analyzing a state of said isolated portion of said object of interest and providing a corresponding output. 17. The method of claim 16, which includes the steps of: determining a state of the object of interest using either: synthetic images of the isolated portion of the object of interest for comparison with the masked binary image; or a linear regression to fit the points (pixels) from the masked binary image to determine the object of interest state. 18. The method of claim 1, which includes the steps of: providing a configuration file for an object of interest with limits of travel of a moving or changing part of the object of interest; anddetermining an operating condition of the vehicle by comparing the object of interest binary file with the configuration file. 19. The method of claim 1, which includes the steps of: using a set-up utility to create a fiducial image comprising multiple individual fiducial images of multiple objects of interest; andusing said set-up utility to create a feature mask for each individual object of interest. 20. The method of claim 1, which includes the additional steps of: receiving geospatial data corresponding to a geospatial position, velocity or attitude of said vehicle;combining said geospatial data with data from said imaging device corresponding to a state of an object of interest in said vehicle to create a fused sensor value;providing a rules engine corresponding to operating characteristics of said vehicle;comparing with said computer processor said fused sensor value with said rules engine;detecting an exceedance of said rules engine based on said comparison with said fused sensor value; andproviding an event response comprising at least one of: recording said fused sensor value corresponding to said event; recording a video from said imaging device output;communicating the event and/or the fused sensor value offboard the vehicle via a telemetry device; and communicating the event and/or the fused sensor value offboard the vehicle via a wide-area network. 21. The method of claim 20, which includes the steps of: providing a global navigation satellite system (GNSS) receiver including a GNSS antenna mounted on said vehicle;providing an inertial measurement unit (IMU) mounted on said vehicle;connecting said GNSS receiver and said IMU to said computer processor;providing input signals to said computer processor from said GNSS receiver and said IMU corresponding to geospatial positions and attitudes of said vehicle;said computer processor calculating the location and orientation of said vehicle using said GNSS receiver and said IMU signals; andsaid computer processor combining said vehicle location and orientation information with said extracted information to create said fused sensor value. 22. The method of claim 1 wherein said image includes an object of interest comprising at least a portion of an exterior surface of said vehicle. 23. The method of claim 22 wherein said vehicle is an aircraft and said at least a portion of an exterior surface of said vehicle is a feature selected from the group comprising: wing, strut, vertical stabilizer, horizontal stabilizer, aileron, flap, rudder, elevator, landing gear, and exterior light. 24. A method of acquiring information pertaining to the operation of a vehicle from images of objects of interest within the vehicle, from the exterior surface of the vehicle, or a combination thereof, which method comprises the steps of: providing at least one imaging device aboard said vehicle;providing a computer processor connected to and controlling said imaging device;calibrating the imaging device with an adaptive imaging module;acquiring a test image of said object of interest with said adaptive imaging module;identifying coordinates for said object of interest in said test image;providing an object library comprising information corresponding to images of pre-identified objects corresponding to specific types of vehicles;determining if an object of interest is included in the object library;selecting the vehicle type from said object library;if said object of interest is found in the object library, storing the object of interest configuration from said object library with said computer processor;if said object of interest is not found in the object library, identifying and storing its configuration and operational characteristics with said computer processor;providing raw image data of objects of interest from said imaging device;applying a low-pass filter to said raw image data to remove image noise therefrom;using an edge detection algorithm for identifying points in said image data at which the image brightness changes sharply or has detectable discontinuities;applying a binary hard limiter to convert edge-only images to binary images of said objects of interest;providing an output from said computer processor corresponding to a state of said objects of interest binary images;applying either a high-pass filter or an image differentiator to perform edge detection on said image data;using a set-up utility to create multiple individual fiducial images of multiple objects of interest;storing with said computer processor said fiducial images;providing binary images of said objects of interest with said imaging device;aligning said binary images detected by said imaging device by comparing and cross-correlating said binary images with said fiducial images;using said set-up utility to create a feature mask for each individual object of interest;applying a mask to said aligned binary images to isolate portions of said objects of interest;said imaging device providing an input to said computer processor corresponding to said isolated portions of said objects of interest;said computer processor analyzing a state of said isolated portions of said objects of interest and providing a corresponding output;determining states of the objects of interest using either: synthetic images of the isolated portions of the objects of interest for comparison with the masked binary images; or linear regressions to fit the points (pixels) from the masked binary images to determine the objects of interest states;providing configuration files for the objects of interest with limits of travel of moving or changing parts of the objects of interest;determining operating conditions of the vehicle by comparing the objects of interest binary files with the configuration files;receiving geospatial data corresponding to a geospatial position, velocity or attitude of said vehicle;combining said geospatial data with data from said imaging device corresponding to states of objects of interest in said vehicle to create fused sensor values;providing a rules engine corresponding to operating characteristics of said vehicle;comparing with said computer processor said fused sensor values with said rules engine;detecting an exceedance(s) of said rules engine based on said comparisons with said fused sensor values;providing an event response(s) comprising at least one of: recording said fused sensor value corresponding to said event; recording a video from said imaging device output; communicating the event and/or the fused sensor value offboard the vehicle via a telemetry device; and communicating the event and/or the fused sensor value offboard the vehicle via a wide-area network;a software-controlled imaging device mounted on an interior or exterior surface of said vehicle;a computer processor connected to and adapted for controlling said imaging device;said computer processor including a memory module;said imaging device capturing an image of said object of interest and providing imaging data as an input to said computer processor;storing said imaging data in said memory module;said computer processor using said imaging data to determine an image state of said vehicle and providing an output corresponding theretowherein said imaging device includes advanced light metering capabilities chosen from among the group comprising spot metering, average metering and center-weighted average metering; andsaid computer processor being adapted for controlling said light metering capabilities. 25. The system of claim 24 further comprising: a rules engine executing on said computer processor and adapted for analyzing said image state; andsaid computer processor being adapted for determining if said image state indicates that said vehicle is in violation of a condition defined by said rules engine and initiating an appropriate response to said violation. 26. The system of claim 25 wherein said rules engine comprises aircraft flight profile rules as used by a flight operations quality assurance (FOQA) program. 27. The system of claim 25 wherein said appropriate response includes at least one of: sounding an aural alarm; displaying a visual alarm; and reporting the condition to an off-board station by means of a telemetry device. 28. The system of claim 24 wherein said imaging device is adapted for using advanced light metering capabilities to capture an image and for applying said advanced light metering capabilities directly to raw pixel data before an image file is created. 29. The system of claim 24 wherein said image includes an object of interest comprising at least a portion of an instrument panel of said vehicle. 30. The system of claim 29 wherein said at least a portion of said instrument panel is a feature selected from the group comprising: mechanical gauge, digital readout, status light, functional switch, and operator control. 31. The system of claim 24, which includes: an adaptive imaging module adapted for calibrating the imaging device;said adaptive imaging module being adapted for acquiring a test image of an object of interest and identifying coordinates for said object of interest in said test image. 32. The system of claim 31, which includes: an object library stored in said memory module and comprising information corresponding to images of pre-identified objects of interest;means for determining if an object of interest is included in the object library;means for storing the object of interest configuration from said object library with said computer processor if said object of interest is found in the object library; andmeans for identifying and storing its configuration and operational characteristics with said computer processor if said object of interest is not found in the object library. 33. The system of claim 32 wherein said object library includes pre-identified objects of interest corresponding to specific types of vehicles and said computer processor is adapted for selecting the vehicle type from said object library. 34. The system of claim 24, which includes: said imaging device being adapted for providing raw image data of an object of interest;a low-pass filter connected to said imaging device and adapted for removing image noise from said role image data;said computer processor including an edge detection algorithm for identifying points in said image data at which the image brightness changes sharply or has detectable discontinuities;said computer processor including a binary hard limiter adapted for converting an edge-only image to a binary image of said object of interest; andsaid computer processor adapted for providing an output corresponding to a state of said object of interest binary image. 35. The system of claim 34, which includes: a high-pass filter connected to said imaging device and adapted for performing edge detection on said image data. 36. The system of claim 34, which includes: an image differentiator connected to said imaging device and adapted for performing edge detection on said image data. 37. The system of claim 24, which includes: said computer processor being adapted for: creating and storing a fiducial image of an object of interest;said imaging device being adapted for providing a binary image of said object of interest; andsaid computer processor and adapted for aligning said binary image detected by said imaging device by comparing and cross-correlating said binary image with said fiducial image. 38. The system of claim 37, which includes: said computer processor being adapted for applying a mask to said aligned binary image to isolate a portion of said object of interest;said imaging device being adapted for providing an input to said computer processor corresponding to said isolated portion of said object of interest; andsaid computer processor being adapted for analyzing a state of said isolated portion of said object of interest and providing a corresponding output. 39. The system of claim 38, which includes: means for determining a state of the object of interest using either: synthetic images of the isolated portion of the object of interest for comparison with the masked binary image; or a linear regression to fit the points (pixels) from the masked binary image to determine the object of interest state. 40. The system of claim 24, which includes: means for providing a configuration file for an object of interest with limits of travel of a moving or changing part of the object of interest; andmeans for determining an operating condition of the vehicle by comparing the object of interest binary file with the configuration file. 41. The system of claim 24, which includes: a set-up utility adapted for creating a fiducial image comprising multiple individual fiducial images of multiple objects of interest; andsaid set-up utility being adapted for creating a feature mask for each individual object of interest. 42. The system of claim 24, which includes: means for receiving geospatial data corresponding to a geospatial position, velocity or attitude of said vehicle;means for combining said geospatial data with data from said imaging device corresponding to a state of an object of interest in said vehicle to create a fused sensor value;means for providing a rules engine corresponding to operating characteristics of said vehicle;means for comparing with said computer processor said fused sensor value with said rules engine;means for detecting an exceedance of said rules engine based on said comparison with said fused sensor value; andmeans for providing an event response comprising at least one of: recording said fused sensor value corresponding to said event; recording a video from said imaging device output; communicating the event and/or the fused sensor value offboard the vehicle via a telemetry device; and communicating the event and/or the fused sensor value offboard the vehicle via a wide-area network. 43. The system of claim 42, further comprising: a global navigation satellite system (GNSS) receiver including a GNSS antenna, said receiver being mounted on or associated with said vehicle;an inertial measurement unit (IMU) mounted on said vehicle;said GNSS receiver and said IMU being connected to said computer processor and adapted for providing input signals thereto corresponding to geospatial positions and attitudes of said vehicle;said computer processor being adapted for calculating the location and orientation of said vehicle using said GNSS receiver and said IMU signals; andwherein said location and orientation are combined with said extracted information to create a fused sensor value. 44. The system of claim 43 further comprising a rules engine executing on said computer processing means, wherein said rules engine determines if said fused sensor value indicates that said vehicle is in violation of a condition defined by said rules engine. 45. The system of claim 44 further comprising a telemetry device for transmitting said extracted information and said violation from said vehicle. 46. The system of claim 24 wherein said image includes an object of interest comprising at least a portion of an exterior surface of said vehicle. 47. The system of claim 46 wherein said vehicle is an aircraft and said at least a portion of an exterior surface of said vehicle is a feature selected from the group comprising: wing, strut, vertical stabilizer, horizontal stabilizer, aileron, flap, rudder, elevator, landing gear, and exterior light. 48. A storage medium encoded with a machine-readable computer program code, the code including instructions for causing a computer to implement a method for acquiring information from an image of at least a portion of a vehicle, which method comprises the steps of: providing at least one imaging device aboard said vehicle;providing a computer processor to control said imaging device;capturing an image of said at least a potion of said vehicle with said imaging device;inputting said image to said computer processor;identifying with said computer processor a state of said image;said computer processor providing an output corresponding to said image state;calibrating said imaging device with an adaptive imaging module;acquiring a test image of an object of interest with said adaptive imaging module;identifying coordinates for said object of interest in said test image;creating a fiducial image of the object of interest;storing with said computer processor said fiducial image;providing a binary image with said imaging device;aligning said binary image detected by said imaging device by comparing and cross-correlating said binary image with said fiducial image;applying a mask to said aligned binary image to isolate a portion of said object of interest;said imaging device providing an input to said computer processor corresponding to said isolated portion of said object of interest;said computer processor analyzing a state of said isolated portion of said object of interest and providing a corresponding output;providing said imaging device with advanced light metering capabilities chosen from among the group comprising spot metering, average metering and center-weighted average metering; andcontrolling said light metering capabilities with said computer processor. 49. A computer data signal comprising code configured to cause a processor to implement a method for acquiring information from an image of at least a portion of a vehicle, which method comprises the steps of: providing at least one imaging device aboard said vehicle;providing a computer processor to control said imaging device;capturing an image of said at least a portion of said vehicle with said imaging device;inputting said image to said computer processor;identifying with said computer processor a state of said image;said computer processor providing an output corresponding to said image state;calibrating said imaging device with an adaptive imaging module;acquiring a test image of an object of interest with said adaptive imaging module;identifying coordinates for said object of interest in said test image;creating a fiducial image of the object of interest;storing with said computer processor said fiducial image;providing a binary image with said imaging device;aligning said binary image detected by said imaging device by comparing and cross-correlating said binary image with said fiducial image;applying a mask to said aligned binary image to isolate a portion of said object of interest;said imaging device providing an input to said computer processor corresponding to said isolated portion of said object of interest;said computer processor analyzing a state of said isolated portion of said object of interest and providing a corresponding output;providing said imaging device with advanced light metering capabilities chosen from among the group comprising spot metering, average metering and center-weighted average metering; andcontrolling said light metering capabilities with said computer processor.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.