IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0711252
(2004-09-03)
|
등록번호 |
US-7295106
(2007-11-13)
|
발명자
/ 주소 |
- Jackson,John
- Ershtein,Maksim
- Danileiko,Alexandre
- Ide,Curtis Evan
- Heller,Jonathan
|
출원인 / 주소 |
|
인용정보 |
피인용 횟수 :
20 인용 특허 :
4 |
초록
▼
A method and system for the classification of an object within a zone of a specified area with multiple surveillance means. The invention performs the steps of receiving a set of objects within a predefined zone area from each of at least a first and second surveillance means. Subsequently, each rec
A method and system for the classification of an object within a zone of a specified area with multiple surveillance means. The invention performs the steps of receiving a set of objects within a predefined zone area from each of at least a first and second surveillance means. Subsequently, each received set of objects is filtered to ensure that the objects in the set are comparable to the objects in the other received set. Characteristics of the received sets of objects are compared and characteristics of the objects within a received set of objects are compared to characteristics of the objects within a different set of received objects, wherein the characteristics are based upon a set of predetermined characteristics. It is determined if each object or set identified by the first surveillance means corresponds to an object or set identified by the second surveillance means.
대표청구항
▼
What is claimed is: 1. A method for the classification of an individual or object within a zone of a specified area with multiple surveillance devices wherein the method comprises the steps of: receiving a first set of objects from a first surveillance device and a second set of objects from a seco
What is claimed is: 1. A method for the classification of an individual or object within a zone of a specified area with multiple surveillance devices wherein the method comprises the steps of: receiving a first set of objects from a first surveillance device and a second set of objects from a second surveillance device within a predefined zone area; filtering the first and second sets of objects according to a set of predetermined characteristics to ensure that the objects in the first set are comparable to the objects in the second set of objects; comparing characteristics of the first and second sets of objects, wherein the characteristics are based upon the set of predetermined characteristics; further comparing characteristics of the objects within the first set of objects to characteristics of the objects within the second set of received objects, wherein the characteristics are based upon the set of predetermined characteristics; and determining if each object identified by the first surveillance device corresponds to an object identified by the second surveillance device; and classifying each object identified by the first surveillance device according to rather is corresponded to an object identified by the second surveillance device; wherein the second surveillance device provides a video feed of a field-of-view of the predefined zone area; the objects identified by the first surveillance device comprises at least one of an active identification device or a passive identification device, wherein each first surveillance device comprises an associated profile. 2. The method of claim 1, further including the step of the first and second surveillance devices determining the location of the objects within the first and second set of objects. 3. The method of claim 1, wherein the video feed is used to construct a 2D map of the predetermined zone area featuring the location of the objects present in the video feed. 4. The method of claim 1, further including the step of utilizing object location data acquired from the first and second surveillance devices in conjunction with video feed data of the objects received at the second surveillance device in order to construct a 3D map of the predetermined zone area, the friendly and unfriendly objects situated within the zone area being displayed upon the 3D map. 5. The method of claim 1, wherein the step of comparing the characteristics of the filtered first and second sets of objects further comprises the step of determining if an object received by the first surveillance device is within a predetermined measure of distance from an object received by the second surveillance device. 6. The method of claim 5, wherein if it is determined that an object received by the first surveillance device is within a predetermined distance from an object identified by the second surveillance device, then the two objects are assumed to be the same object. 7. The method of claim 6, further including the step of assigning and identifying an object received by the second surveillance device with a profile of an object received by the first surveillance device if the two objects are determined to be the same object. 8. The method of claim 7, wherein if an object is identified, then no action is taken and the identified object is classified as a friendly object. 9. The method of claim 7, wherein if an object is not identified, then an alarm condition is initiated and the object is classified as an unfriendly object. 10. The method of claim 1, wherein the first surveillance device and the second surveillance device are different types of devices. 11. A method for the classification of an individual or object within a zone of a specified area with multiple surveillance devices, wherein the method comprises the steps of: receiving a first set of comparable objects from a first surveillance device and a second set of comparable objects from a second surveillance device within a predefined zone area; comparing characteristics of the first and second sets of objects, wherein the characteristics are based upon a set of predetermined characteristics; comparing characteristics of the objects within the first set of objects to characteristics of the objects within the second set of objects, wherein the characteristics are based upon a set of predetermined characteristics; and determining if the first set of objects identified by the first surveillance device corresponds to the second set of objects identified by the second surveillance device; and classifying each object identified by the first surveillance device according to rather is corresponded to an object identified by the second surveillance device; wherein the second surveillance device provides a video feed of a field-of-view of the predefined zone area; the objects identified by the first surveillance device comprises at least one of an active identification device or a passive identification device, wherein each first surveillance device comprises an associated profile. 12. The method of claim 11, wherein the video feed is used to construct a 2D map of the predetermined zone area featuring the location of the objects present in the video feed. 13. The method of claim 11, further including the step of utilizing object location data acquired from the first surveillance device in conjunction with video feed data of the objects acquired from the second surveillance device in order to construct a 3D map of the predetermined zone area. 14. The method of claim 13, further including the step of tracking the movements of the received objects and displaying each object's or a compilation of at least two objects' respective path and time of movement on the 3D map of the zone area. 15. The method of claim 11, wherein the step of receiving objects or sets of objects comprises receiving information or data corresponding to objects or sets of real objects from sensors, or from data storage means or communication means operatively associated with such sensors, and processing such information in a computer system. 16. The method of claim 11, further including the step of determining the number of objects within the first set of objects that have been received by the first surveillance device. 17. The method of claim 16, further including the step of determining the number of objects within the second set of objects that have been identified by the second surveillance device. 18. The method of claim 17, further including the step of comparing the number of objects received by the first surveillance device to the number of objects received by the second surveillance device in order to determine if the number of received objects are equal or not equal. 19. The method of claim 18, wherein if it is determined that the number of objects received at the first and second surveillance devices are equal then no action is taken. 20. The method of claim 18, wherein if it is determined that the number of objects received at the first and second surveillance devices are not equal then an alarm condition is initiated. 21. The method of claim 11, wherein the first surveillance device and the second surveillance device are different types of devices. 22. A system for the classification of an individual or object within a zone of a specified area with multiple surveillance devices, the system comprising: a first surveillance device operative for observing a first set of objects within a predefined zone area; a second surveillance device operative for observing a second set of objects within a predefined zone area; and a processor in communication with the first surveillance device and the second surveillance device, the processor operative for receiving the first and second sets of objects, filtering out any incomparable objects, comparing characteristics of the filtered first and second sets of objects, and further comparing characteristics of the objects within the first set of objects to characteristics of the objects within the second set of objects, wherein the characteristics are based upon a set of predetermined characteristics, the processor further determining if each object in the first filtered set of objects identified by the first surveillance device corresponds to an object in the second filtered set of objects identified by the second surveillance device; and classifying each object identified by the first surveillance device according to rather is corresponded to an object identified by the second surveillance device; wherein the second surveillance device provides a video feed of a field-of-view of the predefined zone area; the objects identified by the first surveillance device comprises at least one of an active identification device or a passive identification device, wherein each first surveillance device comprises an associated profile. 23. The system of claim 22, wherein the first and second surveillance devices determines the location of the received objects within the predefined zone area. 24. The system of claim 22, wherein the video feed is used to construct a 2D map of the predetermined zone area featuring the location of the objects present in the video feed. 25. The system of claim 22, wherein object location data acquired from the first and second surveillance devices is used in conjunction with video feed data of the objects acquired from the second surveillance device in order to construct a 3D map of the predetermined zone area, the friendly and unfriendly objects situated within the zone area being displayed upon the 3D map. 26. The system of claim 25, wherein each friendly and unfriendly objects' respective path of movement and the time of the object's movement are tracked and displayed on the 3D map of the zone area. 27. The system of claim 22, wherein the processor is further operative for receiving information or data corresponding to the first and second sets of objects or sets of real objects from the first and second surveillance devices, or from data storage means or communication means operatively associated with the first and second surveillance devices. 28. The system of claim 22, wherein it is determined if an object received by the first surveillance device as is within a predetermined distance from an object received by the second surveillance device. 29. The system of claim 22, wherein if it is determined that an object received by the first surveillance device is within a predetermined distance from an object received by the second surveillance device, then the two objects are assumed to be the same object. 30. The system of claim 29, wherein if the two objects are determined to be the same object then the object received by the second surveillance device is assigned with a profile of the object identified by the first surveillance device. 31. The system of claim 30, wherein if it is determined that an object has a corresponding profile, then no action is taken and the object is classified as a friendly object. 32. The system of claim 30, wherein if it is determined that an object does not have a corresponding profile, then an alarm condition is initiated and the object is classified as an unfriendly object. 33. The system of claim 22, wherein the first surveillance device and the second surveillance device are different types of devices. 34. A system for the classification of an individual or object within a zone of a specified area with multiple surveillance devices, the system comprising: a first surveillance device operative for observing a first set of objects within a predefined zone area; a second surveillance device operative for observing a second set of objects within a predefined zone area; and a processor in communication with the first surveillance device and the second surveillance device, the processor operative for receiving the first and second sets of objects, filtering out any incomparable objects, comparing characteristics of the filtered first and second sets of objects, and further comparing characteristics of the objects within the first set of objects to characteristics of the objects within the second set of objects, wherein the characteristics are based upon a set of predetermined characteristics, the processor further determining if the filtered first set of objects identified by the first surveillance device corresponds to the filtered second set of objects identified by the second surveillance device; and classifying each object identified by the first surveillance device according to rather is corresponded to an object identified by the second surveillance device; wherein the second surveillance device provides a video feed of a field-of-view of the predefined zone area; the objects identified by the first surveillance device comprises at least one of an active identification device or a passive identification device, wherein each first surveillance device comprises an associated profile. 35. The system of claim 34, wherein the video feed is used to construct a 2D map of the predetermined zone area featuring the location of the objects present in the video feed. 36. The system of claim 35, wherein object location data acquired from the first and second surveillance devices is used in conjunction with video feed data of the objects acquired from the second surveillance device in order to construct a 3D map of the predetermined zone area. 37. The system of claim 36, wherein each object's respective path of movement and the time of the received an object's or a compilation of at least two objects' movements are tracked and displayed on the 3D map of the zone area. 38. The system of claim 34, wherein the number of objects within the filtered first set of objects that have been received by the first surveillance device is determined. 39. The system of claim 38, wherein the number of objects received by the first surveillance device is compared to the number of objects received by the second surveillance device in order to determine if the number of received objects are equal or not equal. 40. The system of claim 39, wherein if it is determined that the number of objects received at the first and second surveillance devices are equal then no action is taken. 41. The system of claim 39, wherein if it is determined that the number of objects received at the first and second surveillance device are not equal then an alarm condition is initiated. 42. The system of claim 41, wherein if it is determined that the number of objects received at the first and second surveillance devices are not equal then an alarm condition is initiated. 43. The system of claim 42, wherein the video feed is used to construct a 2D map of the predetermined zone area featuring the location of the objects present in the video feed. 44. The system of claim 43, wherein object location data acquired from the first and second surveillance devices is used in conjunction with video feed data of the objects acquired from the second surveillance means in order to construct a 3D map of the predetermined zone area. 45. The system of claim 44, wherein each object's respective path of movement and the time of the received object's movements or a compilation of at least two objects' movements are tracked and displayed on the 3D map of the zone area. 46. The system of claim 34, wherein the processor is further operative for receiving information or data corresponding to the first and second sets of objects or sets of real objects from the first and second surveillance devices, or from data storage means or communication means operatively associated with the first and second surveillance device. 47. The system of claim 34, wherein the first surveillance device and the second surveillance device are different types of devices.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.