The presently disclosed subject matter includes a tracking system and method which for tracking objects by a sensing unit operable to communicate over a communication link with a control center which enables to execute a command generated at the control center with respect to a selected object in an
The presently disclosed subject matter includes a tracking system and method which for tracking objects by a sensing unit operable to communicate over a communication link with a control center which enables to execute a command generated at the control center with respect to a selected object in an image captured by the sensing unit, notwithstanding a time-delay between a time when the sensing unit acquires the image with the selected object, to a time when the command is received at the sensing unit with respect to the selected object.
대표청구항▼
1. A tracking system comprising: a sensing unit operable to communicate over a communication link with a control center located remotely from the sensing unit, the sensing unit comprising: a processor;a data-repository; andan image sensor operable to capture a succession of images of a scene;the sen
1. A tracking system comprising: a sensing unit operable to communicate over a communication link with a control center located remotely from the sensing unit, the sensing unit comprising: a processor;a data-repository; andan image sensor operable to capture a succession of images of a scene;the sensing unit being operable to:identify, using the processor, one or more objects in a first image in said succession of images;tag, using the processor, at least one of said objects with a first object-tag, thereby generating a respective tagged object;transmit, using the processor, sensing-data to said control center, said sensing-data including at least said first image;store, using the data-repository, said first object-tag along with data indicative of a position of the respective tagged object;trace, using the processor, the stored tagged object, from said first image in said succession of images to a later image in said succession of images, thereby maintaining a given object-tag associated with its respective tagged-object along said succession of images;in case said position of the respective tagged object in said first image is changed in the later image, update, using the processor, the stored data indicative of the position of the respective tagged object stored in the data-repository;receive, using the processor, a command from said control center, the command including a second object-tag incorporated in the command, the second object-tag being indicative of a selected object, and the command also including at least one instruction related to the selected object;identify, using the processor, with the help of said first object-tag stored in the data-repository and the second object-tag received with the command, said selected object in a latest available captured image in said succession of images, wherein said identifying includes using the received second object-tag to search the data-repository for the stored first object-tag that corresponds to the second object-tag, and using the updated position data stored in the data-repository associated with the first object-tag to locate the selected object in the latest available captured image; andexecute, using the processor, said at least one instruction with respect to the selected object without being prevented by a time-delay between a time when the sensing unit acquires the latest available captured image with the selected object and a time when the corresponding command is received at the sensing unit due to the identifying of the selected object in the latest available captured image by using the first and second object-tags. 2. The tracking system according to claim 1, wherein said sensing-data further includes said first object-tag. 3. The tracking system according to claim 1, further comprising said control center, the control center being operable to receive said first image from the sensing unit; identify one or more objects in said first image; assign an object-tag to at least one of said objects thereby generating a parallel tagged object; store said object-tag in association with the parallel tagged object; said identification and assigning is performed in the sensing unit and the control center according to identical principles. 4. The tracking system according claim 1, further operable, in case said selected object is not located inside a field of view of said image sensor, to estimate a real-time location of the selected object; and generate direction commands to said image sensor, such that said selected object is located in a real-time image of the scene generated by the image sensor. 5. The tracking system according to claim 1, wherein said command is a tracking command, instructing said sensing unit to track the selected object. 6. The tracking system according to claim 1, further operable to: determine one or more characteristics of said one or more identified objects; andselect an object to be assigned with a respective object-tag, in case one or more characteristics of the object match a predefined criterion. 7. The tracking system according to claim 1, further operable, responsive to a received screening command including data indicative of at least one selected object, to: determine object-data of said at least one selected object;select one or more objects to be assigned with a respective object-tag from among said one or more identified objects, in case object-data of said one or more identified objects match the object-data of said at least one selected object. 8. The tracking system according to claim 1, wherein said image sensor is a camera. 9. The tracking system according to claim 1, further comprising said control center; said control center being operable to receive a captured image from said sensing unit; and display said image on a display operatively connected to the control center; said control center is further operable, responsive to a selection of an object in said image, to identify an object-tag associated with said object, generate said command, and transmit said command to said sensing unit. 10. The tracking system according to claim 1, wherein said sensing unit is located on an airborne vehicle and said control center is located on the ground. 11. A computer-implemented method of tracking objects by a sensing unit operable to communicate over a communication link with a control center located remotely from the sensing unit, the method comprising: capturing a succession of images of a scene;identifying one or more objects in a first image in said succession of images;tagging at least one of said objects with a first object-tag, thereby generating a respective tagged object;transmitting sensing-data to said control center, said sensing-data including at least said first image;storing said first object-tag along with data indicative of a position of the respective tagged object;tracing the stored tagged object, from said first image in said succession of images to a later image in said succession of images, thereby maintaining a given object-tag associated with its respective tagged-object along said succession of images;in case said position of the respective tagged object in said first image is changed in the later image, updating the stored data indicative of the position of the respective tagged object;receiving a command from said control center, the command including a second object-tag incorporated in the command, the second object-tag being indicative of a selected object, and the command also including at least one instruction related to the selected object;identifying, with the help of said stored first object-tag and the second object-tag, said selected object in a latest available captured image in said succession of images, wherein said identifying includes using the received second object-tag to search for the stored first object-tag that corresponds to the second object tag, and using the updated position data associated with the first object-tag to locate the selected object in the latest available captured image; andexecuting said at least one instruction with respect to the selected object without being prevented by a time-delay between a time when the sensing unit acquires the latest available captured image with the selected object and a time when the corresponding command is received at the sensing unit due to the identifying of the selected object in the latest available captured image by using the first and second object-tags. 12. The method according to claim 11, wherein said sensing-data further includes said first object-tag. 13. The method according to claim 11, further comprising: at the control center: receiving said first image;identifying one or more objects in said first image;assigning an object-tag to at least one of said objects thereby generating a parallel tagged object;storing said object-tag in association with the parallel tagged object; wherein said identification and assigning is performed in the sensing unit and the control center according to identical principles. 14. The method according to claim 11, further comprising: estimating a real-time location of the selected object, in case said selected object is not located inside a field of view of said image sensor; andgenerating direction commands to said image sensor, such that said selected object is located in a real-time image of the scene generated by the image sensor. 15. The method according to claim 11, further comprising: determining one or more characteristics of said one or more identified objects; andselecting an object to be assigned with a respective object-tag, in case one or more characteristics of the object match a predefined criterion. 16. The method according to claim 11, further comprising responsive to a received command, including data indicative of at least one selected object; determining object-data of said at least one selected object; andselecting one or more objects to be assigned with a respective object-tag from among said one or more identified objects, in case object-data of said one or more identified objects match the object-data of said at least one selected object. 17. The method according to claim 11, further comprising: receiving at the control center a captured image from said sensing unit;displaying said image on a display;responsive to a selection of an object in said image: identifying an object-tag associated with said object;generating said command; andtransmitting said command to said sensing unit. 18. The method according to claim 11, wherein images in said succession of images are captured by an image sensor in said sensing unit. 19. The method according to claim 11, wherein said sensing unit comprises a processor and a data-repository. 20. A program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform computer-implemented method steps of tracking objects by a sensing unit operable to communicate over a communication link with a control center located remotely from the sensing unit, the method comprising: capturing a succession of images of a scene;identifying one or more objects in a first image in said succession of images;tagging at least one of said objects with a first object-tag, thereby generating a respective tagged object;transmitting sensing-data to said control center, said sensing-data including at least said first image;storing said first object-tag along with data indicative of a position of the respective tagged object;tracing the stored tagged object, from said first image in said succession of images to a later image in said succession of images, thereby maintaining a given object-tag associated with its respective tagged-object along said succession of images;in case said position of the respective tagged object in said first image is changed in the later image, updating the stored data indicative of the position of the respective tagged object;receiving a command from said control center, the command including a second object-tag incorporated in the command, the second object-tag being indicative of a selected object, and the command also including at least one instruction related to the selected object;identifying, with the help of said stored first object-tag and the second object-tag, said selected object in a latest available captured image in said succession of images, wherein said identifying includes using the received second object-tag to search for the stored first object-tag that corresponds to the second object tag, and using the updated position data associated with the first object-tag to locate the selected object in the latest available captured image; andexecuting said at least one instruction with respect to the selected object without being prevented by a time-delay between a time when the sensing unit acquires the latest available captured image with the selected object and a time when the corresponding command is received at the sensing unit due to the identifying of the selected object in the latest available captured image by using the first and second object-tags. 21. The program storage device readable by machine according to claim 20, wherein said sensing-data further includes said first object-tag. 22. The program storage device readable by machine according to claim 20, wherein the method further comprises, in case said selected object is not located, to estimate a real-time location of the selected object, and generate direction commands for an image sensor.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (90)
Khani Dara T., Adaptive filter selection for optimal feature extraction.
Chodos Steven L. (Los Angeles CA) Groves Gillian K. (Lawndale CA) Stephan Larisa (Hawthorne CA), Adaptive track loop bandwidth reduction in cluttered scenes.
Lo Thomas K. (Temple City CA) Banh Nam D. (Canoga Park CA) Bohn Timothy T. (Simi Valley CA) Sacks Jack M. (Thousand Oaks CA), Apparatus and method for tracking a target.
Rankin David B. (Lowell MA) Roberts ; Jr. Edgar P. (Winston-Salem NC) Kluttz James W. (Winston-Salem NC), Apparatus and method for tracking the flight of a golf ball.
Fagarasan John T. (Brea CA) Alland Stephen W. (Pomona CA) Jakaub Ronald J. (Hacienda Heights CA), Automatic global radar/IR/ESM track association based on ranked candidate pairings and measures of their proximity.
Fagarasan John T. (Brea CA) Alland Stephen W. (Pomona CA) Jakaub Ronald J. (Hacienda Heights CA), Automatic global radar/IR/Esm track association based on ranked candidate pairings and measures of their proximity.
Petajan Eric D. (25 Cypress St. Millburn NJ 07041), Electronic facial tracking and detection system and method and apparatus for automated speech recognition.
Chacon Kim M. (Arcadia CA) Groves Gillian K. (Lawndale CA) Prager Kenneth E. (Los Angeles CA), Image-based detection and tracking system and processing method employing clutter measurements and signal-to-clutter rat.
Kara Atsushi (#145 The Village at Vanderbilt Nashville TN 37212) Kawamura Kazuhiko (5908 Robert E. Lee Dr. Nashville TN 37215), Method and apparatus for automatically tracking an object.
Ono Shuji,JPX ; Osawa Akira,JPX, Method for learning by a neural network including extracting a target object image for which learning operations are to be carried out.
Bekiares, Tyrone D.; Blatter, Isaac B.; Keller, Matthew C.; Tine, Steven D., Methods and apparatus to compensate for overshoot of a desired field of vision by a remotely-controlled image capture device.
Bassman Robert G ; Bhatt Bhavesh B. ; Call Bill J. ; Hansen Michael W. ; Hsu Stephen C. ; van der Wal Gooitzen S. ; Wixson Lambert E., Parallel-pipelined image processing system.
Schlossers Christophe (Chatillon FRX) Burzawa Stephane (Versailles FRX) Megel Franois (Chaville FRX) Le Gouzouguec Anne (Vanves FRX), Process and device for determining the location of a target.
Krueger Myron W. (55 Edith Rd. Vernon CT 06066) Hinrichsen Katrin (81 Willington Oaks Storrs CT 06268) Gionfriddo Thomas S. (81 Willington Oaks Storrs CT 06268), Real time perception of and response to the actions of an unencumbered participant/user.
Ertel John (Portola Valley CA) Holland William D. (Palo Alto CA) Vincent Kent D. (Cupertino CA) Jamp Rueiming (Los Altos CA) Baldwin Richard R. (Saratoga CA), Substrate advance measurement system using cross-correlation of light sensor array signals.
Nishimura Shigeru (Tokyo JPX) Kuroda Toshihisa (Kanagawa JPX) Aoki Shin-ichiro (Kanagawa JPX), Video signal processing apparatus for separating an image of a moving object from video signals.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.