A number of roadway sensing systems are described herein. An example of such is an apparatus to detect and/or track objects at a roadway with a plurality of sensors. The plurality of sensors can include a first sensor that is a radar sensor having a first field of view that is positionable at the ro
A number of roadway sensing systems are described herein. An example of such is an apparatus to detect and/or track objects at a roadway with a plurality of sensors. The plurality of sensors can include a first sensor that is a radar sensor having a first field of view that is positionable at the roadway and a second sensor that is a machine vision sensor having a second field of view that is positionable at the roadway, where the first and second fields of view at least partially overlap in a common field of view over a portion of the roadway. The example system includes a controller configured to combine sensor data streams for at least a portion of the common field of view from the first and second sensors to detect and/or track the objects.
대표청구항▼
1. An apparatus to detect or track objects at a roadway, the apparatus comprising: a plurality of sensors, comprising a first sensor that is a radar sensor having a first field of view that is installed at a stationary position in association with a roadway and a second sensor that is a machine visi
1. An apparatus to detect or track objects at a roadway, the apparatus comprising: a plurality of sensors, comprising a first sensor that is a radar sensor having a first field of view that is installed at a stationary position in association with a roadway and a second sensor that is a machine vision sensor having a second field of view that is installed at a collocated stationary position in association with the roadway, wherein the first and second fields of view at least partially overlap in a common field of view over a portion of the roadway; anda controller configured to combine sensor data streams for at least a portion of the common field of view from the first and second sensors to detect or track objects. 2. The apparatus of claim 1, comprising two different coordinate systems for at least a portion of the common field of view of the first sensor and the second sensor that are transformed to a homographic matrix by correspondence of points of interest between the two different coordinate systems. 3. The apparatus of claim 2, wherein the correspondence of the points of interest comprises at least one synthetic target generator device positioned in the coordinate system of the radar sensor that is correlated to a position observed for the at least one synthetic target generator device in the coordinate system of the machine vision sensor. 4. The apparatus of claim 2, wherein the correspondence of the points of interest comprises an application to simultaneously accept a first data stream from the radar sensor and a second data stream from the machine vision sensor, display an overlay of at least one detected point of interest in the different coordinate systems of the radar sensor and the machine vision sensor, and to enable alignment of the points of interest. 5. The apparatus of claim 1, wherein the first and second sensors are collocated adjacent to one another and are both commonly supported by a stationary support structure installed in association with the roadway. 6. A system to detect or track objects in a roadway area, comprising: a radar sensor having a first field of view as a first sensing modality that is installed at a stationary position in association with a roadway;a first machine vision sensor having a second field of view as a second sensing modality that is installed at a collocated stationary position in association with the roadway; anda communication device configured to communicate data from the first and second sensors to a processing resource that is remote from the collocated first and second sensors. 7. The system of claim 6, wherein the processing resource comprises cloud based processing. 8. The system of claim 6, wherein the second field of view of the first machine vision sensor has a horizontal field of view of 100 degrees or less. 9. The system of claim 6, comprising a second machine vision sensor having a wide angle horizontal field of view that is greater than 100 degrees that is positionable at the roadway. 10. The system of claim 9, wherein the second machine vision sensor having the wide angle horizontal field of view is a third sensing modality that is positioned to simultaneously detect a number of objects positioned within two crosswalks or a number of objects traversing at least two stop lines at an intersection. 11. The system of claim 9, wherein at least one sensor selected from the radar sensor, the first machine vision sensor, and the second machine vision sensor is configured or positioned to detect and track objects within 100 to 300 feet of a stop line at an intersection, a dilemma zone up to 300 to 600 feet distal from the stop line, and an advanced zone greater that 300 to 600 feet distal from the stop line. 12. The system of claim 10, wherein the radar sensor and the first machine vision sensor are collocated in an integrated assembly and the second machine vision sensor is mounted in a location separate from the integrated assembly and communicates data to the processing resource. 13. The system of claim 6, comprising an automatic license plate recognition (ALPR) sensor that is positionable at the roadway and that senses visible or infrared light reflected or emitted by a vehicle license plate. 14. The system of claim 13, wherein the radar sensor, the first machine vision sensor, and the ALPR are collocated in an integrated assembly that communicates data to the processing resource via the communication device. 15. The system of claim 13, wherein the ALPR sensor captures an image of a license plate as determined by input from at least one of the radar sensor, a first machine vision sensor having the horizontal field of view of 100 degrees or less, and the second machine vision sensor having the wide angle horizontal field of view that is greater than 100 degrees. 16. A non-transitory machine-readable medium storing instructions executable by a processing resource to detect or track objects in a roadway area, the instructions executable to: receive data input from a first discrete sensor type having a first sensor coordinate system;receive data input from a second discrete sensor type having a second sensor coordinate system;assign a time stamp from a common clock to each of a number of putative points of interest in the data input from the first discrete sensor type and the data input from the second discrete sensor type;determine a location and motion vector for each of the number of putative points of interest in the data input from the first discrete sensor type and the data input from the second discrete sensor type;match multiple pairs of the putative points of interest in the data input from the first discrete sensor type and the data input from the second discrete sensor type based upon similarity of the assigned time stamps and the location and motion vectors to determine multiple matched points of interest; andcompute a two-dimensional homography between the first sensor coordinate system and the second sensor coordinate system based on the multiple matched points of interest. 17. The medium of claim 16, the instructions further executable to: calculate a first probability of accuracy of an object attribute detected by the first discrete sensor type by a first numerical representation of the attribute for probability estimation;calculate a second probability of accuracy of the object attribute detected by the second discrete sensor type by a second numerical representation of the attribute for probability estimation; andfuse the first probability and the second probability of accuracy of the object attribute to provide a single estimate of the accuracy of the object attribute. 18. The medium of claim 17, the instructions further executable to: estimate a probability of presence or velocity of a vehicle by fusion of the first probability and the second probability of accuracy to the single estimate of the accuracy, wherein the first discrete sensor type is a radar sensor and the second discrete sensor type is a machine vision sensor and wherein the numerical representation of first probability and the numerical representation of second probability of accuracy of presence or velocity of the vehicle are dependent upon the sensing environment. 19. The medium of claim 16, the instructions further executable to: monitor traffic behavior in the roadway area by data input from at least one of the first discrete sensor type and the second discrete sensor type related to vehicle position and velocity;compare the vehicle position and velocity input to a number of predefined statistical models of the traffic behavior to cluster similar traffic behaviors; andif incoming vehicle position and velocity input does not match at least one of the number of predefined statistical models, generate a new model to establish a new pattern of traffic behavior. 20. The medium of claim 19, the instructions further executable to: repeatedly receive the data input from at least one of the first discrete sensor type and the second discrete sensor type related to vehicle position and velocity;classify lane types or geometries in the roadway area based on vehicle position and velocity orientation within one or more model; andpredict behavior of at least one vehicle based on a match of the vehicle position and velocity input with at least one model.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (73)
Kuhn John Patrick ; Bui Binh C ; Pieper Gregory J, Acoustic sensor system for vehicle detection and multi-lane highway monitoring.
Arnold, David V.; Jarrett, Bryan R.; Jarvis, Dale M.; Karlinsey, Thomas W.; Smith, Ryan L.; Waite, Jonathan L., Detecting targets in roadway intersections.
Lachinski Theodore M. (Andover MN) Ptacek Louis S. (Mound MN) Blais Paul M. (St. Paul MN) Boggs Stephen (Fridley MN) Longfellow John W. (St. Paul MN) Setterholm Jeffrey M. (Lakeville MN), Method and apparatus for collecting and processing visual and spatial position information from a moving platform.
Tsutsumi Kazumichi (Tokyo JPX) Okamura Shigekazu (Tokyo JPX) Irie Tatsuji (Tokyo JPX), Method and apparatus for monitoring the surroundings of a vehicle and for detecting failure of the monitoring apparatus.
Henninger, III, Paul E; Bell, Mark; Magid, Bruce, Method and apparatus for producing frame accurate position data in a PTZ dome camera with open loop control.
Teffer, Dean; Sherwood, Alexander, Method and system for collecting traffic data, monitoring traffic, and automated enforcement at a centralized station.
Perl,Elyahu, Method and system for correlating radar position data with target identification data, and determining target position using round trip delay data.
Moehlenbrink Wolfgang,DEX ; Moehrke Gerd,DEX ; Boese Peter,DEX, Process and arrangement for determining the position of at least one point of a track-guided vehicle.
Maren Alianna J. ; Akita Richard M. ; Colbert Bradley D. ; Donovan David J. ; Glover Charles W. ; Mathia Karl ; Pap Robert M. ; Priddy Kevin L. ; Robinson Timothy W. ; Saeks Richard E., Sensor fusion apparatus and method.
Sinor Susan S. (Ellicott City MD) Zagardo Vincent S. (Cockeysville MD) Fry David B. (Columbia MD) Vanous Michael D. (Ellicott City MD) Mowery David L. (Highland MD) Kiselewich Gary M. (Baltimore MD) , Signal processing system and method.
Arnold, David V.; Giles, Bradley C.; Harris, Logan C.; Jarrett, Bryan R.; Karlinsey, Thomas W.; Waite, Jonathan L., Systems and methods for configuring intersection detection zones.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.