System and method for multipurpose traffic detection and characterization
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G08G-001/01
G08G-001/065
G01S-017/58
G01S-017/66
G01S-017/88
G01S-017/89
G01S-007/48
G01S-007/484
G08G-001/04
G06K-009/00
G08G-001/015
G08G-001/017
G08G-001/054
G06K-009/32
G06T-007/20
G01S-017/02
출원번호
US-0115244
(2013-03-01)
등록번호
US-9235988
(2016-01-12)
국제출원번호
PCT/IB2013/051667
(2013-03-01)
국제공개번호
WO2013/128427
(2013-09-06)
발명자
/ 주소
Mimeault, Yvan
Gidel, Samuel
출원인 / 주소
LEDDARTECH INC.
대리인 / 주소
Robert Plotkin, P.C.
인용정보
피인용 횟수 :
2인용 특허 :
149
초록▼
A method for tracking and characterizing a plurality of vehicles simultaneously in a traffic control environment, comprising: providing a 3D optical emitter; providing a 3D optical receiver with a wide and deep field of view; driving the 3D optical emitter into emitting short light pulses; receiving
A method for tracking and characterizing a plurality of vehicles simultaneously in a traffic control environment, comprising: providing a 3D optical emitter; providing a 3D optical receiver with a wide and deep field of view; driving the 3D optical emitter into emitting short light pulses; receiving a reflection/backscatter of the emitted light, thereby acquiring an individual digital full-waveform LIDAR trace for each detection channel of the 3D optical receiver; using the individual digital full-waveform LIDAR trace and the emitted light waveform, detecting a presence of a plurality of vehicles, a position of at least part of each vehicle and a time at which the position is detected; assigning a unique identifier to each vehicle; repeating the steps of driving, receiving, acquiring and detecting, at a predetermined frequency; tracking and recording an updated position of each vehicle and an updated time at which the updated position is detected.
대표청구항▼
1. A method for tracking and characterizing a plurality of vehicles simultaneously in a traffic control environment, the method comprising: providing a 3D optical emitter at an installation height oriented to allow illumination of a 3D detection zone in said environment;providing a 3D optical receiv
1. A method for tracking and characterizing a plurality of vehicles simultaneously in a traffic control environment, the method comprising: providing a 3D optical emitter at an installation height oriented to allow illumination of a 3D detection zone in said environment;providing a 3D optical receiver oriented to have a wide and deep field of view within said 3D detection zone, said 3D optical receiver having a plurality of detection channels in said field of view;driving the 3D optical emitter into emitting short light pulses toward the detection zone, said light pulses having an emitted light waveform;receiving a reflection/backscatter of the emitted light on the vehicles in the 3D detection zone at said 3D optical receiver, thereby acquiring an individual digital full-waveform LIDAR trace for each detection channel of said 3D optical receiver;using said individual digital full-waveform LIDAR trace and said emitted light waveform, detecting a presence of a plurality of vehicles in said 3D detection zone, a position of at least part of each said vehicle in said 3D detection zone and a time at which said position is detected;assigning a unique identifier to each vehicle of said plurality of vehicles detected;repeating said steps of driving, receiving, acquiring and detecting, at a predetermined frequency;at each instance of said repeating step, tracking and recording an updated position of each vehicle of said plurality of vehicles detected and an updated time at which said updated position is detected, with said unique identifier;wherein said detecting said presence includes: extracting observations in the individual digital full-waveform LIDAR trace;using the location for the observations to remove observations coming from a surrounding environment;extracting lines using an estimate line and a covariance matrix using polar coordinates;removing observations located on lines parallel to the x axis. 2. The method as claimed in claim 1, wherein said traffic control environment is at least one of a traffic management environment and a traffic enforcement environment. 3. The method as claimed in claim 1, wherein said detecting said presence includes extracting observations in the individual digital full-waveform LIDAR trace and intensity data for the observations;finding at least one blob in the observations;computing an observation weight depending on the intensity of the observations in the blob;computing a blob gravity center based on the weight and a position of the observations in the blob. 4. The method as claimed in claim 1, further comprising setting at least one trigger line location and recording trigger line trespassing data with the unique identifier. 5. The method as claimed in claim 4, further comprising setting said trigger line location relative to a visible landmark in said environment. 6. The method as claimed in claim 1, wherein said detecting said time at which said position is detected includes assigning a timestamp for said detecting said presence and wherein said timestamp is adapted to be synchronized with an external controller. 7. The method as claimed in claim 1, further comprising obtaining a classification for each detected vehicles using a plurality of detections in the 3D detection zone caused by the same vehicle. 8. The method as claimed in claim 1, wherein said detecting said presence further comprises detecting a presence of a pedestrian in said environment. 9. The method as claimed in claim 1, wherein said part of said vehicle is one of a front, a side and a rear of the vehicle. 10. The method as claimed in claim 1, wherein emitting short light pulses includes emitting short light pulses of a duration of less than 50 ns. 11. The method as claimed in claim 1, wherein said 3D optical emitter is at least one of an infrared LED source, a visible-light LED source and a laser. 12. The method as claimed in claim 1, wherein said providing said 3D optical receiver to have a wide and deep field of view includes providing said 3D optical receiver to have a horizontal field of view angle of at least 20° and a vertical field of view angle of at least 4°. 13. The method as claimed in claim 1, further comprising determining and recording a speed for each said vehicle using said position and said updated position of one of said instances of said repeating step and an elapsed time between said time of said position and said updated time of said updated position, with said unique identifier. 14. The method as claimed in claim 13, further comprising using a Kalman filter to determine an accuracy for said speed to validate said speed; comparing said accuracy to a predetermined accuracy threshold; if said accuracy is lower than said predetermined accuracy threshold, rejecting said speed. 15. The method as claimed in claim 14, further comprising retrieving a speed limit and identifying a speed limit infraction by comparing said speed recorded for each said vehicle to said speed limit. 16. The method as claimed in claim 1, further comprising: providing a 2D optical receiver, wherein said 2D optical receiver being an image sensor adapted to provide images of said 2D detection zone;driving the 2D optical receiver to capture a 2D image;using image registration to correlate corresponding locations between said 2D image and said detection channels;extracting vehicle identification data from said 2D image at a location corresponding to said location for said detected vehicle;assigning said vehicle identification data to said unique identifier. 17. The method as claimed in claim 16, wherein the vehicle identification data is at least one of a picture of the vehicle and a license plate alphanumerical code present on the vehicle. 18. The method as claimed in claim 17, wherein the vehicle identification data includes said 2D image showing a traffic violation. 19. The method as claimed in claim 17, further comprising extracting at least one of a size of characters on the license plate and a size of the license plate and comparing one of said size among different instances of the repeating to determine an approximate speed value. 20. The method as claimed in claim 16, further comprising providing a 2D illumination source oriented to allow illumination of a 2D detection zone in said 3D detection zone and driving the 2D illumination source to emit pulses to illuminate said 2D detection zone and synchronizing said driving the 2D optical receiver to capture images with said driving the 2D illumination source to emit pulses to allow capture of said images during said illumination. 21. The method as claimed in claim 20, wherein driving the 2D illumination source includes driving the 2D illumination source to emit pulses of a duration between 10 μs and 10 ms. 22. The method as claimed in claim 19, wherein the 2D illumination source is at least one of a visible light LED source, an infrared LED light source and laser. 23. The method as claimed in claim 19, wherein the 3D optical emitter and the 2D illumination source are provided by a common infrared LED light source. 24. The method as claimed in claim 19, wherein the vehicle identification data is at least two areas of high retroreflectivity apparent on the images, said detecting a presence includes extracting observations in the individual digital signals and intensity data for the observations, the method further comprising correlating locations for the areas of high retroreflectivity and high intensity data locations in the observations, wherein each said area of high retroreflectivity is created from one of a retroreflective license plate, a retro-reflector affixed on a vehicle and a retro-reflective lighting module provided on a vehicle. 25. The method as claimed in claim 16, further comprising combining multiples ones of said captured images into a combined image with the vehicle and the vehicle identification data apparent. 26. A system for tracking and characterizing a plurality of vehicles simultaneously in a traffic control environment, the system comprising: a 3D optical emitter provided at an installation height and oriented to allow illumination of a 3D detection zone in the environment;a 3D optical receiver provided and oriented to have a wide and deep field of view within the 3D detection zone, the 3D optical receiver having a plurality of detection channels in said field of view;a controller for driving the 3D optical emitter into emitting short light pulses toward the detection zone, the light pulses having an emitted light waveform;the 3D optical receiver receiving a reflection/backscatter of the emitted light on the vehicles in the 3D detection zone, thereby acquiring an individual digital full-waveform LIDAR trace for each channel of the 3D optical receiver;a processor for detecting a presence of a plurality of vehicles in the 3D detection zone using the individual digital full-waveform LIDAR trace and the emitted light waveform, detecting a position of at least part of each the vehicle in the 3D detection zone, recording a time at which the position is detected, assigning a unique identifier to each vehicle of the plurality of vehicles detected and tracking and recording an updated position of each vehicle of the plurality of vehicles detected and an updated time at which the updated position is detected, with the unique identifier;a 2D optical receiver, wherein the 2D optical receiver is an image sensor adapted to provide images of the 2D detection zone; anda driver for driving the 2D optical receiver to capture a 2D image;the processor being further adapted for using image registration to correlate corresponding locations between said 2D image and said detection channels and extracting vehicle identification data from the 2D image at a location corresponding to the location for the detected vehicle; and assigning the vehicle identification data to the unique identifier. 27. The system as claimed in claim 26, wherein said processor is further for determining and recording a speed for each the vehicle using the position and the updated position of one of the instances of the repeating step and an elapsed time between the time of the position and the updated time of the updated position, with the unique identifier. 28. The system as claimed in claim 26, further comprising a 2D illumination source provided and oriented to allow illumination of a 2D detection zone in the 3D detection zone;a source driver for driving the 2D illumination source to emit pulses;a synchronization module for synchronizing said source driver and said driver to allow capture of said images while said 2D detection zone is illuminated. 29. A method for tracking and characterizing a plurality of vehicles simultaneously in a traffic control environment, the method comprising: providing a 3D optical emitter at an installation height oriented to allow illumination of a 3D detection zone in said environment;providing a 3D optical receiver oriented to have a wide and deep field of view within said 3D detection zone, said 3D optical receiver having a plurality of detection channels in said field of view;driving the 3D optical emitter into emitting short light pulses toward the detection zone, said light pulses having an emitted light waveform;receiving a reflection/backscatter of the emitted light on the vehicles in the 3D detection zone at said 3D optical receiver, thereby acquiring an individual digital full-waveform LIDAR trace for each detection channel of said 3D optical receiver;using said individual digital full-waveform LIDAR trace and said emitted light waveform, detecting a presence of a plurality of vehicles in said 3D detection zone, a position of at least part of each said vehicle in said 3D detection zone and a time at which said position is detected;assigning a unique identifier to each vehicle of said plurality of vehicles detected;repeating said steps of driving, receiving, acquiring and detecting, at a predetermined frequency;at each instance of said repeating step, tracking and recording an updated position of each vehicle of said plurality of vehicles detected and an updated time at which said updated position is detected, with said unique identifier,wherein said detecting said presence includes: extracting observations in the individual digital full-waveform LIDAR trace and intensity data for the observations;finding at least one blob in the observations;computing an observation weight depending on the intensity of the observations in the blob;computing a blob gravity center based on the weight and a position of the observations in the blob.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (149)
Mimeault, Yvan; Perreault, Louis, 3D optical detection system and method for a mobile storage system.
Kim, Heung Seop; Hur, Won Moo; Kim, Hwa Sik; Park, Jong Sup; Kim, Young Hoon, Collision avoidance system based on detection of obstacles in blind spots of vehicle.
Arnold, David V.; Jarrett, Bryan R.; Jarvis, Dale M.; Karlinsey, Thomas W.; Smith, Ryan L.; Waite, Jonathan L., Detecting targets in roadway intersections.
Arnold, David V.; Jarrett, Bryan R.; Jarvis, Dale M.; Karlinsey, Thomas W.; Smith, Ryan L.; Waite, Jonathan L., Detecting targets in roadway intersections.
Wangler Richard J. ; Gustavson Robert L. ; McConnell ; II Robert E. ; Fowler Keith L., Intelligent vehicle highway system multi-lane sensor and method.
Marmarelis Vasilis Z. ; Nikias Chrysostomos L. ; Shin Dae Cheol, Method and apparatus for situationally adaptive processing in echo-location systems operating in non-Gaussian environments.
Harrington, Nathan J.; Hoots, III, Harry L., Method and system for vehicle mounted infrared wavelength information displays for traffic camera viewers.
Nourrcier, Charles E.; Wirtz, Karen D.; Brown, Stanley D.; Sakamoto, Colin N.; Martinez, Angela K.; Johnson, Scott C., Method for improved range accuracy in laser range finders.
Aebischer, Beat; Betschon, Christian; Gachter, Bernhard; Orawez, Georg; Ramseier, Ernst; Rutishauser, Esther, Method for measuring the distance to at least one target.
Peddie, Timm; Bim-Merle, David P.; Burnham, Thomas A.; Santos, Daniel O.; Miller, Lawrence E.; Mehta, Vineet; Van Niekerk, Johannes B., Methods, systems and devices related to road mounted indicators for providing visual indications to approaching traffic.
Harry H. Cheng ; Benjamin D. Shaw ; Joe Palen ; Jonathan E. Larson ; Xudong Hu ; Kirk Van Katwyk, Non-intrusive laser-based system for detecting objects moving across a planar surface.
Barkley George J. (283 Wooded La. North Barrington IL 60010) Barkley Roberta (283 Wooded La. North Barrington IL 60010), Photoelectric vehicle position indicating device for use in parking and otherwise positioning vehicles.
McLauchlan John M. (Pasadena CA) AuYeung John (Alhambra CA) Tubbs Eldred F. (Claremont CA) Goss Willis C. (Altadena CA) Psaltis Demetri (Altadena CA), Ranging system which compares an object-reflected component of a light beam to a reference component of the light beam.
Maxik, Fredric S.; Bartine, David E.; Soler, Robert R.; Bastien, Valerie A.; Schellack, James Lynn; Grove, Eliza Katar, Sustainable outdoor lighting system for use in environmentally photo-sensitive area.
Sirota, J. Marcos; Seas, Antonios; Field, Christopher; Marzouk, Marzouk, System and method for traffic monitoring, speed determination, and traffic light violation detection and recording.
Arnold, David V.; Giles, Bradley C.; Harris, Logan C.; Jarrett, Bryan R.; Karlinsey, Thomas W.; Waite, Jonathan L., Systems and methods for configuring intersection detection zones.
Wangler Richard J. ; Myers John T. ; Gustavson Robert L. ; McConnell ; II Robert E., Vehicle classification and axle counting sensor system and method.
Baker, Jay DeAvis; Kneisel, Lawrence LeRoy; Zoratti, Paul Kirk; Attard, Jimmy; Glovatsky, Andrew Zachary, Vehicle lamp and vehicle illumination and data transmission system incorporating same.
Tummala, Gopi Krishna; Cobb, Derrick Ian; Sinha, Prasun; Ramnath, Rajiv, Methods and apparatus for enabling mobile communication device based secure interaction from vehicles through motion signatures.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.