Collision avoidance using limited range gated video
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G08G-001/16
B60T-007/16
출원번호
US-0436695
(2003-05-13)
발명자
/ 주소
Trudeau, Tim K.
출원인 / 주소
ITT Manufacturing Enterprises, Inc.
대리인 / 주소
RatnerPrestia
인용정보
피인용 횟수 :
177인용 특허 :
4
초록▼
A collision avoidance system for a vehicle includes a collision avoidance processor, and an active sensor for obtaining successive range measurements to objects along a path transversed by the vehicle. The active sensor includes a near range gate for obtaining a near range measurement to objects loc
A collision avoidance system for a vehicle includes a collision avoidance processor, and an active sensor for obtaining successive range measurements to objects along a path transversed by the vehicle. The active sensor includes a near range gate for obtaining a near range measurement to objects located at a range less than a first predetermined range, and a far range gate for obtaining a far range measurement to objects located at a range greater than a second predetermined range. The near and far range measurements are provided to the collision avoidance processor for maneuvering of the vehicle.
대표청구항▼
1. A collision avoidance system for a vehicle comprisinga collision avoidance processor,a single active sensor for obtaining successive range measurements to objects along a path transversed by the vehicle, the single active sensor includinga near range gate for obtaining a near range measurement to
1. A collision avoidance system for a vehicle comprisinga collision avoidance processor,a single active sensor for obtaining successive range measurements to objects along a path transversed by the vehicle, the single active sensor includinga near range gate for obtaining a near range measurement to objects located at a range less than a first predetermined range, anda far range gate for obtaining a far range measurement to objects located at a range greater than a second predetermined range,wherein the near and far range measurements are provided to the collision avoidance processor from the single active sensor for maneuvering of the vehicle.2. The collision avoidance system of claim 1 whereinthe active sensor includes a transmitter for transmitting pulses toward the objects, and a receiver for receiving the pulses reflected from the objects,the near range gate is configured to be gated ON between a time of transmission of a pulse and a time of reception of the pulse reflected from an object located at a range less than the first predetermined range, andthe far range gate is configured to be gated ON at a time of reception of a pulse reflected from an object located at a range greater than the second predetermined range.3. The collision avoidance system of claim 2 whereinthe near and far range gates are both configured to be gated OFF between a time of reception of a pulse reflected from an object located at a range greater than the first predetermined range and a time of reception of a pulse reflected from an object located at a range less than the second predetermined range.4. The collision avoidance system of claim 2 whereinthe active sensor is configured to transmit pulses having leading and trailing edges,the near range gate is configured to receive the leading edges of the pulses, andthe far range gate is configured to receive the trailing edges of the pulses.5. The collision avoidance system of claim 1 whereinthe active sensor includes a laser light transmitted toward objects, anda camera for receiving light reflected from the objects to form an image.6. The collision avoidance system of claim 5 whereinthe laser light includes pulses having leading and trailing edges,the near range gate is configured to receive the leading edges of the pulses and ignore the trailing edges of the pulses, andthe far range gate is configured to receive the trailing edges of the pulses and ignore the leading edges of the pulses.7. The collision avoidance system of claim 6 whereinthe near range gate is configured to receive a leading edge of a respective pulse transmitted concurrently with a shutter opening of the camera, anda trailing edge of the respective pulse is configured to occur after a shutter closing of the camera.8. The collision avoidance system of claim 6 whereinthe far range gate is configured to receive a trailing edge of a respective pulse transmitted prior to a shutter opening of the camera, andthe shutter opening of the camera is configured to occur at 2 times the second predetermined range divided by the speed of light.9. The collision avoidance system of claim 5 whereinthe first predetermined range is based on turning velocity of the vehicle, andthe second predetermined range is based on a difference between a maximum range of the camera and a depth of field of the camera.10. A method of collision avoidance for a vehicle comprising the steps of:(a) measuring, using a near range sate of a single sensor, a near range to objects located at a range less than a first predetermined range;(b) measuring, using a far range sate of the same single sensor, a far range to objects located at a range greater than a second predetermined range;(c) steering the vehicle in response to the single sensor measuring the objects located at the range greater than the second predetermined range; and(d) modifying step (c) in response to the single sensor measuring the objects located at the range less than the first predetermined range.11. The method of claim 10 in whichsteps (a) and (b) includes transmitting pulses toward the objects, receiving reflected pulses from the objects, and measuring the range to the objects using a near range gate and a far range gate.12. The method of claim 11 in whichstep (a) includes gating the near range gate ON between a time of transmission of a leading edge of a pulse and a time of reception of a leading edge of a reflected pulse, andstep (b) includes gating the far range gate ON at a time of reception of a trailing edge of a pulse reflected off an object located at the range greater than the second predetermined range.13. The method of claim 10 including the steps of:(e) transmitting a laser light toward the objects; and(f) receiving the laser light reflected from the objects with a camera.14. The method of claim 13 in whichstep (e) includes transmitting the laser light as pulses having leading edges and trailing edges, andstep (f) includes gating ON a shutter of the camera between a time of transmission of a leading edge of a pulse and a time of reception of a leading edge of a pulse reflected off an object located at the range less than the first predetermined range, andgating ON the shutter of the camera at a time of reception of a trailing edge of a reflected pulse located at a range greater than the second predetermined range.15. The method of claim 14 in whichstep (f) includes gating OFF the shutter of the camera between a time of reception of a pulse reflected from an object located at a range greater than the first predetermined range and a time of reception of a pulse reflected from an object located at a range less than the second predetermined range.16. The method of claim 10 in whichstep (c) includes steering the vehicle to an aim point located at the range greater than the second predetermined range, andstep (d) includes modifying at least one of direction and velocity of the vehicle to avoid an object located at the range less than the first predetermined range.17. A method of collision avoidance for a vehicle comprising the steps of:(a) imaging, using a near range gate of a sensor, a scene located at a first predetermined range;(b) imaging, using a far range sate of the same sensor, a scene located at a second predetermined range, the second predetermined range being greater than the first predetermined range;(c) locating at least one low density region in the scene imaged by the sensor at the second predetermined range;(d) selecting a low density region from low density regions located in step (c), and using the selected low density region as a steering aim point for the vehicle; and(e) modifying at least one of direction and velocity of the vehicle, while being steered toward the steering aim point, to avoid objects located by the sensor at the first predetermined range.18. The method of claim 17 including the steps of:(f) detecting edges of objects imaged in step (b);(g) identifying interrupted edges of objects detected in step (f);(h) interpolating interrupted edges of objects identified in step (g); and(i) updating a map scene density using the interpolated interrupted edges of objects; andstep (c) includes locating the low density region in the scene imaged at the second predetermined range using the updated map scene density.19. The method of claim 17 including the steps of:(f) steering the vehicle to a global goal using a velocity vector;(g) selecting a velocity tuple from a set of velocity tuples that satisfy the conditions of(i) steering the vehicle to the low density region selected in step (d) and(ii) steering the vehicle to the global goal using the velocity vector of step (f).20. The method of claim 17 including the step of:(f) detecting objects imaged in the scene located at the first predetermined range; andstep (e) includes modifying the at least one of direction and velocity of the vehicle to avoid the objects detected in step (f).
Stark, Wayne E.; Bordes, Jean P.; Davis, Curtis; Rao, Raghunath K.; Maher, Monier; Hegde, Manju; Schmid, Otto A., Adaptive transmission and interference cancellation for MIMO radar.
Stark, Wayne E.; Bordes, Jean P.; Davis, Curtis; Rao, Raghunath K.; Maher, Monier; Hegde, Manju; Schmid, Otto A., Adaptive transmission and interference cancellation for MIMO radar.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Adjusting a consumer experience based on a 3D captured image stream of a consumer response.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Adjusting a consumer experience based on a 3D captured image stream of a consumer response.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Assisting a vision-impaired user with navigation based on a 3D captured image stream.
Duggan, David S.; Felio, David A.; Pate, Billy B.; Longhi, Vince R.; Petersen, Jerry L.; Bergee, Mark J., Autonomous control of unmanned aerial vehicles.
Duggan, David S.; Felio, David A.; Pate, Billy B.; Longhi, Vince R.; Petersen, Jerry L.; Bergee, Mark J., Autonomous control of unmanned aerial vehicles.
Gilbert, Jr., Duane L.; Williams, Marcus R.; Okerholm, Andrea M.; Kristant, Elaine H.; Longo, Sheila A.; Kee, Daniel E.; Strauss, Marc D., Autonomous coverage robot.
Ozick, Daniel N.; Okerholm, Andrea M.; Mammen, Jeffrey W.; Halloran, Michael J.; Sandin, Paul E.; Won, Chikyung, Autonomous coverage robot navigation system.
Ozick, Daniel N.; Okerholm, Andrea M.; Mammen, Jeffrey W.; Halloran, Michael J.; Sandin, Paul E.; Won, Chikyung, Autonomous coverage robot navigation system.
Gilbert, Jr., Duane L.; Williams, Marcus R.; Okerholm, Andrea M.; Kristant, Elaine H.; Longo, Sheila A.; Kee, Daniel E.; Strauss, Marc D., Autonomous coverage robot sensing.
Hussey, Patrick Alan; Roy, Robert Paul; Neumann, Rogelio Manfred; Svendsen, Selma; Ozick, Daniel N.; Casey, Christopher M.; Kapoor, Deepak Ramesh; Campbell, Tony L.; Won, Chikyung; Morse, Christopher John; Burnett, Scott Thomas, Autonomous coverage robots.
Svendsen, Selma; Ozick, Daniel N.; Casey, Christopher M.; Kapoor, Deepak Ramesh; Campbell, Tony L.; Won, Chikyung; Morse, Christopher John; Burnett, Scott Thomas, Coverage robot mobility.
Svendsen, Selma; Ozick, Daniel N.; Casey, Christopher M.; Kapoor, Deepak Ramesh; Campbell, Tony L.; Won, Chikyung; Morse, Christopher John; Burnett, Scott Thomas, Coverage robot mobility.
Svendsen, Selma; Ozick, Daniel N.; Casey, Christopher M.; Kapoor, Deepak Ramesh; Campbell, Tony L.; Won, Chikyung; Morse, Christopher John; Burnett, Scott Thomas, Coverage robot mobility.
Svendsen, Selma; Ozick, Daniel N.; Casey, Christopher M.; Kapoor, Deepak Ramesh; Campbell, Tony L.; Won, Chikyung; Morse, Christopher John; Burnett, Scott Thomas, Coverage robot mobility.
Petrini, Erik; Sundqvist, Bengt-Göran; Persson, Andreas; Pellebergs, Johan, Method and arrangement for estimating at least one parameter of an intruder.
Joh, Gyu Myeong; King, Anthony Gerald; Luo, Wangdong; Shehan, Mark Alan; Yung Kang Hung, Charlie, Method and system for impact time and velocity prediction.
Beggs, Ryan P.; Boerger, James C.; Hoffmann, David J.; Markham, Ken; McNeill, Matthew; Muhl, Timothy; Nelson, Kyle; Oates, James; Senfleben, Jason, Methods and apparatus to detect and warn proximate entities of interest.
Beggs, Ryan P.; Boerger, James C.; Hoffmann, David J.; Markham, Ken; McNeill, Matthew; Muhl, Timothy; Nelson, Kyle; Oates, James; Senfleben, Jason, Methods and apparatus to detect and warn proximate entities of interest.
Beggs, Ryan P.; Boerger, James C.; Hoffmann, David J.; Markham, Ken; McNeill, Matthew; Muhl, Timothy; Nelson, Kyle; Oates, James; Senfleben, Jason, Methods and apparatus to detect and warn proximate entities of interest.
Beggs, Ryan P.; Boerger, James C.; Hoffmann, David J.; Markham, Ken; McNeill, Matthew; Muhl, Timothy; Nelson, Kyle; Oates, James; Senfleben, Jason, Methods and apparatus to detect and warn proximate entities of interest.
Beggs, Ryan P; Boerger, James C.; Hoffmann, David J.; Markham, Ken; McNeill, Matthew; Muhl, Timothy; Nelson, Kyle; Oates, James; Senfleben, Jason, Methods and apparatus to detect and warn proximate entities of interest.
Kokkeby, Kristen L.; Lutter, Robert P.; Munoz, Michael L.; Cathey, Frederick W.; Hilliard, David J.; Olson, Trevor L., Methods for autonomous tracking and surveillance.
Duggan, David S.; Felio, David A.; Pate, Billy B.; Longhi, Vince R.; Petersen, Jerry L.; Bergee, Mark J., Multi-sensor autonomous control of unmanned aerial vehicles.
Ozick, Daniel N.; Okerholm, Andrea M.; Mammen, Jeffrey W.; Halloran, Michael J.; Sandin, Paul E.; Won, Chikyung, Navigating autonomous coverage robots.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Predicting adverse behaviors of others within an environment based on a 3D captured image stream.
Halloran, Michael J.; Mammen, Jeffrey W.; Campbell, Tony L.; Walker, Jason S.; Sandin, Paul E.; Billington, Jr., John N.; Ozick, Daniel N., Robot system.
Halloran, Michael J.; Mammen, Jeffrey W.; Campbell, Tony L.; Walker, Jason S.; Sandin, Paul E.; Billington, Jr., John N.; Ozick, Daniel N., Robot system.
Halloran, Michael J.; Mammen, Jeffrey W.; Campbell, Tony L.; Walker, Jason S.; Sandin, Paul E.; Billington, Jr., John N.; Ozick, Daniel N., Robot system.
Halloran, Michael J.; Mammen, Jeffrey W.; Campbell, Tony L.; Walker, Jason S.; Sandin, Paul E.; Billington, Jr., John N.; Ozick, Daniel N., Robot system.
Halloran, Michael J.; Mammen, Jeffrey W.; Campbell, Tony L.; Walker, Jason S.; Sandin, Paul E.; Billington, Jr., John N.; Ozick, Daniel N., Robot system.
Sweet, III, Charles Wheeler; Swart, Hugo, System and method of dynamically controlling parameters for processing sensor output data for collision avoidance and path planning.
Kokkeby, Kristen L.; Lutter, Robert P.; Munoz, Michael L.; Cathey, Frederick W.; Hilliard, David J.; Olson, Trevor L., System and methods for autonomous tracking and surveillance.
Norris, William Robert; Allard, James; Filippov, Mikhail O.; Haun, Robert Dale; Turner, Christopher David Glenn; Gilbertson, Seth; Norby, Andrew Julian, Systems and methods for switching between autonomous and manual operation of a vehicle.
Norris, William Robert; Allard, James; Filippov, Mikhail O.; Haun, Robert Dale; Turner, Christopher David Glenn; Gilbertson, Seth; Norby, Andrew Julian, Systems and methods for switching between autonomous and manual operation of a vehicle.
Duggan, David S.; Felio, David A.; Pate, Billy B.; Longhi, Vince R.; Petersen, Jerry L.; Bergee, Mark J., Unmanned aerial vehicle take-off and landing systems.
Duggan, David S.; Felio, David A.; Pate, Billy B.; Longhi, Vince R.; Petersen, Jerry L.; Bergee, Mark J., Vehicle control system including related methods and components.
Davis, Curtis; Hegde, Manju; Stark, Wayne E.; Eshraghi, Aria; Goldenberg, Marius; Ali, Murtaza, Vehicle radar system with a shared radar and communication system.
Eshraghi, Aria; Bordes, Jean P.; Davis, Curtis; Rao, Raghunath K.; Ali, Murtaza; Dent, Paul W., Vehicular radar system with self-interference cancellation.
Filippov, Mikhail O.; Fitch, Osa; Keller, Scott P.; O'Connor, John; Zendzian, David S.; El Fata, Nadim; Larsen, Kevin; Meuchel, Arlen Eugene; Schmaltz, Mark David; Allard, James; De Roo, Chris A.; Norris, William Robert; Norby, Andrew Julian; Turner, Christopher David Glenn, Versatile robotic control module.
Filippov, Mikhail O.; Fitch, Osa; Keller, Scott P.; O'Connor, John; Zendzian, David S.; El Fata, Nadim; Larsen, Kevin; Meuchel, Arlen Eugene; Schmaltz, Mark David; Allard, James; De Roo, Chris A.; Norris, William Robert; Norby, Andrew Julian; Turner, Christopher David Glenn, Versatile robotic control module.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Warning a user about adverse behaviors of others within an environment based on a 3D captured image stream.
Albertson, Jacob C.; Arnold, Kenneth C.; Goldman, Steven D.; Paolini, Michael A.; Sessa, Anthony J., Warning a user about adverse behaviors of others within an environment based on a 3D captured image stream.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.