IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0544757
(2012-07-09)
|
등록번호 |
US-9558667
(2017-01-31)
|
발명자
/ 주소 |
- Bowers, Jeffrey A.
- Deane, Geoffrey F.
- Hyde, Roderick A.
- Kundtz, Nathan
- Myhrvold, Nathan P.
- Smith, David R.
- Tegreene, Clarence T.
- Wood, Jr., Lowell L.
|
출원인 / 주소 |
|
인용정보 |
피인용 횟수 :
20 인용 특허 :
37 |
초록
▼
A vehicle collision detection system may be configured to coordinate with collision detection systems of other vehicles. The coordination may comprise sharing sensor data with other vehicles, receiving sensor information from other vehicles, using sensor information to generate a collision detection
A vehicle collision detection system may be configured to coordinate with collision detection systems of other vehicles. The coordination may comprise sharing sensor data with other vehicles, receiving sensor information from other vehicles, using sensor information to generate a collision detection model, sharing the collision detection model with other vehicles, receiving a collision detection model from other vehicles, and the like. In some embodiments, vehicles may coordinate sensor operation to form a bistatic and/or multistatic sensor configuration, in which a detection signal generated at a first land vehicle is detected at a sensing system at a second land vehicle.
대표청구항
▼
1. A method, comprising: acquiring first sensor data pertaining to a particular object at a first land vehicle by use of a sensing system of the first land vehicle;using a communication module of the first land vehicle to acquire second sensor data pertaining to the particular object from a second l
1. A method, comprising: acquiring first sensor data pertaining to a particular object at a first land vehicle by use of a sensing system of the first land vehicle;using a communication module of the first land vehicle to acquire second sensor data pertaining to the particular object from a second land vehicle, wherein the second land vehicle comprises a sensing system, wherein the second sensor data comprises sensor data obtained by use of the sensing system of the second land vehicle, and wherein the particular object is external to the second land vehicle and the first land vehicle; anddetermining a kinematic component for a kinematic model of the particular object using a processor of the first land vehicle, wherein determining the kinematic component comprises, calculating a first measurement value pertaining to the kinematic component from the first sensor data pertaining to the particular object,determining a second measurement value pertaining to the kinematic component from the second sensor data pertaining to the particular object, andderiving the kinematic component for the kinematic model of the particular object such that the derived kinematic component incorporates the first measurement value calculated from the first sensor data and the second measurement value determined from the second sensor data. 2. The method of claim 1, further comprising translating the second sensor data into a frame of reference of the first land vehicle. 3. The method of claim 1, further comprising translating the second sensor data into another coordinate system. 4. The method of claim 1, further comprising generating a collision detection model for the first land vehicle that comprises the kinematic model of the particular object. 5. The method of claim 1, wherein the determined kinematic component of the particular object comprises a position of the particular object relative to the first land vehicle. 6. The method of claim 1, wherein the determined kinematic component of the particular object comprises an orientation of the particular object relative to the first land vehicle. 7. The method of claim 1, wherein the first measurement value comprises a first vector quantity, wherein the second measurement value comprises a second vector quantity, and wherein deriving the kinematic component comprises combining the first vector quantity and the second vector quantity. 8. The method of claim 7, wherein the determined kinematic component comprises an acceleration vector of the particular object relative to the first land vehicle. 9. The method of claim 1, further comprising determining another kinematic component for the kinematic model of the particular object by use of the first sensor data. 10. The method of claim 1, further comprising determining another kinematic component for the kinematic model of the particular object by use of the second sensor data. 11. The method of claim 1, wherein the first sensor data and the second sensor data comprise angle information pertaining to the particular object. 12. The method of claim 11, wherein the kinematic component comprises a position of the particular object relative to the first land vehicle at a time the first sensor data was acquired, and wherein determining the position of the particular object comprises triangulating the angle information of the first sensor data with the angle information of the second sensor data. 13. The method of claim 1, wherein the first sensor data and the second sensor data comprise range information pertaining to the particular object. 14. The method of claim 13, wherein the kinematic component comprises an angular orientation of the particular object relative to the first land vehicle, and wherein determining the angular orientation comprises identifying intersecting range radii of the first sensor data and the second sensor data. 15. The method of claim 1, wherein the first sensor data and the second sensor data comprise both range and angle information pertaining to the particular object. 16. The method of claim 15, wherein the kinematic component of the particular object comprises a position of the particular object relative to the first land vehicle at a time the first sensor data was acquired, and wherein determining the position of the particular object comprises combining range and angle information of the first sensor data and the second sensor data. 17. The method of claim 1, wherein the first sensor data and the second sensor data comprise angle information pertaining to the particular object, the method further comprising: acquiring third sensor data comprising range information pertaining to the particular object from a third land vehicle; andgenerating the kinematic model for the particular object by use of the angle information pertaining to the particular object in the first sensor data and the second sensor data and the range information acquired from the third land vehicle. 18. The method of claim 1, the method further comprising: determining one of an orientation, a position, a velocity, and an acceleration of the particular object in the kinematic model of the particular object using the first sensor data; andrefining one of the determined orientation, position, velocity, and acceleration using the second sensor data. 19. A collision detection system, comprising: a sensor of a first land vehicle configured to capture first sensor data pertaining to objects external to the first land vehicle;a coordination module of the first land vehicle configured to acquire second sensor data from a second land vehicle, wherein the second land vehicle comprises a sensing system, wherein the second sensor data acquired from the second land vehicle comprises sensor data obtained by use of the sensing system of the second land vehicle that pertains to objects external to the second land vehicle, and wherein the first sensor data and the second sensor data comprise sensor data pertaining to a particular object, the particular object external to the first land vehicle and the second land vehicle; anda processing module configured to calculate a first measurement quantity from the first sensor data, to determine a second measurement quantity from the second sensor data, and to derive a value of a component of a kinematic model of the particular object that incorporates both of the first measurement quantity, calculated from the first sensor data, and the second measurement quantity, calculated from the second sensor data. 20. The collision detection system of claim 19, wherein the processing module is configured to detect a potential collision based on the kinematic model of the particular object. 21. The collision detection system of claim 20, wherein the processing module is configured to generate an alert in response to detecting the potential collision. 22. The collision detection system of claim 21, wherein the coordination module is further configured to provide the alert to another land vehicle. 23. The collision detection system of claim 20, further comprising a vehicle interface module configured to activate a collision avoidance system of the first land vehicle in response to detecting the potential collision. 24. The collision detection system of claim 20, further comprising a vehicle interface module configured to activate a collision warning system of the first land vehicle in response to detecting the potential collision. 25. The collision detection system of claim 24, wherein the vehicle interface module is configured to activate an electro-optical emitter of the first land vehicle in response to detecting the potential collision. 26. The collision detection system of claim 24, wherein the vehicle interface module is configured to display an alert in response to detecting the potential collision. 27. The collision detection system of claim 24, wherein the vehicle interface module is configured to display a visual indication of the potential collision on a heads-up display of the first land vehicle. 28. The collision detection system of claim 20, wherein the processing module is configured to generate a collision avoidance instruction by use of the collision detection model in response to detecting the potential collision. 29. The collision detection system of claim 20, wherein the processing module is configured to predict a result of the potential collision by use of the collision detection model, and to generate a collision avoidance instruction by use of the predicted result. 30. The collision detection system of claim 29, wherein the collision avoidance instruction is configured for use by the first land vehicle. 31. A non-transitory machine-readable storage medium comprising instructions configured to cause a collision detection system to perform operations, comprising: capturing first sensor data pertaining to a particular object at a first land vehicle by use of a sensor of the first land vehicle;acquiring second sensor data pertaining to the particular object from a second land vehicle at the first land vehicle, wherein the second sensor data acquired from the second land vehicle comprises sensor data captured by a sensing system of the second land vehicle, and wherein the particular object is external to the second land vehicle; andgenerating a kinematic model of the object at the first land vehicle, wherein generating the kinematic model comprises calculating a component value for the kinematic model using the first sensor data and the second sensor data, wherein the component value models a kinematic component of the object relative to the first land vehicle at a capture time of the first sensor data, and wherein calculating the component value comprises, deriving a first value pertaining to the kinematic component of the particular object from the first sensor data,determining a second value pertaining to the kinematic component of the particular object from the second sensor data, andcalculating the component value for the kinematic model of the particular object such that the calculated component value includes the first value derived from the first sensor data and the second value determined from the second sensor data. 32. The non-transitory machine-readable storage medium of claim 31, the operations further comprising calculating another component value for the kinematic model of the object by use of the first sensor data. 33. The non-transitory machine-readable storage medium of claim 32, the operations further comprising generating a collision detection module comprising the kinematic model of the object. 34. The non-transitory machine-readable storage medium of claim 31, wherein the first sensor data captured at the first land vehicle and the second sensor data acquired from the second land vehicle comprise angle information pertaining to the object. 35. The non-transitory machine-readable storage medium of claim 34, wherein determining the component value for the kinematic model of the object further comprises determining a position of the object by triangulating angle information of the first sensor data captured at the first land vehicle with angle information of the second sensor data acquired from the second land vehicle. 36. The non-transitory machine-readable storage medium of claim 31, wherein the first sensor data and the second sensor data comprise range information pertaining to the object, and wherein calculating the component value for the kinematic model of the object comprises identifying intersecting range radii in the first sensor data and the second sensor data to calculate one or more of a position of the object relative to the first land vehicle, and an angular orientation of the object relative to the first land vehicle. 37. The non-transitory machine-readable storage medium of claim 31, wherein the first sensor data is captured at the first land vehicle concurrently with acquiring the second sensor data from the second land vehicle. 38. The non-transitory machine-readable storage medium of claim 31, wherein the first sensor data and the second sensor data comprise both range and angle information pertaining to the object, and wherein calculating the component value for the kinematic model of the object comprises determining a position of the object by combining range and angle information of the first sensor data with range and angle information of the second sensor data. 39. The non-transitory machine-readable storage medium of claim 31, the operations further comprising transmitting a portion of the first sensor data captured at the first land vehicle to the second land vehicle in response to acquiring the second sensor data from the second land vehicle. 40. The non-transitory machine-readable storage medium of claim 31, the operations further comprising: determining one of an orientation, a position, a velocity, and an acceleration of the object in the collision detection model using the first sensor data; andrefining one of the determined orientation, position, velocity, and acceleration of the object in the collision detection model using the second sensor data. 41. The non-transitory machine-readable storage medium of claim 31, wherein at least a portion of the second sensor data pertains to a particular object that is outside of a detection range of the sensor of the first land vehicle, the operations further comprising: including the particular object in a collision detection model of the first land vehicle by use of the second sensor data. 42. The non-transitory machine-readable storage medium of claim 31, the operations further comprising aligning the first sensor data captured at the first land vehicle with the second sensor data acquired from the second land vehicle. 43. The non-transitory machine-readable storage medium of claim 31, the operations further comprising requesting access to the sensor data of the second land vehicle. 44. The non-transitory machine-readable storage medium of claim 31, wherein the first sensor data captured at the first land vehicle and the second sensor data acquired from the second land vehicle comprise angle information pertaining to the object, the operations further comprising: acquiring third sensor data comprising range information pertaining to the object from a third land vehicle; andgenerating the kinematic model for the object by use of the angle information pertaining to the object in the first sensor data and the second sensor data and the range information acquired from the third land vehicle.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.