최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | US-0393137 (2016-12-28) |
등록번호 | US-10031523 (2018-07-24) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 0 인용 특허 : 259 |
Systems of an electrical vehicle and the operations thereof are provided that augments learned behaviors of one vehicle with learned behaviors of other vehicles.
1. An autonomous vehicle, comprising: a vehicle interior for receiving one or more occupants;a plurality of sensors to collect vehicle-related information, occupant-related information, and exterior environmental and object information associated with the vehicle;an automatic vehicle location system
1. An autonomous vehicle, comprising: a vehicle interior for receiving one or more occupants;a plurality of sensors to collect vehicle-related information, occupant-related information, and exterior environmental and object information associated with the vehicle;an automatic vehicle location system to determine a current spatial location of the vehicle;a computer readable medium to store selected information;an arithmetic logic unit that performs mathematical operations;a data bus that, at the request of the arithmetic logic unit, sends data to or receives data from the computer readable medium;an address bus that, at the request of the arithmetic logic unit, sends an address to the computer readable medium;a read and write line that, at the request of the arithmetic logic unit, commands the computer readable medium whether to set or retrieve a location corresponding to the address;one or more registers to latch a value on the data bus or output by the arithmetic logic unit; andone or more buffers, wherein the arithmetic logic unit is coupled to the plurality of sensors, automatic vehicle location system, and computer readable medium, and: determines a current spatial location of the vehicle,receives current vehicle-related information, current occupant-related information, and exterior environmental and object information,generates, from the exterior environmental and object information, a three-dimensional map comprising exterior animate objects in spatial proximity to the vehicle, the exterior animate objects comprising a selected exterior animate object,models from the three-dimensional map a first predicted behavior of the selected exterior animate object,receives a different second predicted behavior of the selected exterior animate object generated by another vehicle, andbased on the three-dimensional map and the first and second predicted behaviors of the selected exterior animate object, issues a command to a vehicle component to perform a vehicle driving operation,wherein the first and second predicted behaviors, each executed alone by the arithmetic logic unit, cause the arithmetic logic unit to produce different commands. 2. The vehicle of claim 1, wherein the different commands are different than the issued command, wherein the behaviors correspond to one or more of an acceleration event, an acceleration rate, a deceleration event, a deceleration rate, a steering angle relative to a reference axis, and a spacing distance between an exterior surface of the vehicle and a nearby exterior object in the three-dimensional map. 3. The vehicle of claim 1, wherein the second predicted behavior has a corresponding application limitation defining when the second predicted behavior is to be used instead of the first predicted behavior, and wherein the corresponding application limitation is one or more of a temporal limitation, a spatial limitation, and an event duration limitation. 4. The vehicle of claim 1, wherein the second predicted behavior is received as part of navigation information comprising a dimensional array of features, each feature having an attribute of location and category and wherein the navigation information comprises or references the second predicted behavior, wherein the navigation information further comprises one or more of a command to the vehicle arithmetic logic unit, request to the vehicle arithmetic logic unit, warning of a hazard, instruction to be performed by the vehicle arithmetic logic unit, rule to be applied by the vehicle arithmetic logic unit, and link to one or more of the command, request, warning, instruction or rule, wherein the navigation information comprises a flag field to indicate whether or not the navigation information comprises a predicted behavior to be considered by the vehicle, and wherein the microprocessor forward the identified autonomous driving information to at least one other autonomous vehicle in spatial proximity to the autonomous vehicle for possible execution by a microprocessor of the at least one other autonomous vehicle, and wherein an autonomous mode of operation of the vehicle is at least level 2 or higher. 5. The vehicle of claim 1, wherein the vehicle forms an ad hoc network with a different vehicle, wherein the vehicle communicates to the different vehicle not to transmit to a remote control source or navigation source exterior environmental and object information collected by a respective plurality of sensors of the different vehicle in temporal proximity to the vehicle collection of the exterior environmental and object information, and wherein the vehicle transmits, to a remote control source or navigation source, the current vehicle-related information and current occupant-related information, three-dimensional map, current spatial location of the vehicle, and predicted behavior of one or more of the exterior animate objects and/or one or more of the vehicle occupants. 6. The vehicle of claim 5, wherein the different vehicle transmits current vehicle-related information and current occupant-related information collected by the respective plurality of sensors of the different vehicle to the remote control source or navigation source but does not transmit to the remote control source or navigation source exterior environmental and object information collected by a respective plurality of sensors of the different vehicle. 7. The vehicle of claim 5, wherein the microprocessor transmits the predicted behaviors of the one or more exterior animate objects to the different vehicle for execution by a microprocessor of the different vehicle. 8. The vehicle of claim 5, wherein the microprocessor transmits the predicted behaviors of the one or more occupants of the vehicle to the different vehicle for execution by a microprocessor of the different vehicle. 9. The vehicle of claim 1, wherein the plurality of sensors comprise a lidar sensor, radar sensor, ultrasonic sensor, camera, and infrared sensor, wherein the command is one or more of an acceleration rate of the vehicle, a deceleration rate of the vehicle, a steering angle of the vehicle, and an inter-object spacing of the vehicle relative to an exteriorly located object, and wherein the vehicle transmits to the control source or navigation source an identifier associated with the vehicle and different vehicle in connection with the exterior environmental and object information. 10. The vehicle of claim 1, wherein what vehicles are in the ad hoc network is based on one or more of spatial location or proximity of potentially networked vehicles, received signal strength by one potentially networked vehicle of a signal transmitted by another potentially networked vehicle, directions of travel of potentially networked vehicles, roadway type traveled by potentially networked vehicles, types of potentially networked vehicles, models of potentially networked vehicles, and manufacturers of potentially networked vehicles and wherein which vehicles are members of the ad hoc network change in response to vehicle movement. 11. A method for autonomous operation of a vehicle, comprising: providing a vehicle comprising a vehicle interior for receiving one or more occupants, a plurality of sensors to collect vehicle-related information, occupant-related information, and exterior environmental and object information associated with the vehicle, an automatic vehicle location system to determine a current spatial location of the vehicle, a computer readable medium to store selected information, an arithmetic logic unit that performs mathematical operations, a data bus that, at the request of the arithmetic logic unit, sends data to or receives data from the computer readable medium, an address bus that, at the request of the arithmetic logic unit, sends an address to the computer readable medium, a read and write line that, at the request of the arithmetic logic unit, commands the computer readable medium whether to set or retrieve a location corresponding to the address, one or more registers to latch a value on the data bus or output by the arithmetic logic unit, and one or more buffers;determining, by the arithmetic logic unit, a current spatial location of the vehicle;receiving, by the arithmetic logic unit, current vehicle-related information, current occupant-related information, and exterior environmental and object information;generating, by the arithmetic logic unit and from the exterior environmental and object information, a three-dimensional map comprising exterior animate objects in spatial proximity to the vehicle, the exterior animate objects comprising a selected exterior animate object;modeling, by the arithmetic logic unit and from the three-dimensional map, a first predicted behavior of the selected exterior animate object;receiving, by the arithmetic logic unit, a different second predicted behavior of the selected exterior animate object generated by another vehicle; andbased on the three-dimensional map and the first and second predicted behaviors of the selected exterior animate object, issuing, by the arithmetic logic unit, a command to a vehicle component to perform a vehicle driving operation,wherein the first and second predicted behaviors, each executed alone by the arithmetic logic unit, cause the arithmetic logic unit to produce different commands. 12. The method of claim 11, wherein different commands are different than the issued command, wherein the behaviors correspond to one or more of an acceleration event, an acceleration rate, a deceleration event, a deceleration rate, a steering angle relative to a reference axis, and a spacing distance between an exterior surface of the vehicle and a nearby exterior object in the three-dimensional map. 13. The method of claim 11, wherein the second predicted behavior has a corresponding application limitation defining when the second predicted behavior is to be used instead of the first predicted behavior, and wherein the corresponding application limitation is one or more of a temporal limitation, a spatial limitation, and an event duration limitation. 14. The method of claim 11, wherein the second predicted behavior is received as part of navigation information comprising a dimensional array of features, each feature having an attribute of location and category and wherein the navigation information comprises or references the second predicted behavior, wherein the navigation information further comprises one or more of a command to the vehicle arithmetic logic unit, request to the vehicle arithmetic logic unit, warning of a hazard, instruction to be performed by the vehicle arithmetic logic unit, rule to be applied by the vehicle arithmetic logic unit, and link to one or more of the command, request, warning, instruction or rule, wherein the navigation information comprises a flag field to indicate whether or not the navigation information comprises a predicted behavior to be considered by the vehicle, and wherein the microprocessor forward the identified autonomous driving information to at least one other autonomous vehicle in spatial proximity to the autonomous vehicle for possible execution by a microprocessor of the at least one other autonomous vehicle, and wherein an autonomous mode of operation of the vehicle is at least level 2 or higher. 15. The method of claim 11, wherein the vehicle forms an ad hoc network with a different vehicle, wherein the vehicle communicates to the different vehicle not to transmit to a remote control source or navigation source exterior environmental and object information collected by a respective plurality of sensors of the different vehicle in temporal proximity to the vehicle collection of the exterior environmental and object information, and wherein the vehicle transmits, to a remote control source or navigation source, the current vehicle-related information and current occupant-related information, three-dimensional map, current spatial location of the vehicle, and predicted behavior of one or more of the exterior animate objects and/or one or more of the vehicle occupants. 16. The method of claim 15, wherein the different vehicle transmits current vehicle-related information and current occupant-related information collected by the respective plurality of sensors of the different vehicle to the remote control source or navigation source but does not transmit to the remote control source or navigation source exterior environmental and object information collected by a respective plurality of sensors of the different vehicle. 17. The method of claim 15, wherein the microprocessor transmits the predicted behaviors of the one or more exterior animate objects to the different vehicle for execution by a microprocessor of the different vehicle and wherein the microprocessor transmits the predicted behaviors of the one or more occupants of the vehicle to the different vehicle for execution by a microprocessor of the different vehicle. 18. The method of claim 15, wherein the plurality of sensors comprise a lidar sensor, radar sensor, ultrasonic sensor, camera, and infrared sensor, wherein the command is one or more of an acceleration rate of the vehicle, a deceleration rate of the vehicle, a steering angle of the vehicle, and an inter-object spacing of the vehicle relative to an exteriorly located object, and wherein the vehicle transmits to the control source or navigation source an identifier associated with the vehicle and different vehicle in connection with the exterior environmental and object information, wherein what vehicles are in the ad hoc network is based on one or more of spatial location or proximity of potentially networked vehicles, received signal strength by one potentially networked vehicle of a signal transmitted by another potentially networked vehicle, directions of travel of potentially networked vehicles, roadway type traveled by potentially networked vehicles, types of potentially networked vehicles, models of potentially networked vehicles, and manufacturers of potentially networked vehicles and wherein which vehicles are members of the ad hoc network change in response to vehicle movement. 19. A method for autonomous operation of a vehicle, comprising: providing a vehicle comprising a vehicle interior for receiving one or more occupants, a plurality of sensors to collect vehicle-related information, occupant-related information, and exterior environmental and object information associated with the vehicle, an automatic vehicle location system to determine a current spatial location of the vehicle, a computer readable medium to store selected information, an arithmetic logic unit that performs mathematical operations, a data bus that, at the request of the arithmetic logic unit, sends data to or receives data from the computer readable medium, an address bus that, at the request of the arithmetic logic unit, sends an address to the computer readable medium, a read and write line that, at the request of the arithmetic logic unit, commands the computer readable medium whether to set or retrieve a location corresponding to the address, one or more registers to latch a value on the data bus or output by the arithmetic logic unit, and one or more buffers;determining, by the arithmetic logic unit, a current spatial location of the vehicle;receiving, by the arithmetic logic unit, current vehicle-related information, current occupant-related information, and exterior environmental and object information;generating, by the arithmetic logic unit and from the exterior environmental and object information, a three-dimensional map comprising exterior animate objects in spatial proximity to the vehicle, the exterior animate objects comprising the selected exterior animate object;modeling, by the arithmetic logic unit and from the three-dimensional map, a first predicted behavior of a selected exterior animate object;receiving, by the arithmetic logic unit, a different second predicted behavior of the selected exterior animate object generated by another vehicle; andbased on the three-dimensional map and the first and second predicted behaviors of the selected exterior animate object, issuing, by the arithmetic logic unit, a command to a vehicle component to perform a vehicle driving operation,wherein the first and second predicted behaviors, each executed alone by the arithmetic logic unit, produce different commands, wherein the behaviors correspond to one or more of an acceleration event, an acceleration rate, a deceleration event, a deceleration rate, a steering angle relative to a reference axis, and a spacing distance between an exterior surface of the vehicle and a nearby exterior object in the three-dimensional map. 20. The method of claim 19, wherein the second predicted behavior has a corresponding application limitation defining when the second predicted behavior is to be used instead of the first predicted behavior, and wherein the corresponding application limitation is one or more of a temporal limitation, a spatial limitation, and an event duration limitation, wherein the second predicted behavior is received as part of navigation information comprising a dimensional array of features, each feature having an attribute of location and category and wherein the navigation information comprises or references the second predicted behavior, wherein the navigation information further comprises one or more of a command to the vehicle arithmetic logic unit, request to the vehicle arithmetic logic unit, warning of a hazard, instruction to be performed by the vehicle arithmetic logic unit, rule to be applied by the vehicle arithmetic logic unit, and link to one or more of the command, request, warning, instruction or rule, wherein the navigation information comprises a flag field to indicate whether or not the navigation information comprises a predicted behavior to be considered by the vehicle, and wherein the microprocessor forward the identified autonomous driving information to at least one other autonomous vehicle in spatial proximity to the autonomous vehicle for possible execution by a microprocessor of the at least one other autonomous vehicle, wherein an autonomous mode of operation of the vehicle is at least level 2 or higher, wherein the vehicle forms an ad hoc network with a different vehicle, wherein the vehicle communicates to the different vehicle not to transmit to a remote control source or navigation source exterior environmental and object information collected by a respective plurality of sensors of the different vehicle in temporal proximity to the vehicle collection of the exterior environmental and object information, and wherein the vehicle transmits, to a remote control source or navigation source, the current vehicle-related information and current occupant-related information, three-dimensional map, current spatial location of the vehicle, and predicted behavior of one or more of the exterior animate objects and/or one or more of the vehicle occupants.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.