Method and system for collective autonomous operation database for autonomous vehicles
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G08G-001/00
G05D-001/00
G08G-001/01
G08G-001/017
출원번호
US-0394489
(2016-12-29)
등록번호
US-10083604
(2018-09-25)
발명자
/ 주소
Ricci, Christopher P.
Singhal, Abhishek
출원인 / 주소
NIO USA, Inc.
대리인 / 주소
Sheridan Ross PC
인용정보
피인용 횟수 :
0인용 특허 :
199
초록
Systems of an electrical vehicle and the operations thereof are provided that forms ad hoc autonomous vehicle networks to conserve bandwidth in receiving information from a remote source and sending information to the remote source.
대표청구항▼
1. An autonomous vehicle, comprising: a vehicle interior for receiving one or more occupants;a plurality of sensors to collect vehicle-related information, occupant-related information, and exterior environmental and object information associated with the vehicle;an automatic vehicle location system
1. An autonomous vehicle, comprising: a vehicle interior for receiving one or more occupants;a plurality of sensors to collect vehicle-related information, occupant-related information, and exterior environmental and object information associated with the vehicle;an automatic vehicle location system to determine a current spatial location of the vehicle;a computer readable medium to store networking instructions to form an ad hoc network comprising the vehicle and a different vehicle;an arithmetic logic unit that performs mathematical operations;a data bus that, at the request of the arithmetic logic unit, sends data to or receives data from the computer readable medium;an address bus that, at the request of the arithmetic logic unit, sends an address to the computer readable medium;a read and write line that, at the request of the arithmetic logic unit, commands the computer readable medium whether to set or retrieve a location corresponding to the address;one or more registers to latch a value on the data bus or output by the arithmetic logic unit; andone or more buffers, wherein the arithmetic logic unit is coupled to the plurality of sensors, automatic vehicle location system, and computer readable medium, anddetermines a current spatial location of the vehicle,receives current vehicle-related information, current occupant-related information, and exterior environmental and object information,generates, from the exterior environmental and object information, a three-dimensional map comprising exterior animate objects in spatial proximity to the vehicle, models, from the three-dimensional map, predicted behavior of one or more of the exterior animate objects and, from the occupant-related information, predicted behavior of one or more vehicle occupants,based on the three-dimensional map and predicted behaviors of the one or more exterior animate objects and one or more vehicle occupants, issues a command to a vehicle component to perform a vehicle driving operation, andcommunicates to the different vehicle not to transmit to a remote control source or navigation source exterior environmental and object information collected by a respective plurality of sensors of the different vehicle in temporal proximity to the vehicle collection of the exterior environmental and object information,wherein the vehicle transmits, to a remote control source or navigation source, the current vehicle-related information and current occupant-related information, three-dimensional map, current spatial location of the vehicle, and predicted behavior of one or more of the exterior animate objects and/or one or more of the vehicle occupants. 2. The vehicle of claim 1, wherein the arithmetic logic unit transmits to a remote control source or navigation source, the exterior environmental and object information collected by plurality of sensors, wherein the different vehicle transmits current vehicle-related information and current occupant-related information collected by the respective plurality of sensors of the different vehicle to the remote control source or navigation source but does not transmit navigation source exterior environmental and object information collected by a respective plurality of sensors of the different vehicle. 3. The vehicle of claim 1, wherein the microprocessor transmits the predicted behaviors of the one or more exterior animate objects to the different vehicle for execution by a microprocessor of the different vehicle. 4. The vehicle of claim 1, wherein the microprocessor transmits the predicted behaviors of the one or more occupants of the vehicle to the different vehicle for execution by a microprocessor of the different vehicle. 5. The vehicle of claim 1, wherein the plurality of sensors comprise a lidar sensor, radar sensor, ultrasonic sensor, camera, and infrared sensor, wherein the command is one or more of an acceleration rate of the vehicle, a deceleration rate of the vehicle, a steering angle of the vehicle, and an inter-object spacing of the vehicle relative to an exteriorly located object, and wherein the vehicle transmits to the control source or navigation source an identifier associated with the vehicle and different vehicle in connection with the in connection with the three-dimensional map. 6. The vehicle of claim 1, wherein what vehicles are in the ad hoc network is based on one or more of spatial location or proximity of potentially networked vehicles, received signal strength by one potentially networked vehicle of a signal transmitted by another potentially networked vehicle, directions of travel of potentially networked vehicles, roadway type traveled by potentially networked vehicles, types of potentially networked vehicles, models of potentially networked vehicles, and manufacturers of potentially networked vehicles and wherein which vehicles are members of the ad hoc network change in response to vehicle movement. 7. The vehicle of claim 1, wherein the predicted behavior is for a selected exterior animate object in the three-dimensional map, wherein the vehicle receives a different second predicted behavior of the selected exterior animate object generated by another vehicle, and based on the three-dimensional map and the first and second predicted behaviors of the selected exterior animate object, and issues a command to a vehicle component to perform a vehicle driving operation, wherein the first and second predicted behaviors, each executed alone by the arithmetic logic unit, produce different commands, wherein the command corresponds to one or more of an acceleration event, an acceleration rate, a deceleration event, a deceleration rate, a steering angle relative to a reference axis, and a spacing distance between an exterior surface of the vehicle and a nearby object. 8. The vehicle of claim 7, wherein the second predicted behavior has a corresponding application limitation defining when the second predicted behavior is to be used instead of the first predicted behavior, and wherein the corresponding application limitation is one or more of a temporal limitation, a spatial limitation, and an event duration limitation. 9. The vehicle of claim 8, wherein the second predicted behavior is received as part of navigation information comprising a dimensional array of features, each feature having an attribute of location and category and wherein the navigation information comprises or references the second predicted behavior, wherein the navigation information further comprises one or more of a command to the vehicle arithmetic logic unit, request to the vehicle arithmetic logic unit, warning of a hazard, instruction to be performed by the vehicle arithmetic logic unit, rule to be applied by the vehicle arithmetic logic unit, and link to one or more of the command, request, warning, instruction or rule, wherein the navigation information comprises a flag field to indicate whether or not the navigation information comprises a predicted behavior to be considered by the vehicle, and wherein the microprocessor forward the identified autonomous driving information to at least one other autonomous vehicle in spatial proximity to the autonomous vehicle for possible execution by a microprocessor of the at least one other autonomous vehicle, and wherein an autonomous mode of operation of the vehicle is at least level 2 or higher. 10. A method for autonomous operation of a vehicle, comprising: providing a vehicle comprising a vehicle interior for receiving one or more occupants, a plurality of sensors to collect vehicle-related information, occupant-related information, and exterior environmental and object information associated with the vehicle, an automatic vehicle location system to determine a current spatial location of the vehicle, a computer readable medium to store networking instructions to form an ad hoc network comprising the vehicle and a different vehicle, an arithmetic logic unit that performs mathematical operations, a data bus that, at the request of the arithmetic logic unit, sends data to or receives data from the computer readable medium, an address bus that, at the request of the arithmetic logic unit, sends an address to the computer readable medium, a read and write line that, at the request of the arithmetic logic unit, commands the computer readable medium whether to set or retrieve a location corresponding to the address, one or more registers to latch a value on the data bus or output by the arithmetic logic unit, and one or more buffers;determining, by the arithmetic logic unit, a current spatial location of the vehicle;receiving, by the arithmetic logic unit and from the plurality of sensors, current vehicle-related information, current occupant-related information, and exterior environmental and object information;generating, by the arithmetic logic unit, from the exterior environmental and object information a three-dimensional map comprising exterior animate objects in spatial proximity to the vehicle;determining, by the arithmetic logic unit executing one or more behavioral models and based on the three-dimensional map, a predicted behavior of one or more of the exterior animate objects and from the occupant-related information a predicted behavior of one or more vehicle occupants;based on the three-dimensional map and predicted behaviors of the one or more exterior animate objects and one or more vehicle occupants, issuing, by arithmetic logic unit, a command to a vehicle component to perform a vehicle driving operation;causing, by the arithmetic logic unit, the different vehicle not to transmit to a remote control source or navigation source exterior environmental and object information collected by a respective plurality of sensors of the different vehicle in temporal proximity to the vehicle collection of the exterior environmental and object information; andcausing, by the arithmetic logic unit, transmission, to a remote control source or navigation source, of the current vehicle-related information and current occupant-related information, three-dimensional map, current spatial location of the vehicle, and predicted behavior of one or more of the exterior animate objects and/or one or more of the vehicle occupants. 11. The method of claim 10, further comprising: transmitting, by the arithmetic logic unit and to a remote control source or navigation source, the exterior environmental and object information collected by plurality of sensors, wherein the different vehicle transmits current vehicle-related information and current occupant-related information collected by the respective plurality of sensors of the different vehicle to the remote control source or navigation source but does not transmit navigation source exterior environmental and object information collected by a respective plurality of sensors of the different vehicle. 12. The method of claim 10, wherein the arithmetic logic unit causes transmission of the predicted behaviors of the one or more exterior animate objects to the different vehicle for execution by a microprocessor of the different vehicle. 13. The method of claim 10, wherein the arithmetic logic unit causes transmission of the predicted behaviors of the one or more occupants of the vehicle to the different vehicle for execution by a microprocessor of the different vehicle. 14. The method of claim 10, wherein the plurality of sensors comprise a lidar sensor, radar sensor, ultrasonic sensor, camera, and infrared sensor, wherein the command is one or more of an acceleration rate of the vehicle, a deceleration rate of the vehicle, a steering angle of the vehicle, and an inter-object spacing of the vehicle relative to an exteriorly located object, and wherein the vehicle transmits to the control source or navigation source an identifier associated with the vehicle and different vehicle in connection with the three-dimensional map. 15. The method of claim 10, wherein what vehicles are in the ad hoc network is based on one or more of spatial location or proximity of potentially networked vehicles, received signal strength by one potentially networked vehicle of a signal transmitted by another potentially networked vehicle, directions of travel of potentially networked vehicles, roadway type traveled by potentially networked vehicles, types of potentially networked vehicles, models of potentially networked vehicles, and manufacturers of potentially networked vehicles and wherein which vehicles are members of the ad hoc network change in response to vehicle movement. 16. The method of claim 10, wherein the predicted behavior is for a selected exterior animate object in the three-dimensional map, wherein the vehicle receives a different second predicted behavior of the selected exterior animate object generated by another vehicle, and based on the three-dimensional map and the first and second predicted behaviors of the selected exterior animate object, and issues a command to a vehicle component to perform a vehicle driving operation, wherein the first and second predicted behaviors, each executed alone by the arithmetic logic unit, produce different commands, wherein the command corresponds to one or more of an acceleration event, an acceleration rate, a deceleration event, a deceleration rate, a steering angle relative to a reference axis, and a spacing distance between an exterior surface of the vehicle and a nearby object. 17. The method of claim 16, wherein the second predicted behavior has a corresponding application limitation defining when the second predicted behavior is to be used instead of the first predicted behavior, and wherein the corresponding application limitation is one or more of a temporal limitation, a spatial limitation, and an event duration limitation. 18. The method of claim 17, wherein the second predicted behavior is received as part of navigation information comprising a dimensional array of features, each feature having an attribute of location and category and wherein the navigation information comprises or references the second predicted behavior, wherein the navigation information further comprises one or more of a command to the vehicle arithmetic logic unit, request to the vehicle arithmetic logic unit, warning of a hazard, instruction to be performed by the vehicle arithmetic logic unit, rule to be applied by the vehicle arithmetic logic unit, and link to one or more of the command, request, warning, instruction or rule, wherein the navigation information comprises a flag field to indicate whether or not the navigation information comprises a predicted behavior to be considered by the vehicle, and wherein the microprocessor forward the identified autonomous driving information to at least one other autonomous vehicle in spatial proximity to the autonomous vehicle for possible execution by a microprocessor of the at least one other autonomous vehicle, and wherein an autonomous mode of operation of the vehicle is at least level 2 or higher. 19. A method for autonomous operation of a vehicle, comprising: providing a vehicle comprising a vehicle interior for receiving one or more occupants, a plurality of sensors to collect vehicle-related information, occupant-related information, and exterior environmental and object information associated with the vehicle;, an automatic vehicle location system to determine a current spatial location of the vehicle, a computer readable medium to store networking instructions to form an ad hoc network comprising the vehicle and a different vehicle, an arithmetic logic unit that performs mathematical operations, a data bus that, at the request of the arithmetic logic unit, sends data to or receives data from the computer readable medium, an address bus that, at the request of the arithmetic logic unit, sends an address to the computer readable medium, a read and write line that, at the request of the arithmetic logic unit, commands the computer readable medium whether to set or retrieve a location corresponding to the address, and one or more registers that enable the arithmetic logic unit to perform mathematical operations;determining, by the arithmetic logic unit, a current spatial location of the vehicle;receiving, by the arithmetic logic unit, current vehicle-related information, current occupant-related information, and exterior environmental and object information;generating, by the arithmetic logic unit, from the exterior environmental and object information a three-dimensional map comprising exterior animate objects in spatial proximity to the vehicle;determining, by the arithmetic logic unit executing one or more behavioral models and based on the three-dimensional map, a predicted behavior of one or more of the exterior animate objects and from the occupant-related information a predicted behavior of one or more vehicle occupants;based on the three-dimensional map and predicted behaviors of the one or more exterior animate objects and one or more vehicle occupants, issuing, by arithmetic logic unit, a command to a vehicle component to perform a vehicle driving operation;causing, by the arithmetic logic unit, a communication to be transmitted to the different vehicle commanding the different vehicle not to transmit to a remote control source or navigation source exterior environmental and object information collected by a respective plurality of sensors of the different vehicle in temporal proximity to the vehicle collection of the exterior environmental and object information;causing, by the arithmetic logic unit, transmission, to a remote control source or navigation source, of the current vehicle-related information and current occupant-related information, three-dimensional map, current spatial location of the vehicle, and predicted behavior of one or more of the exterior animate objects and/or one or more of the vehicle occupants; andcausing, by the arithmetic logic unit, transmission of the predicted behaviors of the one or more exterior animate objects and one or more occupants of the vehicle to the different vehicle for execution by a microprocessor of the different vehicle. 20. The method of claim 19, further comprising: transmitting, by the arithmetic logic unit and to a remote control source or navigation source, the exterior environmental and object information collected by plurality of sensors, wherein the different vehicle transmits current vehicle-related information and current occupant-related information collected by the respective plurality of sensors of the different vehicle to the remote control source or navigation source but does not transmit navigation source exterior environmental and object information collected by a respective plurality of sensors of the different vehicle. 21. The method of claim 19, wherein the vehicle transmits to the control source or navigation source an identifier associated with the vehicle and different vehicle in connection with the three-dimensional map.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (199)
McWalter,William F.; Kelly,Lisa M.; Decristo,Dianna L.; Razavi,Behfar, Abstract user interface manager with prioritization.
Abrams, Vincent D.; Izzard, III, Alexander Edwin; Cunningham, Glen; Parker, Kenneth R.; Woods, Timothy E., Appliance communication and control system and appliances for use in same.
Berenz, John J.; McIver, George W.; Niesen, Joseph W.; Dunbridge, Barry; Shreve, Gregory A., Application of human facial features recognition to automobile safety.
Clanton,Charles H.; Ventrella,Jeffrey J.; Paiz,Fernando J., Cinematic techniques in avatar-centric communication during a multi-user online simulation.
Kent Fillmore Hayes, Jr. ; Brett Graham King, Client-server system for maintaining a user desktop consistent with server application user access permissions.
Ohta Takashi,JPX ; Iwatsuki Kunihiro,JPX ; Fukumura Kagenori,JPX, Control system for controlling the behavior of a vehicle based on accurately detected route information.
Aaron, Jeffrey; Streijl, Robert, Devices, methods, and computer-readable media for providing sevices based upon identification of decision makers and owners associated with communication services.
Ashihara, Jun, Driver authentication apparatus and method for identifying automatically-extracted driver's operation feature data with already-registered feature data.
Lepley, Geoffrey Peter; Miles, Dean Anthony; Gallichan, Kevin Langley; Robberts, Nicholas James, Dual-function removable reversable unit for radio and telephone.
Kashima, Koji; Sakaguchi, Tatsumi; Oryoji, Hiroshi; Eshima, Masashi, Electronic apparatus, reproduction control system, reproduction control method, and program therefor.
Leising Maurice B. (Clawson MI) Benford Howard L. (Bloomfield Hills MI) Holbrook Gerald L. (Rochester Hills MI), Electronically-controlled, adaptive automatic transmission system.
Filev, Dimitar Petrov; Gusikhin, Oleg Yurievitch; Syed, Fazal Urrahman; Klampfl, Erica; Giuli, Thomas J.; Chen, Yifan, Emotive engine and method for generating a simulated emotion for an information system.
Chatham Michael D. (Bloomington IL) Fotsch Paul D. (Dunlap IL) Heyveld Doyle G. (Peoria IL) Kelley Edward P. (Chillicothe IL) Lohmann ; Jr. Walter E. (Decatur IL) Roley David R. (Morton IL) Sieck Cha, Fatigue analysis and warning system.
Waeller, Christoph; Wu, Yongmei; Bohnenberger, Thorsten, Information device, preferably in a motor vehicle, and method for supplying information about vehicle data, in particular vehicle functions and their operation.
Shuman, Valerie; Paulauskas, Cynthia; Shields, T. Russell; Weiland, Richard J.; Jasper, John C., Method and system for an in-vehicle computing architecture.
Cataldo, Anthony Joseph; Haggerty, Terry; Ubik, Henry Thomas; Patel, Mona; Harrington, Tim; Bacon, Tom; Wisherd, Dave; Bowman, Doug, Method and system for capturing vehicle data using an RF transmitter.
Uyeki, Robert; Tamura, Kazuya; Ohki, Eric Shigeru; Kurciska, Maja, Method and system for using traffic flow data to navigate a vehicle to a destination.
McLaughlin Paul F. (Hatfield PA) Bristow Robert W. (Hatboro PA) Kummer Karl T. (Doylestown PA), Method for enacting failover of a 1:1 redundant pair of slave processors.
Davis, Terry L., Method to use empty slots in onboard aircraft servers and communication devices to install non-proprietary servers and communications interfaces.
Rhoads, Geoffrey B.; Rodriguez, Tony F.; Lord, John D.; MacIntosh, Brian T.; Rhoads, Nicole; Conwell, William Y., Methods and systems for content processing.
Penilla, Angel A.; Penilla, Albert S., Methods and systems for defining vehicle user profiles and managing user profiles via cloud systems and applying learned settings to user profiles.
Baldas Jason Paul ; Simpson Tracy Lee ; Ohashi Hitoshi ; Morrison Gerald Oscar ; Alfano Gregory W. ; Palaski William Edwin ; George Richard David ; Avram Eileen Marie, Overhead console for motor vehicle.
Carnevali, Jeffrey D., Reconfigurable console mount having a plurality of interchangeable tongue-and-groove blank and equipment mounting panels and quick disconnect clamps.
Addepalli, Sateesh K.; Dai, Lillian Lei; Sudhaakar, Raghuram S.; Somers, Robert Edward, System and method for establishing communication channels between on-board unit of vehicle and plurality of nodes.
Addepalli, Sateesh K.; Moghe, Ashok K.; Bonomi, Flavio; Girardot, Marc Jean-Philippe; Thubert, Pascal, System and method for internal networking, data optimization and dynamic frequency selection in a vehicular environment.
Jensen, Peter Strarup; Veselov, Pavel S.; Ayyagari, Venkata S.; Grigoryev, Nikolay G., System and method for managing and deploying functional services to a vehicle client.
Van Wiemeersch, John Robert; Kleve, Robert Bruce; Schondorf, Steven Yellin; Miller, Thomas Lee; Bennie, Brian; Kwon, Dae Wook; Aldighieri, Paul, System and method for remotely controlling vehicle components from a nomadic communication device or computer.
Tomkins, Steve; Dodge, Dan; Van Der Veen, Peter; Tang, Xiaodan; Burgess, Colin, System having user interface using motion based object selection and mouse movement.
Michmerhuizen,Mark; Syfert,Timothy J.; Spencer,John D.; Strazanac,Joseph W., System, method and device for providing communication between a vehicle and a plurality of wireless devices having different communication standards.
Jackson, Dean Kenneth; Klein, Daniel Victor, Systems and methods for updating vehicle behavior and settings based on the locations of vehicle passengers.
Sanders Rudy T. (9520 Rhea Ave. Northridge CA 91324) Fleishman Lee (2169 Brookfield Dr. Thousand Oaks CA 91362), User identifying vehicle control and security device.
Shaw David C. H. (3312 E. Mandeville Pl. Orange CA 92667) Shaw Judy Z. Z. (3312 E. Mandeville Pl. Orange CA 92667), Vehicle collision avoidance system.
Zyburt Jeffrey P. ; Cowan Allan L. ; Grimaudo Donald W. ; Shaffer Frederick J. ; Muzzell Jeffrey ; Nelson James G. ; Gu Zhengang ; Frinkle Marvin L. ; Robinson David T. ; Zuo Kai, Vehicle control system for automated durability road (ADR) facility.
Sato Koji (Mishima JPX) Morita Makoto (Mishima JPX) Kizu Masafumi (Toyota JPX), Vehicle data processing system which can communicate with information center.
Filippov, Mikhail O.; Fitch, Osa; Keller, Scott P.; O'Connor, John; Zendzian, David S.; El Fata, Nadim; Larsen, Kevin; Meuchel, Arlen Eugene; Schmaltz, Mark David; Allard, James; De Roo, Chris A.; Norris, William Robert; Norby, Andrew Julian; Turner, Christopher David Glenn, Versatile robotic control module.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.