A system for controlling a vehicle navigating a roadway, including a perception module that generates sensor data and outputs a cost map and traffic data associated with traffic objects, a behavior planning module that receives the cost map and the traffic data from the perception module and generat
A system for controlling a vehicle navigating a roadway, including a perception module that generates sensor data and outputs a cost map and traffic data associated with traffic objects, a behavior planning module that receives the cost map and the traffic data from the perception module and generates planner primitives, a training module that receives the cost map and the traffic data from the perception module, receives driver input from a vehicle operator, and trains the behavior planning module, a local planning module comprising a set of task blocks that receives the cost map from the perception module and the planner primitives from the behavior planning module, selects a task block, and generates control commands using the selected task block; and a control module comprising an actuation subsystem, wherein the control module receives the control commands from the local planning module and controls the actuation subsystem.
대표청구항▼
1. A system for controlling a vehicle navigating a roadway, comprising: a perception module comprising a sensor subsystem, wherein the sensor subsystem generates sensor data, and wherein the perception module outputs a cost map of the area proximal the vehicle, and traffic data associated with traff
1. A system for controlling a vehicle navigating a roadway, comprising: a perception module comprising a sensor subsystem, wherein the sensor subsystem generates sensor data, and wherein the perception module outputs a cost map of the area proximal the vehicle, and traffic data associated with traffic objects proximal the vehicle based on an analysis of the sensor data;a behavior planning module that receives the cost map and the traffic data from the perception module and generates planner primitives based on the cost map and traffic data, wherein the behavior planning module comprises a decision-making block that consists essentially of a trained machine-learning model;a training module that receives the cost map and the traffic data from the perception module, receives driver input from a vehicle operator, and trains the behavior planning module based on the driver input, the cost map, and the traffic data;a local planning module comprising a set of task blocks, each of the set of task blocks consisting essentially of an explicitly-programmed set of rules, wherein the local planning module receives the cost map from the perception module and the planner primitives from the behavior planning module, selects a task block based on the planner primitives, and uses the selected task block to generate control commands based on the cost map; anda control module comprising an actuation subsystem, wherein the control module receives the control commands from the local planning module and controls the actuation subsystem based on the control commands. 2. The system of claim 1, wherein the sensor data comprises image data and range data corresponding to the environment surrounding the vehicle. 3. The system of claim 1, wherein the traffic data comprises at least one of a set of positions and a set of trajectories, the set of positions and the set of trajectories associated with a set of traffic objects classified by an object analysis block of the perception module. 4. The system of claim 3, wherein the set of traffic objects comprises at least one of a neighboring vehicle, a lane marking, and a roadway edge. 5. The system of claim 1, wherein the perception module outputs a geographic location of the vehicle, and further comprising a mission planning module that receives the geographic location from the perception module and generates a route plan based on the geographic location and a predetermined destination, wherein the behavior planning module receives the route plan from the mission planning module and generates the planner primitives based on the route plan in combination with the cost map and traffic data. 6. The system of claim 1, wherein the perception module comprises: a cost mapping block that generates the cost map based on an analysis of the sensor data, wherein the cost map comprises a two-dimensional mapping of a set of weights to an associated set of positions on the roadway, wherein each weight corresponds to a risk value of occurrence of an adverse event were the vehicle to be located at the associated position, andat least one of a lane identification block, a lane tracking block, an object identification block, and an object tracking block. 7. The system of claim 1, further comprising a prediction block that estimates future positions of traffic objects relative to the vehicle, based on computed trajectories of the traffic objects and the traffic data, and wherein the behavior module generates planner primitives based on the estimated future positions of traffic objects. 8. The system of claim 1, further comprising a finite-state automator that selects a subset of allowed planner primitives from a set of planner primitives, based on the cost map and traffic data. 9. The system of claim 8, wherein the subset of allowed planner primitives excludes a lane-change primitive, based on traffic data indicative that the roadway is a single-lane roadway. 10. The system of claim 1, wherein the vehicle operator is a remote vehicle operator residing outside the vehicle, further comprising a remote teleoperation interface, comprising a display and a set of inputs, that renders the sensor data to the remote vehicle operator at the display and receives the driver input from the remote vehicle operator at the set of inputs, wherein the set of inputs comprises a steering wheel input, a gas pedal input, a brake pedal input, and a transmission input. 11. The system of claim 10, wherein the system is operable between a teleoperation mode and an autonomous mode, wherein in the teleoperation mode, the behavior planning module generates the planner primitives based on a directive received from the remote vehicle operator; wherein in the autonomous mode, the behavior planning module generates the planner primitives at the decision-making block independently of the remote vehicle operator; and wherein the system transitions between the teleoperation mode and the autonomous mode based on a geographic location of the vehicle. 12. The system of claim 1, wherein the control module further comprises a speed control block and a steering control block, wherein the speed control block outputs a throttle actuator position and a brake actuator position based on the command instructions associated with the selected task block, and wherein the steering control block outputs a steering angle based on the command instructions associated with the selected task block. 13. A method for controlling a vehicle, comprising: continuously sampling, at a sensor subsystem of the vehicle, sensor data comprising an image stream, a localization signal, and operational data;transmitting the image stream, the localization signal, and the operational data to a remote teleoperation interface associated with a teleoperator;receiving, at a behavior planning module of the vehicle, a first directive from the teleoperator by way of the remote teleoperation interface, wherein the behavior planning module consists essentially of a trained machine-learning module;generating a planner primitive at the behavior planning module based on the directive;selecting, at a local planning module of the vehicle, a task block based on the planner primitive, wherein the task block consists essentially of an explicitly programmed set of rules;controlling the vehicle, at a control module of the vehicle, based on the selected task block in combination with the sensor data;receiving a second directive from the teleoperator, in response to the vehicle entering a geographic region having a predetermined characteristic;transferring planning authority to the behavior planning module of the vehicle, in response to receiving the second directive;automatically generating a second planner primitive at the behavior planning module based on the sensor data;automatically selecting a second task block, based on the second planner primitive;controlling the vehicle based on the selected second task block in combination with the sensor data; andautomatically transferring planning authority to the teleoperator, in response to the vehicle reaching a predetermined geographic location. 14. The method of claim 13, wherein the operational data comprises at least one of the current vehicle speed, current vehicle steering angle, current throttle status, and instantaneous estimated vehicle range. 15. The method of claim 14, further comprising rendering the operational data at the remote teleoperation interface, wherein the remote teleoperation interface is configured to visually simulate the interior of a commercial trucking cabin. 16. The method of claim 13, further comprising training the behavior planning module based on the first directive in combination with the sensor data. 17. The method of claim 13, wherein the geographic region having the predetermined characteristic comprises an end region of a highway on-ramp, and wherein the predetermined geographic location comprises a highway off-ramp. 18. The method of claim 13, wherein the first directive comprises a directive to change lanes to the left, wherein the planner primitive comprises a left-lane-change primitive, wherein the task block comprises a left-lane-change block, and wherein controlling the vehicle comprises generating steering angle instructions, throttling instructions, and braking instructions corresponding to a lane change to a lane at a left side of the vehicle, based on the sensor data and a set of rules of the left-lane-change block, and actuating a steering wheel, gas pedal, and brake pedal of the vehicle according to the respective steering angle instructions, throttling instructions, and braking instructions. 19. The method of claim 13, further comprising receiving a third directive from a local operator residing inside the vehicle, transferring planning authority to the local operator in response to receiving the third directive, and controlling the vehicle based on direct control inputs generated by the local operator. 20. The method of claim 19, further comprising training the behavior planning module based on the direct control inputs in combination with the sensor data.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (7)
Anderson, Noel Wayne, Leader-follower fully-autonomous vehicle with operator on side.
Levinson, Jesse Sol; Sibley, Gabriel Thurston; Rege, Ashutosh Gajanan, Machine-learning systems and techniques to optimize teleoperation and/or planner decisions.
Kleimenhagen Karl W. (Peoria IL) Kemner Carl A. (Peoria Heights IL) Bradbury Walter J. (Peoria IL) Koehrsen Craig L. (Peoria IL) Peterson Joel L. (Peoria IL) Schmidt Larry E. (Peoria IL) Stafford Dar, System and method for operating an autonomous navigation system.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.