Unmanned vehicle (UV) control may include receiving a UV work order and generating a mission request based on the UV work order. The mission request may identify an objective of a mission, assign a UV and a sensor to the mission from a fleet of UVs and sensors, and assign a first movement plan to th
Unmanned vehicle (UV) control may include receiving a UV work order and generating a mission request based on the UV work order. The mission request may identify an objective of a mission, assign a UV and a sensor to the mission from a fleet of UVs and sensors, and assign a first movement plan to the mission based on the identified objective of the mission. The assigned UV may be controlled according to the assigned first movement plan, and communication data may be received from the assigned sensor. The communication data may be analyzed to identify an event related to the mission. The identified event and the first movement plan may be analyzed to assign a second movement plan to the mission based on the analysis of the identified event and the first movement plan to meet the identified objective of the mission.
대표청구항▼
1. An unmanned vehicle (UV) control system comprising: a fleet and mission operations controller, executed by at least one hardware processor, to receive a UV work order and to generate a mission request based on the UV work order, the mission request identifying an objective of a mission, assigning
1. An unmanned vehicle (UV) control system comprising: a fleet and mission operations controller, executed by at least one hardware processor, to receive a UV work order and to generate a mission request based on the UV work order, the mission request identifying an objective of a mission, assigning a UV and a sensor to the mission from a fleet of UVs and sensors, and assigning a first movement plan to the mission based on the identified objective of the mission, wherein the first movement plan includes predefined way points and alternate points for the UV based on the identified objective of the mission;a mission controller, executed by the at least one hardware processor, to control the assigned UV according to the assigned first movement plan, andreceive communication data from the assigned sensor; andan event detector, executed by the at least one hardware processor, to receive the communication data,analyze the communication data to identify an event related to the mission, andforward the identified event to the fleet and mission operations controller to analyze the identified event and the first movement plan, andassign a second movement plan to the mission based on the analysis of the identified event and the first movement plan to meet the identified objective of the mission, whereinthe second movement plan is different than the first movement plan, andthe second movement plan includes at least one different predefined way point from the predefined way points and at least one different alternate point from the alternate points for the UV based on the analysis of the identified event to meet the identified objective of the mission, andwherein the mission controller is to control the assigned UV according to the assigned second movement plan. 2. The UV control system according to claim 1, wherein the event detector is to analyze the communication data to identify the event related to the mission by combining telemetry data and video stream data of the communication data by determining a number of frames per second for the video stream data,determining, from the telemetry data, a time and a location associated with the video stream data, andgenerating a meta tag for each of the frames of the video stream data, wherein the meta tag includes the time and the location associated with the video stream data. 3. The UV control system according to claim 1, wherein the event detector is to analyze the communication data to identify the event that includes a potential leak by converting video stream data of the communication data from red, green, and blue (RGB) into corresponding hue-saturation-values (HSVs) to adjust for variations in lighting conditions and shadows,defining lower and upper bounds of the HSVs based on a type of material associated with the potential leak,analyzing, based on the defined lower and upper bounds, each video frame associated with the video stream data to overlay shapes on each of the video frames associated with the video stream data, andapplying parameter constraints to determine whether an area of a video frame associated with the overlayed shapes represents the potential leak. 4. The UV control system according to claim 1, wherein the fleet of UVs includes unmanned aerial vehicles (UAVs). 5. The UV control system according to claim 1, wherein the event detector is to analyze the communication data to identify the event that includes a potential leak by implementing a cascade-of-rejecters model that includes a concatenation of a plurality of classifiers to analyze different features associated with the communication data, andidentifying the event that includes a potential leak based on a determination of whether the different features associated with the communication data pass all of the plurality of classifiers within the cascade-of-rejecters model. 6. The UV control system according to claim 1, wherein the sensor includes a video camera, andthe mission controller is to generate a real-time display from the video camera,receive instructions to modify movement of the UV based on an analysis of the real-time display from the video camera, andmodify movement of the UV based on the received instructions. 7. The UV control system according to claim 1, wherein the event detector is to analyze the communication data to identify the event that includes an intruder related to a pipeline by accounting for a proportional increase in a dimension associated with a person based on whether the person is moving towards the pipeline or moving away from the pipeline. 8. The UV control system according to claim 1, wherein the event detector is to analyze the communication data to identify the event that includes an intruder related to a pipeline by accounting for a decrease in probability of a position of a person transitioning to a next position within a specified time period when a dimensional difference in two positions associated with the person is above a specified threshold. 9. The UV control system according to claim 1, wherein the event detector is to analyze the communication data to identify the event related to a pipeline, andgenerate instructions for preventative actions with respect to the pipeline based on the identification of the event. 10. The UV control system according to claim 1, wherein the mission controller is to analyze the event to determine a severity level of the event, andgenerate a real-time display related to the event, wherein the real-time display includes a characterization of a type and the severity level of the event. 11. A method for unmanned vehicle (UV) control, the method comprising: generating, by a fleet and mission operations controller that is executed by at least one hardware processor, a mission request to identify an objective of a mission,assign a UV and a sensor to the mission from a fleet of UVs and sensors, andassign a first movement plan to the mission based on the identified objective of the mission;controlling, by a mission controller that is executed by the at least one hardware processor, the assigned UV according to the assigned first movement plan;receiving, by an event detector that is executed by the at least one hardware processor, communication data;analyzing, by the event detector, the communication data from the assigned sensor to identify an event that includes a potential leak or an intruder related to a pipeline;analyzing, by the fleet and mission operations controller, the identified event and the first movement plan;assigning, by the fleet and mission operations controller, a second movement plan to the mission based on the analysis of the identified event and the first movement plan to meet the identified objective of the mission, wherein the second movement plan is different than the first movement plan; andcontrolling, by the mission controller, the assigned UV according to the assigned second movement plan. 12. The method for UV control according to claim 11, further comprising: determining, by a compliance evaluator that is executed by the at least one hardware processor, whether the mission request is compliant with regulations, andin response to a determination that the mission request is compliant with regulations, forwarding, from the compliance evaluator, the mission request to the mission controller. 13. The method for UV control according to claim 11, further comprising: generating, by the mission controller, a real-time display related to the event, wherein the real-time display includes a characterization of a type and a severity level of the event. 14. The method for UV control according to claim 11, further comprising: analyzing, by the event detector, the communication data to identify the event related to the pipeline, andgenerating, by the event detector, instructions for preventative actions with respect to the pipeline based on the identification of the event. 15. The method for UV control according to claim 11, wherein the sensor includes a video camera, the method further comprises: generating, by the mission controller, a real-time display from the video camera,receiving, by the mission controller, instructions to modify movement of the UV based on an analysis of the real-time display from the video camera, andmodifying, by the mission controller, movement of the UV based on the received instructions. 16. A non-transitory computer readable medium having stored thereon machine readable instructions for UV control, the machine readable instructions when executed cause at least one hardware processor to: receive, at a mission controller that is executed by the at least one hardware processor, a mission request that identifies an objective of a mission,assigns a UV and a sensor to the mission from a fleet of UVs and sensors, andassigns a first movement plan to the mission based on the identified objective of the mission;control, by the mission controller, the assigned UV according to the assigned first movement plan;receive, by an event detector that is executed by the at least one hardware processor, communication data;analyze, by the event detector that is executed by the at least one hardware processor, the communication data from the assigned sensor to identify an event related to the mission;analyze, by a fleet and mission operations controller, the identified event and the first movement plan;receive, at the mission controller, a second movement plan for the mission based on the analysis of the identified event and the first movement plan to meet the identified objective of the mission, wherein the second movement plan is different than the first movement plan;analyze, by the mission controller, the event to determine a severity level of the event; andgenerate, by the mission controller, a real-time display related to the event, wherein the real-time display includes a characterization of a type and the severity level of the event. 17. The non-transitory computer readable medium according to claim 16, wherein the machine readable instructions to analyze, by the event detector that is executed by the at least one hardware processor, the communication data from the assigned sensor to identify the event related to the mission, when executed, further cause the at least one hardware processor to combine telemetry data and video stream data of the communication data by determining a number of frames per second for the video stream data,determining, from the telemetry data, a time and a location associated with the video stream data, andgenerating a meta tag for each of the frames of the video stream data, wherein the meta tag includes the time and the location associated with the video stream data. 18. The non-transitory computer readable medium according to claim 16, wherein the machine readable instructions to analyze, by the event detector that is executed by the at least one hardware processor, the communication data from the assigned sensor to identify the event related to the mission, when executed, further cause the at least one hardware processor to identify the event that includes a potential leak by converting video stream data of the communication data from red, green, and blue (RGB) into corresponding hue-saturation-values (HSVs) to adjust for variations in lighting conditions and shadows,defining lower and upper bounds of the HSVs based on a type of material associated with the potential leak,analyzing, based on the defined lower and upper bounds, each video frame associated with the video stream data to overlay shapes on each of the video frames associated with the video stream data, andapplying parameter constraints to determine whether an area of a video frame associated with the overlayed shapes represents the potential leak. 19. The non-transitory computer readable medium according to claim 16, wherein the machine readable instructions, when executed, further cause the at least one hardware processor to: determine, by a compliance evaluator that is executed by the at least one hardware processor, whether the mission request is compliant with at least one of regulations and safety parameters; andin response to a determination that the mission request is compliant with the at least one of regulations and safety parameters, forward the mission request to the mission controller. 20. The non-transitory computer readable medium according to claim 16, wherein the machine readable instructions, when executed, further cause the at least one hardware processor to: determine, by a compliance evaluator that is executed by the at least one hardware processor, whether the assigned UV and a UV operation crew associated with the mission request is compliant with at least one of regulations and safety parameters; andin response to a determination that the assigned UV and the UV operation crew associated with the mission request is compliant with the at least one of regulations and safety parameters, forward the mission request to the mission controller.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (7)
Yavnai Arie,ILX, Autonomous command and control unit for mobile platform.
Yee David Moon ; Bickley Robert Henry ; Brenner Charles Herbert ; Zucarelli Philip John ; Keller Theodore Wolley ; Moyer Christopher Kent, GPS based search and rescue system.
Gorr Russell E. ; Hancock Thomas R. ; Judd J. Stephen ; Lin Long-Ji ; Novak Carol L. ; Rickard ; Jr. Scott T., Method and apparatus for automatically tracking the location of vehicles.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.