Systems and methods for controlling vehicle position and orientation
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06T-007/00
G05D-001/02
G06T-007/73
G06T-007/10
G06T-007/194
출원번호
US-0788239
(2015-06-30)
등록번호
US-9734583
(2017-08-15)
발명자
/ 주소
Walker, Collin
Walker, Gary
출원인 / 주소
Walker, Collin
대리인 / 주소
TraskBritt
인용정보
피인용 횟수 :
0인용 특허 :
15
초록▼
Systems and related methods for controlling vehicle position and movement are disclosed. A system includes one or more computers configured to receive data corresponding to video image frames of at least a portion of a course, and analyze the video image frame to estimate at least one of position an
Systems and related methods for controlling vehicle position and movement are disclosed. A system includes one or more computers configured to receive data corresponding to video image frames of at least a portion of a course, and analyze the video image frame to estimate at least one of position and orientation of the vehicle within the course. A method includes identifying groups of pixels from the video image frames that are determined to correspond to non-background objects in the course. The method also includes correlating at least a portion of the identified groups of pixels with known objects in the course, and analyzing positions of the groups of pixels within the video image frames and known positions of at least one of an image capture device and known objects relative to the course to estimate at least one of the position and orientation of the vehicle within the course.
대표청구항▼
1. A method of controlling water vehicle position and movement, comprising: capturing, with one or more cameras secured to the water vehicle, video frame data corresponding to video image frames taken of at least a portion of a water course;identifying, with a computer, groups of pixels from the vid
1. A method of controlling water vehicle position and movement, comprising: capturing, with one or more cameras secured to the water vehicle, video frame data corresponding to video image frames taken of at least a portion of a water course;identifying, with a computer, groups of pixels from the video frame data, the identified groups of pixels determined to correspond to objects in the water course;correlating at least a pair of the identified groups of pixels with a known pair of buoys on each side of in the water course;estimating at least one of a position or an orientation of the water vehicle within the water course by analyzing positions of the groups of pixels within the video image frames and known spatial positions of the one or more cameras that captured the video frame data and the known pair of buoys in the water course;estimating at least one of steering parameters and acceleration parameters that are determined to navigate the water vehicle on a desired path through the water course; andtransmitting, with the computer, navigation data including the at least one of the steering parameters and the acceleration parameters to at least one of: a user interface configured to provide the at least one of the steering parameters and the acceleration parameters to a driver of the water vehicle; oran automatic operating device configured to receive the navigation data and automatically adjust at least one of steering and acceleration of the water vehicle according to the at least one of the steering parameters and the acceleration parameters. 2. The method of claim 1, wherein estimating at least one of a position or an orientation of a water vehicle further comprises analyzing sensor data from one or more sensors to determine the at least one of the position or the orientation of the water vehicle. 3. The method of claim 1, wherein correlating at least a pair of the identified groups of pixels with a known pair of buoys in the water course comprises: assigning a score to a plurality of pairs of the group of pixels of the identified groups of pixels for its fitness for correlation with each known pair of buoys; andcorrelating a different pair of the identified group of pixels with each known pair of buoys having a best score for the correlated known pair of buoys assigned thereto. 4. The method of claim 1, wherein correlating at least a pair of the identified groups of pixels with a known pair of buoys in the water course comprises assigning a different pair of groups of pixels to each of a plurality of known pairs of buoys in the water course. 5. The method of claim 1, wherein correlating at least a pair of the identified groups of pixels with a known pair of buoys in the water course comprises eliminating at least one identified group of pixels from consideration for correlation with the known pair of buoys if the at least one identified group of pixels has properties that are inconsistent with properties of the known pair of buoys. 6. The method of claim 3, wherein assigning the score to the plurality of pairs of the group of pixels for its fitness for correlation with each known object of at least a portion of the known objects is based, at least in part, on a slope of a line passing through each group of pixels for a given pair to be evaluated for correlation. 7. The method of claim 3, wherein assigning the score to the plurality of pairs of the group of pixels is based, at least in part, on determining a number of pixels within each group of pixels for a given pair to be evaluated for correlation. 8. The method of claim 3, wherein assigning the score to the plurality of pairs of the group of pixels is based, at least in part, on determining that a given pair to be evaluated for correlation is located relatively close to its location in a previous video image frame of the video frame data. 9. The method of claim 3, wherein assigning the score to the plurality of pairs of the group of pixels is based, at least in part, on determining that a given pair to be evaluated for correlation has a pixel width between each of the groups that is wider and located toward a bottom portion of the video image frame in comparison with another pair that has a pixel width between each of the groups of pixels that is narrower and located toward an upper portion of the video image frame. 10. The method of claim 1, wherein identifying groups of pixels determined to correspond to objects in a water course comprises using a background subtraction process to differentiate the groups of pixels determined to correspond to the non-background objects in the water course from background pixels of a dynamic background. 11. The method of claim 10, wherein using a background subtraction process comprises using a visual background extractor (ViBe) process. 12. A system, comprising: one or more image capture devices configured to capture video image frame data of a course that a vehicle is to navigate;one or more computers configured to receive the video image frame data from the one or more image capture devices, and execute computer-readable instructions including: a segmentation software module configured to identify groups of pixels from the video image frame data that correspond to determined objects in the course that are distinguished from dynamically changing background data;a model fitting software module configured to group together the identified groups of pixels into one or more pairs, and correlate pairs of known objects on each side of the course with the one or more pairs of the groups of pixels identified by the segmentation software module; anda pose estimation software module configured to estimate at least one of a position or orientation of the vehicle by analyzing image locations of the object pairs of groups of pixels within the video image frame data and known locations of at least one of: the one or more image capture devices; orthe pair of known objects correlated with the groups of pixels by the model fitting software module; andone or more interface devices operably coupled to the one or more computers and configured to at least partially automate navigation of the vehicle through the course. 13. The system of claim 12, wherein the one or more interface devices configured to at least partially automate navigation of the vehicle through the course include a user interface configured to provide perceptible feedback to a driver of the vehicle, the perceptible feedback configured to provide information instructing the driver of the vehicle to perform at least one manual control of the vehicle selected from the group consisting of steering the vehicle, accelerating the vehicle, and decelerating the vehicle. 14. The system of claim 13, wherein the user interface is configured to provide at least one of visual feedback, audible feedback, and haptic feedback to the driver of the vehicle. 15. The system of claim 12, wherein the one or more interface devices configured to at least partially automate navigation of the vehicle through the course include one or more automatic operating devices operably coupled to the one or more computers, and wherein the one or more computers include a control module configured to control the one or more automatic operating devices. 16. The system of claim 15, wherein the one or more automatic operating devices include a steering element of the vehicle operably coupled to an electrically controlled motor configured to adjust the steering element of the vehicle responsive to control signals received from the control module. 17. A system comprising: a water vehicle comprising: an image capture device secured to the water vehicle, the image capture device configured to capture video image frames of at least a portion of a water course;a computer operably coupled to the image capture device, the computer comprising at least one processing element operably coupled to at least one data storage device including computer-readable instructions stored thereon, the at least one processing element configured to execute the computer-readable instructions, the computer-readable instructions configured to instruct the at least one processing element to: identify groups of pixels from the video image frames that correspond to determined non-background objects in the water course;correlate pairs of the identified groups of pixels with known buoy pairs on each side of the water course according to one or more scoring criteria for determining a likelihood of a match between the identified groups of pixels that are grouped together and the known buoy pairs; andestimate at least one of position or orientation of the water vehicle within the water course by comparing locations, within the video image frames, of the pairs of the identified groups of pixels that correlate with the known buoy pairs in the water course to known locations of the known buoy pairs in the water course; anda user interface configured to provide human-perceptible feedback to a driver of the water vehicle to indicate corrections to be made to at least one of a position, a direction, and a speed of the water vehicle. 18. A method of controlling water vehicle position and movement, comprising: capturing, with one or more cameras from a stationary location remote to the water vehicle, video frame data corresponding to video image frames taken of at least a portion of a water course;identifying, with a computer, groups of pixels from a static background of the video frame data;determining navigation data including a position of the water vehicle within the water course by analyzing positions of the groups of pixels associated with the moving water vehicle within the video image frames and known additional pairs of groups of pixels identified by the computer that are correlated with known buoy pairs on each side of the water course;transmitting, with the computer, at least a portion of the navigation data from the computer located at the stationary location to another computer located on-board the water vehicle; andautomatically adjusting at least one of steering or acceleration of the water vehicle responsive to the navigation data received from the computer located at the remote location and orientation data determined by a device located on-board the water vehicle. 19. The method of claim 18, wherein identifying groups of pixels corresponding to an object includes identifying lights mounted to the water vehicle at known locations for the computer to use while determining the navigation data. 20. The method of claim 19, wherein identifying the lights includes distinguishing the lights from other objects corresponding to other groups of pixels within the video frame data by adjusting at least one parameter of the lights including at least one of: controlling a color of the lights;controlling a brightness of the lights;controlling a flashing of the lights according to a predetermined pattern;controlling a flashing of the lights by turning the lights on and off for a predetermined time; orcontrolling a flashing of the lights according to a pseudo random number generator having a seed known by both a controller controlling the lights and the computer.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (15)
Barker Christopher (New York NY) Kunz Christian (New York NY) Combs John A. (Babylon NY) Park John C. S. (New York NY), Apparatus for a video marine navigation plotter with electronic charting and methods for use therein.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.