IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0688149
(2007-03-19)
|
등록번호 |
US-8335345
(2012-12-18)
|
발명자
/ 주소 |
- White, Marvin S.
- Alt, Alina
|
출원인 / 주소 |
|
대리인 / 주소 |
Vierra Magen Marcus & DeNiro LLP
|
인용정보 |
피인용 횟수 :
15 인용 특허 :
87 |
초록
▼
The path and/or position of an object is tracked using two or more cameras which run asynchronously so there is need to provide a common timing signal to each camera. Captured images are analyzed to detect a position of the object in the image. Equations of motion for the object are then solved base
The path and/or position of an object is tracked using two or more cameras which run asynchronously so there is need to provide a common timing signal to each camera. Captured images are analyzed to detect a position of the object in the image. Equations of motion for the object are then solved based on the detected positions and a transformation which relates the detected positions to a desired coordinate system in which the path is to be described. The position of an object can also be determined from a position which meets a distance metric relative to lines of position from three or more images. The images can be enhanced to depict the path and/or position of the object as a graphical element. Further, statistics such as maximum object speed and distance traveled can be obtained. Applications include tracking the position of a game object at a sports event.
대표청구항
▼
1. A method for determining a path of a moving object, comprising: receiving images of the moving object from at least first and second cameras at different time points during a time interval, as the moving object travels in a path during the time interval, the at least first and second cameras capt
1. A method for determining a path of a moving object, comprising: receiving images of the moving object from at least first and second cameras at different time points during a time interval, as the moving object travels in a path during the time interval, the at least first and second cameras capture the images asynchronously, the first camera has a pixel space which is related to a world coordinate system by an associated transformation matrix and the second camera has a pixel space which is related to the world coordinate system by an associated transformation matrix;using one or more processors to provide a matrix S, for each image captured by the first camera, which indicates a position of the moving object in the pixel space of the first camera, and for each image captured by the second camera, which indicates a position of the moving object in the pixel space of the second camera;using the one or more processors to solve for coefficients of equations of motion of the moving object to provide solved coefficients, based on the positions, the equations of motion with the solved coefficients describe the path of the moving object during the time interval, the coefficients of the equations of motion include velocity and acceleration coefficients and are in a matrix A, and the solving includes providing a matrix U for each image which relates the matrix A to the matrix S of the image by an equation U×AT=ST, where AT is a transpose of the matrix A and ST is a transpose of the matrix S of the image, and solving for the matrix A from the equation U×AT=ST; andenhancing a video signal based on the equations of motion to depict the path of the moving object. 2. The method of claim 1, wherein: each position represents a point on a line of position from one of the at least first and second plurality of cameras to the moving object. 3. The method of claim 2, wherein: the solving comprises satisfying a least square error criteria for errors between: (a) the lines of position and (b) positions of the moving object which are based on the equations of motion. 4. The method of claim 1, wherein: the equations of motion with the solved coefficients describe the path of the moving object in three orthogonal directions in the world coordinate system at any time t during the time interval as a function of the time t. 5. The method of claim 4, wherein: the world coordinate system has orthogonal x-, y- and z-directions; andthe matrix A includes coefficients, including: (a) x0, y0 and z0, which are positions in the x-, y- and z-directions, respectively, for a first of the images, (b) vx0, vy0 and vz0, which are velocities in the x-, y- and z-directions, respectively, for the first of the images, and (c) ax, ay and az, which are accelerations in the x-, y- and z-directions, respectively. 6. The method of claim 5, wherein: the equations of motions include: wx(t)=x0+vx0*t+(½)ax*t2,wy(t)=y0+vy0*t+(½)ay*t2, andwz(t)=z0+vz0*t+(½)(az+g)*t2, where g denotes gravitational acceleration, and wx(t), wy(t) and wz(t) denotes positions of the moving object in the x-, y- and z-directions, respectively, at the time t. 7. The method of claim 4, wherein: (a) for each image captured by the first camera: the matrix S of the image includes horizontal and vertical pixel coordinates, sx and sy, respectively, of the position of the moving object in the pixel space of the first camera; andthe matrix U for the image has elements which include sx and sy in the pixel space of the first camera, the time t, and elements of the associated transformation matrix of the first camera; and(b) for each image captured by the second camera: the matrix S of the image includes horizontal and vertical pixel coordinates, sx and sy, respectively, of the position of the moving object in the pixel space of the second camera; andthe matrix U for the image has elements which include sx and sy in the pixel space of the second camera, the time t, and elements of the associated transformation matrix of the second camera. 8. The method of claim 1, wherein: the coefficients describe a position, velocity and acceleration of the moving object along the path of the moving object at any time during the time interval. 9. The method of claim 1, further comprising: identifying the different time points at which the images are received using a common clock signal which is independent of the at least first and second cameras. 10. The method of claim 1, wherein: the equations of motion are linear equations and the solving uses Singular Value Decomposition. 11. The method of claim 1, wherein: the equations of motion include at least one non-linear equation, and the solving comprises performing a computation which starts with a linear estimate and is completed using a Levenberg-Marquardt method with the linear estimate as a starting point. 12. The method of claim 1, wherein: the enhancing comprises using the equations of motion to determine locations of the moving object along the path at time points which are between successive time points of the different time points. 13. The method of claim 1, wherein: the matrix A is solved simultaneously for all of the images. 14. The method of claim 1, wherein: a number N of the images are received;N of the matrices S are provided, one for each image, including matrices (sx0 sy0), (sx1sy1), . . . , (sxN−1 syN−1), where sx0, sx1, . . . , sxN−1 and sy0, sy1, . . . , syN−1 are horizontal and vertical pixel coordinates, respectively, of the positions of the moving object in the pixel spaces;N of the matrices U are provided, one for each image, including matrices U0, U1, . . . , UN; andone of the matrices A is provided. 15. The method of claim 14, wherein the solving includes solving an equation: (U0U1…UN-1)AT=((sx0sy0)T(sx1sy1)T(…)T(sxN-1syN-1)T). 16. A system for determining a path of a moving object used at an event, comprising: at least first and second cameras capturing images of the moving object at the event at different time points during a time interval to provide captured images, as the moving object travels in a path during the time interval, the at least first and second of cameras capture the images asynchronously, the first camera has a pixel space which is related to a world coordinate system by an associated transformation matrix and the second camera has a pixel space which is related to the world coordinate system by an associated transformation matrix; andat least one processor which receives the captured images from the at least first and second of cameras, the at least one processor: a) provides a matrix S, for each image captured by the first camera, which indicates a position of the moving object in the pixel space of the first camera, and for each image captured by the second camera, which indicates a position of the moving object in the pixel space of the second camera, and b) solves for coefficients of equations of motion of the moving object to provide solved coefficients, based on the positions, the equations of motion with the solved coefficients describe the path of the moving object during the time interval, the coefficients of the equations of motion include velocity and acceleration coefficients and are in a matrix A, and the at least one processor solves for the coefficients of equations of motion by providing a matrix U for each image which relates the matrix A to the matrix S of the image by an equation U×AT=ST, where AT is a transpose of the matrix A and ST is a transpose of the matrix S of the image, and solves for the matrix A from the equation U×AT=ST. 17. The system of claim 16, wherein: each position represents a point on a line of position from one of the at least first and second of cameras to the moving object. 18. The system of claim 17, wherein: the at least one processor solves for the coefficients in accordance with a least square error criteria for errors between: (a) the lines of position and (b) positions of the moving object which are based on the equations of motion. 19. The system of claim 16, wherein: the equations of motion with the solved coefficients describe the path of the moving object in three orthogonal directions in the world coordinate system at any time t during the time interval as a function of the time t. 20. The system of claim 19, wherein: (a) for each image captured by the first camera: the matrix S of the image includes horizontal and vertical pixel coordinates, sx and sy, respectively, of the position of the moving object in the pixel space of the first camera; andthe matrix U for the image has elements which include sx and sy in the pixel space of the first camera, the time t, and elements of the associated transformation matrix of the first camera; and(b) for each image captured by the second camera: the matrix S of the image includes horizontal and vertical pixel coordinates, sx and sy, respectively, of the position of the moving object in the pixel space of the second camera; andthe matrix U for the image has elements which include sx and sy in the pixel space of the second camera, the time t, and elements of the associated transformation matrix of the second camera. 21. The system of claim 16, wherein: the coefficients describe a position, velocity and acceleration of the moving object along the path of the moving object at any time during the time interval. 22. The system of claim 16, wherein: the at least one processor identifies the different time points at which the captured images are received from the at least first and second cameras using a common clock signal which is independent of the at least first and second of cameras. 23. The system of claim 16, wherein: the captured images are from video of the moving object, and the at least one processor enhances the video to depict at least a portion of the path of the moving object. 24. The system of claim 16, wherein: the at least one processor enhances a video signal based on the equations of motion to depict the path of the moving object, the enhancing comprises displaying an on-screen graphic which depicts the path. 25. The system of claim 16, wherein: the at least one processor using the equations of motion to determine locations of the moving object along the path at time points which are between successive time points of the different time points. 26. The system of claim 16, wherein: a number N of the captured images are received by the at least one processor;N of the matrices S are provided, one for each captured image, including matrices (sx0 sy0), (sx1 sy1), . . . , (sxN−1 syN−1), where sx0, sx1, . . . , sxN−1 and sy0, sy1, . . . , syN−1 are horizontal and vertical pixel coordinates, respectively, of the positions of the moving object in the pixel spaces;N of the matrices U are provided, one for each captured image, including matrices U0, U1, . . . , UN; andone of the matrices A is provided. 27. The system of claim 26, wherein the at least one processor solves for the matrix A by solving an equation: (U0U1…UN-1)AT=((sx0sy0)T(sx1sy1)T(…)T(sxN-1syN-1)T). 28. At least one processor readable storage device having processor readable code embodied thereon for programming at least one processor to perform a method, the method comprising: receiving images of a moving object from at least first and second cameras at different time points during a time interval, as the moving object travels in a path during the time interval, the at least first and second cameras capture the images asynchronously, the first camera has a pixel space which is related to a world coordinate system by an associated transformation matrix and the second camera has a pixel space which is related to the world coordinate system by an associated transformation matrix;providing a matrix S, for each image captured by the first camera, which indicates a position of the moving object in the pixel space of the first camera, and for each image captured by the second camera, which indicates a position of the moving object in the pixel space of the second camera; andsolving for coefficients of equations of motion of the moving object to provide solved coefficients, based on the positions, the equations of motion with the solved coefficients describe the path of the moving object during the time interval, the coefficients of the equations of motion include velocity and acceleration coefficients and are in a matrix A, and the solving includes providing a matrix U for each image which relates the matrix A to the matrix S of the image by an equation U×AT=ST, where AT is a transpose of the matrix A and ST is a transpose of the matrix S of the image, and solving for the matrix A from the equation U×AT=ST. 29. The at least one processor readable storage device of claim 28, wherein: each position represents a point on a line of position from one of the first and second cameras to the moving object. 30. The at least one processor readable storage device of claim 29, wherein: the solving comprises satisfying a least square error criteria for errors between: (a) the lines of position and (b) positions of the moving object which are based on the equations of motion. 31. The at least one processor readable storage device of claim 28, wherein: the equations of motion with the solved coefficients describe the path of the moving object in three orthogonal directions in the world coordinate system at any time t during the time interval as a function of the time t. 32. The at least one processor readable storage device of claim 31, wherein: (a) for each image captured by the first camera: the matrix S of the image includes horizontal and vertical pixel coordinates, sx and sy, respectively, of the position of the moving object in the pixel space of the first camera; andthe matrix U for the image has elements which include sx and sy in the pixel space of the first camera, the time t, and elements of the associated transformation matrix of the first camera; and(b) for each image captured by the second camera: the matrix S of the image includes horizontal and vertical pixel coordinates, sx and sy, respectively, of the position of the moving object in the pixel space of the second camera; andthe matrix U for the image has elements which include sx and sy in the pixel space of the second camera, the time t, and elements of the associated transformation matrix of the second camera. 33. The at least one processor readable storage device of claim 28, wherein: the coefficients describe a position, velocity and acceleration of the moving object along the path of the moving object at any time during the time interval. 34. The at least one processor readable storage device of claim 28, wherein: the images are from video of the moving object, and the method performed further comprises enhancing the video to depict at least a portion of the path of the moving object. 35. The at least one processor readable storage device of claim 27, wherein: the method performed further comprises providing statistics regarding the path of the moving object. 36. The at least one processor readable storage device of claim 28, wherein the method performed further comprises: enhancing a video signal based on the equations of motion to depict the path of the moving object, the enhancing comprises determining locations of the moving object along the path at time points which are between successive time points of the different time points. 37. The at least one processor readable storage device of claim 28, wherein: a number N of the images are received;N of the matrices S are provided, one for each image, including matrices (sx0 sy0), (sx1 sy1), . . . , (sxN−1 syN−1), where sx0, sx1, . . . , sxN−1 and sy0, sy1, . . . , syN−1 are horizontal and vertical pixel coordinates, respectively, of the positions of the moving object in the pixel spaces;N of the matrices U are provided, one for each image, including matrices U0, U1, . . . , UN; andone of the matrices A is provided. 38. The at least one processor readable storage device of claim 37, wherein the solving includes solving an equation: (U0U1…UN-1)AT=((sx0sy0)T(sx1sy1)T(…)T(sxN-1syN-1)T).
※ AI-Helper는 부적절한 답변을 할 수 있습니다.