IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0986661
(2011-01-07)
|
등록번호 |
US-8837782
(2014-09-16)
|
발명자
/ 주소 |
- Rosenwinkel, Alan M.
- Mercurio, Jonathan
- Bucha, Kellie
|
출원인 / 주소 |
- Lockheed Martin Corporation
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
7 인용 특허 :
16 |
초록
▼
Data or electrooptic sensor (EOS) images are made of a star field and at least one, and possibly multiple, Earth satellites associated therewith. Calculations performed on the imaged locations of a satellite and two stars of a star field provide all the information needed to identify the observer's
Data or electrooptic sensor (EOS) images are made of a star field and at least one, and possibly multiple, Earth satellites associated therewith. Calculations performed on the imaged locations of a satellite and two stars of a star field provide all the information needed to identify the observer's position. When the ephemerides of the satellite(s) are less accurately known, calculations performed on the imaged locations of at least two satellites and four stars of a star field provide all the information needed to identify the observer's position, because the along-track and cross-track ephemerides errors are different. Thus, the cross-track information of multiple satellites is preferentially used to determine the geolocation.
대표청구항
▼
1. A method for geoposition determination of longitude and latitude of a camera on a movable platform in the presence of at least one Earth satellite and first and second stars, said method comprising the steps of: obtaining, using the camera, a first image including the at least one earth satellite
1. A method for geoposition determination of longitude and latitude of a camera on a movable platform in the presence of at least one Earth satellite and first and second stars, said method comprising the steps of: obtaining, using the camera, a first image including the at least one earth satellite and the first and second stars;determining x and y positions of the at least one earth satellite and the first and second stars in the first image; andin a computer process, determining, from said determined x and y positions of said at least one earth satellite and said two stars-in said first image, a first set of possible locations of the camera in three-dimensional space, wherein said set of possible locations of the camera in three-dimensional space is determined by creating a curve mapped on a surface of the Earth, based on vectors measured in the first image from the camera to the first star, from the camera to the second star and from the camera to the at least one earth satellite. 2. A method for geoposition determination of longitude and latitude in the presence of at least one Earth satellite and two stars, said method comprising the steps of: observing x and y positions of the at least one earth satellite and first and second stars in a first camera image of a camera; andin a computer process, determining from said x and y positions of said at least one earth satellite and said two stars in said first camera image a first set of possible locations of the camera in three-dimensional space, wherein said step of determining includes the steps of:calculating a vector S1CAM from the camera to the first star in a camera coordinate frame S1CAM=[sin(x1Px)cos(y1Py)cos(x1Px)cos(y1Py)sin(y1Py)]where:x1 is an x-axis location, in pixels, in a camera focal plane of the first star;y1 is an y-axis location, in pixels, in the camera focal plane, of the first star;Px is the radians/pixel along the x-axis of the camera;Py is the radians/pixel along the y-axis of the camera;calculating a vector S2CAM from the camera to the second star in the camera coordinate frame S2CAM=[sin(x2Px)cos(y2Py)cos(x2Px)cos(y2Py)sin(y2Py)];where:x2 is an x-axis location, in pixels, in the camera focal plane of the second star;y2 is an y-axis location, in pixels, in the camera focal plane, of the second star;calculating a vector UCAM from the camera to the at least one earth satellite in the camera coordinate frame UCAM=[sin(xUPx)cos(yUPy)cos(yUPy)sin(yUPy)];where:xU is an x-axis location, in pixels, in the camera focal plane of the satellite;yU is a y-axis location, in pixels, in the camera focal plane, of the satellite;calculating a 3×3 matrix MECEF representing an orthogonal vector triad of the first and second stars in an earth-centered, earth-fixed (ECEF) coordinate frame MECEF=[S1ECEFS1ECEFXS2ECEFS1ECEFX(S1ECEFXS2ECEF)]where:S1ECEF and S2ECEF are vectors extending from the center of the Earth to the first and second stars, respectively;calculating a 3×3 matrix MCAM representing the orthogonal vector triad of the first and second stars in the camera coordinate frame MCAM=[S1CAMS1CAMXS2CAMS1CAMX(S1CAMXS2CAM)]where:S1CAM and S2CAM are vectors from the camera to the first and second stars in the camera coordinate frame, wherein cross product X in the above equation is a normalized cross product, defined as the unit vector in a direction of a cross product (U×V)i≡UjVkɛijkUmVnɛlmnUpVqɛlpq;where:U and V are arbitrary vectors and ε is a Levi-Civita symbol and i, j, k, l, m, n, p, q represent Cartesian coordinates as per Eisenstein notation convention;calculating a 3×3 rotation matrix RCAMECEF from the camera coordinate frame to the ECEF coordinate frame RCAMECEF=MECEF(MCAM)T;calculating a vector Ufrom shipECEF from the camera to the at least one earth satellite in the ECEF coordinate frame by Ufrom shipECEF=RCAMECEFUCAM where:UCAM is a 3×1 vector from the camera to the at least one earth satellite in the camera coordinate frame;calculating a vector XECEF from the center of the Earth to the camera, in the ECEF coordinate frame, at the time the first camera image is or was taken, by XECEF=UECEF−CUfrom shipECEF where:C is an unknown scalar; andUECEF is a vector to the at least one earth satellite from the Earth center, which is known from satellite ephemerides;solving XECEF=UECEF−CUfrom shipECEF for a scalar value γ such thatAltitude (XECEF(C=γ))=0, andsubstituting C=γ into XECEF=UECEF−CUfrom shipECEF to giveXECEF=UECEF−γUfrom shipECEF where:r is the value of C that makes the expression Altitude (XECEF(C=γ))=0 true. 3. A method for geoposition determination of an imager on a movable platform in the presence of at least one Earth satellite and at least first and second stars, said method comprising the steps of: observing said at least one Earth satellite and said first and second stars with the imager to produce a plurality of images, each of said images containing an image of at least said at least one Earth satellite or one of said first and second stars, and some of said images containing both the at least one satellite and said first and second stars;in a computer process, determining the imager's motion in an interval between the observing of said at least one Earth satellite, the observing of said first star, and the observing of said second star;from x and y coordinates of said at least one satellite and said first and second stars in said images, determining in a computer process, an azimuth and an elevation of said at least one Earth satellite and an azimuth and an elevation of each of said first and second stars at the imager's location;in a computer process, converting said azimuths and said elevations of said at least one Earth satellite and said first and second stars to a common or local coordinate system fixed in time, to produce a synthetic sky image of the at least one Earth satellite and the first and second stars, wherein said synthetic sky image is constructed from multiple images based on known angular relationships and exposure times of the multiple images; andin a computer process, determining, from said synthetic sky image, the location of the imager in three-dimensional space. 4. A method according to claim 3, further comprising the step of converting units of said location of the imager to latitude and longitude. 5. A method for geoposition determination of longitude and latitude of an imager on a movable platform in the presence of at least two Earth satellites, said method comprising the steps of: observing x and y positions of a first one of the at least two Earth satellites and a first set of two stars in a first image taken with the imager;observing x and y positions of a second one of the at least two satellites and a second set of two stars in a second image taken with the imager;in a computer process, determining, from said first image, a first set of possible locations of the imager in three-dimensional space by mapping a curve on a surface of the Earth based on the position of the first one of the at least two Earth satellites and said first set of two stars;in a computer process, determining, from said second image, a second set of possible locations of the imager in three-dimensional space by mapping a curve on a surface of the Earth based on the position of the second one of the at least two Earth satellites and said second set of two stars; andin a computer process, locating an intersection of said first and second sets of possible locations, based on an intersection of the curves mapped to the surface of the Earth, said intersection being the location of the imager. 6. A method according to claim 5, wherein before said step of locating, further comprising the steps of: in a computer process, determining said imager's motion in an interval between obtaining said first and second images; andin a computer process, compensating said second set of possible locations for said motion. 7. A method according to claim 5, wherein said step of observing the x and y positions of said second one of the at least two Earth satellites and said second set of two stars in said second image includes the step of observing the x and y positions of two stars which are different from the stars of said first image. 8. A method for geoposition determination of an imager on a movable platform in the presence of at least first and second Earth satellites and at least first and second stars, said method comprising the steps of: observing said at least first and second Earth satellites and said first and second stars with the imager, to produce a plurality of images, each of said images containing an image of at least one of said first and second Earth satellites or one of said first and second stars, and some of said images containing both of the first and second Earth satellites and said first and second stars;from the x and y coordinates of said first and second Earth satellites and said first and second stars in said images, determining in a computer process an azimuth and an elevation of said at least first and second Earth satellites and said first and second stars at the imager's location, and converting said azimuth and elevation of said first and second Earth satellites and said first and second stars to a common or local coordinate system fixed in time, to produce a first and second synthetic sky image, each of at least one of said first or second Earth satellites and both of the first and second stars, wherein each of said first and second synthetic sky images are constructed from multiple images based on known angular relationships and exposure times of the multiple images;in a computer process, determining, from said first and second synthetic sky images, a first and second set of possible locations of the imager in three-dimensional space; anddeeming the intersection of said first and second sets of possible locations to be the location of the imager. 9. A method according to claim 8, further comprising the step of converting the units of said location of the observer to latitude and longitude. 10. A method for geoposition determination of an imager on a movable platform in the presence of at least two Earth satellites, said method comprising the steps of: observing x and y positions of a first one of said two Earth satellites and a first star in a first image with the imager;in a computer process, converting said x and y positions of said first one of said two Earth satellites and said first star in said first image to an azimuth and an elevation at the imager's location;observing x and y positions of a second star in a second image;in a computer process, converting said x and y positions of said second star in said second image to an azimuth and an elevation at the imager's location;in a computer process, determining the imager's motion in a first interval between the observing of said first one of said two Earth satellites and said first star and the observing of the second star;in a computer process, generating a first synthetic sky image from said motion of said imager during said first interval and from said azimuth and said elevation of said first one of said two Earth satellites, said first star and said second star; andobserving x and y positions of a second one of said at least two Earth satellites and a third star in a third image;in a computer process, converting said x and y positions of said second one of said at least two satellites and said third star in said third image to an azimuth and an elevation at the imager's location;observing x and y positions of a fourth star in a fourth image;in a computer process, converting said x and y positions of said fourth star in said fourth image to an azimuth and an elevation at the imager's location;in a computer process, determining the imager's motion in a second interval between the observing of said second one of said at least two Earth satellites and said third star and the observing of said fourth star;in a computer process, generating a second synthetic sky image from said motion of said imager during said second interval and from said azimuth and said elevation of said second Earth satellite, said third star and said fourth star;in a computer process, determining, from said first synthetic sky image, a first set of possible locations of the imager in three-dimensional space;in a computer process, determining, from said second image, a second set of possible locations of the imager in three-dimensional space; andin a computer process, locating an intersection of said first and second sets of possible locations, said intersection being the location of the imager. 11. A system for geoposition determination of longitude and latitude of an observer on a movable platform in the presence of at least one Earth satellite and first and second stars, said system comprising: the observer observing x and y positions of said at least one Earth satellite and said first and second stars in a focal plane image; anda processor arrangement for determining, from said x and y positions of said at least one Earth satellite and said first and second stars in said focal plane image, a first set of possible locations of said observer in three-dimensional space, wherein said set of possible locations of the observer in three-dimensional space is determined by creating a curve mapped on a surface of the Earth, based on vectors measured in the first image from the camera to the first star, from the camera to the second star and from the camera to the at least one earth satellite. 12. A system for geoposition determination of longitude and latitude in the presence of at least one Earth satellite and first and second stars, said system comprising: an observer for observing x and y positions of said at least one Earth satellite and said first and second stars in a focal plane image; anda processor arrangement for determining, from said x and y positions of said at least one Earth satellite and said first and second stars in said image, a first set of possible locations of said observer in three-dimensional space,wherein said observer comprises a camera and said processor arrangement includes:a first processor portion for calculating the vector S1CAM extending from said camera to the first star in a camera coordinate frame S1CAM=[sin(x1Px)cos(y1Py)cos(x1Px)cos(y1Py)sin(y1Py)]where:x1 is an x-axis location, in pixels, in a camera focal plane of the first stary1 is an y-axis location, in pixels, in the camera focal plane, of the first star;PX is the radians/pixel along the x-axis of the camera focal plane;Py is the radians/pixel along the y-axis of the camera focal plane;a second processor portion for calculating the vector S2CAM from said camera to the second star in said camera coordinate frame S2CAM=[sin(x2Px)cos(y2Py)cos(x2Px)cos(y2Py)sin(y2Py)];where:x2 is an x-axis location, in pixels, in the camera focal plane of the second stary2 is an y-axis location, in pixels, in the camera focal plane, of the second star;Px is the radians/pixel along the x-axis of the camera focal plane;Py is the radians/pixel along the y-axis of the camera focal plane;a third processor portion for calculating the vector UCAM extending from said camera to said satellite in said camera coordinate frame UCAM=[sin(xUPx)cos(yUPy)cos(xUPx)cos(yUPy)sin(yUPy)];where:xU is an x-axis location, in pixels, in the camera focal plane of the satellite;yU is an y-axis location, in pixels, in the camera focal plane, of the satellite;a fourth processor portion for calculating a 3×3 matrix MECEF representing the orthogonal vector triad of said first and second stars in Earth-centered Earth-fixed (ECEF) coordinate frame MECEF=[S1ECEFS1ECEFXS2ECEFS1ECEFX(S1ECEFXS2ECEF)]where:S1ECEF and S2ECEF are vectors extending from the center of the Earth to the first and second stars, respectively;a fifth processor portion for calculating a 3×3 matrix MCAM representing the orthogonal vector triad of the first and second stars in the camera coordinate frame MCAM[S1CAMS1CAMXS2CAMS1CAMX(S1CAMXS2CAM)]where: S1CAM and S2CAM are vectors from the ship to the first and second stars in said camera coordinate frame, wherein cross product X is a normalized cross product, defined as the unit vector in a direction of a cross product expressed as: (U×V)i≡UjVkɛijkUmVnɛlmnUpVqɛlpq;where:U and V are arbitrary vectors and c is a Levi-Civita symbol and i, j, k, l, m, n, p, q represent Cartesian coordinates as per Eisenstein notation convention;a sixth processor portion for calculating a 3×3 rotation matrix RCAMECEF from said camera coordinate frame to said Earth-centered Earth-fixed ECEF coordinate frame RCAMECEF=MECEF(MCAM)T;a seventh processor portion for calculating a vector Ufrom shipECEF from said observer to said at least one Earth satellite in said Earth-centered Earth-fixed ECEF coordinate frame by Ufrom shipECEF=RCAMECEFUCAM where:UCAM is a 3×1 vector from said observer to said at least one Earth satellite in said camera coordinate frame;an eighth processor portion for calculating a vector XECEF from the center of the Earth to said observer, in said Earth-centered Earth-fixed ECEF coordinate frame, at the time the first camera image is or was taken, by XECEF=UECEF−CUfrom shipECEF where:C is an unknown scalar; andUECEF is a vector to said at least one Earth satellite from the Earth center, which is known from said satellite ephemerides;a ninth processor portion for solving XECEF=UECEF−CUfrom shipECEF for a scalar value γ such that Altitude (XECEF(C=γ))=0, and forsubstituting C=γ into XECEF=UECEF−CUfrom shipECEF to giveXECEF=UECEF−γUfrom shipECEF where:γ is the value of C that makes the expression Altitude (XECEF(C=γ))=0 true. 13. A system for geoposition determination of an observer on a movable platform in the presence of at least one Earth satellite and at least first and second stars, said system comprising: the observer observing said at least one Earth satellite and said first and second stars, to produce a plurality of images in a camera focal plane, each of said images containing the image of said at least one Earth satellite or one of first and second stars, and some of said images containing images of both said at least one Earth satellite and said first and second stars;a first processing portion for determining the motion of said observer in an interval between the time of generation of an image including said at least one Earth satellite, generation of a first image including said first star, and the generation of a second image including said second star;a second processing portion for determining, from the x and y coordinates of said at least one Earth satellite and said first and second stars in said first and second images, the azimuth and elevation of said at least one Earth satellite and said first and second stars at the observer's location, and converting said azimuth and elevation of said at least one Earth satellite and said first and second stars to a common or local coordinate system fixed in time, to thereby produce a synthetic sky image, which synthetic sky image includes at least two of the first and second stars and one at least one Earth satellite, wherein said synthetic sky image is constructed from multiple images based on known angular relationships and exposure times of the multiple images; anda third processing portion for determining, from said synthetic sky image, the location of the observer in three-dimensional space. 14. A system according to claim 13, further comprising: a fourth processing portion for converting the units of said location of the observer to latitude and longitude. 15. A system for determination of longitude and latitude of an observer on a movable platform in the presence of at least two Earth satellites, said system comprising: the observer observing x and y positions of a first one of said at least two Earth satellites and of a first and second star in a first image and for observing x and y positions of a second one of said at least two Earth satellites and a third and fourth star in a second image;a first processing portion for determining, from said first image, a first set of possible locations of the observer in three-dimensional space by mapping a curve on a surface of the Earth based on the position of the first one of the at least two Earth satellites and said first and second stars;a second processing portion for determining, from said second image, a second set of possible locations of the observer in three-dimensional space by mapping a curve on a surface of the Earth based on the position of the first one of the at least two Earth satellites and said third and fourth star; anda third processing portion for locating an intersection of said first and second sets of possible locations, based on an intersection of the curves mapped to the surface of the Earth, said intersection being the location of the observer. 16. A system according to claim 15, further comprising: an fourth processing portion for determining said observer's motion in an interval between obtaining said first and second images; anda fifth processing portion for compensating said second set of possible locations for said motion. 17. A system for geoposition determination of an observer on a movable platform in the presence of at least first and second Earth satellites and at least first and second stars, said system comprising: the observer observing said at least first and second Earth satellites and said at least first and second stars with an imager, to produce a plurality of images, each of said images containing an image of at least one of said at least first and second Earth satellites or one of said at least first and second stars, and some of said images containing images of both of said at least first and second Earth satellites and said first and second stars;a first processing portion for determining an azimuth and an elevation of said at least first and second Earth satellites and an azimuth and an elevation of each of said first and second stars at a location of said observer, and for converting said azimuths and said elevations of said at least first and second Earth satellites and said azimuths and said elevations of said first and second stars to a common or local coordinate system fixed in time, to produce first and second synthetic sky images, each of said first and second synthetic sky images including at least two of the first and second stars and one of the at least first and second Earth satellites, wherein each of said first and second synthetic sky images are constructed from multiple images based on known angular relationships and exposure times of the multiple images;a second processing portion for determining, from said first and second synthetic sky images, first and second sets of possible locations of said observer in three-dimensional space; anda third processing portion for locating an intersection of said first and second sets of possible locations, said intersection being the location of said observer. 18. A system according to claim 17, further comprising a fourth processing portion for converting units of said location of said observer to latitude and longitude. 19. A system for geoposition determination of an observer on a movable platform in the presence of at least two Earth satellites, said system comprising: the observer in a first location for observing x and y positions of a first one of said at least two Earth satellites and a first star in a first image;a first processor portion for converting said x and y positions of said first one of said at least two Earth satellites and said first star in said first image to a first azimuth and a first elevation at said first location of said observer;said observer for observing at a second location x and y positions of a second star in a second image;a second processor portion for converting said x and y positions of said second star in said second image to a second azimuth and a second elevation at said second location of said observer;a third processor portion for determining the observer's motion in a first interval between the observing of said first one of said at least two satellites and said first star and the observing of the second star;a fourth processor portion for generating a first synthetic sky image from said motion of said observer during said first interval and from said first azimuth and said first elevation of said first one of said at least two Earth satellites, said first star and said second star; said observer for observing at a third location x and y positions of a second one of said at least two Earth satellites and a third star in a third image;a fifth processor portion for converting said x and y positions of said second one of said at least two satellites and said third star in said third image to a third azimuth and a third elevation at said third location;a sixth processor portion for determining the observer's motion in a second interval between the observing of said second star and observing of said second one of said at least two Earth satellites and said third star;said observer for observing at a fourth location x and y positions of a fourth star in a fourth image;a seventh processor portion for converting said x and y positions of said fourth star in said fourth image to a fourth azimuth and a fourth elevation at said fourth location;a eighth processor portion for determining the observer's motion in a third interval between the observing of said second at least one Earth satellite and said third star and the observing of the fourth star;a ninth processor portion for generating a second synthetic sky image from said motion of said observer during said second interval and from said third azimuth and said third elevation of said second one of said at least two Earth satellites, said third star and said fourth star;a tenth processor portion for determining, from said first synthetic sky image, a first set of possible locations of the observer at said first location in three-dimensional space;an eleventh processor portion for determining, from said second image, a second set of possible locations of the observer at said first location in three-dimensional space; anda twelfth processor portion for locating an intersection of said first and second sets of possible locations, said intersection being said first location of the observer. 20. The system of claim 11, wherein the observer comprises an optical device fixed to an inertially stabilized trainable mount. 21. The method of claim 1, wherein said vector measurements are determined based on pixel coordinates of said first image. 22. The method of claim 3, wherein said determining the location of the imager from said synthetic sky image further comprises: determining vectors measured in said synthetic sky image, from the imager to the first star, the imager to the second star, and from the imager to the at least one earth satellite,wherein said vector measurements are based on pixel coordinates of the synthetic sky image.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.