Methods and apparatus track an object with a first camera causing a second camera to also track the object. The object is a moving object. Geospatial coordinates, including an elevation or altitude of a camera are determined on the camera. A pose of the camera, including a pitch angle and an azimuth
Methods and apparatus track an object with a first camera causing a second camera to also track the object. The object is a moving object. Geospatial coordinates, including an elevation or altitude of a camera are determined on the camera. A pose of the camera, including a pitch angle and an azimuth angle are also determined. Azimuth is determined by a digital compass. A camera pose including pitch is determined by accelerometers and/or gyroscopes. Cameras are communicatively connected allowing to display an image recorded by one camera being displayed on a second camera. At least one camera is on a movable platform with actuators. Actuators are controlled to minimize a difference between a first image of the object and a second image of the object. Cameras are part of a wearable headframe.
대표청구항▼
1. An imaging system to assist a user to locate an object that is mobile and located in a plurality of objects, comprising: a mobile first computing device including a housing which holds a processor, a camera, a display to display an image generated by the camera, positioning circuitry to generate
1. An imaging system to assist a user to locate an object that is mobile and located in a plurality of objects, comprising: a mobile first computing device including a housing which holds a processor, a camera, a display to display an image generated by the camera, positioning circuitry to generate positional data including geospatial coordinates of the mobile first computing device including an altitude, a digital compass to generate a pointing direction of the camera, circuitry to determine a pitch position of the camera and a communication circuit, the mobile first computing device is enabled to transmit a message to a second computing device with a request for positional data including an altitude of the second computing device;the second computing device is in a housing attached to the object that is mobile and located in a plurality of objects, the second computing device, which is portable and mobile, includes in the housing a processor, positioning circuitry to generate positional data including geospatial coordinates of the second computing device including an altitude, and a communication circuit;the second computing device is configured to receive the message with the request from the mobile first computing device to provide positional data related to the location of the object;the second computing device is configured to generate an alert that the request from the communication circuit of the mobile first computing device has been received, and to transmit a message with the positional data related to the location of the object to the mobile first computing device; andthe processor in the mobile first computing device is configured to apply positional data of the mobile first computing device and the received positional data related to the location of the object to determine a pose of the camera that places the object in the field of vision of the camera and to generate one or more indicators on the display to indicate a change in pitch and a change in direction of the camera from a current pose required to place the camera in the pose that places the object in the field of vision of the camera and to generate an indicator on the display that indicates that the object is in the field of vision and where the object is relative to the plurality of objects. 2. The imaging system of claim 1, wherein the housing of the second computing device is a wearable headframe which includes a camera and a display. 3. The imaging system of claim 1, wherein the camera of the mobile first computing device is attached to an actuator controlled platform that places the camera of the first computing device in the pose that places the object in the field of vision of the camera of the mobile first computing device. 4. The imaging system of claim 1, further comprising: the camera of the mobile first computing device is enabled to point at a second object not being the object and the mobile first computing device is configured to display on the display of the mobile first computing device an image generated by a camera of the second computing device, andthe mobile first computer device is configured to provide instructions to the second computing device to place the camera of the second computer device in a pose that generates an image of the second object generated by the camera of the second computer device on a predetermined part of the display of the first computing device. 5. The imaging system of claim 4, further comprising: the processor of the mobile first computing device configured to determine geospatial coordinates of the second object. 6. The imaging system of claim 4, further comprising: the processor of the mobile first computing device configured to determine geospatial coordinates of the second object after it has moved to a next position. 7. The imaging system of claim 4, further comprising a remote control with a separate housing communicatively connected to the mobile first computing device that is configured to initiate the instructions to the second computing device. 8. An imaging system, comprising: a first wearable headframe that is mobile and configured to be moved, including a processor with a memory, a camera, a display configured to display an image generated by the camera, positioning circuitry to generate geospatial coordinates of the first wearable headframe including an altitude, a digital compass to generate a pointing direction of the camera, circuitry to determine a pitch position of the camera and a communication circuit;a second wearable headframe that is mobile and configured to be moved, including a processor with a memory, a camera, a display configured to display an image generated by the camera, positioning circuitry to generate geospatial coordinates of the second wearable headframe including an altitude, a digital compass to generate a pointing direction of the camera, circuitry to determine a pitch position of the camera and a communication circuit, the communication circuits enabled to exchange data with each other;the processor with the memory in the first wearable headframe is configured to retrieve instructions from the memory to generate a request for location data of the second wearable headframe and to transmit the request by the communication circuit of the first wearable headframe to the communication circuit of the second headframe; andthe processor with the memory in the second wearable headframe configured to process and to authorize the request for location data of the second wearable headframe and to generate and transmit data including geospatial data to the first wearable headframe;the processor in the first wearable headframe is configured to determine a pose of the camera of the first wearable headframe including a pitch position that places the second wearable headframe in a field of vision of the camera of the first wearable headframe, andthe processor in the first wearable headframe is configured to generate one or more indicators on the display of the first wearable headframe to indicate a change in pitch and a change in direction of the camera of the first wearable headframe from a current pose required to place the camera of the first wearable headframe in the pose that places the second wearable headframe in the field of vision of the camera of the first wearable headframe and to generate an indicator on the display of the first wearable headframe that indicates that the second wearable headframe is in the field of vision and to identify a location of the second wearable headframe on the display of the first wearable headframe. 9. The system of claim 8, wherein a difference between an actual pose and the pose of the camera of the first wearable headframe that places the second wearable headframe in the field of vision of the camera of the first wearable headframe is determined and an instruction is displayed on the display of the first wearable headframe to minimize the difference. 10. The system of claim 8, further comprising: the first wearable headframe is configured to be moved towards an object not being the first and second wearable headframe, the camera of the first wearable headframe is configured to obtain an image of the object, the display of the first wearable headframe is configured to display the image of the object and the display of the first wearable headframe is also configured to display an image obtained by the camera of the second wearable headframe;the second wearable headframe is configured to be moved towards the object, the camera of the second wearable headframe is configured to obtain an image that is displayed on the display of the second wearable headframe, the display of the second wearable headframe is configured to display the image of the object obtained by the camera of the first wearable headframe; andthe processor of the first wearable headframe is configured to generate a confirmation message for the processor of the second wearable headframe when the object in the image obtained by the camera of the first wearable headframe and displayed by the display of the first wearable headframe is determined to coincide with the image obtained by the camera of the second wearable headframe which contains an image of the object and displayed by the display of the first wearable headframe. 11. The system of claim 10, wherein: geospatial coordinates including a pose of the camera of the first wearable camera are determined;geospatial coordinates including a pose of the camera of the second wearable camera are determined; andgeospatial coordinates of the object are determined. 12. The system of claim 10, wherein: the camera of the first wearable headframe tracks the object while it is moving and the processor of the first wearable headframe provides instructions to the processor of the second wearable headframe enabling it to track the moving object. 13. The imaging system of claim 1, wherein the object is obscured by another object. 14. The imaging system of claim 1, wherein the first computing device is attached to a vehicle selected from the group consisting of a car and an aircraft. 15. A method for directing a mobile camera in a housing to a mobile device with a housing on a remote object with a location including an altitude, comprising: generating by a processor in the housing of the mobile camera a location request including a request for the altitude from a processor in the mobile device with the housing on the remote object;transmitting by a transmitter in the mobile camera of the location request to a receiver in the mobile device on the remote object;authorizing of the location request by the processor in the mobile device on the remote object;generating by circuitry in the mobile device on the remote object of location data of the remote object, including altitude data;transmitting by a transmitter in the mobile device on the remote object of the location data of the remote object including altitude data to a receiver in the housing of the mobile camera;determining by the processor in the housing of the mobile camera of a pose of the mobile camera, including a pitch position, that places the remote object in a field of view of the mobile camera, the pose is calculated based on the received location data of the remote object, on location data generated by location circuitry in the housing of the mobile camera, and on directional data of the mobile camera generated by circuitry in the housing of the mobile camera, wherein a location of the mobile camera is variable; andgenerating by the processor in the housing of the mobile camera one or more indicators on a display in the mobile camera to indicate a change in pitch and a change in direction of the mobile camera from a current pose required to place the mobile camera in the pose that places the remote object in the field of view of the mobile camera;generating by the processor in the housing of the mobile camera an indicator on the display of the mobile camera that indicates that the object is in the field of view of the mobile camera; andmarking by the processor in the housing of the mobile camera a location of the object on the display of the mobile camera. 16. The method of claim 15, further comprising: displaying on the display in the mobile camera a mark that identifies a presence of the remote object wherein the remote object is hidden from the mobile camera by one or more other objects. 17. The method of claim 15, wherein the remote object is a person located in a group of other people. 18. The method of claim 15, wherein the mobile camera is part of a wearable headframe. 19. The imaging system of claim 1, wherein the object is a person located in an audience of people.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (78)
Pace, Vincent; Campbell, Patrick, 3D camera with foreground object distance sensing.
Lee, Seongho; Kim, Jaechul; Chang, Yoonseop; Kim, Kyungok; Park, Jonghyun, Apparatus for calculating 3D spatial coordinates of digital images and method thereof.
Haruhiko Murata JP; Yukio Mori JP; Shuugo Yamashita JP; Akihiro Maenaka JP; Seiji Okada JP; Kanji Ihara JP, Device and method for converting two-dimensional video into three-dimensional video.
Turley, Richard; Dalton, Dan L.; Bloom, Daniel M.; Hofer, Gregory V.; Miller, Casey L.; Woods, Scott A., Digital imaging device shutter calibration method and apparatus using multiple exposures of a single field.
Metcalf, Darrell J., Large-audience, positionable imaging and display system for exhibiting panoramic imagery, and multimedia content featuring a circularity of action.
Lachinski Theodore M. (Andover MN) Ptacek Louis S. (Mound MN) Blais Paul M. (St. Paul MN) Boggs Stephen (Fridley MN) Longfellow John W. (St. Paul MN) Setterholm Jeffrey M. (Lakeville MN), Method and apparatus for collecting and processing visual and spatial position information from a moving platform.
Wong,Terrence L.; Lewis,Gregory S.; Lee,David C.; Ling,Yi; Pope,Stephen P.; Spielman,Steven R.; Zingman,Jonathan A., Method and format for reading and writing in a multilevel optical data systems.
Wehrenberg, Paul J.; Leiba, Aaron; Williams, Richard C.; Falkenburg, David R.; Gerbarg, Louis G.; Chang, Ray L., Methods and apparatuses for operating a portable device based on an accelerometer.
Takeuchi,Kohji; Shimizu,Akihiko, Multi-level information recording apparatus, multi-level information recording method, multi-level information recording medium and multi-level information recording-reproducing apparatus.
Mori, Hiromi, Non-transitory storage medium storing image-forming-data transmitting program, mobile terminal, and control method therefor transmitting movement and orientation information of mobile terminal.
Mollie,Gilles; de Schuyteneer,Vincent, Process and device for managing the memory space of a hard disk, in particular for a receiver of satellite digital television signals.
Hasegawa Akira,JPX ; Shimizu Yoshinori,JPX ; Mizuno Masayoshi,JPX ; Ishida Takayuki,JPX, Video data decoding apparatus and method and video signal reproduction apparatus and method.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.