Systems, methods and media for generating a panoramic view
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
H04N-005/232
H04N-007/18
H04N-005/225
출원번호
US-0801649
(2013-03-13)
등록번호
US-9479697
(2016-10-25)
발명자
/ 주소
Aguilar, Francisco
Alvarado-Moya, Pablo
Fridberg, Mikhail
출원인 / 주소
BOUNCE IMAGING, INC.
대리인 / 주소
DLA Piper LLP US
인용정보
피인용 횟수 :
34인용 특허 :
16
초록▼
A method is provided for a surveillance system including a remote sensor apparatus a portable receiver. The method includes receiving, at the portable receiver, image data collected simultaneously from a plurality of image sensors installed in the remote sensor apparatus. The method also includes pr
A method is provided for a surveillance system including a remote sensor apparatus a portable receiver. The method includes receiving, at the portable receiver, image data collected simultaneously from a plurality of image sensors installed in the remote sensor apparatus. The method also includes processing, at the portable receiver, the received image data by, for each of the first plurality of captured images, detecting image features and performing feature matches by matching control points across neighboring images based on a control-point assumption that the control points are relevant only in areas of field of view overlap. The areas of field of view are determined by positions of the plurality of image sensors and a field of view of a wide-angle lens fitted with each of the plurality of image sensors. The method further includes generating a panoramic view by blending the processed first plurality of captured images.
대표청구항▼
1. In a surveillance system including a remote sensor apparatus for collecting data and a portable receiver for receiving data collected by at least one sensor installed in the remote sensor apparatus, wherein the portable receiver is capable of running an application program for processing the rece
1. In a surveillance system including a remote sensor apparatus for collecting data and a portable receiver for receiving data collected by at least one sensor installed in the remote sensor apparatus, wherein the portable receiver is capable of running an application program for processing the received data and displaying the processed data on a display screen of the portable receiver, wherein the remote sensor apparatus has a spherical housing for containing a processing unit, a plurality of image sensors coupled to the processor, an inertial measurement unit, and a wireless transceiver and wherein each of the plurality of image sensors is fitted with a wide-angle fisheye lens, a method for generating a panoramic view of a scene external to the housing of the remote sensor apparatus, the method comprising: receiving, at the portable receiver, first image data collected simultaneously from the plurality of image sensors installed in the remote sensor apparatus, wherein the received first image data includes a first plurality of images captured by the plurality of image sensors and associated inertial measurement data, and wherein the plurality of image sensors are located at fixed positions relative to one another and a physical center of the housing;computing, at the portable receiver, a spherical projection for each of the first plurality of captured images with respect to a reference sphere having a virtual center point, wherein the computation of the spherical projection incorporates relative position of the image sensor and respective lens and lens field of view and distortion parameters and includes translating mathematically a center of each spherical projection to the virtual center point;processing, at the portable receiver, the first received image data by, for each of the first plurality of captured images, calculating overlap areas from the computed spherical projections, detecting image features only in areas of field of view overlap and performing feature matches by generating and matching control points across neighboring images only in the areas of field of view overlap; andgenerating a panoramic view by blending the processed first plurality of captured images and using the intertial measurement data associated with each of the first plurality of images to relate a vertical and horizontal orientation of the panoramic view to a coordinate system defined by the virtual center point. 2. The method of claim 1, further comprising displaying the panoramic view on the display screen of the portable receiver. 3. The method of claim 1, wherein processing the first received image data further includes, for each of the first plurality of captured images, correcting initial distortions. 4. The method of claim 1, wherein processing the first received image data further includes, for each of the first plurality of captured images, warping the captured image to compensate for effects of capturing an image using a wide-angle fisheye lens. 5. The method of claim 1, wherein processing the first received image data further includes, for each of the first plurality of captured images, estimating and compensating an exposure of the captured image. 6. The method of claim 1, further comprising: receiving, at the portable receiver, second image data collected simultaneously from the plurality of image sensors installed in the remote sensor apparatus subsequent to receiving the first image data, wherein the remote sensor apparatus is in motion and wherein the received second image data includes a second plurality of images captured by the plurality of image sensors;computing, at the portable receiver, a spherical projection for each of the second plurality of captured images with respect to the reference sphere having the virtual center point, wherein the computation of the spherical projection incorporates relative position of the image sensor and respective lens and lens field of view and distortion parameters and includes translating mathematically a center of each spherical projection to the virtual center point;processing, at the portable receiver, the second received image data by, for each of the second plurality of captured images, calculating overlap areas from the computed spherical projections, detecting image features and performing feature matches by generating and matching control points across neighboring images only in areas of field of view overlap; andgenerating a three-dimensional representation of a space where the remote sensor apparatus is deployed using the first received image data and the second subsequently received image data. 7. The method of claim 1, further comprising displaying the panoramic view in an immersive or virtual reality view by projecting the panoramic view from the virtual center point as a perspective of a viewer and orienting the immersive or virtual reality view relative to the viewer using the inertia measurement data. 8. The method of claim 1, wherein the portable receiver comprises a gyroscope, an accelerometer, and a display to display the generated panoramic view using the application program, wherein the application program includes a deep inspection mode operationally coupled with the gyroscope and accelerometer to navigate the panoramic view, and wherein the method further comprises navigating, using the deep inspection mode, the panoramic view in response to a tilt of the portable receiver by rotating the panoramic view in the direction of the tilt to provide a perspective in the tilted direction as if viewed at the virtual center point. 9. The method of claim 1, further comprising receiving, at the portable receiver, second image data collected simultaneously from a second plurality of image sensors installed in a second remote sensor apparatus comprising a spherical housing containing a processing unit, a plurality of image sensors coupled to the processor, an inertial measurement unit, and a wireless transceiver, wherein the plurality of image sensors are each fitted with a wide-angle fisheye lens and are located at fixed positions relative to one another and a physical center of the housing, and wherein the received second image data includes a second plurality of images and associated inertial measurement data; computing, at the portable receiver, a spherical projection for each of the second plurality of captured images with respect to the reference sphere having the virtual center point, wherein the computation of the spherical projection incorporates relative position of the image sensor and respective lens and lens field of view and distortion parameters and includes translating mathematically a center of each spherical projection to the virtual center point;processing, at the portable receiver, the second received image data by, for each of the second plurality of captured images, calculating overlap areas from the computed spherical projections, detecting image features and performing feature matches by generating and matching control points across neighboring images only in areas of field of view overlap; andusing relative position of the first remote sensor apparatus and the second remote sensor apparatus for mapping or three dimensional imaging. 10. The method of claim 9, wherein the second remote sensor apparatus transmits the second image data to the first remote sensor apparatus over a wireless network and the first remote sensor apparatus transmits the first image data and the second image data received at the portable receiver. 11. The method of claim 1, wherein the remote sensor apparatus comprises infrared or near-infrared LEDs disposed along the outer shell to provide light during collection of the first image data by the plurality of image sensors installed in the remote sensor apparatus. 12. The method of claim 11, wherein the infrared or near-infrared LEDs comprise high-intensity near-infrared LEDs operable to provide light outside human visible spectrum during collection of the first image data. 13. The method of claim 12, wherein each wide-angle fisheye lens is ringed with eight near-infrared LEDs having light brightest at wavelengths around 850 nm. 14. The method of claim 11, wherein the remote sensor apparatus comprises a plurality of light sensors to provide information about ambient lighting to modify shutter exposures and LED flash intensity. 15. The method of claim 1, wherein the remote sensor apparatus uses the image sensors milliseconds before capturing the first images to calibrate lighting and exposure duration. 16. The method of claim 1, wherein the remote sensor apparatus comprises an environmental sensor unit to detect environmental data comprising one or more atmospheric gases, hazardous gases, smoke, vibrations, radiations, CBRN (chem/bio/nuclear/radiological), temperatures, or combinations thereof, and wherein the first image data comprises environmental data associated with the first plurality of captured images. 17. The method of claim 16, wherein the method further comprises displaying the panoramic view on the display of the portable receiver and overlaying the environmental data over the panoramic view in areas of the view associated with the environmental data. 18. The method of claim 1, wherein the remote sensor apparatus comprises a diversionary device comprising one or more of a visible spectrum high intensity LED, speaker, or combination thereof, and wherein the method further comprising triggering, at the portable receiver, the diversionary device to at least one of emit light from the high intensity LED, emit a beep or siren from the speaker, or combination thereof. 19. In a surveillance system including a remote sensor apparatus for collecting data and a portable receiver for receiving data collected by at least one sensor installed in the remote sensor apparatus, wherein the portable receiver is capable of running an application program for processing the received data and displaying the processed data on a display screen of the portable receiver, wherein the remote sensor apparatus has a spherical housing for containing a processing unit, a printed circuit board mounted to the housing at cushioned points, a wireless transceiver, a plurality of image sensors coupled to the processor, an inertial measurement unit, and an environmental sensor unit comprising one or more environmental sensors operable to detect environmental data comprising one or more atmospheric gases, hazardous gases, smoke, vibrations, radiations, CBRN (chem/bio/nuclear/radiological), temperatures, or combinations thereof, and wherein each of the plurality of image sensors is fitted with a wide-angle fisheye lens, a method for generating a panoramic view of a scene external to the housing of the remote sensor apparatus, the method comprising: receiving, at the portable receiver, first image data collected simultaneously from the plurality of image sensors installed in the remote sensor apparatus, wherein the received first image data includes a first plurality of images captured by the plurality of image sensors, associated inertial measurement data and environmental data, and wherein the plurality of image sensors are located at fixed positions relative to one another and a physical center of the housing;computing, at the portable receiver, a spherical projection for each of the first plurality of captured images with respect to a reference sphere having a virtual center point, wherein the computation of the spherical projection incorporates relative position of the image sensor and respective lens and lens field of view and distortion parameters and includes translating mathematically a center of each spherical projection to the virtual center point;processing, at the portable receiver, the first received image data by, for each of the first plurality of captured images, calculating overlap areas from the computed spherical projections, detecting image features only in areas of field of view overlap and performing feature matches by generating and matching control points across neighboring images only in the areas of field of view overlap;generating a panoramic view by blending the processed first plurality of captured images and using the intertial measurement data associated with each of the first plurality of images to relate a vertical and horizontal orientation of the panoramic view to a coordinate system defined by the virtual center point; anddisplaying the panoramic view in an immersive or virtual reality view by projecting the panoramic view from the virtual center point as a perspective of a viewer and orienting the immersive or virtual reality view relative to the viewer using the inertia measurement data,wherein the remote sensor apparatus further comprises near-infrared LEDs operable to provide light outside human visible spectrum during collection of the first image data.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (16)
Foote,Jonathan; Ahmad,Subutai; Boreczky,John, Automatic video system using multiple cameras.
Ramsay, Sean Geoffrey; Mills, Daniel Chantal; Horvath, Dylan Stephen; Bodaly, Scott Andrew Robinson, Systems and methods for generating spherical images.
MacMillan, Timothy; Newman, David A.; Adsumilli, Balineedu Chowdary; Campbell, Scott Patrick, Generation of video from spherical content using edit maps.
Adsumilli, Balineedu Chowdary; Lustig, Ryan; Newman, David A., Scene and activity identification in video summary generation based on motion detected in a video.
Garcia, Vincent; Medioni, Tom; Rouif, Matthieu; Lema, Gabriel; Santoni, Francescu, Systems and methods to detect and correlate user responses to media content.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.