System and method for blending images into a single image
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06K-009/36
G09G-005/00
출원번호
UP-0814302
(2004-04-01)
등록번호
US-7813589
(2010-11-01)
발명자
/ 주소
Silverstein, D. Amnon
Deng, Yining
출원인 / 주소
Hewlett-Packard Development Company, L.P.
인용정보
피인용 횟수 :
6인용 특허 :
51
초록▼
A method and system for blending images into a single image. Initially two images of a view are selected, wherein the images have overlapping content of the view. The images can differ from each other in such characteristics as time, camera location, camera settings, and lighting. The images are div
A method and system for blending images into a single image. Initially two images of a view are selected, wherein the images have overlapping content of the view. The images can differ from each other in such characteristics as time, camera location, camera settings, and lighting. The images are divided into strips along a common plane in a region of each image where the images overlap. A strip from each image is selected where the images are a close match. Pixel by pixel difference values between the two strips are calculated, and a cut line is determined where the differences between the two strips are minimized. Each image is cut along the corresponding cut line, and the cut images are blended together to form a single image of the view. The blended single image can be further processed by warping the image along the cut line to provide for a smoother fit between the two images.
대표청구항▼
What is claimed is: 1. A method for blending images into a composite image, comprising: selecting two images having overlapping content; dividing the two images into strips; selecting a strip in each of the two images where the two images overlap each other; determining differences between the over
What is claimed is: 1. A method for blending images into a composite image, comprising: selecting two images having overlapping content; dividing the two images into strips; selecting a strip in each of the two images where the two images overlap each other; determining differences between the overlapping strips; determining a minimized line through the overlapping strips where the differences between the overlapping strips are minimized; and blending the two images together along the minimized line to create a composite image. 2. The method according to claim 1, wherein the selected images belong to a set of images comprising a scene. 3. The method according to claim 1, wherein the selected images are divided along a common plane. 4. The method according to claim 1, wherein the selected images are divided into strips along one of a vertical plane or a horizontal plane. 5. The method according to claim 1, wherein the two overlapping strips are selected according to a mean squared difference algorithm such that the sum of the mean squared difference values between the two selected strips is minimized. 6. The method according to claim 1, including: calculating a squared color difference value for each pixel pair between the overlapping strips; converting the squared color difference values into a gray scale image of the overlapping strips, wherein the brightest pixels in the gray scale image correspond to the pixels of greatest difference between the two overlapping strips; sorting the gray scale pixels from largest to smallest difference value; for each sorted gray scale pixel, mapping the gray scale pixel to one of two regions within the overlapping strip according to the adjacency of the gray scale pixel to the one of the two regions; determining a cut line between the two regions; cutting each selected image along the cut line within the overlapping strip of each selected image; and combining the two cut selected images along the cut line to form the composite image. 7. The method according to claim 6, wherein the cut line is determined between a first region and a second region to which the pixels have been mapped. 8. The method according to claim 6, wherein the cut line corresponds to the line of best match between the overlapping strips. 9. The method according to claim 6, wherein at least one of the cut images is warped along the cut line to improve the fit between the two cut images along the cut line. 10. The method according to claim 1, wherein the blending of images is performed iteratively, with the blended composite image being utilized as one of the selected two images to be blended. 11. The method according to claim 10, wherein the method of blending is performed iteratively until all images comprising the scene have been blended into a final image of the scene. 12. The method according to claim 1, wherein the selecting comprises selecting the strips of the two images which provide reduced error between the overlapping strips compared with non-selected strips of the two images. 13. The method according to claim 1, wherein the determining differences comprises determining differences between image data content of the overlapping strips. 14. The method according to claim 13 wherein the determining differences between image data content comprises determining differences between the image data content of one pixel of one of the overlapping strips and one pixel of another of the overlapping strips and wherein the one pixels of the one and the another of the overlapping strips both correspond to the same subject present in the two images. 15. The method according to claim 13 wherein the determining differences comprises determining differences between the image data content comprising color space content of the overlapping strips. 16. The method according to claim 1, wherein the selecting a strip in each of the two images comprises selecting the strips in the two images which comprise the same content of a scene present in the two images. 17. The method according to claim 1, wherein the selectings, dividing, determinings and blending comprise selectings, dividing, determinings and blending using processing circuitry. 18. The method according to claim 1, further comprising storing the composite image. 19. The method according to claim 18, further comprising displaying the composite image. 20. A method for blending two images into a composite image, comprising: dividing two images into strips along a common plane; selecting a strip in each image where the two images overlap, wherein the selecting comprises selecting the overlapping strips which have reduced error between the overlapping strips compared with non-selected overlapping strips of the two images; determining a minimized line through the selected overlapping strips where differences between the selected overlapping strips are minimized; blending the two images along the minimized line to create a composite image; and warping the composite image to minimize blurring along the minimized line. 21. The method according to claim 20, wherein the minimized line is determined by calculating mean squared difference values for pairs of pixels between the two selected overlapping strips. 22. The method according to claim 20, wherein at least one of the images is warped where the differences between the selected overlapping strips along the blending line exceed a predetermined threshold. 23. The method according to claim 20, wherein the dividing, selecting, determining, blending and warping comprise dividing, selecting, determining, blending and warping using processing circuitry. 24. The method according to claim 20, further comprising storing the composite image. 25. The method according to claim 24, further comprising displaying the composite image. 26. A computer-based system for blending images into a composite image, comprising: a computer configured to: divide two images having overlapping content into strips along a common plane wherein each strip is a long and narrow piece of the image having one dimension which is greater than another dimension of the respective strip; select a strip of uniform width in each of the two images where the two images overlap each other; determine pixel difference values between the overlapping strips; determine a minimized line through the overlapping strips where a sum of the pixel difference values between the overlapping strips is minimized; and blend the two images together along the minimized line to create a composite image. 27. The system according to claim 26, wherein the two overlapping strips are selected according to a mean squared difference algorithm such that the sum of the mean squared difference values between the overlapping strips is minimized. 28. The system according to claim 26, wherein the computer is configured to: calculate a squared color difference value for each pixel pair between the overlapping strips; convert the squared color difference values into a gray scale image of the overlapping strips, wherein the brightest pixels in the gray scale image correspond to the pixels of greatest difference between the two overlapping strips; sort the gray scale pixels from largest to smallest difference value; for each sorted gray scale pixel, map the gray scale pixel to one of two regions within the overlapping strip according to the adjacency of the sort gray scale pixel to the one of the two regions; determine a cut line between the two regions; cut each image along the cut line of the overlapping strip of each image; and combine the two cut images along the cut line to form the composite image. 29. The system according to claim 28, wherein the cut line is determined by calculating mean squared difference values for pairs of pixels between the two selected image strips. 30. The system according to claim 28, wherein at least one of the images is warped where the differences between the selected strips along the cut line exceed a predetermined threshold. 31. A system for blending images into a composite image, comprising: means for dividing two images having overlapping content into strips along a common plane in at least one region of overlap wherein each strip is a long and narrow piece of the image having one dimension which is greater than another dimension of the respective strip; means for calculating difference values between image data content of respective pixels of the two images in corresponding strips of uniform length in the at least one region of overlap; means for determining a cut line through the two images where the difference values are minimized; and means for blending the two images along the cut line to create a composite image. 32. A system for blending images into a composite image, comprising: a first computing module dividing two images having overlapping content into strips along a common plane in at least one region of overlap; a second computing module calculating difference values between pixels of the two images in the at least one region of overlap, wherein the difference values individually correspond to a difference of image data content between a pair of corresponding pixels of the two images; a third computing module determining a cut line through the two images where the difference values are minimized; and a fourth computing module blending the two images along the cut line to create a composite image. 33. The system according to claim 32, including selecting two overlapping strips according to a mean squared difference algorithm such that the sum of the mean squared difference values between the two overlapping strips is minimized. 34. The system according to claim 32, including: a fifth computing module cutting the two images along the cut line; and a sixth computing module joining the cut images together to create the composite image. 35. The system according to claim 32, wherein the blending of images is performed iteratively, with the composite image being utilized as one of the two images to be blended. 36. The system according to claim 32, wherein the pairs of the pixels individually correspond to the same subject present in the two images. 37. A non-transitory computer readable medium storing software for blending images into a composite image, wherein the software is provided for: selecting two images having overlapping content; dividing the two images into strips along a common plane where the two images overlap each other; selecting a strip in each of the two images; determining the differences between the overlapping strips; determining a minimized line through the overlapping strips where the differences between the overlapping strips are minimized; and blending the two images together along the minimized line to create a composite image. 38. The software according to claim 37, wherein the two overlapping strips are selected according to a mean squared difference algorithm such that the sum of the mean squared difference values between the overlapping strips is minimized. 39. The software according to claim 37, wherein the software is provided for: calculating a difference value for each pixel pair between the two overlapping strips; converting the calculated difference values into a gray scale image of the overlapping strips, wherein the brightest pixels in the gray scale image correspond to the pixels of greatest difference between the two overlapping strips; sorting the gray scale pixels from largest to smallest difference value; for each sorted gray scale pixel, mapping the gray scale pixel to a first region or a second region within the overlapping strip according to the adjacency of the gray scale pixel to the first region or the second region; determining a cut line within the overlapping strips between the first mapped region and the second mapped region; cutting each selected image along the cut line of the overlapping strip of each selected image; and combining the two cut selected images along the cut line to form the composite image. 40. The software according to claim 37, wherein the selecting a strip in each of the two images comprises selecting the strips in the two images which comprise the same content of a scene present in the two images.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (51)
Xiong, Yalin; Turkowski, Ken, Aligning rectilinear images in 3D through projective registration and calibration.
Szeliski Richard ; Shum Heung-Yeung, Image mosaic construction system and apparatus with patch-based alignment, global block adjustment and pair-wise motion-based local warping.
Herman ; deceased Joshua Randy ; Bergen James Russell ; Peleg Shmuel,ILX ; Paragano Vincent ; Dixon Douglas F. ; Burt Peter J. ; Sawhney Harpreet ; Gendel Gary A. ; Kumar Rakesh ; Brill Michael H., Method and apparatus for mosaic image construction.
Hsu Stephen Charles ; Kumar Rakesh ; Sawhney Harpreet Singh ; Bergen James R. ; Dixon Doug ; Paragano Vince ; Gendel Gary, Method and apparatus for performing local to global multiframe alignment to construct mosaic images.
Todd J. Stephan ; David A. Noblett ; Jun Yang, Method and system for overlaying at least three microarray images to obtain a multicolor composite image.
Mayer, III, Theodore; Paul, Lawrence S.; Chaney, Todd, Projection system and method for using a single light source to generate multiple images to be edge blended for arrayed or tiled display.
Schmucker Mark A. ; Schmit Joanna, Selection process for sequentially combining multiple sets of overlapping surface-profile interferometric data to produ.
Crandall,Greg J.; Eichhorn,Ole; Olson,Allen H.; Soenksen,Dirk G., System and method for data management in a linear-array-based microscope slide scanner.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.