Multi-exposure image fusion using propagated image filtering
Source
Advances in Intelligent Systems and Computing
ISSN
21945357
Date Issued
2017-01-01
Author(s)
Abstract
Image fusion is the process of combining multiple images of a same scene to single high-quality image which has more information than any of the input images. In this paper, we propose a new fusion approach in a spatial domain using propagated image filter. The proposed approach calculates the weight map of every input image using the propagated image filter and gradient domain postprocessing. Propagated image filter exploits cumulative weight construction approach for filtering operation.We show that the proposed approach is able to achieve state-of-the-art results for the problem of multi-exposure fusion for various types of indoor and outdoor natural static scenes with varying amounts of dynamic range.
Subjects
Computational photography | HDR imaging | Multi-exposure fusion | Propagated image filter
