Repository logo
  • English
  • العربية
  • বাংলা
  • Català
  • Čeština
  • Deutsch
  • Ελληνικά
  • Español
  • Suomi
  • Français
  • Gàidhlig
  • हिंदी
  • Magyar
  • Italiano
  • Қазақ
  • Latviešu
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Српски
  • Svenska
  • Türkçe
  • Yкраї́нська
  • Tiếng Việt
Log In
New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Scholalry Output
  3. Publications
  4. Multi-exposure image fusion using propagated image filtering
 
  • Details

Multi-exposure image fusion using propagated image filtering

Source
Advances in Intelligent Systems and Computing
ISSN
21945357
Date Issued
2017-01-01
Author(s)
Patel, Diptiben
Sonane, Bhoomika
Raman, Shanmuganathan  
DOI
10.1007/978-981-10-2104-6_39
Volume
459 AISC
Abstract
Image fusion is the process of combining multiple images of a same scene to single high-quality image which has more information than any of the input images. In this paper, we propose a new fusion approach in a spatial domain using propagated image filter. The proposed approach calculates the weight map of every input image using the propagated image filter and gradient domain postprocessing. Propagated image filter exploits cumulative weight construction approach for filtering operation.We show that the proposed approach is able to achieve state-of-the-art results for the problem of multi-exposure fusion for various types of indoor and outdoor natural static scenes with varying amounts of dynamic range.
Unpaywall
URI
https://d8.irins.org/handle/IITG2025/22558
Subjects
Computational photography | HDR imaging | Multi-exposure fusion | Propagated image filter
IITGN Knowledge Repository Developed and Managed by Library

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Privacy policy
  • End User Agreement
  • Send Feedback
Repository logo COAR Notify