Please use this identifier to cite or link to this item:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/100921| Title: | 合成多重曝光融合資料集:從靜態資料生成動態訓練 數據的方法 Synthesis Multi-Exposure Fusion Dataset: Generating Dynamic Training Sets from Static Datasets |
| Authors: | 蘇浚笙 CHUN-SHENG SU |
| Advisor: | 莊永裕 Yung-Yu Chuang |
| Keyword: | 多曝光影像融合,資料集鬼影 Multi-Exposure image fusion,datasetghosting |
| Publication Year : | 2025 |
| Degree: | 碩士 |
| Abstract: | 本研究針對動態多重曝光融合(Multi-Exposure Fusion, MEF)資料集的稀缺問題,提出一種方法將現有的靜態資料集轉換為可供模型訓練的動態版本,以提升模型的去鬼影能力。我們設計一套演算法,能夠分析靜態場景的多曝光影像組,並自動判定可用於合成動態組件的區域。我們進行了多層面的實驗比較,涵蓋不同演算法設定以及訓練策略,藉此找出最能優化模型效能的配置。此外,我們亦進行跨資料集的實驗,以驗證方法在不同數據來源下的泛用性。實驗結果顯示,使用經本方法轉換之靜態資料集進行訓練的模型,在多項評估指標上皆能達到與真實動態資料集訓練結果相當甚至更優的表現,證明了本研究方法的實用性與穩健性。 This work addresses the scarcity of dynamic multi-exposure fusion (MEF) datasets and proposes a method to transform existing static datasets into dynamic datasets suitable for training models capable of de-ghosting. We develop an algorithm that analyzes static multi-exposure image sequences to identify candidate regions for synthesizing dynamic components. We conduct extensive experiments under various conditions,including comparisons with different algorithmic settings and training strategies. These comparisons enable us to determine the optimal configuration that consistently improves model performance. Furthermore, we perform cross-dataset experiments to demonstrate the generalizability of the proposed method. Experimental results show that models trained on the transformed static datasets achieve comparable or even better performance to those trained on real dynamic datasets, confirming the practicality and robustness of our approach. |
| URI: | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/100921 |
| DOI: | 10.6342/NTU202504663 |
| Fulltext Rights: | 同意授權(全球公開) |
| metadata.dc.date.embargo-lift: | 2025-11-27 |
| Appears in Collections: | 資訊網路與多媒體研究所 |
Files in This Item:
| File | Size | Format | |
|---|---|---|---|
| ntu-114-1.pdf | 13.24 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
