Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/52186
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor莊永裕
dc.contributor.authorChih-Yang Chenen
dc.contributor.author陳智暘zh_TW
dc.date.accessioned2021-06-15T16:09:12Z-
dc.date.available2015-08-25
dc.date.copyright2015-08-25
dc.date.issued2015
dc.date.submitted2015-08-19
dc.identifier.citation[1] Berthold K Horn and Brian G Schunck. Determining optical flow. In 1981 Technical
symposium east, pages 319–331. International Society for Optics and Photonics,
1981.
[2] Gunnar Farnebäck. Two-frame motion estimation based on polynomial expansion.
In Image Analysis, pages 363–370. Springer, 2003.
[3] Christopher Zach, Thomas Pock, and Horst Bischof. A duality based approach for
realtime tv-l 1 optical flow. In Pattern Recognition, pages 214–223. Springer, 2007.
[4] Michael Tao, Jiamin Bai, Pushmeet Kohli, and Sylvain Paris. Simpleflow: A noniterative,
sublinear optical flow algorithm. In Computer Graphics Forum, volume 31,
pages 345–353. Wiley Online Library, 2012.
[5] Daniel Glasner, Shai Bagon, and Michal Irani. Super-resolution from a single image.
In Computer Vision, 2009 IEEE 12th International Conference on, pages 349–356.
IEEE, 2009.
[6] William T Freeman, Thouis R Jones, and Egon C Pasztor. Example-based superresolution.
Computer Graphics and Applications, IEEE, 22(2):56–65, 2002.
[7] Radu Timofte, Vivek De, and Luc Van Gool. Anchored neighborhood regression
for fast example-based super-resolution. In Computer Vision (ICCV), 2013 IEEE
International Conference on, pages 1920–1927. IEEE, 2013.
[8] Chao Dong, Chen Change Loy, Kaiming He, and Xiaoou Tang. Learning a deep
convolutional network for image super-resolution. In Computer Vision–ECCV 2014,
pages 184–199. Springer, 2014.
[9] Michal Irani and Shmuel Peleg. Improving resolution by image registration. CVGIP:
Graphical models and image processing, 53(3):231–239, 1991.
[10] Sina Farsiu, M Dirk Robinson, Michael Elad, and Peyman Milanfar. Fast and robust
multiframe super resolution. Image processing, IEEE Transactions on, 13(10):1327–
1344, 2004.
[11] Dennis Mitzel, Thomas Pock, Thomas Schoenemann, and Daniel Cremers. Video
super resolution using duality based tv-l 1 optical flow. In Pattern Recognition, pages
432–441. Springer, 2009.
[12] Haichao Zhang and Lawrence Carin. Multi-shot imaging: Joint alignment, deblurring,
and resolution-enhancement. In Computer Vision and Pattern Recognition
(CVPR), 2014 IEEE Conference on, pages 2925–2932. IEEE, 2014.
[13] Jianchao Yang, John Wright, Thomas S Huang, and Yi Ma. Image super-resolution
via sparse representation. Image Processing, IEEE Transactions on, 19(11):2861–
2873, 2010.
[14] Samuel Schulter, Christian Leistner, and Horst Bischof. Fast and accurate image
upscaling with super-resolution forests. In Proceedings of the IEEE Conference on
Computer Vision and Pattern Recognition, pages 3791–3799, 2015.
[15] A Ardeshir Goshtasby. 2-D and 3-D image registration: for medical, remote sensing,
and industrial applications. John Wiley & Sons, 2005.
[16] Matthew Brown, Richard Szeliski, and Simon Winder. Multi-image matching using
multi-scale oriented patches. In Computer Vision and Pattern Recognition, 2005.
CVPR 2005. IEEE Computer Society Conference on, volume 1, pages 510–517.
IEEE, 2005.
[17] B Srinivasa Reddy and Biswanath N Chatterji. An fft-based technique for translation,
rotation, and scale-invariant image registration. IEEE transactions on image
processing, 5(8):1266–1271, 1996.
[18] Sung Cheol Park, Min Kyu Park, and Moon Gi Kang. Super-resolution image reconstruction:
a technical overview. Signal Processing Magazine, IEEE, 20(3):21–
36, 2003.
[19] Hanoch Ur and Daniel Gross. Improved resolution from subpixel shifted pictures.
CVGIP: Graphical Models and Image Processing, 54(2):181–186, 1992.
[20] RY Tsai and Thomas S Huang. Multiframe image restoration and registration. Advances
in computer vision and Image Processing, 1(2):317–339, 1984.
[21] Aggelos K Katsaggelos. Digital image restoration. Springer Publishing Company,
Incorporated, 2012.
[22] Antonio Marquina and Stanley J Osher. Image super-resolution by tv-regularization
and bregman iteration. Journal of Scientific Computing, 37(3):367–382, 2008.
[23] WenYi Zhao and Harpreet S Sawhney. Is super-resolution with optical flow feasible?
In Computer VisionECCV 2002, pages 599–613. Springer, 2002.
[24] AV Kanaev and CW Miller. Multi-frame super-resolution algorithm for complex
motion patterns. Optics express, 21(17):19850–19866, 2013.
[25] Felix Heide, Markus Steinberger, Yun-Ta Tsai, Mushfiqur Rouf, Dawid Pająk, Dikpal
Reddy, Orazio Gallo, Jing Liu, Wolfgang Heidrich, Karen Egiazarian, et al. Flexisp:
A flexible camera image processing framework. ACM Transactions on Graphics
(TOG), 33(6):231, 2014.
[26] Simon Baker, Daniel Scharstein, JP Lewis, Stefan Roth, Michael J Black, and
Richard Szeliski. A database and evaluation methodology for optical flow. International
Journal of Computer Vision, 92(1):1–31, 2011.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/52186-
dc.description.abstract從一連串觀測到的低解析度影像,合成一張高解析度影像的演算
法,稱之為多張影像超解析度。而重建式多張影像超解析度演算法大
致上可分為兩個步驟: 低解析影像之間的對齊與高解析影像的重建。
在本篇論文中,基於不同張低解析度影像間光強度一致性的假設,
嘗試多種光流法來對齊影像。並且基於來回光流的一致性假設,計算
光流的可信度,將其帶入高解析度影像的重建中,以減少對齊誤差對
重建結果的影響。然而在”分辨力測式卡”這樣的測試資料中,會因
為相機在拍攝過於高頻的圖樣時所產生的錯誤成像,違背光強度一致
性的假設,進而導致光流法對齊失敗。所以我們提出在使用光流法對
齊前,將圖片先進行模糊處理,使得光流法不受此種錯誤成像的影響。
另外,由於現今照片的解析度越來越高,重建高解析度影像需要龐
大的記憶體與時間。本篇論文提出將重建分成多個可平行處理的資料
塊,以減少記憶體用量。並且在硬體方面嘗試使用多執行緒與圖型處
理器加速。重建演算法方面則是提出使用最近鄰居重建法與線性重建
法的結合,進而達到加速的效果。
透過本篇論文提出的方法,能使將光流法運用於多張重建式超解析
度之方法更為可靠。並減少重建的時間與記憶體使用量。
zh_TW
dc.description.abstractMethod of integrating a high-resolution (HR) image from multiple observed
low-resolution (LR) images is called multi-frame super-resolution (SR).
There are basically two stages of reconstruction-based SR: registration of LR
images and reconstruction of HR image.
In this thesis, we based on the assumption of intensity consistency, and
tried several optical flow methods as registration method. Also, based on another
assumption: ”forward-backward flow consistency”, we calculated the
confidence of a flow, then brought confidence into HR image reconstruction
to reduce the error caused by mis-registration. But in the test sets like ”resolution
chart”, there will be some errors caused by some patterns with frequencies
that is too high. The errors violates the assumption of intensity consistency,
which will cause fail registration of optical flow method. Thus, we proposed
to applying blur before calculating the flow. The method can prevent optical
flow from failing.
Also, due to the resolution of images nowadays becomes higher and higher,
which will make the reconstruction of HR image need enormous amount of
memory usage and time. The thesis proposed to divide the reconstruction
to multiple parallelable data blocks to reduce memory and time usage, and
proposed multi-thread and GPU speed-ups. As for algorithm speed-up, we
proposed combining nearest neighbors (NN) reconstruction and linear reconstruction to achieve acceleration.
With the method proposed by this thesis, we can make using optical flow
in multi-frame reconstruction-based SR more robust, and reduce the reconstruction
time and peak memory usage.
en
dc.description.provenanceMade available in DSpace on 2021-06-15T16:09:12Z (GMT). No. of bitstreams: 1
ntu-104-R02922132-1.pdf: 9426512 bytes, checksum: 983dc2419dd43f629d30a2819eeb6863 (MD5)
Previous issue date: 2015
en
dc.description.tableofcontents1 Introduction 1
1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Observation Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2 Related Work 5
2.1 Single-Frame SR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Multi-Frame SR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2.1 Registration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2.2 Reconstruction . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.3 Optical Flow in Multi-Frame SR . . . . . . . . . . . . . . . . . . . . . . 7
3 Flow-Based Registrations 9
3.1 Horn and Schunck [1] . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.2 Farneback [2] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.3 TVL1 Optical Flow [3] . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.4 Simple Flow [4] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.5 Flow Comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.6 SR Comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.6.1 Reconstruction . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.6.2 Flow Confidence in Reconstruction . . . . . . . . . . . . . . . . 14
3.7 Violation of Intensity Consistency . . . . . . . . . . . . . . . . . . . . . 15
3.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
4 Reconstruction Speed-Up 20
4.1 Divide the Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.2 Multi-thread . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.3 GPU . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.4 Hybrid Method: NN + L2 Reconstruction . . . . . . . . . . . . . . . . . 22
5 Result 24
5.1 TVL1 Optical Flow Multi-Frame SR . . . . . . . . . . . . . . . . . . . . 24
5.2 Reconstruction Speed-Up . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.2.1 Hardware Parallel Result . . . . . . . . . . . . . . . . . . . . . . 25
5.2.2 NN + L2 Reconstruction Hybrid Method Result . . . . . . . . . . 26
6 Conclusion 30
Bibliography 32
dc.language.isoen
dc.subject可靠方法zh_TW
dc.subject超解析度zh_TW
dc.subject加速zh_TW
dc.subject平行zh_TW
dc.subject光流法zh_TW
dc.subject可信度zh_TW
dc.subjectaccelerationen
dc.subjectrobust methoden
dc.subjectconfidenceen
dc.subjectoptical flowen
dc.subjectparallelen
dc.subjectsuper resolutionen
dc.title將光流法應用於重建式多張影像超解析之可靠方法zh_TW
dc.titleA Robust Reconstruction-Based Multi-Frame Super-Resolution Method using Optical Flowen
dc.typeThesis
dc.date.schoolyear103-2
dc.description.degree碩士
dc.contributor.oralexamcommittee吳賦哲,葉正聖
dc.subject.keyword超解析度,加速,平行,光流法,可信度,可靠方法,zh_TW
dc.subject.keywordsuper resolution,acceleration,parallel,optical flow,confidence,robust method,en
dc.relation.page34
dc.rights.note有償授權
dc.date.accepted2015-08-19
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-104-1.pdf
  未授權公開取用
9.21 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved