Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電子工程學研究所
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/21485
Title: 基於位移導向之立體影像視角合成應用於單一或多視角彩色深度相機
Displacement-oriented View Synthesis for Single/Multiple RGBD Cameras
Authors: Yu-Sheng Hsu
徐佑昇
Advisor: 陳良基
Keyword: 影像處理,相機,影像,視角合成,多視角,深度,位移,
camera,RGBD,image processing,view synthesis,displacement,depth,multiview,
Publication Year : 2019
Degree: 碩士
Abstract: With the rapid improvement of technology, RGB/D cameras have been growing more and more popular. Depth map plays an important role for many 3D applications in human–computer interaction systems. There are several technical challenges in producing a high-quality synthesized view. To generate more comfortable novel view has become the bottleneck for current research.
Multimedia applications such as 3DTV, and virtual reality provide viewers with 3D experience by presenting videos from different viewpoints to our eyes. Through we can build rich 3D maps of environments is an important task for mobile robotics, with applications in navigation, manipulation, semantic mapping, and telepresence. But 3D point clouds frame-to-frame alignment and dense 3D reconstruction require high bandwidth, memory and computational costs because of costly iterative operations, the original point cloud is computationally expensive for real-time system implementation. View synthesis is just like 3D information projection to 2D, is an efficient implementation in daily life. Most popular view synthesis system adopted by the ITU/MPEG standard uses depth image based rendering (DIBR) techniques.
In this thesis, we propose to tackle the artifacts, pinhole, disocclusion of RGB-D multiview images when synthesizing new views of a scene by changing its viewpoint. We first examine how ghost contour come from, why the disocclusion region cannot be seen in the original view but exposed in the virtual view, and pinholes/Cracks appear in the derived frame for surfaces whose normal has rotated towards the user in the derived frame.
2D information from images transformation to 3D and make connection between reference view and novel view. We analyze 3D warping techniques: background erosion is proposed here to remove the wrongly warped boundary, forward warping is to write a single derived pixel for each warped reference pixel. And we define a vector displacement as corresponding feature points’ “movement” between different views to help us to do backward warping. Further more, we point out the disocclusion is the area between foreground-background edge displacement difference, and we combine convention inpainting technique and our growing guidance to get improvement on hole filling. Large holes are pretty hard to be filled with an acceptable subjective quality. The situation can be relieved by multiview. Despite of z-buffering, we also propose view weighting based on the distance between reference and novel view and the other method winner take all. In order to generate free view, we exploit quaternion rotation to do inter-view interpolation, and analyze the quality versus view point. Finally, we can provide quality good and comfortable virtual view synthesis through our proposed.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/21485
DOI: 10.6342/NTU201902139
Fulltext Rights: 未授權
Appears in Collections:電子工程學研究所

Files in This Item:
File SizeFormat 
ntu-108-1.pdf
  Restricted Access
25.17 MBAdobe PDF
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved