Please use this identifier to cite or link to this item:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/56754
Title: | 以優先步驟高斯-賽得爾法完成基於移動深度相機在擁擠都市環境之稠密對應點估測與運動分割 A Prioritized Gauss-Seidel Method for Dense Correspondence Estimation and Motion Segmentation in Crowded Urban Areas with a Moving Depth Camera |
Authors: | Yi Chiang 江懿 |
Advisor: | 王傑智(Chieh-Chih Wang) |
Keyword: | 對應,分割,深度,相機,高斯,賽得爾, correspondence,segmentation,depth,camera,Gauss,Seidel, |
Publication Year : | 2014 |
Degree: | 碩士 |
Abstract: | 稠密 RGB-D 運動圖像分割是計算機視覺、影像處理與機器人學中重要的前處理模塊。本論文提出了一個基於深度資訊且不依賴於顏色的運動圖像分割演算法。此演算法以最佳化框架將有著一致運動的物件從背景與彼此分割並計算其運動參數。本方法在運動分割的同時也計算稠密點對應。本論文並提出了一個基於區塊約束非線性高斯 -賽得爾迭代 [1] 與優先步驟搜尋法 [2] 的數值方法來有效率的解決所提出的最佳化問題。本數值方法將變數分類並決定其優化的順序並對演算法的收歛性加以證明。此演算法在移動相機且佈滿非剛體動態物件的環境下有
良好的表現。 Dense RGB-D video motion segmentation is an important preprocessing module in computer vision, image processing and robotics. A motion segmentation algorithm based on an optimization framework which utilizes depth information only is presented in this thesis. The proposed optimization framework segments and estimates rigid motion parameters of each locally rigid moving objects with coherent motion. The proposed method also calculates dense point correspondences while performing segmentation. An efficient numerical algorithm based on Constrained Block Nonlinear Gauss-Seidel (CNLGS) algorithm [1] and Prioritized Step Search [2] is proposed to solve the optimization problem. It classifies variables including point correspondences into groups and determines the ordering of variables to optimize. We prove the proposed numerical algorithm to converge to a theoretical bound. The proposed algorithm works well with a moving camera in highly dynamic urban scenes with non-rigid moving objects. |
URI: | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/56754 |
Fulltext Rights: | 有償授權 |
Appears in Collections: | 資訊工程學系 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
ntu-103-1.pdf Restricted Access | 26.13 MB | Adobe PDF |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.