Please use this identifier to cite or link to this item:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/65727
Title: | 利用深度圖與粒子群優化演算法之人體動作偵測 Human Pose Estimation Using Depth Map and Particle Swarm Optimization |
Authors: | Chih-Chun Yang 楊智鈞 |
Advisor: | 鄭士康(Shyh-Kang Jeng) |
Keyword: | 人體姿勢辨識,粒子群優化演算法,深度感測器,圖形處理器編程, Human Pose Estimation,Particle Swarm Optimization,Depth Sensor,GPU Programming, |
Publication Year : | 2012 |
Degree: | 碩士 |
Abstract: | 在本篇論文中,我們提出一個可在CUDA平台上實作的人體姿態辨識方法。此演算法僅需要單視角深度影像,不需要前人研究中所需的彩色影像或多視角影像。此演算法包含以下特點,我們設計了一個由兩個橢圓柱及九個橢球構成且擁有32個自由度的人體模型。也提出了一個改良版的粒子群優化演算法架構以解決我們面對的最佳化問題。最後,充分利用演算法的高度平行性,將此方法實作在CUDA平台上以達到即時運算的效能。我們以微軟Kinect作為深度相機,並用NVIDIA GTS450作為主要的計算處理器。實驗結果顯示,本論文提出的方法可有效的解決此領域常見之自我遮蔽的問題,憑藉著圖形處理器的計算能力,此系統可做即時性運算(每秒鐘12至33幀)。 In this thesis, we propose a human pose estimation algorithm and implement the algorithm on CUDA platform. The proposed algorithm needs only single-view depth image as input, unlike some former works which take color images or multi-view images. The proposed algorithm contains the following features: first, a 32 degree-of-free model composed of two elliptic cylinder and nine ellipsoids is adopted to formulate an optimization problem. Second, a modified particle swarm optimization (PSO) scheme is applied to solve the optimization problem. And this highly parallel algorithm is suitable to be implemented on CUDA platform to achieve real-time performance. We use the Microsoft Kinect as depth sensor and use the NVIDIA GTS450 as computing device. The experimental result shows that the proposed algorithm is robust enough to overcome the self-occlusion which is the common difficulty in this area. And with the aid of this GPU, this algorithm can work in real-time (12-33 fps). |
URI: | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/65727 |
Fulltext Rights: | 有償授權 |
Appears in Collections: | 電信工程學研究所 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
ntu-101-1.pdf Restricted Access | 2.3 MB | Adobe PDF |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.