Please use this identifier to cite or link to this item:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/63508
Title: | 基於手部形狀特徵之物件推測 Object Inference via Hand Shape Cues |
Authors: | Li Wang 王立 |
Advisor: | 王傑智(Chieh-Chih Wang) |
Keyword: | 物體辨識,點狀雲,快速點特徵分布圖,手形辨識,機器學習, Object Recognition,Point Clouds,FPFH,Hand Shape Recognition,Machine Learning, |
Publication Year : | 2012 |
Degree: | 碩士 |
Abstract: | 「辨識人類在進行某項動作時所使用的器具」是物件辨識領域的一個重要課題。然而,當辨識目標為文具或桌上型日用品等小型器具時,往往會因為此類器具被使用者的手遮蔽而無法成功辨識。在本篇論文中,我們將介紹一種當物件被手所遮蔽而無法直接辨識時、轉而利用該手的形狀推測物件種類的方法。這個方法是先從拍攝到的手部表面均勻地抽取一部份採樣點、並計算快速點特徵分布圖( Fast Point Feature Histogram, FPFH )後,再利用基於支持向量機( Support Vector Machine, SVM )的學習方式判斷各採樣點是否與特定物件種類有高度相關。最後,各採樣點的結果會透過一個計分公式整合成對物件種類的最終推測。另外,我們針對一組七種不同類別、總筆數 3750 的測試資料進行實驗;藉由文中所提供的推測架構,我們能夠取得 93.62% 的辨識率。 One of the key interests in the field of object recognition is to recognize tools or objects human subjects are using while performing certain tasks. However, for smaller objects such as stationeries or other desk objects, recognition tasks can be challenging, as these objects can be mostly or fully occluded by human hands during manipulation. In this thesis, we have showed that it is possible to make inferences of the objects occluded during a hand-object interaction by observing the shape of the occluding hands. This is done by calculating Fast Point Feature Histograms (FPFHs) for points sampled from the input point cloud clusters, applying Support Vector Machine (SVM) based training and testing to determine points which have high confidence of being related to a certain object, and a scoring system to determine final decisions. Experiments done on our 3750-frame dataset showed a recognition accuracy of 93.61% by using the proposed framework. |
URI: | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/63508 |
Fulltext Rights: | 有償授權 |
Appears in Collections: | 資訊網路與多媒體研究所 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
ntu-101-1.pdf Restricted Access | 13.95 MB | Adobe PDF |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.