請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/6587
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 張璞曾 | |
dc.contributor.author | Wei-Chih Chen | en |
dc.contributor.author | 陳威誌 | zh_TW |
dc.date.accessioned | 2021-05-17T09:14:58Z | - |
dc.date.available | 2014-08-22 | |
dc.date.available | 2021-05-17T09:14:58Z | - |
dc.date.copyright | 2012-08-22 | |
dc.date.issued | 2012 | |
dc.date.submitted | 2012-08-13 | |
dc.identifier.citation | [1] Thomas J. Lord, Yu li, Diana M. Keefe, Nikolay Stoykov,
Derek Kamper, “Development of a haptic keypad for training finger individuation after stroke,” 2011 International Conference on Virtual Rehabilitation (ICVR), Zurich, Switzerland, June 2011, pp. 1-2. [2] Sriram Sanka, Prashanth G. Reddy, Amber Alt, Ann Reinthal, Nigamanth Sridhar, “Utilization of a wrist- mounted accelerometer to count movement repetitions,” 2012 Fourth International Conference on Communication Systems and Networks (COMSNETS), Bangalore, India, January 2012, pp. 1-6. [3] 蔡爭岳, “以類神經網路及3D資料建立靜態台灣手勢辨識系 統”, 國立台灣科技大學工業管理系博士論文, 2007 [4] sugizo, “身體就是感應器,微軟Kinect是怎麼做到的? Web Site,” April 2012. [Online]. Available: http://www.techbang.com/posts/2936-get-to-know-how-it- works-kinect [5] Jgospel, “微軟官方揭密Kinect工作原理 Web Site,” June 2012. [Online]. Available: http://jgospel.net/news/tech/ 微軟官方揭秘kinect工作原理._gc28095.aspx?location=TW [6] Heresy, “OpenNI / Kinect 相關文章目錄,” April 2012. [Online]. Available: http://kheresy.wordpress.com/index_of_openni_and_kinect/ [7] OpenNI, “OpenNI User Guide,” April 2011. [Online]. Available: http://www.openni.org/documentation [8] PrimeSensor, “PrimeSensor Nite Web Site,” April 2012. [Online]. Available: http://www.primesense.com/?p=515 [9] Wikipedia, “OpenCV,” April 2012. [Online]. Available: http://en.wikipedia.org/wiki/OpenCV [10] Wikipedia, “OpenGL,” May 2012. [Online]. Available: http://en.wikipedia.org/wiki/OpenGL [11] Wikipedia, “Kinect,” April 2012. [Online]. Available: http://en.wikipedia.org/wiki/Kinect [12] 宏遠儀器, “Leica DISTOTM D3a BT,” June 2012. [Online]. Available: http://www.control- signal.com.tw/product.php [13] J. Segen, S.Kumar, “Human-Computer Interaction Using Gesture Recognition and 3D Hand Tracking,” 1998 International Conference on Image Processing (ICIP), Chicago, USA, October 1998, vol. 3, pp. 188-192. [14] Stephan Rogge, Philipp Amtsfeld, Christian Hentschel, Dieter Bestle, Marcus Meyer, “Using gestures to interactively modify turbine blades in a virtual environment,” 2012 IEEE International Conference on Emerging Signal Processing Applications (ESPA), Las Vegas, USA, January 2012, pp. 155-158. [15] K. K. Biswas, Saurav Kumar Basu, “Gesture recognition using Microsoft Kinect,” 2011 5th International Conference on Automation, Robotics and Applications (ICARA), Wellington, New Zealand, December 2012, pp. 100-103. [16] Jagdish L. Raheja, Ankit Chaudhary, Kunal Singal, “Tracking of fingertips and centres of palm using Kinect,” 2011 Third International Conference on Computational Intelligence, Modelling and Simulation (CIMSiM), Langkawi, Malaysia, September 2011, pp. 248- 252. [17] Nadia Hocine, Abdelkader Gouaïch, “Therapeutic games’ difficulty adaptation: an approach based on player’s ability and motivation,” 2011 16th International Conference on Computer Games (CGAMES), Kentucky, USA, July 2011, pp. 257-261. [18] Gwyn N Lewis, Claire Woods, Juliet A Rosie, Kathryn M McPherson, “Virtual reality games for rehabilitation: perspectives from the users and new directions,” 2011 International Conference on Virtual Rehabilitation (ICVR), Zurich, Switzerland, June 2011, pp. 1-2. [19] Eduardo Souza Santos, Edgard A. Lamounier, Alexandre Cardoso, “Interaction in augmented reality environments using Kinect,” 2011 XIII Symposum on Virtual Reality (SVR), Uberlandia, Brazil, May 2011, pp. 112-121. [20] 何正宇, 王志龍, 盧玉強, 孫淑芬, 張照宏, 蔡欣宜, “以 WiiTM建構虛擬實境輔助慢性中風患者復健訓練之療效評估,” 台灣復健醫誌, 38(1), pp. 11-18, 2010. [21] Satoshi Ito, Haruhisa Kawasaki, Yasuhiko Ishigure, Masatoshi Natsume, Tetsuya Mouri, Yutaka Nishimoto, “A design of fine motion assist equipment for disabled hand in robotic rehabilitation system,” Journal of The Franklin Institute, vol. 348, pp. 79- 89, February 2011. [22] Sergei V Adamovich, Gerard G Fluet, Abraham Mathai, Qinyin Qiu, Jefferey Lewis, Alma S Merians, “Design of a complex virtual reality simulation to train finger motion for persons with hemiparesis: a proof of concept study,” Journal of NeuroEngineering and Rehabilitation, pp. 6-28, July 2009. [23] OpenGL, “OpenGL Web Site,” April 2012. [Online]. Available: http://www.opengl.org/ [24] Microsoft, “Microsoft Kinect Web Site,” April 2012. [Online]. Available: http://www.xbox.com/en-US/kinect [25] OpenKinect, “OpenKinect Comunity Site,” April 2012. [Online]. Available: http://openkinect.org/ [26] 黃志偉, “高速公路肇事處理時間預測之研究-應用類神經網路 分析”, 國立中央大學土木工程研究所碩士論文, 2002 [27] Valentino Frati, Domenico Prattichizzo, “Using Kinect for hand tracking and rendering in wearable haptics,” 2011 IEEE World Haptics Conference (WHC), Istanbul, Turkey, June 2011, pp. 1054-1061. [28] Kourosh Khoshelham, Sander Oude Elberink, “Accuracy and resolution of Kinect depth data for indoor mapping application,” Sensors, no. 2, pp. 1437- 1454, February 2012. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/6587 | - |
dc.description.abstract | 本論文是建構在微軟公司推出的Kinect體感應器,利用此感應器Light Coding技術產生的深度資訊,來擷取真實空間中手指指尖的空間座標,並評估利用Kinect感應器在判斷手指活動上的可行性。在Kinect開發上,在此是利用OpenNI來進行感應器相關資訊的擷取;手指指尖偵測上,利用k-Curvature演算法來找出指尖位置;在Kinect空間座標系的驗證上,本論文主要在Z軸深度資訊、X軸與Y軸方向長度距離。手指指尖偵測部分則是在其偵測的穩定度分析,利用平均絕對值誤差率(Mean Absolute Percentage Error , MAPE)與均方誤差(Mean Squared Error, MSE)為評估工具。最後則是界定樣本假手手指彎曲量測的最大範圍。
在Kinect感應器的空間座標驗證上,本實驗的量測距離裡(50-130cm),深度距離(Z軸)誤差值會隨著距離的增加而成正比,但其深度平均誤差率均在1%以內。水平及垂直距離的驗證上,本實驗發現在某些特定的深度距離內(80-110cm),其誤差可以控制在可以接受的範圍內(5%)。手指指尖偵測演算法的穩定度分析上,除了中指指尖部分所量測之X座標以及Y座標的MAPE有超過10以外,其他均小於10;雖然中指指尖部分較不穩定但其MAPE也都小於50,所以都是合理的範圍內。在手指指尖偵測中,一般而言,每個手指的空間座標與平均值的平均誤差距離會隨著深度距離增加而提高。受限於光學上的限制,手指指尖偵測的模式在本實驗中所能偵測到的最大手指彎曲角度,約在30-45度左右。 根據以上的數據結果顯示,在本實驗中,要利用Kinect感應器來做為手指活動的擷取系統是可行的。雖然受限於硬體、以及光學原理,受試者手部位置及活動範圍必須被嚴格限制,但對於手部復健已有一定恢復程度的患者而言,藉由Kinect再加上適當的訓練模式,能使病患自行在居住地方進行密集且有趣的復健訓練,不需親自前往醫院,也可以大大提升手部功能。此外,對於臨床復健醫師而言,也可以藉由此系統間接得到患者復健及恢復程度,進而評估當下的復健模式或是修正接下來的訓練計劃。 | zh_TW |
dc.description.abstract | The system architecture of this thesis utilized concept modify from Microsoft Kinect sensor. We utilized depth information generated by Light Coding Technology in the Kinect to capture spatial coordinates of fingertip in real world space. Then, we assess the feasibility of using Kinect sensor on finger motion capture in real world. OpenNI is used to retrieve the needed information. In fingertip detection, we used k-curvature algorithm to find out the fingertip location. Validation of the Kinect space coordinates for Z-axis depth information, X-axis and Y-axis length distance were also done. In addition, the analysis of stability on fingertip detection algorithm was also presented. The mean absolute percentage error (MAPE) and mean squared error (MSE) are evaluation tools. This help to find out the maximum bending angle of sample finger.
In the verification experiment on the real space coordinates of the Kinect, we defined the depth measurement distance is from 50 cm to 130 cm. We found that the error value is proportional to the depth distance, but the average error rates are less than 1%. In the validation of horizontal and vertical distance, the error rate can be controlled within the acceptable rage (less than 5%) on the certain depth distance (80-110cm). In the stability analysis of fingertip detection algorithm, the MAPE value in tri-axial detection of each fingertip is mostly below 10. In general, average coordinates distance error is proportional to the depth distance within the measurement distance. Finger maximum bending angle that can be detected is about 30-45 degree, which is limited by the optical limitation. According to the result in this experiment, it is feasible to use the Kinect as finger capture system. Although limited by the hardware, the subjects’ position of hand and the rage of activities must be strictly limited. Nevertheless, for the patient with certain degree of recovery, using the Kinect within appropriate training mode can enable them to self-intensive training to live with interesting. Patient do not go to hospital can also improve hand function. In addition, clinical physician can currently assess the patients’ condition by this system, and the clinical physician can also modify the training program as needed. | en |
dc.description.provenance | Made available in DSpace on 2021-05-17T09:14:58Z (GMT). No. of bitstreams: 1 ntu-101-R99945037-1.pdf: 8915361 bytes, checksum: 8d22c12aa007cba40c275522d1d93890 (MD5) Previous issue date: 2012 | en |
dc.description.tableofcontents | 誌謝 i
摘要 ii ABSTRACT iv 總目錄 vi 圖目錄 viii 表目錄 xi 第一章 緒論 1 1.1 前言 1 1.2 相關文獻探討 3 1.3 研究動機與方法 7 1.4 論文架構 8 第二章 原理介紹與方法 9 2.1 微軟Kinect體感應器之介紹 9 2.1.1 Kinect體感應器之硬體描述 9 2.1.2 Kinect體感應器之深度資訊 13 2.1.3 Kinect體感應器之骨架追蹤系統 16 2.2 Open Natural Interaction (OpenNI) 22 2.2.1 OpenNI架構 22 2.2.2 OpenNI之節點(Node) 24 2.2.3 OpenNI之能力(Capability) 27 2.2.4 OpenNI之座標系統 29 2.3 NITE / OpenCV ( Open Source Computer Vision) 31 2.3.1 NITE之介紹 31 2.3.2 OpenCV 33 第三章 系統設計與實驗方法 34 3.1 整體系統架構分析 34 3.2 使用者相關資訊追蹤 36 3.3 手指指尖偵測流程 42 第四章 驗證過程與實驗結果 48 4.1 Kinect三度空間座標系信賴度 48 4.2 手指指尖偵測結果 59 4.2.1 不同深度範圍偵測結果 59 4.2.2 偵測穩定度分析 62 4.3 手指指尖偵測之手指彎曲角度極限 70 第五章 討論與未來展望 72 5.1 討論 72 5.2 未來展望 75 參考文獻 76 | |
dc.language.iso | zh-TW | |
dc.title | 應用Kinect感應器分析手指活動擷取系統之可行性 | zh_TW |
dc.title | A Feasibility Analysis of a Finger Motion Capture System using Kinect | en |
dc.type | Thesis | |
dc.date.schoolyear | 100-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 林耀仁,陸哲駒,林育德,盧並裕 | |
dc.subject.keyword | Kinect體感應器,OpenNI,OpenCV,Light Coding,深度影像,手指指尖偵測,k-Curvature,平均絕對值誤差率,均方誤差, | zh_TW |
dc.subject.keyword | Kinect sensor,Open Natural Interaction (OpenNI),Open Source Computer Vision (OpenCV),Light Coding,Fingertip detection,k-Curvature,Mean absolute percentage error (MAPE),Mean squared error (MSE), | en |
dc.relation.page | 80 | |
dc.rights.note | 同意授權(全球公開) | |
dc.date.accepted | 2012-08-14 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 生醫電子與資訊學研究所 | zh_TW |
顯示於系所單位: | 生醫電子與資訊學研究所 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-101-1.pdf | 8.71 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。