Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/66254
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor吳家麟(Ja-Ling Wu)
dc.contributor.authorChi-Wen Chenen
dc.contributor.author陳祺文zh_TW
dc.date.accessioned2021-06-17T00:27:27Z-
dc.date.available2015-02-21
dc.date.copyright2012-02-21
dc.date.issued2012
dc.date.submitted2012-02-15
dc.identifier.citation[1] Ryota Sakamoto, Yuki Yoshimura, Tokuhiro Suiua and Yoshihiko Nomura, “A Motion Instruction System Using Head Tracking Back Perspective”, in World Automation Congress(WAC), Sep 2010.
[2] Liming Chen and Chris Nugent, survey paper: “Ontology-based Activity Recognition in Intelligent Pervasive Environments”, in International Journal of Web Information System, 2009.
[3] Jamie Shotton, Andrew Fitzgibbon, Mat Cook, Toby Sharp, and Mark Finocchio, “Real-Time Human Pose Recognition in Parts from Single Depth Images”, in IEEE Computer Vision and Pattern Recognition, 2009.
[4] J. K. Aggarwal and M. S. Ryoo, “Human Activity Analysis: A Review”, in ACM Computing Surveys, vol. 43 issue 3, April 2011.
[5] Deva Ramanan, “Learning to Parse Images of Articulated Bodies”, in Advanced in Neural Information Processing Systems, 2006.
[6] Vittorio Ferrari, Manuel Marin-Jimenez, and Andrew Zisserman, “Progressive Search Space Reduction for Human Pose Estimation”, in IEEE Computer Vision and Pattern Recognition, June 2008.
[7] Himanshu Prakash Jain and Anbumani Subramanian, “Real-time Upper-body Human Pose Estimation Using a Depth Camera”, HP Laboratories, HP Technical Reports, HPL-2010-190, 2010.
[8] Jamie Shotton, Andrew Fitzgibbon, Mat Cook, Toby Sharp, and Mark Finocchio, “Real-Time Human Pose Recognition in Parts from Single Depth Images”, in IEEE Computer Vision and Pattern Recognition, 2011.
[9] Michalis Raptis, Darko Kirovshi, and Hugues Hoppe, “Real-Time Classification of Dance Gestures from Skeleton Animation”, in Eurographics/ACM SIGGRAPH Symposium on Computer Animation, 2011.
[10] Dimitrios Alexiadis, Philip Kelly, Tamy Boubekeur, and Maher Ben Moussa, “Evaluating a Dancer’s Performance using Kinect-based Skeleton Tracking”, in ACM Multimedia, 2011.
[11] Yaser Sheikh, Mumtaz Sheikh, and Mubarak Shah, “Exploring the Space of a Human Action”, in International Conference on Computer Vision, 2005.
[12] OpenNI http://www.openni.org/
[13] Copyright c Benjamin Cummings, an imprint of Addison Wesley Longman, Inc.
[14] Mei-Chen Yeh and Kwang-Ting Cheng, “A String Matching Approach for Visual Retrieval and Classification”, in Proceeding of the 1st ACM Multimedia Information Retrieval, 2008.
[15] Meimen Qigong Culture Center http://www.mymeimen.org/
[16] PrimeSense http://www.primesense.com/
[17] Pingshuai Kongfu http://www.mymeimen.org/index.php?option=com_content&view=category&layout=blog&id=37&Itemid=56
[18] World Health Organization, Prevalence of insufficient physical activity.
http://www.who.int/gho/ncd/risk_factors/physical_activity_text/en/index.html
[19] Roberto Colombo, Fabrizio Pisano, Alessandra Mazzone, Carmen Delconte, Silvestro Micera, M Chiara Carrozza, Paolo Dario and Giuseppe Minuco, “Design Strategies to Improve Patient Motivation During Robot-added Rehabilitation”, in Journal of NeuroEngineering and Rehabilitation, 2007.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/66254-
dc.description.abstract基於影像資料的人體動作辨識技術,因其廣泛的應用一直是一個重要的研究領域。近來,隨著商用深度相機,如Microsoft Kinect,的發展,人體姿勢分析的難度大幅降低,更加刺激該領域的蓬勃發展。越來越多讓人眼睛為之一亮的應用相繼問世。然而,絕大多數的研究致力於提升動作分析的準確度和運算速度而非詳細的動作評估方式,人體姿勢預測和了解動作表現的好壞仍舊存在很大的鴻溝。而這樣的動作評估技術對於運動教學和健康照護的應用尤為重要。
在本篇論文中,我們提出了一個運動教學系統,藉由Microsoft Kinect捕捉使用者的色彩及深度資訊,以OpenNI人體骨骼追蹤的資訊作為主要輸入,計算更能代表人體的座標系統將關節位置轉換成球狀座標,建立穩定而具鑑別度的姿勢特徵描述,降低人體比例和視角變化的影響。並利用以範例為基礎的人體動作分析技術,以非線性時間彎曲校正的方法辨認使用者的動作,尋找資料庫中相應的範例片段作為學習的對象。藉由非線性時間彎曲校正的方式,即使在有局部速度變化的情況下,依然可找出對應的動作片段,計算動作間的細微差距。最後我們利用系統分析的資訊,呈現給使用者更有效且更有組織的動作評估結果,幫助使用者改善動作表現。
zh_TW
dc.description.abstractVision-based human action analysis is an important research area because of its wide-ranging applications. Nowadays, with the aid of commercial depth camera, such as Microsoft Kinect, the difficulty of the posture estimation process is greatly decreased and this has even spurred the research progress. More and more amazing applications are developed. However, most of the research works focus on improving the estimation rate and the computation speed rather than the detailed motion evaluation and there is still much to do to bridge the estimation of human posture and the understanding of action performance. Moreover, this movement assessment technique is especially important in the healthcare and motion instruction applications.
In this work, we proposed a motion instruction system, using Microsoft Kinect to capture the color and depth information of the user and then taking OpenNI skeleton tracking model as system input. We use this joint-matched skeleton model to calculate human body's object coordinate system and describe the joint positions in spherical coordinate to construct a more robust and discriminative pose descriptor, lowering the effect of anthropometric and viewpoint transformations. Furthermore, we also use the exemplar-based action recognition technology, applying non-linear time-warping approaches to recognize the actions performed by the users and then use this action sequence to find the similar segments in the database as the learning model. This non-linear time warping approaches could find the correspondent pairs under certain degrees of time variation, defining a similarity measurement and calculating the subtle difference among different executions. Finally, we utilize these information, present the motion assessment result to the users in a more effective and organized way, hoping to help them improve their action performance.
en
dc.description.provenanceMade available in DSpace on 2021-06-17T00:27:27Z (GMT). No. of bitstreams: 1
ntu-101-R98922080-1.pdf: 2907839 bytes, checksum: 561aeadfb05af65956157e303571b911 (MD5)
Previous issue date: 2012
en
dc.description.tableofcontents口試委員會審定書 i
致謝 ii
中文摘要 iv
ABSTRACT v
CONTENTS vii
LIST OF FIGURES ix
LIST OF TABLES xiii
Chapter 1 Introduction 1
1.1 Motivation 1
1.2 Related Work 2
1.3 Goal of This Work 6
1.4 Challenges 7
1.5 Contributions 8
Chapter 2 System Framework 10
Chapter 3 OpenNI Skeleton Estimation 15
Chapter 4 Action Video Retrieval and Motion Assessment 18
4.1 Feature Construction 18
4.2 Pose Similarity 26
4.3 Action Similarity 27
4.3.1 Cross-Correlation 27
4.3.2 Dynamic Time Warping 29
4.3.3 Approximate String Matching 31
4.4 DTW v.s. ASM 36
4.4.1 Difference Between DTW and ASM 36
4.4.2 Modified DTW 37
Chapter 5 Experiment Results 39
5.1 Representative Feature and Meaningful Joints 40
5.2 Non-Linear Time Warping 47
5.3 User Evaluation 54
Chapter 6 Conclusion and Future Work 58
6.1 Conclusion 58
6.2 Future Work 58
REFERENCE 60
dc.language.isoen
dc.subject以範例為基礎的人體動作分析技術zh_TW
dc.subjectMicrosoft Kinectzh_TW
dc.subjectOpenNI 人體骨骼追蹤zh_TW
dc.subject運動教學系統zh_TW
dc.subject動作評估zh_TW
dc.subject動作評估zh_TW
dc.subject運動教學系統zh_TW
dc.subject以範例為基礎的人體動作分析技術zh_TW
dc.subjectMicrosoft Kinectzh_TW
dc.subjectOpenNI 人體骨骼追蹤zh_TW
dc.subjectMotion Instruction Systemen
dc.subject Microsoft Kinecten
dc.subjectOpenNI skeleton trackingen
dc.subjectMotion Instruction Systemen
dc.subjectMovement Assessmenten
dc.subjectExemplar-based Human Action Analysisen
dc.subject Microsoft Kinecten
dc.subjectMovement Assessmenten
dc.subjectOpenNI skeleton trackingen
dc.subjectExemplar-based Human Action Analysisen
dc.title以範例為基礎利用Kinect感應裝置輔助之上身序列作評估zh_TW
dc.titleExemplar-based Sequential Upper-body Movement Assessment with Kinect Sensoren
dc.typeThesis
dc.date.schoolyear100-1
dc.description.degree碩士
dc.contributor.oralexamcommittee林登彬(Teng-Pin Lin),許超雲(Chau-Yun Hsu),鄭文皇(Wen-Huang Cheng)
dc.subject.keyword以範例為基礎的人體動作分析技術,動作評估,運動教學系統,OpenNI 人體骨骼追蹤,Microsoft Kinect,zh_TW
dc.subject.keywordExemplar-based Human Action Analysis,Movement Assessment,Motion Instruction System,OpenNI skeleton tracking, Microsoft Kinect,en
dc.relation.page0
dc.rights.note有償授權
dc.date.accepted2012-02-15
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-101-1.pdf
  未授權公開取用
2.84 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved