Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/7483
Full metadata record
???org.dspace.app.webui.jsptag.ItemTag.dcfield???ValueLanguage
dc.contributor.advisor徐宏民
dc.contributor.authorChih-Yu Linen
dc.contributor.author林之宇zh_TW
dc.date.accessioned2021-05-19T17:44:38Z-
dc.date.available2023-08-15
dc.date.available2021-05-19T17:44:38Z-
dc.date.copyright2018-08-15
dc.date.issued2018
dc.date.submitted2018-08-13
dc.identifier.citation[1] Ofir Levy and Lior Wolf. Live repetition counting. In International Conference on Computer Vision (ICCV), 2015.
[2] Ousman Azy and Narendra Ahuja. Segmentation of periodically moving objects. In 2008 19th International Conference on Pattern Recognition, ICPR 2008, 2008.
[3] Ping-Sing Tsai, Mubarak Shah, Katharine Keiter, and Takis Kasparis. Cyclic motion detection for motion based recognition. 27:1591–1603, 12 1994.
[4] Alexia Briassouli and Narendra Ahuja. Extraction and analysis of multiple periodic motions in video sequences. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(7):1244–1261, 7 2007.
[5] A. Thangali and S. Sclaroff. Periodic motion detection and estimation via space-time sampling. In Application of Computer Vision, 2005. WACV/MOTIONS ’05 Volume 1. Seventh IEEE Workshops on, volume 2, pages 176–182, Jan 2005.
[6] R. Cutler and L. S. Davis. Robust real-time periodic motion detection, analysis, and applications. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(8): 781–796, Aug 2000.
[7] Scott Satkin and Martial Hebert. Modeling the temporal extent of actions. In European Conference on Computer Vision, September 2010.
[8] Zhe Cao, Tomas Simon, Shih-En Wei, and Yaser Sheikh. Realtime multi-person 2d pose estimation using part affinity fields. CoRR, abs/1611.08050, 2016.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/7483-
dc.description.abstract雖然深度學習已經在電腦視覺方面取得相當地勝利,辨識人類的動作重複次數仍是十分具挑戰性的問題,在病人進行復健以及人們進行重量訓練等時刻皆需要計算動作重複的次數。解決此問題的困難點在於重複動作時動作間的細微差異、人們在畫面中重複動作時相機的拍攝視角移動以及對應不同重複動作需要不同的處理。為了解決這個問題,我們收集了一個新的資料集以及建立了一個嶄新的網路架構,人類重複動作計算網路,用以計算任意的人類重複動作。我們的網路使用人們隨著時間進行重複動作之軌跡的頻域資訊來計算人類重複動作的次數,實驗結果指出我們的網路在計算人類任意動作的重複次數方面有優於以往的表現。另外由於人類進行重複動作的影片取得較為困難,我們也製作了一個資料集,使用產生波形的方式來模擬人類重複動作時的軌跡,並使用此資料集預訓練我們的網路以獲得更精確的計算能力。人類重複動作計算網路並不僅僅只能計算人類重複動作的次數,在實驗中,以物體進行重複動作時的軌跡輸入網路也可以獲得物體重複動作的次數。zh_TW
dc.description.abstractAlthough deep neural network has achieved great success in computer vision recently, the problem of determining repetitions of arbitrary periodic human actions is still challenging. The difficulties lay in varying frame length of repetitions, temporal localization of human beings and different features corresponding to different motions. Moreover, the demand of human action repetition counting is rising in medical rehabilitation and sport events, etc. To address this problem, we construct a human action dataset and propose a brand new framework, Human Action Repetition Counter (HARC), which could work on arbitrary human actions with a single architecture. Our HARC learns to count repetitions of human action in the time-frequency domain determined after few pilot studies. The experiments show that HARC outperforms previous counting methods on benchmarks. Additionally, we design novel learning strategies by generating effective synthetic data to pretrain our network, which can further boost the performance and reach more accurate results. We also demonstrate that our HARC is also capable of counting the periodic object motions. Our dataset, YT_Human_Segments dataset, will be publicly available which will benefit future researches.en
dc.description.provenanceMade available in DSpace on 2021-05-19T17:44:38Z (GMT). No. of bitstreams: 1
ntu-107-R05922109-1.pdf: 1649675 bytes, checksum: ab2fb1d7e1e63e185ecef301ee0add85 (MD5)
Previous issue date: 2018
en
dc.description.tableofcontents1 Introduction 1
1.1 Motivation.................................. 1
1.2 RelatedWork ................................ 2
1.3 Contribution..........4
2 Dataset 5
2.1 YT_Human_Segments Dataset 6
2.2 SyntheticData.........6
3 Human Action Repetition Counter 8
3.1 FeatureExtraction.......8
3.2 Learning to Count Neural Networks (L2CNN) 10
3.3 Training................................... 10
3.4 Detector: Smoothing Outputs........................ 11
4 Experiment 13
4.1 Evaluation Method ............................. 13
4.2 Comparison with Benchmarks ....................... 14
4.3 Pretrained Model .............................. 15
4.4 Robustness ................................. 15
4.5 Action Repetition Counting on Objects................... 16
5 Conclusion 18
6 Reference 19
dc.language.isozh-TW
dc.title以神經網路整合雙串流軌跡資料用於計算任意人類動作次數zh_TW
dc.titleAggregating Two-Stream Trajectory using Neural Network for Counting Arbitrary Human Action Repetitionen
dc.typeThesis
dc.date.schoolyear106-2
dc.description.degree碩士
dc.contributor.oralexamcommittee葉梅珍,陳文進
dc.subject.keyword重複動作次數計算,zh_TW
dc.subject.keywordperiodic motion,repetition counter,en
dc.relation.page19
dc.identifier.doi10.6342/NTU201803126
dc.rights.note同意授權(全球公開)
dc.date.accepted2018-08-13
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
dc.date.embargo-lift2023-08-15-
Appears in Collections:資訊工程學系

Files in This Item:
File SizeFormat 
ntu-107-1.pdf1.61 MBAdobe PDFView/Open
Show simple item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved