Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/7483
Title: 以神經網路整合雙串流軌跡資料用於計算任意人類動作次數
Aggregating Two-Stream Trajectory using Neural Network for Counting Arbitrary Human Action Repetition
Authors: Chih-Yu Lin
林之宇
Advisor: 徐宏民
Keyword: 重複動作次數計算,
periodic motion,repetition counter,
Publication Year : 2018
Degree: 碩士
Abstract: 雖然深度學習已經在電腦視覺方面取得相當地勝利,辨識人類的動作重複次數仍是十分具挑戰性的問題,在病人進行復健以及人們進行重量訓練等時刻皆需要計算動作重複的次數。解決此問題的困難點在於重複動作時動作間的細微差異、人們在畫面中重複動作時相機的拍攝視角移動以及對應不同重複動作需要不同的處理。為了解決這個問題,我們收集了一個新的資料集以及建立了一個嶄新的網路架構,人類重複動作計算網路,用以計算任意的人類重複動作。我們的網路使用人們隨著時間進行重複動作之軌跡的頻域資訊來計算人類重複動作的次數,實驗結果指出我們的網路在計算人類任意動作的重複次數方面有優於以往的表現。另外由於人類進行重複動作的影片取得較為困難,我們也製作了一個資料集,使用產生波形的方式來模擬人類重複動作時的軌跡,並使用此資料集預訓練我們的網路以獲得更精確的計算能力。人類重複動作計算網路並不僅僅只能計算人類重複動作的次數,在實驗中,以物體進行重複動作時的軌跡輸入網路也可以獲得物體重複動作的次數。
Although deep neural network has achieved great success in computer vision recently, the problem of determining repetitions of arbitrary periodic human actions is still challenging. The difficulties lay in varying frame length of repetitions, temporal localization of human beings and different features corresponding to different motions. Moreover, the demand of human action repetition counting is rising in medical rehabilitation and sport events, etc. To address this problem, we construct a human action dataset and propose a brand new framework, Human Action Repetition Counter (HARC), which could work on arbitrary human actions with a single architecture. Our HARC learns to count repetitions of human action in the time-frequency domain determined after few pilot studies. The experiments show that HARC outperforms previous counting methods on benchmarks. Additionally, we design novel learning strategies by generating effective synthetic data to pretrain our network, which can further boost the performance and reach more accurate results. We also demonstrate that our HARC is also capable of counting the periodic object motions. Our dataset, YT_Human_Segments dataset, will be publicly available which will benefit future researches.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/7483
DOI: 10.6342/NTU201803126
Fulltext Rights: 同意授權(全球公開)
metadata.dc.date.embargo-lift: 2023-08-15
Appears in Collections:資訊工程學系

Files in This Item:
File SizeFormat 
ntu-107-1.pdf1.61 MBAdobe PDFView/Open
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved