Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88150
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳炳宇zh_TW
dc.contributor.advisorBing-Yu Chenen
dc.contributor.author梁中瀚zh_TW
dc.contributor.authorChung-Han Liangen
dc.date.accessioned2023-08-08T16:31:21Z-
dc.date.available2023-11-09-
dc.date.copyright2023-08-08-
dc.date.issued2023-
dc.date.submitted2023-07-06-
dc.identifier.citation[1] K. Ahuja, E. Ofek, M. Gonzalez-Franco, C. Holz, and A. D. Wilson. Coolmoves: User motion accentuation in virtual reality. Proc. ACM Interact. Mob. WearableUbiquitous Technol., 5(2), jun 2021.
[2] R. Anderegg, L. Ciccone, and R. W. Sumner. Puppetphone: Puppeteering virtualcharacters using a smartphone. In Proceedings of the 11th ACM SIGGRAPH Conferenceon Motion, Interaction and Games, MIG ’18, New York, NY, USA, 2018. Association for Computing Machinery.
[3] J. Chai and J. K. Hodgins. Performance animation from low-dimensional controlsignals. ACM Trans. Graph., 24(3):686–696, jul 2005.
[4] J. Chen, S. Izadi, and A. Fitzgibbon. KinÊtre: Animating the world with the humanbody. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, UIST ’12, page 435–444, New York, NY, USA, 2012. Association for Computing Machinery.
[5] Y.-T. Cheng, T. K. Shih, and C.-Y. Lin. Create a puppet play and interative digital models with leap motion. In 2017 10th International Conference on Ubi-media Computing and Workshops (Ubi-Media), pages 1–6, 2017.
[6] M. Dontcheva, G. Yngve, and Z. Popović. Layered acting for character animation. ACM Trans. Graph., 22(3):409–416, jul 2003.
[7] M. Eitsuka and M. Hirakawa. Authoring animations of virtual objects in augmented reality-based 3d space. In 2013 Second IIAI International Conference on Advanced Applied Informatics, pages 256–261, 2013.
[8] A. Fender, J. Müller, and D. Lindlbauer. Creature teacher: A performance-based animation system for creating cyclic movements. In Proceedings of the 3rd ACM Symposium on Spatial User Interaction, SUI ’15, page 113–122, New York, NY, USA, 2015. Association for Computing Machinery.
[9] O. Glauser, W.-C. Ma, D. Panozzo, A. Jacobson, O. Hilliges, and O. Sorkine-Hornung. Rig animation with a tangible and modular input device. ACM Trans. Graph., 35(4), jul 2016.
[10] R. Held, A. Gupta, B. Curless, and M. Agrawala. 3d puppetry: A kinect-based interface for 3d animation. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, UIST ’12, page 423–434, New York, NY, USA, 2012. Association for Computing Machinery.
[11] N. Hiroki, N. Pantuwong, and M. Sugimoto. A puppet interface for the development of an intuitive computer animation system. In Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012), pages 3136–3139, 2012.
[12] C.-W. Hung, R.-C. Chang, H.-S. Chen, C. H. Liang, L. Chan, and B.-Y. Chen. Puppeteer: Manipulating human avatar actions with intuitive hand gestures and upperbody postures. In Adjunct Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology, UIST ’22 Adjunct, New York, NY, USA, 2022. Association for Computing Machinery.
[13] S. Ishigaki, T. White, V. B. Zordan, and C. K. Liu. Performance-based control interface for character animation. ACM Trans. Graph., 28(3), jul 2009.
[14] Y. Jiang, Z. Li, M. He, D. Lindlbauer, and Y. Yan. HandAvatar: Embodying Non-Humanoid Virtual Avatars through Hands. Association for Computing Machinery, New York, NY, USA, 2023.
[15] W.-C. Lam, F. Zou, and T. Komura. Motion editing with data glove. In Proceedings of the 2004 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology, ACE ’04, page 337–342, New York, NY, USA, 2004. Association for Computing Machinery.
[16] F. Lamberti, G. Paravati, V. Gatteschi, A. Cannavò, and P. Montuschi. Virtual character animation based on affordable motion capture and reconfigurable tangible interfaces. IEEE Transactions on Visualization and Computer Graphics, 24(5):1742–1755, 2018.
[17] L. Leite and V. Orvalho. Shape your body: Control a virtual silhouette using body motion. In CHI ’12 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’12, page 1913–1918, New York, NY, USA, 2012. Association for Computing Machinery.
[18] L. LEite and V. Orvalho. Mani-pull-action: Hand-based digital puppetry. Proc. ACM Hum.-Comput. Interact., 1(EICS), jun 2017.
[19] H. Liang, J. Chang, I. K. Kazmi, J. J. Zhang, and P. Jiao. Hand gesture-based inter-active puppetry system to assist storytelling for children. Vis. Comput., 33(4):517–531, apr 2017.
[20] H. Liu, X. Wei, J. Chai, I. Ha, and T. Rhee. Realtime human motion control with a small number of inertial sensors. In Symposium on Interactive 3D Graphics and Games, I3D ’11, page 133–140, New York, NY, USA, 2011. Association for Computing Machinery.
[21] N. Lockwood and K. Singh. Finger walking: Motion editing with contact-based hand performance. In Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA ’12, page 43–52, Goslar, DEU, 2012. Eurographics Association.
[22] C. Lugaresi, J. Tang, H. Nash, C. McClanahan, E. Uboweja, M. Hays, F. Zhang, C.-L. Chang, M. G. Yong, J. Lee, W.-T. Chang, W. Hua, M. Georg, and M. Grundmann. Mediapipe: A framework for building perception pipelines, 2019.
[23] Z. Luo, I.-M. Chen, S. H. Yeo, C.-C. Lin, and T.-Y. Li. Building hand motion-based character animation: The case of puppetry. In 2010 International Conference on Cyberworlds, pages 46–52, 2010.
[24] S. Oore, D. Terzopoulos, and G. Hinton. A desktop input device and interface for interactive 3d character animation. In Graphics Interface, volume 2, pages 133–140, 2002.
[25] M. Oshita. Multi-touch interface and motion control model for interactive character animation. In M. L. Gavrilova, C. J. K. Tan, X. Mao, and L. Hong, editors, Transactions on Computational Science XXIII: Special Issue on Cyberworlds, 2014.
[26] M. Oshita, Y. Senju, and S. Morishige. Character motion control interface with hand manipulation inspired by puppet mechanism. In Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry, VRCAI ’13, page 131–138, New York, NY, USA, 2013. Association for Computing Machinery.
[27] M. Sakashita, T. Minagawa, A. Koike, I. Suzuki, K. Kawahara, and Y. Ochiai. You as a puppet: Evaluation of telepresence user interface for puppetry. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, UIST ’17, page 217–228, New York, NY, USA, 2017. Association for Computing Machinery.
[28] Y. Seol, C. O’Sullivan, and J. Lee. Creature features: Online motion puppetry for non-human characters. In Proceedings of the 12th ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA ’13, page 213–221, New York, NY, USA, 2013. Association for Computing Machinery.
[29] T. Shiratori and J. K. Hodgins. Accelerometer-based user interfaces for the control of a physically simulated character. In ACM SIGGRAPH Asia 2008 Papers, SIGGRAPH Asia ’08, New York, NY, USA, 2008. Association for Computing Machinery.
[30] P. Virtanen, R. Gommers, T. E. Oliphant, M. Haberland, T. Reddy, D. Cournapeau, E. Burovski, P. Peterson, W. Weckesser, J. Bright, S. J. van der Walt, M. Brett, J. Wilson, K. J. Millman, N. Mayorov, A. R. J. Nelson, E. Jones, R. Kern, E. Larson, C. J. Carey, İ. Polat, Y. Feng, E. W. Moore, J. VanderPlas, D. Laxalde, J. Perktold, R. Cimrman, I. Henriksen, E. A. Quintero, C. R. Harris, A. M. Archibald, A. H. Ribeiro, F. Pedregosa, P. van Mulbregt, and SciPy 1.0 Contributors. SciPy1.0: Fundamental Algorithms for Scientific Computing in Python. Nature Methods, 17:261–272, 2020.
[31] B.-X. Wang, Y.-W. Wang, Y.-K. Chen, C.-M. Tseng, M.-C. Hsu, C. A. Hsieh, H.-Y. Lee, and M. Y. Chen. Miniature haptics: Experiencing haptic feedback through hand-based and embodied avatars. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, CHI ’20, page 1–8, New York, NY, USA, 2020. Association for Computing Machinery.
[32] M. Wang, K. Lei, Z. Li, H. Mi, and Y. Xu. Twistblocks: Pluggable and twistable modular tui for armature interaction in 3d design. In Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction, TEI ’18, page 19–26, New York, NY, USA, 2018. Association for Computing Machinery.
[33] D.-L. Way, W.-K. Lau, and T. Y. Huang. Glove puppetry cloud theater through a virtual reality network. In ACM SIGGRAPH 2019 Posters, SIGGRAPH ’19, New York, NY, USA, 2019. Association for Computing Machinery.
[34] K. Yamane, Y. Ariki, and J. Hodgins. Animating non-humanoid characters with human motion data. In Proceedings of the 2010 ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA ’10, page 169–178, Goslar, DEU, 2010. Eurographics Association.
[35] H. Ye, K. C. Kwan, W. Su, and H. Fu. Aranimator: In-situ character animation in mobile ar with user-defined motion gestures. ACM Trans. Graph., 39(4), aug 2020.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88150-
dc.description.abstract利用表演達成虛擬操偶的模式已經被廣泛運用在多種不同領域上,包含但不限於遊戲、故事敘述、動畫編輯。人類靈巧的手可以做出非常豐富的動作,正因如此它適合作為操作虛擬角色的介面。我們採用” 手指走路” 這種直覺且自然的表演方式,用來作為操作虛擬人型角色的介面。首先,我們利用初步的使用者訪談,藉此得到大多數一般的使用者對於” 手指走路” 的認知範圍。並且,將這些收集到的” 手指走路” 動作分成不同動作類別。最後,從一個廣泛被使用的動畫資料庫當中,挑選出5 個多數受訪者認同適合使用” 手指走路” 表達的虛擬角色動畫為範例動畫。我們提出的虛擬角色操作方法,它會透過轉換” 手指走路” 過程中的旋轉角度變化到虛擬角色的雙腿達成動作轉移。接著,從範例動畫當中尋找相似的腿部動作用於全身的姿態重建。我們也實作一個互動故事敘述應用程式,藉此展示我們提出的方法有能力產生富有回饋感與可靠的虛擬人型角色動作。zh_TW
dc.description.abstractPerformance-based digital puppetry has gained widespread popularity in various fields, including gaming, storytelling, animation editing, etc. With their dexterity and ability to perform various movements, human hands are well-suited for manipulating digital avatars. In this study, we adopted the finger-walking technique, a natural and intuitive method of performance, as an interface for controlling human avatars. We first conducted a preliminary study to explore the range of finger-walking movements preferred by casual users and identified several general types of finger-walking performances. Based on the study results, we selected five common example animations from a database that are suitable for finger-walking performance. To manipulate the human avatars, we developed a method that maps finger-walking motions to leg motions using rotation mapping and matches similar leg motions in the example animations to generate expressive full-body motions. We also implemented a prototype interactive storytelling application to demonstrate the effectiveness of our system in generating responsive and reliable human avatar motions.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-08T16:31:21Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2023-08-08T16:31:21Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsVerification Letter from the Oral Examination Committee i
Acknowledgements iii
摘要 v
Abstract vii
Contents ix
List of Figures xi
List of Tables xiii
Denotation xv
Chapter 1 Introduction 1
Chapter 2 Related work 7
2.1 Body performance interface . . . . . . . . . . . . . . . . . . . . . . 7
2.2 Hand-held devices interface . . . . . . . . . . . . . . . . . . . . . . 10
2.3 Hand-based performance interface . . . . . . . . . . . . . . . . . . . 12
Chapter 3 Preliminary study 17
3.1 Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.2 Result and Observations . . . . . . . . . . . . . . . . . . . . . . . . 18
Chapter 4 Materials and method 23
4.1 Data and preprocessing in training stage . . . . . . . . . . . . . . . . 24
4.2 Hand motion retargeting to lower body motion . . . . . . . . . . . . 30
4.3 Full body pose reconstruction . . . . . . . . . . . . . . . . . . . . . 33
Chapter 5 Result 37
5.1 Storytelling application . . . . . . . . . . . . . . . . . . . . . . . . . 39
5.2 Limitation and Future Work . . . . . . . . . . . . . . . . . . . . . . 40
Chapter 6 Conclusion 45
References 47
-
dc.language.isoen-
dc.subject動作遷移zh_TW
dc.subject手指走路zh_TW
dc.subject基於表演的輸入介面zh_TW
dc.subjectfinger-walkingen
dc.subjectmotion retargetingen
dc.subjectperformance-based inputen
dc.title手指人偶: 基於手指走路的人形虛擬化身操作zh_TW
dc.titleFingerPuppet: Finger-Walking Performance-based Puppetry for Human Avataren
dc.typeThesis-
dc.date.schoolyear111-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee陳彥仰;梁容輝;詹力韋;蔡欣叡zh_TW
dc.contributor.oralexamcommitteeMike Y. Chen;Rung-Huei Liang;Liwei Chan;Hsin-Ruey Tsaien
dc.subject.keyword手指走路,動作遷移,基於表演的輸入介面,zh_TW
dc.subject.keywordfinger-walking,motion retargeting,performance-based input,en
dc.relation.page52-
dc.identifier.doi10.6342/NTU202301170-
dc.rights.note同意授權(限校園內公開)-
dc.date.accepted2023-07-10-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept資訊工程學系-
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-111-2.pdf
授權僅限NTU校內IP使用(校園外請利用VPN校外連線服務)
2.11 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved