請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88721完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 洪一平 | zh_TW |
| dc.contributor.advisor | Yi-Ping Hung | en |
| dc.contributor.author | 陳泂杋 | zh_TW |
| dc.contributor.author | Chiung-Fan Chen | en |
| dc.date.accessioned | 2023-08-15T17:30:37Z | - |
| dc.date.available | 2023-11-09 | - |
| dc.date.copyright | 2023-08-15 | - |
| dc.date.issued | 2023 | - |
| dc.date.submitted | 2023-08-05 | - |
| dc.identifier.citation | P. T. Chua, R. Crivella, B. Daly, N. Hu, R. Schaaf, D. Ventura, T. Camill, J. Hodgins, and R. Pausch, “Training for physical tasks in virtual environments: Tai chi,” in IEEE Virtual Reality, 2003. Proceedings. IEEE, 2003, pp. 87–94.
T. Iwaanaguchi, M. Shinya, S. Nakajima, and M. Shiraishi, “Cyber tai chi- cg-based video materials for tai chi chuan self-study,” in 2015 International Conference on Cyberworlds (CW). IEEE, 2015, pp. 365–368. P.-H. Han, Y. Zhong, H.-L. Wang, and Y.-P. Hung, “Augmented visualization for practicing tai-chi chuan with drone-enhanced approach,” in Oz CHI, 2016. P.-H. Han, Y.-S. Chen, Y. Zhong, H.-L. Wang, and Y.-P. Hung, “My tai-chi coaches: an augmented-learning tool for practicing tai-chi chuan,” in Proceedings of the 8th Augmented Human International Conference, 2017, pp. 1–4. X. Chen, Z. Chen, Y. Li, T. He, J. Hou, S. Liu, and Y. He, “Immertai: Immersive motion learning in vr environments,” Journal of Visual Communication and Image Representation, vol. 58, pp. 416–427, 2019. Y.-F. Jan, K.-W. Tseng, P.-Y. Kao, and Y.-P. Hung, “Augmented tai-chi chuan practice tool with pose evaluation,” in 2021 IEEE 4th International Conference on Multimedia Information Processing and Retrieval (MIPR). IEEE, 2021, pp. 35–41. P.-Y. Kao, P.-H. Han, Y. F. Jan, Z. Yang, C.-H. Li, and Y.-P. Hung, “On learning weight distribution of tai chi chuan using pressure sensing insoles and mr-hmd.” in VR, 2019, pp. 1457–1464. Y.-F. Jan, “Tai-chi chuan footstep guiding for augmented reality assisted learning system,” Master’s thesis, National Taiwan University, 2022. W.-X. Li, “Remote collaborative tai-chi chuan le arning with mixed reality,” Master’s thesis, National Taiwan University, 2022. P. Wang, X. Bai, M. Billinghurst, S. Zhang, X. Zhang, S. Wang, W. He, Y. Yan, and H. Ji, “Ar/mr remote collaboration on physical tasks: A review,” Robotics and Computer-Integrated Manufacturing, vol. 72, p. 102071, 2021. M. Cidota, S. Lukosch, D. Datcu, and H. Lukosch, “Workspace awareness in collaborative ar using hmds: a user study comparing audio and visual notifications,” in Proceedings of the 7th Augmented Human International Conference 2016, 2016, pp. 1–8. H. Sun, Z. Zhang, Y. Liu, and H. B. Duh, “Optobridge: assisting skill acquisition in the remote experimental collaboration,” in Proceedings of the 28th Australian Conference on Computer-Human Interaction, 2016, pp. 195–199. H. Sun, Y. Liu, Z. Zhang, X. Liu, and Y. Wang, “Employing different viewpoints for remote guidance in a collaborative augmented environment,” in Proceedings of the sixth international symposium of Chinese CHI, 2018, pp. 64–70. S. Wang, M. Parsons, J. Stone-McLean, P. Rogers, S. Boyd, K. Hoover, O. Meruvia-Pastor, M. Gong, and A. Smith, “Augmented reality as a telemedicine platform for remote procedural training,” Sensors, vol. 17, no. 10, p. 2294, 2017. A. Gupta, S. Mohatta, J. Maurya, R. Perla, R. Hebbalaguppe, and E. Hassan, “Hand gesture based region marking for tele-support using wearables,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 2017, pp. 69–75. S. Gunther, S. Kratz, D. Avrahami, and M. Muhlhauser, “Exploring audio, visual, and tactile cues for synchronous remote assistance,” in Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference, 2018, pp. 339–344. S. Kim, M. Billinghurst, C. Lee, and G. Lee, “Using freeze frame and visual notifications in an annotation drawing interface for remote collaboration,” KSII Transactions on Internet and Information Systems (TIIS), vol. 12, no. 12, pp. 6034–6056, 2018. S. Kim, M. Billinghurst, and G. Lee, “The effect of collaboration styles and view independence on video-mediated remote collaboration,” Computer Supported Cooperative Work (CSCW), vol. 27, pp. 569–607, 2018. R. Sodhi, H. Benko, and A. Wilson, “Lightguide: projected visualizations for hand movement guidance,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2012, pp. 179–188. F. Anderson, T. Grossman, J. Matejka, and G. Fitzmaurice, “Youmove: enhancing movement training with an augmented reality mirror,” in Proceedings of the 26th annual ACM symposium on User interface software and technology, 2013, pp. 311–320. E. Velloso, A. Bulling, and H. Gellersen, “Motionma: motion modelling and analysis by demonstration,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2013, pp. 1309–1318. C. Sieluzycki, P. Kaczmarczyk, J. Sobecki, K. Witkowski, J. Maslinski, and W. Cieslinski, “Microsoft kinect as a tool to support training in professional sports: augmented reality application to tachi-waza techniques in judo,” in 2016 Third European Network Intelligence Conference (ENIC). IEEE, 2016, pp. 153–158. R. Tang, X.-D. Yang, S. Bateman, J. Jorge, and A. Tang, “Physio@ home: Exploring visual guidance and feedback techniques for physiotherapy exercises,” in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 2015, pp. 4123–4132. M. Sousa, J. Vieira, D. Medeiros, A. Arsenio, and J. Jorge, “Sleevear: Augmented reality for rehabilitation using realtime feedback,” in Proceedings of the 21st international conference on intelligent user interfaces, 2016, pp. 175–185. C. Ribeiro, R. Kuffner, and C. Fernandes, “Virtual reality annotator: A tool to annotate dancers in a virtual environment,” in Digital Cultural Heritage: Final Conference of the Marie Sklodowska-Curie Initial Training Network for Digital Cultural Heritage, ITN-DCH 2017, Olimje, Slovenia, May 23–25, 2017, Revised Selected Papers. Springer, 2018, pp. 257–266. E. Wu and H. Koike, “Futurepose-mixed reality martial arts training using real-time 3d human pose forecasting with a rgb camera,” in 2019 IEEE Winter Conference on Applications of Computer Vision (WACV). IEEE, 2019, pp. 1384–1392. H. Lee, H. Kim, D. V. Monteiro, Y. Goh, D. Han, H.-N. Liang, H. S. Yang, and J. Jung, “Annotation vs. virtual tutor: Comparative effectiveness of visual instructions in immersive virtual reality,” in 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 2019, pp. 318–327. Y. Lu, H. Yu, W. Ni, and L. Song, “3d real-time human reconstruction with a single rgbd camera,” Applied Intelligence, vol. 53, no. 8, pp. 8735–8745, 2023. Y. Zhou, M. Habermann, I. Habibie, A. Tewari, C. Theobalt, and F. Xu, “Monocular real-time full body capture with inter-part correlations,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021, pp. 4811–4822. L. Xu, Z. Su, L. Han, T. Yu, Y. Liu, and L. Fang, “Unstructuredfusion: realtime 4d geometry and texture reconstruction using commercial rgbd cameras,” IEEE transactions on pattern analysis and machine intelligence, vol. 42, no. 10, pp. 2508–2522, 2019. Y. Kim, S. Hong, and G. J. Kim, “Augmented reality-based remote coaching for fast-paced physical task,” Virtual Reality, vol. 22, pp. 25–36, 2018. C. Lugaresi, J. Tang, H. Nash, C. McClanahan, E. Uboweja, M. Hays, F. Zhang, C.-L. Chang, M. G. Yong, J. Lee et al., “Mediapipe: A framework for building perception pipelines,” arXiv preprint arXiv:1906.08172, 2019. N. Nakano, T. Sakura, K. Ueda, L. Omura, A. Kimura, Y. Iino, S. Fukashiro, and S. Yoshioka, “Evaluation of 3d markerless motion capture accuracy using openpose with multiple video cameras,” Frontiers in sports and active living, vol. 2, p. 50, 2020. M. Zago, M. Luzzago, T. Marangoni, M. De Cecco, M. Tarabini, and M. Galli, “3d tracking of human motion using visual skeletonization vision,” Frontiers in bioengineering and biotechnology, vol. 8, p. 181, 2020. J. Brooke, “Sus: a “quick and dirty’usability,” Usability evaluation in industry, vol. 189, no. 3, pp. 189–194, 1996. S. G. Hart, “Nasa-task load index (nasa-tlx); 20 years later,” in Proceedings of the human factors and ergonomics society annual meeting, vol. 50, no. 9. Sage publications Sage CA: Los Angeles, CA, 2006, pp. 904–908. C. Harms and F. Biocca, “Internal consistency and reliability of the networked minds measure of social presence,” in Seventh annual international workshop: Presence, vol. 2004. Universidad Politecnica de Valencia Valencia, Spain, 2004. | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88721 | - |
| dc.description.abstract | 太極拳是一項包含著連續肢體動作的中國傳統武術。在傳統上來說,太極拳的學習會要求指導者與學習者在同一個空間內。但是,再遠距離教學的需求下,現有的替代方案像是視訊教學與太極拳自學系統都有著各自的缺點。在本研究中,我們提出了一套結合了擴增實境與WIMP (視窗、圖標、選單、指標) 的遠距離協作系統。此系統包含了提供沉浸式環境的自學子系統與作為溝通工具的指導子系統。在指導子系統中,我們設計了位於虛擬鏡子上的2D的視覺引導標示與位於學習者肢體上的3D視覺引導標示作為肢體引導的媒介。在兩項使用者研究中,我們各自比較了指導者與學習者對於2D與3D視覺引導標示的想法。使用者在兩種視覺引導標示都給予了高評價,但使用者認為3D的視覺引導標示比起2D版本,提供學習者更多的指導資訊,並提供指導者更多的引導功能。 | zh_TW |
| dc.description.abstract | Tai-Chi Chuan (TCC) is a famous Chinese exercise with body movement sequences, which traditionally requires the instructor and the learner to be collocated. However, in situations where the instructor and the learner are not in the same place, alternative methods like video conferencing and TCC self-learning exist disadvantages. In this paper, we propose a remote collaboration system combining Augmented Reality (AR) and Windows-Icon-Menu-Pointer (WIMP) for TCC learning. It contains a self-learning subsystem providing an immersive environment and a guidance subsystem as a communication tool. For pose guidance, we design the 2D annotations on the augmented mirror in the virtual environment and the 3D on-body annotations imposed on the learner's joints. 2 user studies were conducted to compare the two types of annotations from the perspective of the instructor and the learner. The result shows that the 2D and 3D annotations both receive high reviews. The 3D annotation performs better by including more instructing information for the learner and more annotation features for the instructor. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-15T17:30:37Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2023-08-15T17:30:37Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | 謝辭 i
摘要 ii Abstract iii List of Figures vii List of Tables viii 1 Introduction 1 2 Related Work 3 2.1 Tai-Chi Chuan Learning System 3 2.2 Remote Collaboration Combining AR and WIMP 4 2.3 Visual Guidance for Body Movement 5 3 System Design 7 3.1 Overview 7 3.2 Self-Learning Subsystem 8 3.2.1 Virtual Coach 8 3.2.2 Augmented Mirror 8 3.2.3 User Interface of Learner 8 3.3 Guidance Subsystem 11 3.3.1 Receiving Body Data 11 3.3.2 Providing Annotations 11 3.3.3 Receiving Annotations 11 3.3.4 Adjusting Body Posture 12 4 Implementation 14 4.1 Sensor Side 15 4.2 Server Side 15 4.3 Instructor Side 15 4.4 Learner Side 17 5 User Study 19 5.1 User Study 1 19 5.1.1 Participant 19 5.1.2 Experimental Condition 20 5.1.3 Scale 20 5.1.4 Procedure 20 5.2 User Study 2 21 5.2.1 Participant 21 5.2.2 Experimental Condition 21 5.2.3 Scale 21 5.2.4 Procedure 21 6 Result and Discussion 22 6.1 User Study 1 22 6.2 User Study 2 24 7 Conclusion and Future Work 26 References 28 A Questionnaire 34 A.1 System Usability Scale (SUS) 34 A.2 NASA Task Load Index (NASA-TLX) 35 A.3 Others 35 | - |
| dc.language.iso | en | - |
| dc.subject | 混合實境 | zh_TW |
| dc.subject | 肢體引導 | zh_TW |
| dc.subject | 太極拳 | zh_TW |
| dc.subject | 擴增實境 | zh_TW |
| dc.subject | 遠距協作 | zh_TW |
| dc.subject | remote collaboration | en |
| dc.subject | Tai-Chi Chuan | en |
| dc.subject | mixed reality | en |
| dc.subject | augmented reality | en |
| dc.subject | pose guidance | en |
| dc.title | 應用肢體引導於擴增實境遠距協作-以太極拳學習為例 | zh_TW |
| dc.title | On-Body Pose Guidance in AR Remote Collaboration: Tai-Chi Chuan Learning as an Example | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 111-2 | - |
| dc.description.degree | 碩士 | - |
| dc.contributor.oralexamcommittee | 鄭文皇;賴尚宏;王元凱 | zh_TW |
| dc.contributor.oralexamcommittee | Wen-Huang Cheng;Shang-Hong Lai;Yuan-Kai Wang | en |
| dc.subject.keyword | 擴增實境,遠距協作,太極拳,混合實境,肢體引導, | zh_TW |
| dc.subject.keyword | augmented reality,remote collaboration,Tai-Chi Chuan,mixed reality,pose guidance, | en |
| dc.relation.page | 35 | - |
| dc.identifier.doi | 10.6342/NTU202302869 | - |
| dc.rights.note | 同意授權(全球公開) | - |
| dc.date.accepted | 2023-08-08 | - |
| dc.contributor.author-college | 電機資訊學院 | - |
| dc.contributor.author-dept | 資訊網路與多媒體研究所 | - |
| 顯示於系所單位: | 資訊網路與多媒體研究所 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-111-2.pdf | 5.02 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
