請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/91620完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 洪一平 | zh_TW |
| dc.contributor.advisor | Yi-Ping Hung | en |
| dc.contributor.author | 陳煬升 | zh_TW |
| dc.contributor.author | Yang-Sheng Chen | en |
| dc.date.accessioned | 2024-02-20T16:14:31Z | - |
| dc.date.available | 2024-02-21 | - |
| dc.date.copyright | 2024-02-20 | - |
| dc.date.issued | 2024 | - |
| dc.date.submitted | 2024-01-30 | - |
| dc.identifier.citation | [1] AMD GPUOpen. Amd radeon™ software fidelityfx contrast adaptive sharpening. https://gpuopen.com/fidelityfx-cas/, Sep. 2023.
[2] F. Argelaguet and C. Andujar. A survey of 3d object selection techniques for virtual environments. Computers & Graphics, 37(3):121–136, 2013. [3] A. Athalye. Neural style. https://github.com/anishathalye/neural-style, 2015. commit cbf6a64. [4] R. Atienza, R. Blonna, M. I. Saludares, J. Casimiro, and V. Fuentes. Interaction techniques using head gaze for virtual reality. In 2016 IEEE Region 10 Symposium (TENSYMP), pp. 110–114. IEEE, 2016. [5] R. Bailey, A. McNamara, N. Sudarsanam, and C. Grimm. Subtle gaze direction. ACM Trans. Graph., 28(4), sep 2009. doi: 10.1145/1559755.1559757 [6] E. Ben-Joseph and E. Greenstein. Gaze direction in virtual reality using illumination modulation and sound. Retrieved July, 30:2018, 2016. [7] C. Boletsis. The new era of virtual reality locomotion: A systematic literature review of techniques and a proposed typology. multimodal technologies and inter action 1, 4 (2017), 2017. [8] E. Bozgeyikli, A. Raij, S. Katkoori, and R. Dubey. Point & teleport locomotion technique for virtual reality. In Proceedings of the 2016 annual symposium on computer-human interaction in play, pp. 205–216, 2016. [9] M. Bricken. Virtual reality learning environments: potentials and challenges. Acm Siggraph Computer Graphics, 25(3):178–184, 1991. [10] H. Bruno. Real-time volumetric intersections of deforming objects. Proc. Vision, Modeling, Visualization VMV’03, Munich, Germany, Nov., 2003. [11] P. Buonincontri and A. Marasco. Enhancing cultural heritage experiences with smart technologies: An integrated experiential framework. European Journal of Tourism Research, 17:83–101, 2017. [12] G. Burdea, P. Richard, and P. Coiffet. Multimodal virtual reality: Input-output devices, system integration, and human factors. International Journal of Human Computer Interaction, 8(1):5–24, 1996. [13] M. Carrozzino and M. Bergamasco. Beyond virtual museums: Experiencing im mersive virtual reality in real museums. Journal of cultural heritage, 11(4):452– 458, 2010. [14] G. Chen, X. Xie, Z. Yang, R. Deng, K. Huang, and C. Wang. Development of a virtual reality game for cultural heritage education: The voyage of“gotheborg". In 2023 9th International Conference on Virtual Reality (ICVR), pp. 531–535. IEEE, 2023. [15] T. Q. Chen and M. Schmidt. Fast patch-based style transfer of arbitrary style. arXiv preprint arXiv:1612.04337, 2016. [16] Y.-S. Chen, P.-H. Han, J.-C. Hsiao, K.-C. Lee, C.-E. Hsieh, K.-Y. Lu, C.-H. Chou, and Y.-P. Hung. Soes: Attachable augmented haptic on gaming controller for im mersive interaction. In Adjunct Proceedings of the 29th Annual ACM Symposium on User Interface Software and Technology, pp. 71–72, 2016. [17] Y.-S. Chen, C.-E. Hsieh, M. T. Y. Jie, P.-H. Han, and Y.-P. Hung. Leap to the eye: Implicit gaze-based interaction to reveal invisible objects for virtual environment exploration. In 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 214–222. IEEE, 2023. [18] J. Cromby, P. Standen, and D. Brown. Using virtual environments in special edu cation. VR in the Schools, 1(3):1–4, 1995. [19] D. Duckworth, P. Hedman, C. Reiser, P. Zhizhin, J.-F. Thibert, M. Lučić, R. Szeliski, and J. T. Barron. Smerf: Streamable memory efficient radiance fields for real-time large-scene exploration. arXiv preprint arXiv:2312.07541, 2023. [20] V. Dumoulin, J. Shlens, and M. Kudlur. A learned representation for artistic style. arXiv preprint arXiv:1610.07629, 2016. [21] Epic Games. Best practices for creating and optimizing content for xr projects. https://docs.unrealengine.com/en-US/Platforms/VR/DevelopVR/ ContentSetup/index.html, Sep.2023. [22] Facebook. Introduction to best practices. https://developer.oculus.com/design/latest/concepts/book-bp/, Oct.2019. [23] C.-Q. K. X.-Z. FENG and J.-K. D. G.-D. TANG. 3d information restoration of the digital images of dunhuang mural paintings. [24] L. A. Gatys, M. Bethge, A. Hertzmann, and E. Shechtman. Preserving color in neural artistic style transfer. arXiv preprint arXiv:1606.05897, 2016. [25] L. A. Gatys, A. S. Ecker, and M. Bethge. Image style transfer using convolutional neural networks. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 2414–2423, 2016. [26] Google. Choose your development environment. https://developers.google.com/vr/develop, Oct.2019. [27] Google I/ O 2016. Vr at google. https://events.google.com/io2016/, May.2016. [28] H. Han and D. Lu. Computer aided protection and restoration of dunhuang mural. In 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No. 04CH37583), vol. 7, pp. 6434–6438. IEEE, 2004. [29] P.-H. Han, Y.-S. Chen, I.-S. Liu, Y.-P. Jang, L. Tsai, A. Chang, and Y.-P. Hung. A compelling virtual tour of the dunhuang cave with an immersive head-mounted display. IEEE Computer Graphics and Applications, 40(1):40–55, 2019. [30] P.-H. Han, C.-E. Hsieh, Y.-S. Chen, J.-C. Hsiao, K.-C. Lee, S.-F. Ko, K.-W. Chen, C.-H. Chou, and Y.-P. Hung. Aoes: Enhancing teleportation experience in im mersive environment with mid-air haptics. In ACM SIGGRAPH 2017 Emerging Technologies, pp. 1–2. 2017. [31] P.-H. Han, D.-Y. Huang, H.-R. Tsai, P.-C. Chen, C.-H. Hsieh, K.-Y. Lu, D.-N. Yang, and Y.-P. Hung. Moving around in virtual space with spider silk. In ACM SIG GRAPH 2015 Emerging Technologies, pp. 1–1. 2015. [32] C. Hand. A survey of 3d interaction techniques. In Computer graphics forum, vol. 16, pp. 269–281. Wiley Online Library, 1997. [33] K. Hanson and B. E. Shelton. Design and development of virtual reality: analysis of challenges faced by educators. Journal of Educational Technology & Society, 11(1):118–131, 2008. [34] HTC. Get started developing for vive. https://developer.vive.com/us/, Sep.2023. [35] D.-Y. Huang, S.-C. Chen, L.-E. Chang, P.-S. Chen, Y.-T. Yeh, and Y.-P. Hung. Im cave: An interactive tabletop system for virtually touring mogao caves. In 2014 IEEE International Conference on Multimedia and Expo (ICME), pp. 1–6. IEEE, 2014. [36] J. Jerald. Human-centered vr design: five essentials every engineer needs to know. IEEE computer graphics and applications, 38(02):15–21, 2018. [37] S. Jin, M. Fan, and A. Kadir. Immersive spring morning in the han palace: learning traditional chinese art via virtual reality and multi-touch tabletop. International Journal of Human–Computer Interaction, 38(3):213–226, 2022. [38] S. Jin, M. Fan, Y. Wang, and Q. Liu. Reconstructing traditional chinese paintings with immersive virtual reality. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–8, 2020. [39] A. Johnson, T. Moher, S. Ohlsson, and M. Gillingham. The round earth project collaborative vr for conceptual learning. IEEE Computer Graphics and Applica tions, 19(6):60–69, 1999. [40] J. Johnson, A. Alahi, and L. Fei-Fei. Perceptual losses for real-time style transfer and super-resolution. In European conference on computer vision, pp. 694–711. Springer, 2016. [41] S. Kenderdine. “pure land": Inhabiting the mogao caves at dunhuang. Curator: The Museum Journal, 56(2):199–218, 2013. [42] R. S. Kennedy, N. E. Lane, K. S. Berbaum, and M. G. Lilienthal. Simulator sick ness questionnaire: An enhanced method for quantifying simulator sickness. The international journal of aviation psychology, 3(3):203–220, 1993. [43] B. Kerbl, G. Kopanas, T. Leimkühler, and G. Drettakis. 3d gaussian splatting for real-time radiance field rendering. ACM Transactions on Graphics, 42(4), July 2023. [44] M. Khamis, C. Oechsner, F. Alt, and A. Bulling. VRPursuits: Interaction in virtual reality using smooth pursuit eye movements. In Proceedings of the 2018 Interna tional Conference on Advanced Visual Interfaces, pp. 1–8, 2018. [45] J. Korein and N. Badler. Temporal anti-aliasing in computer generated animation. In Proceedings of the 10th annual conference on Computer graphics and interactive techniques, pp. 377–388, 1983. [46] D. Lange, T. C. Stratmann, U. Gruenefeld, and S. Boll. Hivefive: Immersion pre serving attention guidance in virtual reality. In Proceedings of the 2020 CHI con ference on human factors in computing systems, pp. 1–13, 2020. [47] J.-Y. Lee, P.-H. Han, L. Tsai, R.-D. Peng, Y.-S. Chen, K.-W. Chen, and Y.-P. Hung. Estimating the simulator sickness in immersive virtual reality with optical flow analysis. In SIGGRAPH Asia 2017 Posters, pp. 1–2. 2017. [48] F. Li, C.-H. Lee, S. Feng, A. Trappey, and F. Gilani. Prospective on eye-tracking based studies in immersive virtual reality. In 2021 IEEE 24th International Confer ence on Computer Supported Cooperative Work in Design (CSCWD), pp. 861–866. IEEE, 2021. [49] B. Lutz and M. Weintke. Virtual dunhuang art cave: A cave within a cave. In Computer Graphics Forum, vol. 18, pp. 257–264. Wiley Online Library, 1999. [50] A. MacQuarrie and A. Steed. Cinematic virtual reality: Evaluating the effect of display type on the viewing experience for panoramic video. In 2017 IEEE Virtual Reality (VR), pp. 45–54. IEEE, 2017. [51] D. Mardanbegi, D. W. Hansen, and T. Pederson. Eye-based head gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 139–146, 2012. [52] S. Marwecki, A. D. Wilson, E. Ofek, M. Gonzalez Franco, and C. Holz. Mise Unseen: Using eye tracking to hide virtual reality scene changes in plain sight. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, pp. 777–789, 2019. [53] L. T. Nielsen, M. B. Møller, S. D. Hartmeyer, T. C. Ljung, N. C. Nilsson, R. Nor dahl, and S. Serafin. Missing the point: an exploration of how to guide users’attention during cinematic virtual reality. In Proceedings of the 22nd ACM conference on virtual reality software and technology, pp. 229–232, 2016. [54] T. Nukarinen, J. Kangas, J. Rantala, O. Koskinen, and R. Raisamo. Evaluating ray casting and two gaze-based pointing techniques for object selection in virtual reality. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, pp. 1–2, 2018. [55] M. F. Osorio, P. Figueroa, F. Prieto, P. Boulanger, and E. Londoño. A novel approach to documenting artifacts at the gold museum in bogota. Computers & Graphics, 35(4):894–903, 2011. [56] T. Parisi. Learning virtual reality: Developing immersive experiences and appli cations for desktop, web, and mobile. ” O’Reilly Media, Inc.”, 2015. [57] R. Pausch, J. Snoddy, R. Taylor, S. Watson, and E. Haseltine. Disney’s aladdin: first steps toward storytelling in virtual reality. In Proceedings of the 23rd annual conference on Computer graphics and interactive techniques, pp. 193–203, 1996. [58] PerkinsCoie. 2022 xr report. https://www.perkinscoie.com/en/news-insights/xr-industry-insider-2022-arvrxr-report.html, 2022. [59] T. Piumsomboon, G. Lee, R. W. Lindeman, and M. Billinghurst. Exploring natural eye-gaze-based interaction for immersive virtual reality. In 2017 IEEE Symposium on 3D User Interfaces (3DUI), pp. 36–39. IEEE, 2017. [60] E. Probst, V. Suttner, M. Dietrich, and T. Buehler. Rapture of the Deep. In SIG GRAPH Asia 2018 Virtual & Augmented Reality, pp. 1–2. 2018. [61] Y. Y. Qian and R. J. Teather. Look to go: An empirical evaluation of eye-based travel in virtual reality. In Proceedings of the Symposium on Spatial User Interac tion, pp. 130–140, 2018. [62] K. Rahimi, C. Banigan, and E. D. Ragan. Scene transitions and teleportation in virtual reality and the implications for spatial awareness and sickness. IEEE trans actions on visualization and computer graphics, 26(6):2273–2287, 2018. [63] S. Razzaque, D. Swapp, M. Slater, M. C. Whitton, and A. Steed. Redirected walking in place. In EGVE, vol. 2, pp. 123–130. Citeseer, 2002. [64] ROADTOVR. Monthly-connected vr headsets on steam blast through 3 million milestone. https://www.roadtovr.com/monthly-connected-vr-headsets-steam-survey-january-2022/, 2022. [65] A. M. Ronchi. eCulture: cultural content in the digital age. Springer Science & Business Media, 2009. [66] S. Rothe, T. Höllerer, and H. Hußmann. CVR-Analyzer: A tool for analyzing cin ematic virtual reality viewing patterns. In Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia, pp. 127–137, 2018. [67] S. Rothe and H. Hussmann. Spaceline: A concept for interaction in cinematic virtual reality. In International Conference on Interactive Digital Storytelling, pp. 115–119. Springer, 2019. [68] S. Rothe, H. Hußmann, and M. Allary. Diegetic cues for guiding the viewer in cinematic virtual reality. In Proceedings of the 23rd ACM symposium on virtual reality software and technology, pp. 1–2, 2017. [69] M. Roussos, A. Johnson, T. Moher, J. Leigh, C. Vasilakis, and C. Barnes. Learning and building together in an immersive virtual world. Presence: Teleoperators & Virtual Environments, 8(3):247–263, 1999. [70] M. Roussou. Immersive interactive virtual reality and informal education. In Pro ceedings of user interfaces for all: interactive learning environments for children, pp. 1–9. Citeseer, 2000. [71] M. Roussou. Immersive interactive virtual reality in the museum. Proc. of TiLE (Trends in Leisure Entertainment), 2001. [72] M. Roussou and D. Efraimoglou. High-end interactive media in the museum. In International Conference on Computer Graphics and Interactive Techniques: ACM SIGGRAPH 99 Conference abstracts and applications, vol. 8, pp. 59–62, 1999. [73] J. L. Rubio-Tamayo, M. Gertrudix Barrio, and F. García García. Immersive en vironments and virtual reality: Systematic review and advances in communica tion, interaction and simulation. Multimodal technologies and interaction, 1(4):21, 2017. [74] T. W. Schubert. The sense of presence in virtual environments: A three-component scale measuring spatial presence, involvement, and realness. Z. für Medienpsy chologie, 15(2):69–71, 2003. [75] K. Schwaber. Scrum development process. In Business Object Design and Imple mentation: OOPSLA'95 Workshop Proceedings 16 October 1995, Austin, Texas, pp. 117–134. Springer, 1997. [76] A. Sheikh, A. Brown, Z. Watson, and M. Evans. Directing attention in 360-degree video. 2016. [77] M. Slater, A. Steed, and M. Usoh. The virtual treadmill: A naturalistic metaphor for navigation in immersive virtual environments. In Virtual Environments'95: Selected papers of the Eurographics Workshops in Barcelona, Spain, 1993, and Monte Carlo, Monaco, 1995, pp. 135–148. Springer, 1995. [78] M. Speicher, C. Rosenberg, D. Degraen, F. Daiber, and A. Krúger. Exploring visual guidance in 360-degree videos. In Proceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video, pp. 1–12, 2019. [79] J. Stam. Stable fluids. In Seminal Graphics Papers: Pushing the Boundaries, Volume 2, pp. 779–786. 2023. [80] A. M. Sullivan. Cultural heritage & new media: a future for the past. J. Marshall Rev. Intell. Prop. L., 15:604, 2015. [81] H. T. Tanaka, K. Hachimura, K. Yano, S. Tanaka, K. Furukawa, T. Nishiura, M. Tsutida, W. Choi, and W. Wakita. Multimodal digital archiving and reproduc tion of the world cultural heritage” gion festival in kyoto”. In Proceedings of the 9th ACM SIGGRAPH Conference on Virtual-Reality Continuum and its Applications in Industry, pp. 21–28, 2010. [82] V. Tanriverdi and R. J. Jacob. Interacting with eye movements in virtual environ ments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 265–272, 2000. [83] J. N. Templeman, P. S. Denbrook, and L. E. Sibert. Virtual locomotion: Walking in place through virtual environments. Presence, 8(6):598–617, 1999. [84] M. Teschner, S. Kimmerle, B. Heidelberger, G. Zachmann, L. Raghupathi, A. Fuhrmann, M.-P. Cani, F. Faure, N. Magnenat-Thalmann, W. Strasser, et al. Collision detection for deformable objects. In Computer graphics forum, vol. 24, pp. 61–81. Wiley Online Library, 2005. [85] S. Thomas and A. Mintz. Virtual and the Real: Media in the Museum. American Association of Museums, 1998. [86] L. P. Tost and M. Economou. Worth a thousand words? the usefulness of immersive virtual reality for learning in cultural heritage settings. International Journal of Architectural Computing, 7(1):157–176, 2009. [87] M. Turk. Multimodal interaction: A review. Pattern recognition letters, 36:189– 195, 2014. [88] Ubisoft 2016. Eagle flight. https://www.ubisoft.com/en-GB/game/eagle-flight/, 2016. [89] UNESCO. Convention concerning the protection of the world cultural and natural heritage (1972). https://whc.unesco.org/?cid=175, 16 Nov. 1972. [90] Unity. Introduction to best practices. https://learn.unity.com/tutorial/vr-best-practice, Oct. 2019. [91] M. Usoh, K. Arthur, M. C. Whitton, R. Bastos, A. Steed, M. Slater, and F. P. Brooks Jr. Walking> walking-in-place> flying, in virtual environments. In Pro ceedings of the 26th annual conference on Computer graphics and interactive tech niques, pp. 359–364, 1999. [92] K. Vasylevska, I. Podkosova, and H. Kaufmann. Walking in virtual reality: flex ible spaces and other techniques. The Visual Language of Technique: Volume 2- Heritage and Expectations in Research, pp. 81–97, 2015. [93] M. Vosmeer and B. Schouten. Interactive cinema: Engagement and interac tion. In International Conference on Interactive Digital Storytelling, pp. 140–147. Springer, 2014. [94] Z. Wang, X. Jin, D. Shao, R. Li, H. Zha, and K. Ikeuchi. Digital longmen project: A free walking vr system with image-based restoration. In Computer Vision–ACCV 2016 Workshops: ACCV 2016 International Workshops, Taipei, Taiwan, November 20-24, 2016, Revised Selected Papers, Part II 13, pp. 191–206. Springer, 2017. [95] B. G. Witmer, C. J. Jerome, and M. J. Singer. The factor structure of the presence questionnaire. Presence: Teleoperators & Virtual Environments, 14(3):298–312, 2005. [96] L. Yang, S. Liu, and M. Salvi. A survey of temporal antialiasing techniques. In Computer graphics forum, vol. 39, pp. 607–621. Wiley Online Library, 2020. [97] T.-H. Yang, J.-Y. Huang, P.-H. Han, and Y.-P. Hung. Saw it or triggered it: Explor ing the threshold of implicit and explicit interaction for eye-tracking technique in virtual reality. In 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), pp. 482–483. IEEE, 2021. [98] A. Yoshimura, A. Khokhar, and C. W. Borst. Eye-gaze-triggered visual cues to restore attention in educational vr. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 1255–1256. IEEE, 2019. [99] C. Yuan and Z. Yun. Tunable, a vr reconstruction of” listening to a guqin” from emperor zhao ji. In SIGGRAPH ASIA 2016 VR Showcase, pp. 1–2. 2016. [100] P. Yunhe. Dunhuang art cave virtual rebuilding and navigation. In Proceedings of the Fourth Conference on Computer Aided Architectural Design Research in Asia, pp. 1–20. Shanghai Scientific and Technological Literature Publishing House, 1999. [101] H. Zhang, S. Starke, T. Komura, and J. Saito. Mode-adaptive neural networks for quadruped motion control. ACM Transactions on Graphics (TOG), 37(4):1–11, 2018. [102] W. Zhang, J.-W. Zhang, K. K. Wong, Y. Wang, Y. Feng, L. Wang, and W. Chen. Computational approaches for traditional chinese painting: From the” six principles of painting” perspective. arXiv preprint arXiv:2307.14227, 2023. | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/91620 | - |
| dc.description.abstract | 透過虛擬實境技術我們可以模擬沉浸式的虛擬世界,並將此技術應用至教育領域。雖然目前虛擬實境頭盔的開發廠商提供虛擬實境的開發指引,但仍然難以套用至各種領域中,如博物館或美術館用來傳播文化遺產。這類型的虛擬實境應用需要跨領域團隊合作,結合歷史學家、博物館研究員以及程式開發團隊共同協力完成虛擬實境體驗。並且在開發階段考量互動開發、財務預算考量、硬體限制以及體驗內容的效能處裡使得製作這類型的虛擬實境體驗別具挑戰性。本篇論文根據文化遺產類型於虛擬實境體驗的方式逕行分類,提出設計考量的框架對應不同類型的文化遺產呈現。並且本篇論文亦提出過去製作並已發表的虛擬實際博物館體驗為最佳實踐範例,提出這些專案的設計考量以及互動實作方式於製作高沉浸的虛擬實際應用。此外我們也套用多模式互動方法讓使用者可以獲得多重感官互動體驗以強化虛擬實境體驗。在各實例中,本篇論問亦進行使用者研究以及場域應用評估以量測系統易用性、沉浸程度以及主觀的學習成效結果。本篇論文總結了博物館虛擬實境應用開發之跨領域合作所需的設計考量,以及數位文化遺產之互動方法用於虛擬環境中。 | zh_TW |
| dc.description.abstract | With virtual reality technology, we can simulate immersive virtual worlds, making it a valuable tool in education. Despite existing best practices outlined by head-mounted display providers, applying VR in diverse fields, such as disseminating cultural heritage in museums, presents unique challenges. This application necessitates collaboration among historians, museum researchers, and developers to effectively convey cultural heritage contexts, considering factors like human aspects, financial constraints, hardware limitations, and performance issues. This thesis categorizes cultural heritage in VR experience design, providing a framework for design considerations in various heritage presentations. We present our best practices, offering insights into design considerations and interaction implementation to create high-immersion VR museum experiences. Additionally, our projects leverage multimodal interactions to enable multiple sensations in the VR museum setting. Through user studies and field deployment assessments, we review system usability, immersion, and subjective learning aspects. In conclusion, our results not only enhance cross-field cooperation in developing VR museum experiences but also contribute interaction insights for translating cultural heritage into virtual environments. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-02-20T16:14:31Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2024-02-20T16:14:31Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | Verification Letter from the Oral Examination Committee i
Acknowledgments iii 摘要 v Abstract vii Contents ix List of Figures xv List of Tables xix Chapter 1 Introduction 1 1.1 Background and Motivation . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Problem and Approach . . . . . . . . . . . . . . . . . . . . . . . . . 3 Chapter 2 Related Work 5 2.1 Immersive VR Museum Experiences . . . . . . . . . . . . . . . . . 5 2.2 Approaching to create the VR Museum Experiences . . . . . . . . . 6 2.2.1 Cinematic Virtual Reality (CVR) . . . . . . . . . . . . . . . . . . . 7 2.2.2 Interactive Virtual Reality (IVR) . . . . . . . . . . . . . . . . . . . 8 2.2.3 Hybrid Approaches . . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.3 Experiencing Cultural Heritage in VR . . . . . . . . . . . . . . . . . 9 2.3.1 Movable Heritage VR Experience . . . . . . . . . . . . . . . . . . 12 2.3.2 Immovable Heritage VR experience . . . . . . . . . . . . . . . . . 15 2.4 Interaction Concept in Cultural Heritage Experiences . . . . . . . . . 17 2.4.1 Cultural Heritage Education and VR . . . . . . . . . . . . . . . . . 17 2.4.2 Locomotion Techniques . . . . . . . . . . . . . . . . . . . . . . . . 18 2.4.3 Gaze-based interaction in VR applications . . . . . . . . . . . . . . 19 2.4.4 Guidance in VR . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 2.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 2.5.1 Visual Design of VR Experience . . . . . . . . . . . . . . . . . . . 22 2.5.2 Moving Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2.5.3 Guidance Strategy . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 2.5.4 Design Space of VR Cultural Heritage . . . . . . . . . . . . . . . . 25 Chapter 3 Applying the Exploration-Based Interaction to Immovable Her itage VR Experience 27 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 3.2 Design Consideration . . . . . . . . . . . . . . . . . . . . . . . . . . 29 3.2.1 Stories of Murals . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 3.2.2 Unreachable And Obstructed Murals Throughout the Cave . . . . . 30 3.2.3 Damaged Murals and Lost Statues . . . . . . . . . . . . . . . . . . 31 3.3 Interaction Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 3.3.1 Spatial Context and Augmented Information . . . . . . . . . . . . . 33 3.3.2 Teleport Technique for Moving Around in the Cave . . . . . . . . . 35 3.3.3 Digital Restoration of Damaged Murals and Lost Statues . . . . . . 41 3.4 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 3.5 User Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 3.6 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 3.7 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 3.7.1 Regions of Interest . . . . . . . . . . . . . . . . . . . . . . . . . . 48 3.7.2 Forgot or Unseen . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 3.7.3 Acquiring Information from Spatial Context . . . . . . . . . . . . . 50 3.7.4 Audio Guidance in the Virtual Tour . . . . . . . . . . . . . . . . . . 51 3.8 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 Chapter 4 Applying the Exploration-Based Interaction to Movable Heritage VR Experience 55 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 4.2 Design Consideration . . . . . . . . . . . . . . . . . . . . . . . . . . 56 4.2.1 Stories of Heritage . . . . . . . . . . . . . . . . . . . . . . . . . . 57 4.2.2 Reconstruction of Heritage Background . . . . . . . . . . . . . . . 61 4.3 Interaction Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 4.3.1 Interaction of Excavation . . . . . . . . . . . . . . . . . . . . . . . 63 4.3.2 Interaction with First Dog . . . . . . . . . . . . . . . . . . . . . . . 67 4.4 Multimodal Tactile Feedback . . . . . . . . . . . . . . . . . . . . . 69 4.4.1 Wearable Devices . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 4.4.2 Environment-Based Feedback . . . . . . . . . . . . . . . . . . . . 70 4.4.3 Physical Scenery Design . . . . . . . . . . . . . . . . . . . . . . . 71 4.5 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 4.6 Field Deploy Study . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 4.7 Result and Discussion . . . . . . . . . . . . . . . . . . . . . . . . . 73 4.7.1 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 4.8 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 Chapter 5 Applying the Storytelling-Based Interaction to Movable Heritage VR Experience 79 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 5.2 Design Consideration . . . . . . . . . . . . . . . . . . . . . . . . . . 81 5.2.1 Collaboration Process with Museum . . . . . . . . . . . . . . . . . 81 5.2.2 Exploration of visible and invisible objects . . . . . . . . . . . . . . 84 5.2.3 Style Transfer Application . . . . . . . . . . . . . . . . . . . . . . 85 5.3 Interaction Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 5.3.1 Water Rendering for Chinese Watercolor Painting in VR . . . . . . 86 5.3.2 Post-processing for Chinese Watercolor Painting in VR . . . . . . . 90 5.3.3 Gaze-based Interaction for Storytelling . . . . . . . . . . . . . . . . 93 5.3.4 Style Transfer Technique of Heritage Painting Interpretation . . . . 94 5.4 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 5.5 User Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 5.6 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 5.7 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 5.7.1 Understanding of Virtual Painting Environment . . . . . . . . . . . 101 5.7.2 Area of Interesting . . . . . . . . . . . . . . . . . . . . . . . . . . 103 5.8 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 Chapter 6 Conclusion and Future Work 109 6.1 Summary of This Thesis . . . . . . . . . . . . . . . . . . . . . . . . 109 6.2 Future Direction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 References 113 | - |
| dc.language.iso | en | - |
| dc.subject | 虛擬實境 | zh_TW |
| dc.subject | 使用者經驗 | zh_TW |
| dc.subject | 沉浸式互動 | zh_TW |
| dc.subject | 多模組回饋技術 | zh_TW |
| dc.subject | 文化遺產 | zh_TW |
| dc.subject | Virtual Reality | en |
| dc.subject | Cultural Heritage | en |
| dc.subject | Multimodal Feedback Techniques | en |
| dc.subject | Immersive Interaction | en |
| dc.subject | User Experience | en |
| dc.title | 用於強化沉浸式體驗之多模式回饋技術於虛擬實境博物館體驗 | zh_TW |
| dc.title | Multimodal Interaction for Enhancing Immersion in VR Museum Experience | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 112-1 | - |
| dc.description.degree | 博士 | - |
| dc.contributor.oralexamcommittee | 歐陽明;莊永裕;陳彥仰;李明穗;胡敏君;孫士韋 | zh_TW |
| dc.contributor.oralexamcommittee | Ming Ouhyoung;Yung-Yu Chuang;Mike Y. Chen;Ming-Sui Lee;Min-Chun Hu;Shih-Wei Sun | en |
| dc.subject.keyword | 虛擬實境,使用者經驗,沉浸式互動,多模組回饋技術,文化遺產, | zh_TW |
| dc.subject.keyword | Virtual Reality,User Experience,Immersive Interaction,Multimodal Feedback Techniques,Cultural Heritage, | en |
| dc.relation.page | 125 | - |
| dc.identifier.doi | 10.6342/NTU202400320 | - |
| dc.rights.note | 同意授權(限校園內公開) | - |
| dc.date.accepted | 2024-02-01 | - |
| dc.contributor.author-college | 電機資訊學院 | - |
| dc.contributor.author-dept | 資訊網路與多媒體研究所 | - |
| dc.date.embargo-lift | 2029-01-29 | - |
| 顯示於系所單位: | 資訊網路與多媒體研究所 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-112-1.pdf 未授權公開取用 | 59.24 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
