請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/66923完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 歐陽明 | |
| dc.contributor.author | Yu-Kai Chiu | en |
| dc.contributor.author | 邱昱愷 | zh_TW |
| dc.date.accessioned | 2021-06-17T01:14:56Z | - |
| dc.date.available | 2017-08-24 | |
| dc.date.copyright | 2017-08-24 | |
| dc.date.issued | 2017 | |
| dc.date.submitted | 2017-08-15 | |
| dc.identifier.citation | [1] B. Aronov, S. Har-Peled, C. Knauer, Y. Wang, and C. Wenk. Fréchet distance for curves, revisited. In European Symposium on Algorithms, pages 52–63. Springer, 2006.[2] C. Buehler, M. Bosse, and L. McMillan. Non-metric image-based rendering for video stabilization. In Computer Vision and Pattern Recognition, 2001. CVPR 2001. Proceedings of the 2001 IEEE Computer Society Conference on, volume 2, pages II– II. IEEE, 2001.[3] P. Clini, E. Frontoni, R. Quattrini, and R. Pierdicca. Augmented reality experience: From high-resolution acquisition to real time augmented contents. Advances in Mul- timedia, 2014:18, 2014.[4] Coldplay. Behind the scene of music Video - Up and Up, 2016.[5] Coldplay. Music Video of - Up and Up, 2016.[6] A. Damala, P. Cubaud, A. Bationo, P. Houlier, and I. Marchal. Bridging the gap between the digital and the physical: design and evaluation of a mobile augmented reality guide for the museum visit. In Proceedings of the 3rd international conference on Digital Interactive Media in Entertainment and Arts, pages 120–127. ACM, 2008.[7] B. Delaunay. Sur la sphere vide.[8] M. L. Gleicher and F. Liu. Re-cinematography: Improving the camerawork of ca- sual video. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM), 5(1):2, 2008.[9] M.Grundmann,V.Kwatra,andI.Essa.Auto-directedvideostabilizationwithrobust l1 optimal camera paths. In Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on, pages 225–232. IEEE, 2011.[10] R. I. Hartley and A. Zisserman. Multiple View Geometry in Computer Vision. Cam- bridge University Press, ISBN: 0521623049, 2000.[11] J. Huang, Z. Chen, D. Ceylan, and H. Jin. 6-dof vr videos with a single 360-camera. In Virtual Reality (VR), 2017 IEEE, pages 37–44. IEEE, 2017.[12] Y.-H. Huang, W.-L. Yang, Y.-L. Kao, Y.-K. Chiu, Y.-B. Huang, H.-Y. Chang, and M. Ouhyoung. A novel dexterous instrument tracking system for augmented reality cataract surgery training system. In SIGGRAPH ASIA 2016 VR Showcase, page 14. ACM, 2016.[13] Y.-H. Huang, T.-C. Yu, P.-H. Tsai, Y.-X. Wang, W.-L. Yang, and M. Ouhyoung. Scope+: A stereoscopic video see-through augmented reality microscope. In Ad- junct Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, pages 33–34. ACM, 2015.[14] T. Igarashi, T. Moscovich, and J. F. Hughes. As-rigid-as-possible shape manipula- tion. In ACM transactions on Graphics (TOG), volume 24, pages 1134–1141. ACM, 2005.[15] C.Kenworthy.Mastershots:100AdvancedCameraTechniquestoGetanExpensive Look on Your Low Budget Movie. 2009.[16] J. Lee and S. Y. Shin. General construction of time-domain filters for orientation data. IEEE Transactions on Visualization and Computer Graphics, 8(2):119–128, 2002.[17] F. Liu, M. Gleicher, H. Jin, and A. Agarwala. Content-preserving warps for 3d video stabilization. ACM Transactions on Graphics (TOG), 28(3):44, 2009.[18] F. Liu, M. Gleicher, J. Wang, H. Jin, and A. Agarwala. Subspace video stabilization. ACM Transactions on Graphics (TOG), 30(1):4, 2011.[19] S. Liu, Y. Wang, L. Yuan, J. Bu, P. Tan, and J. Sun. Video stabilization with a depth camera. In Computer Vision and Pattern Recognition (CVPR), 2012 IEEE Conference on, pages 89–95. IEEE, 2012.[20] S. Liu, L. Yuan, P. Tan, and J. Sun. Bundled camera paths for video stabilization. ACM Transactions on Graphics (TOG), 32(4):78, 2013.[21] W.R.Mark,L.McMillan,andG.Bishop.Post-rendering3dwarping.InProceedings of the 1997 symposium on Interactive 3D graphics, pages 7–ff. ACM, 1997.[22] C. Morimoto and R. Chellappa. Evaluation of image stabilization algorithms. InAcoustics, Speech and Signal Processing, 1998. Proceedings of the 1998 IEEE In- ternational Conference on, volume 5, pages 2789–2792. IEEE, 1998.[23] A. Myronenko and X. Song. Point set registration: Coherent point drift. IEEE transactions on pattern analysis and machine intelligence, 32(12):2262–2275, 2010. [24] C. Nguyen, Y. Niu, and F. Liu. Direct manipulation video navigation in 3d. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 1169–1172. ACM, 2013. [25] S. Nicolau, A. Garcia, X. Pennec, L. Soler, and N. Ayache. An augmented reality system to guide radio-frequency tumour ablation. Computer animation and virtual worlds, 16(1):1–10, 2005. [26] S.Nicolau,L.Soler,D.Mutter,andJ.Marescaux.Augmentedrealityinlaparoscopic surgical oncology. Surgical oncology, 20(3):189–201, 2011. [27] R. Pierdicca, E. Frontoni, P. Zingaretti, M. Sturari, P. Clini, and R. Quattrini. Ad- vanced interaction with paintings by augmented reality and high resolution visualiza- tion: a real case exhibition. In International Conference on Augmented and Virtual Reality, pages 38–50. Springer, 2015. [28] M. Rosenthal, A. State, J. Lee, G. Hirota, J. Ackerman, K. Keller, E. D. Pisano, M. Jiroutek, K. Muller, and H. Fuchs. Augmented reality guidance for needle biop- sies: an initial randomized, controlled trial in phantoms. Medical Image Analysis, 6(3):313–320, 2002. [29] A. Savitzky and M. J. Golay. Smoothing and differentiation of data by simplified least squares procedures. Analytical chemistry, 36(8):1627–1639, 1964. [30] K. Shoemake. Animating rotation with quaternion curves. In ACM SIGGRAPH computer graphics, volume 19, pages 245–254. ACM, 1985. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/66923 | - |
| dc.description.abstract | 近年來,電影特效的合成越來越普及,電影公司大量的使用綠幕影像的合成來創作。這類的技術需要非常專業的器材輔助,例如機械手臂、吊臂與滑軌等等。主要目的是為了讓兩段要合成的影片之運鏡能夠相同,如此才能順利結合而不產生變形或視差等等的破綻。是一件相當精細且困難的任務。然而,這類專業器材的價格相當昂貴,動則數十萬甚至百萬,對於學生或是一般獨立製片是相當昂貴的負擔。因此,我們使用擴增實境之導引來取代上述器材,讓使用者藉由手持相機及擴增實境之導引進行拍攝。同時也實作了軟體影像穩定之演算法來修正由手持相機所造成之晃動與位移。以低成本之系統設計達到相似的效果。 | zh_TW |
| dc.description.abstract | Creating videos from compositing multiple footages requires the support of the robotic arm due to the camera motion model needs to be precise. It is extremely difficult to shoot the footage with a hand-held camera. However, the cost of the robotic arm is extremely high. Thus, we introduce an augmented reality guiding system to replace it. In our system, we utilized augmented reality to guide the user for the camera motion and implemented an algorithm of stabilization and camera motion alignment for a hand-held camera. The system reduces the cost but remaining good quality of the result at the same time. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-17T01:14:56Z (GMT). No. of bitstreams: 1 ntu-106-R04922121-1.pdf: 13547926 bytes, checksum: 57579335ca2786e1b6db1b2748a21135 (MD5) Previous issue date: 2017 | en |
| dc.description.tableofcontents | 1 Introduction 1
1.1 BackgroundandMotivation ........................ 1 1.2 Challenge.................................. 3 2 Related Work 6 2.1 ProductionalCompositing ......................... 6 2.2 Videostabilization ............................. 7 2.2.1 Two-dimensionalStabilization................... 7 2.2.2 Three-dimensionalStabilization .................. 8 2.2.3 HybridStabilization ........................ 9 2.2.4 CinematographyCameraMovement................ 10 2.3 AugmentedRealityGuidance........................ 10 3 System Overview 11 3.1 SystemPipeline............................... 11 3.2 Hardware .................................. 12 3.2.1 HTCVive.............................. 12 3.2.2 DSLRCamera ........................... 13 3.2.3 CameraHolder ........................... 13 3.3 AugmentedRealityGuidingSystem .................... 16 4 Implementation 17 4.1 AugmentedRealityPathGuidance..................... 17 4.1.1 PositionGuidance ......................... 17 4.1.2 OrientationGuidance........................ 18 4.1.3 SpeedGuidanceandDemonstration................ 19 4.1.4 ScaleandStartingpositionofthepath . . . . . . . . . . . . . . . 21 4.1.5 Real-timechromaKeys....................... 22 4.2 3DMotionModelExtraction........................ 23 4.2.1 StructureFromMotion....................... 23 4.2.2 ViveTracker ............................ 24 4.3 3DMotionDataSmoothing......................... 24 4.3.1 PositionData............................ 25 4.3.2 OrientationData .......................... 25 4.4 PathMatching................................ 26 4.5 Pre-warping................................. 26 4.6 ContentWarping .............................. 28 5 Result 29 6 Discussion and Limitation 34 7 Conclusion 35 8 Future Work 36 Bibliography 37 | |
| dc.language.iso | en | |
| dc.subject | 綠幕合成 | zh_TW |
| dc.subject | 擴增實境 | zh_TW |
| dc.subject | 電腦視覺 | zh_TW |
| dc.subject | Augmented Reality | en |
| dc.subject | Computer Vision | en |
| dc.subject | Compositing | en |
| dc.title | 以擴增實境輔助電影拍攝 - 綠幕合成之應用 | zh_TW |
| dc.title | AR Filming : Augmented Reality Guide for Compositing Footage in Filmmaking | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 105-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 葉正聖,李明穗 | |
| dc.subject.keyword | 擴增實境,電腦視覺,綠幕合成, | zh_TW |
| dc.subject.keyword | Augmented Reality,Computer Vision,Compositing, | en |
| dc.relation.page | 40 | |
| dc.identifier.doi | 10.6342/NTU201703242 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2017-08-15 | |
| dc.contributor.author-college | 電機資訊學院 | zh_TW |
| dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
| 顯示於系所單位: | 資訊工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-106-1.pdf 未授權公開取用 | 13.23 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
