請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/69493
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 陳彥仰(Mike Y. Chen) | |
dc.contributor.author | Yu-An Chen | en |
dc.contributor.author | 陳俞安 | zh_TW |
dc.date.accessioned | 2021-06-17T03:17:16Z | - |
dc.date.available | 2018-07-19 | |
dc.date.copyright | 2018-07-19 | |
dc.date.issued | 2018 | |
dc.date.submitted | 2018-07-03 | |
dc.identifier.citation | [1] Arkit.
[2] Copilot. [3] Dji gs pro. [4] Dji mobile sdk. [5] Google earth. [6] Google map gestures. [7] Litchi for dji mavic. [8] Mission planner. [9] Pix4d. [10] Skywand. [11] M. Arvola and A. Holm. Device-orientation is more engaging than drag (at least in mobile computing). In Proceedings of the 8th Nordic Conference on Human- Computer Interaction: Fun, Fast, Foundational, NordiCHI ’14, pages 939–942, New York, NY, USA, 2014. ACM. [12] L. Besançon, P. Issartel, M. Ammi, and T. Isenberg. Hybrid tactile/tangible inter- action for 3d data exploration. IEEE Transactions on Visualization and Computer Graphics, 23(1):881–890, 2017. [13] L. Besançon, P. Issartel, M. Ammi, and T. Isenberg. Mouse, tactile, and tangible input for 3d manipulation. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI ’17, pages 4727–4740, New York, NY, USA, 2017. ACM. [14] W. Büschel, P. Reipschläger, R. Langner, and R. Dachselt. Investigating the use of spatial interaction for 3d data visualization on mobile devices. In Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces, ISS ’17, pages 62–71, New York, NY, USA, 2017. ACM. [15] J. R. Cauchard, J. L. E, K. Y. Zhai, and J. A. Landay. Drone & me: An exploration into natural human-drone interaction. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp ’15, pages 361–365, New York, NY, USA, 2015. ACM. [16] D.Cohen-Or,C.Greif,T.Ju,N.J.Mitra,A.Shamir,O.Sorkine-Hornung,andH.R. Zhang. A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing. A.K. Peters, Ltd., Natick, MA, USA, 2015. [17] F. D. Crescenzio, G. Miranda, F. Persiani, and T. Bombardi. A first implementation of an advanced 3d interface to control and supervise uav (uninhabited aerial vehi- cles) missions. Presence: Teleoperators & Virtual Environments, 18(3):171–184, jun 2009. [18] J. L. E, I. L. E, J. A. Landay, and J. R. Cauchard. Drone & wo: Cultural influences on human-drone interaction techniques. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI ’17, pages 6794–6799, New York, NY, USA, 2017. ACM. [19] C.Gebhardt,B.Hepp,T.Nägeli,S.Stevšić,andO.Hilliges.Airways:Optimization- based planning of quadrotor trajectories according to high-level user goals. In Pro- ceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI ’16, pages 2508–2519, New York, NY, USA, 2016. ACM. [20] W. Hürst and T. Bilyalov. Dynamic versus static peephole navigation of vr panora- mas on handheld devices. In Proceedings of the 9th International Conference on Mobile and Ubiquitous Multimedia, MUM ’10, pages 25:1–25:8, New York, NY, USA, 2010. ACM. [21] N. Joubert, M. Roberts, A. Truong, F. Berthouzoz, and P. Hanrahan. An interac- tive tool for designing quadrotor camera shots. ACM Transactions on Graphics, 34(6):238:1–238:11, 2015. [22] A. Marzo, B. Bossavit, and M. Hachet. Combining multi-touch input and device movement for 3d manipulations in mobile augmented reality environments. In Pro- ceedings of the 2Nd ACM Symposium on Spatial User Interaction, SUI ’14, pages 13–16, New York, NY, USA, 2014. ACM. [23] F. Mueller, E. Graether, and C. Toprak. Joggobot: Jogging with a flying robot. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’13, pages 2845–2846, New York, NY, USA, 2013. ACM. [24] F. F. Mueller and M. Muirhead. Jogging with a quadcopter. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15, pages 2023–2032, New York, NY, USA, 2015. ACM. [25] T. Nägeli, J. Alonso-Mora, A. Domahidi, D. Rus, and O. Hilliges. Real-time mo- tion planning for aerial videography with dynamic obstacle avoidance and viewpoint optimization. IEEE Robotics and Automation Letters, 2(3):1696–1703, feb 2017. [26] T. Nägeli, L. Meier, A. Domahidi, J. Alonso-Mora, and O. Hilliges. Real-time planning for automated multi-view drone cinematography. ACM Trans. Graph., 36(4):132:1–132:10, jul 2017. [27] M. Norusis. SPSS 14.0 Guide to Data Analysis. Prentice-Hall, Inc., Upper Saddle River, NJ, USA, 2006. [28] H. Nozaki. Flying display: A movable display pairing projector and screen in the air. In CHI ’14 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’14, pages 909–914, New York, NY, USA, 2014. ACM. [29] M. Roberts and P. Hanrahan. Generating dynamically feasible trajectories for quadrotor cameras. ACM Transactions on Graphics, 35(4):61:1–61:11, jul 2016. [30] J. Scheible, A. Hoth, J. Saal, and H. Su. Displaydrone: A flying robot based interac- tive display. In Proceedings of the 2Nd ACM International Symposium on Pervasive Displays, PerDis ’13, pages 49–54, New York, NY, USA, 2013. ACM. [31] M. Spindler, W. Büschel, and R. Dachselt. Use your head: Tangible windows for 3d information spaces in a tabletop environment. In Proceedings of the 2012 ACM International Conference on Interactive Tabletops and Surfaces, ITS ’12, pages 245– 254, New York, NY, USA, 2012. ACM. [32] D. Szafir, B. Mutlu, and T. Fong. Communication of intent in assistive free flyers. In Proceedings of the 2014 ACM/IEEE International Conference on Human-robot Interaction, HRI ’14, pages 358–365, New York, NY, USA, 2014. ACM. [33] D. Szafir, B. Mutlu, and T. Fong. Communicating directionality in flying robots. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human- Robot Interaction, HRI ’15, pages 19–26, New York, NY, USA, 2015. ACM. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/69493 | - |
dc.description.abstract | 空拍機配備能自由調整角度的相機在攝影領域越來越熱門。但若 要同時操作空拍機與相機時,需要控制五到六個維度,這需要大量 的訓練。我們提出了 ARPilot,讓使用者能夠使用透過行動裝置利用 擴增實境的技術直接對著縮小後的虛擬 3D 模型模擬空中拍攝。使用 者會將行動裝置當做鏡頭,直覺的找尋自己想像中的關鍵影格。我們 也探討了三種六維的虛擬實境攝影方式,分別為:AR keyframe、AR continuous 和 AR hybrid。我們不只比較了這三種新拍攝模式的差異和 使用者偏好,也比較了使用虛擬實境的技術來操控鏡頭與傳統觸控方 式來調控鏡頭的時間、準確度與偏好。結果顯示,AR hybrid 模式是最 受偏好且耗費心力最低的模式,而使用者也認為 AR continuous 模式能 幫助拍出較有創意的空拍成品。為了提供未來設計作為參考,我們也 對幾種不同的應用進行討論並回報觀點。 | zh_TW |
dc.description.abstract | Drones offer camera angles that are not possible with traditional cameras and are becoming increasingly popular for videography.
However, flying a drone and controlling its camera simultaneously requires manipulating 5-6 degrees of freedom (DOF) that needs significant training. We present ARPilot, a direct-manipulation interface that lets users plan an aerial video by physically moving their mobile devices around a miniature 3D model of the scene, shown via Augmented Reality (AR). The mobile devices act as the viewfinder, making them intuitive to explore and frame the shots. We leveraged AR technology to explore three 6DOF video-shooting interfaces on mobile devices: AR keyframe, AR continuous, and AR hybrid, and compared against a traditional touch interface in a user study. The results show that AR hybrid is the most preferred by the participants and expends the least effort among all the techniques, while the users' feedback suggests that AR continuous empowers more creative shots. We discuss several distinct usage patterns and report insights for further design. | en |
dc.description.provenance | Made available in DSpace on 2021-06-17T03:17:16Z (GMT). No. of bitstreams: 1 ntu-107-R05922105-1.pdf: 2736485 bytes, checksum: 7094b372a91344a3dda806dcebb39199 (MD5) Previous issue date: 2018 | en |
dc.description.tableofcontents | 口試委員會審定書 i
摘要 ii Abstract iii 1 Introduction 2 2 Related Work 4 2.1 Human-droneinteraction .......................... 4 2.1.1 Missionplanner........................... 4 2.1.2 Trajectoryplanning......................... 5 2.2 TangibleandTouchCameraControls ................... 5 2.2.1 3D object manipulation and data exploration . . . . . . . . . . . 6 2.2.2 3Ddataexploration......................... 6 2.2.3 Peepholenavigationonmobiledevices . . . . . . . . . . . . . . 6 3 ARPilot 8 3.1 ARKeyframeInterface........................... 8 3.2 ARHybridInterface ............................ 9 3.3 ARContinuousInterface .......................... 9 3.4 Implementation ............................... 9 3.4.1 Modelplacement.......................... 10 3.4.2 Pathrouting............................. 10 3.4.3 Safetytips.............................. 11 3.4.4 Videopreview ........................... 11 3.4.5 Dronemission ........................... 11 4 User Study 13 4.1 TaskDesign................................. 13 4.1.1 Forward............................... 13 4.1.2 Pullback............................... 14 4.1.3 Sidewaysliding........................... 14 4.1.4 Panorama.............................. 14 4.1.5 Orbiting............................... 14 4.2 Participants ................................. 14 4.3 ApparatusandImplementation....................... 15 4.4 Procedure.................................. 15 5 Result 18 5.1 Performance................................. 18 5.2 Similarity .................................. 18 5.3 Effort .................................... 19 5.4 Preference.................................. 19 6 Discussion and Future Work 22 6.1 PhysicalMovementEffort ......................... 22 6.2 CreativeTasks................................ 22 6.3 FurtherNextStepConsiderations...................... 23 7 Conclusion 25 8 ACKNOWLEDGEMENTS 26 Bibliography 27 | |
dc.language.iso | en | |
dc.title | 擴增飛行員: 空拍規劃擴增實境行動裝置介面 | zh_TW |
dc.title | ARPilot: Designing and Investigating AR ShootingInterfaces on Mobile Devices for Drone Videography | en |
dc.type | Thesis | |
dc.date.schoolyear | 106-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 余能豪(Neng-Hao Yu),黃大源(Da-Yuan Haung) | |
dc.subject.keyword | 互動科技,擴增實境,虛擬相機控制,行動裝置, | zh_TW |
dc.subject.keyword | Interaction techniques,augmented reality,virtual camera control,mobile device, | en |
dc.relation.page | 30 | |
dc.identifier.doi | 10.6342/NTU201801154 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2018-07-03 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
顯示於系所單位: | 資訊工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-107-1.pdf 目前未授權公開取用 | 2.67 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。