請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/74942
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 連豊力 | |
dc.contributor.author | Che-Cheng Chang | en |
dc.contributor.author | 張哲誠 | zh_TW |
dc.date.accessioned | 2021-06-17T09:10:51Z | - |
dc.date.available | 2020-03-11 | |
dc.date.copyright | 2020-03-11 | |
dc.date.issued | 2019 | |
dc.date.submitted | 2019-08-13 | |
dc.identifier.citation | [1: Jordan et al. 2018] Sophie Jordan, Julian Moore, Sierra Hovet, John Box, Jason Perry, Kevin Kirsche, Dexter Lewis and Zion Tsz Ho Tse, “State-of-the-art technologies for UAV inspections,” IET Radar, Sonar & Navigation, Vol. 12, No. 2, pp. 151-164, Feb. 2018.
[2: Vempati et al. 2017] Anurag Sai Vempati, Igor Gilitschenski, Juan Nieto, Paul Beardsley and Roland Siegwart, “Onboard Real-time Dense Reconstruction of Large-scale Environments for UAV,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Vancouver, BC, Canada, pp. 3479-3486, Sep. 24-28, 2017. [3: Tomic et al. 2012] Teodor Tomic, Korbinian Schmid, Philipp Lutz, Andreas Domel, Michael Kassecker, Elmar Mair, Iris Lynne Grixa, Felix Ruess, Michael Suppa and Darius Burschka, “Toward a Fully Autonomous UAV: Research Platform for Indoor and Outdoor Urban Search and Rescue,” IEEE Robotics & Automation Magazine, Vol. 19, No. 3, pp. 46-56, Sept. 2012. [4: Thomas et al. 2016] Justin Thomas, Giuseppe Loianno, Kostas Daniilidis and Vijay Kumar, “Visual Servoing of Quadrotors for Perching by Hanging From Cylindrical Objects,” IEEE Robotics and Automation Letters, Vol. 1, No. 1, pp. 57-64, Jan. 2016. [5: Thomas et al. 2017] Justin Thomas, Jake Welde, Giuseppe Loianno, Kostas Daniilidis and Vijay Kumar, “Autonomous Flight for Detection, Localization, and Tracking of Moving Targets With a Small Quadrotor,” IEEE Robotics and Automation Letters, Vol. 2, No. 3, pp. 1762-1769, May 2017. [6: Chen & Shen 2017] Jing Chen and Shaojie Shen, “Using a Quadrotor to Track a Moving Target with Arbitrary Relative Motion Patterns,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Vancouver, BC, Canada, pp. 5310-5317, Sep. 24-28, 2017. [7: Nageli et al. 2017] Tobias Nageli, Javier Alonso-Mora, Alexander Domahidi, Daniela Rus and Otmar Hilliges, “Real-Time Motion Planning for Aerial Videography With Dynamic Obstacle Avoidance and Viewpoint Optimization,” IEEE Robotics and Automation Letters, Vol. 2, No. 3, pp. 1696-1703, July. 2017. [8: Engel et al. 2012] Jakob Engel, Jürgen Sturm and Daniel Cremers, “Camera-Based Navigation of a Low-Cost Quadrocopter,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal, pp. 2815-2821, Oct. 7-12, 2012. [9: Lee & Croon 2018] Seong Hun Lee and Guido de Croon, 'Stability-Based Scale Estimation for Monocular SLAM,' IEEE Robotics and Automation Letters, Vol. 3, No. 2, pp. 780-787, Apr. 2018. [10: Joubert et al. 2016] Niels Joubert, Jane L. E, Dan B Goldman, Floraine Berthouzoz, Mike Roberts, James A. Landay and Pat Hanrahan, “Towards a Drone Cinematographer: Guiding Quadrotor Cameras using Visual Composition Principles,” arXiv preprint, Oct. 5, 2016. [11: Huang et al. 2018] Chong Huang, Fei Gao, Jie Pan, Zhenyu Yang, Weihao Qiu, Peng Chen, Xin Yang, Shaojie Shen and Kwang-Ting Cheng, “ACT: An Autonomous Drone Cinematography System for Action Scenes,” in Proceedings of IEEE International Conference on Robotics and Automation, Brisbane, Australia, pp. 7039-7046, May 21-25, 2018. [12: Pestana et al. 2014] Jesus Pestana, Jose Luis Sanchez-Lopez, Srikanth Saripalli and Pascual Campoy, “Computer Vision Based General Object Following for GPS-denied Multirotor Unmanned Vehicles,” in Proceedings of American Control Conference, Portland, Oregon, USA, pp. 1886-1891, June 4-6, 2014. [13: Chakrabarty et al. 2016] A. Chakrabarty, R. Morris, X. Bouyssounouse and R. Hunt, “Autonomous Indoor Object Tracking with the Parrot AR.Drone,” in Proceedings of International Conference on Unmanned Aircraft Systems, Arlington, VA, USA, pp. 25-30, June 7-10, 2016. [14: Penin et al. 2018] Bryan Penin, Paolo Robuffo Giordano and Francois Chaumette, “Vision-Based Reactive Planning for Aggressive Target Tracking While Avoiding Collisions and Occlusions,” IEEE Robotics and Automation Letters, Vol. 3, No. 4, pp. 3725-3732, July 2018. [15: Yao et al. 2017] Ningshi Yao, Emily Anaya, Qiuyang Tao, Sungjin Cho, Hongrui Zheng and Fumin Zhang, “Monocular Vision-based Human Following on Miniature Robotic Blimp,” in Proceedings of IEEE International Conference on Robotics and Automation, Singapore, Singapore, pp. 2739-2746, May 29-June 3, 2017. [16: Bullinger et al. 2017] Sebastian Bullinger, Christoph Bodensteiner, Michael Arens, “Instance flow based online multiple object tracking,” in the Proceedings of IEEE Conference on Image Processing, Beijing, China, pp. 785-789, Sept. 17-20, 2017. [17: Zhu et al. 2018] Yu Zhu, Jing Wen, Liang Zhang, Yi Wang, “Visual Tracking with Dynamic Model Update and Results Fusion,” in the Proceedings of IEEE Conference on Image Processing, Athens, Greece, pp. 2685-2689, Oct. 7-10, 2018. [18: Nebehay & Pflugfelder 2015] Georg Nebehay and Roman Pflugfelder, “Clustering of Static-adaptive Correspondences for Deformable Object Tracking,” in the Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, pp. 2784-2791, July 7-12, 2015. [19: Li et al. 2017] Shengkun Li, Dawei Du, Longyin Wen, Ming-Ching Chang and Siwei Lyu, “Hybrid structure hypergraph for online deformable object tracking,” in the Proceedings of IEEE Conference on Image Processing, Beijing, China, pp. 1127-1131, Sept. 17-20, 2017. [20: Mozhdehi & Medeiros 2017] Reza Jalil Mozhdehi and Henry Medeiros, “Deep Convolutional Particle Filter for Visual Tracking,” in the Proceedings of IEEE Conference on Image Processing, Beijing, China, pp. 3650-3654, Sep. 17-20, 2017. [21: Duan et al. 2018] Huiyu Duan, Guangtao Zhai, Xiongkuo Min, Yi Fang, Zhaohui Che, Xiaokang Yang, Cheng Zhi, Hua Yang and Ning Liu, “Learning to Predict where the Children with Asd Look,” in the Proceedings of IEEE Conference on Image Processing, Athens, Greece, pp. 704-708, Oct. 7-10, 2018. [22: Kokul et al. 2018] T. Kokul, C. Fookes, S. Sridharan, A. Ramanan and U. A. J. Pinidiyaarachchi, “Deep Match Tracker: Classifying when Dissimilar, Similarity Matching when Not,” in the Proceedings of IEEE Conference on Image Processing, Athens, Greece, pp. 2730-2734, Oct. 7-10, 2018. [23: Hao et al. 2018] Zhaohui Hao, Guixi Liu and Haoyang Zhang, “Correlation filter-based visual tracking via adaptive weighted CNN features fusion,” IET Image Processing, Vol. 12, No. 8, pp. 1423-1431, Aug. 2018. [24: Li et al. 2016] Hanxi Li, Yi Li, and Fatih Porikli, “DeepTrack: Learning Discriminative Feature Representations Online for Robust Visual Tracking,” IEEE Transactions on Image Processing, Vol. 25, No. 4, pp. 1834-1848, Apr. 2016. [25: Henriques et al. 2015] Joao F. Henriques, Rui Caseiro, Pedro Martins and Jorge Batista, “High-Speed Tracking with Kernelized Correlation Filters,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 37, No. 3, pp. 583-596, March 2015. [26: Odelga et al. 2016] Marcin Odelga, Paolo Stegagno and Heinrich H. Bulthoff, “Obstacle Detection, Tracking and Avoidance for a Teleoperated UAV,” in Proceedings of IEEE International Conference on Robotics and Automation, Stockholm, Sweden, pp. 2984-2990, May 16-21, 2016. [27: Oh et al. 2015] Hyondong Oh, Cunjia Liu, Seungkeun Kim, Hyo-Sang Shin and Wen-Hua Chen, “Coordinated Standoff Tracking of In- and Out-of-Surveillance Targets Using Constrained Particle Filter for UAVs,” in the proceedings of IEEE Intelligent Vehicles Symposium, COEX, Seoul, Korea, pp. 499-504, June 28-July 1, 2015. [28: Qin et al. 2018] Huai Qin, Zhixiong Pi, Changqian Yu, Changxin Gao, Jin-Gang Yu and Nong Sang, “Spatially Attentive Correlation Filters for Visual Tracking,” in the Proceedings of IEEE Conference on Image Processing, Athens, Greece, pp. 2695-2699, Oct. 7-10, 2018. [29: Cheng et al. 2017] Hui Cheng, Lishan Lin, Zhuoqi Zheng, Yuwei Guan and Zhongchang Liu, “An Autonomous Vision-Based Target Tracking System for Rotorcraft Unmanned Aerial Vehicles,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Vancouver, BC, Canada, pp. 1732-1738, Sep. 24-28, 2017. [30: Lim & Sinha 2015] Hyon Lim and Sudipta N. Sinha, “Monocular Localization of a moving person onboard a Quadrotor MAV,” in Proceedings of IEEE International Conference on Robotics and Automation, Washington State Convention Center, Seattle, Washington, pp. 2182-2189, May 26-30, 2015. [31: Chen et al. 2018] Peng Chen, Yuanjie Dang, Ronghua Liang, Wei Zhu and Xiaofei He, “Real-Time Object Tracking on a Drone With Multi-Inertial Sensing Data,” IEEE Transactions on Intelligent Transportation Systems, Vol. 19, No. 1, pp. 131-139, Jan. 2018. [32: Naseer et al. 2013] Tayyab Naseer, Jurgen Sturm and Daniel Cremers, “FollowMe: Person Following and Gesture Recognition with a Quadrocopter,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, pp. 624-630, Nov. 3-7, 2013. [33: Chen et al. 2016] Jing Chen, Tianbo Liu and Shaojie Shen, “Online Generation of Collision-Free Trajectories for Quadrotor Flight in Unknown Cluttered Environments,” in Proceedings of IEEE International Conference on Robotics and Automation, Stockholm, Sweden, pp. 1476-1483, May 16-21, 2016. [34: Yao et al. 2017] Ningshi Yao, Emily Anaya, Qiuyang Tao, Sungjin Cho, Hongrui Zheng and Fumin Zhang, “Monocular Vision-based Human Following on Miniature Robotic Blimp,” in Proceedings of IEEE International Conference on Robotics and Automation, Singapore, Singapore, pp. 2739-2746, May 29-June 3, 2017. [35: Hutchinson et al. 1996] Seth Hutchinson, Gregory D. Hager and Peter I. Corke, “A tutorial on visual servo control,” IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, pp. 651-670, Oct. 1996. [36: Chaumette & Hutchinson 2008] François Chaumette and Seth Hutchinson, “Visual Servoing and Visual Tracking,” Springer Handbook of Robotics, Berlin, Germany, pp. 563–583, 2008. [37: Kim et al. 2014] JeongWoon Kim, Yeondeuk Jung, Dasol Lee and David Hyunchul Shim, “Outdoor Autonomous Landing on a Moving Platform for Quadrotors using an Omnidirectional Camera,” in Proceedings of International Conference on Unmanned Aircraft Systems, Orlando, FL, USA, pp. 1243-1252, May 27-30, 2014. [38: Jung et al. 2015] Youeyun Jung, Dongjin Lee and Hyochoong Bang, “Close-Range Vision Navigation and Guidance for rotary UAV Autonomous Landing,” in the Proceedings of IEEE International Conference on IEEE Conference on Automation Science and Engineering, Gothenburg, Sweden, Aug. 24-28, 2015. [39: Mebarki et al. 2015] Rafik Mebarki, Vincenzo Lippiello and Bruno Siciliano, “Nonlinear Visual Control of Unmanned Aerial Vehicles in GPS-Denied Environments,” IEEE Transactions on Robotics, Vol. 31, No. 4, pp. 1004-1017, Aug. 2015. [40: Serra et al. 2016] Pedro Serra, Rita Cunha, Tarek Hamel, David Cabecinhas and Carlos Silvestre, “Landing of a Quadrotor on a Moving Target Using Dynamic Image-Based Visual Servo Control,” IEEE Transactions on Robotics, Vol. 32, No. 6, pp. 1524-1535, Dec. 2016. [41: Kendall et al. 2014] Alex G. Kendall, Nishaad N. Salvapantula and Karl A. Stol, “On-Board Object Tracking Control of a Quadcopter with Monocular Vision,” in Proceedings of International Conference on Unmanned Aircraft Systems, Orlando, FL, USA, pp. 404-411, May 27-30, 2014. [42: Xiang et al. 2016] Tian Xiang, Fan Jiang, Gongjin Lan, Jiaming Sun, Guocheng Liu, Qi Hao and Cong Wang, “UAV Based Target Tracking and Recognition,” IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, Kongresshaus Baden-Baden, Germany, pp. 400-405, Sept. 19-21, 2016. [43: Xie et al. 2017] Hui Xie, Kin Huat Low and Zhen He, “Adaptive Visual Servoing of Unmanned Aerial Vehicles in GPS-Denied Environments,” IEEE/ASME Transactions on Mechatronics, Vol. 22, No. 6, pp. 2554-2563, Dec. 2017. [44: Xie & Lynch 2017] Hui Xie and Alan F. Lynch, “Input Saturated Visual Servoing for Unmanned Aerial Vehicles,” IEEE/ASME Transactions on Mechatronics, Vol. 22, No. 2, pp. 952-960, Apr. 2017. [45: Zheng et al. 2017] Dongliang Zheng, Hesheng Wang, Jingchuan Wang, Siheng Chen, Weidong Chen and Xinwu Liang, “Image-Based Visual Servoing of a Quadrotor Using Virtual Camera Approach,” IEEE/ASME Transactions on Mechatronics, Vol. 22, No. 2, pp. 972-982, Apr. 2017. [46: Mellinger & Kumar 2011] Daniel Mellinger and Vijay Kumar, “Minimum Snap Trajectory Generation and Control for Quadrotors,” in Proceedings of IEEE International Conference on Robotics and Automation, Shanghai, China, pp. 2520-2525, May 9-13, 2011. [47: Achtelik et al. 2012] Markus W. Achtelik, Simon Lynen, Stephan Weiss, Margarita Chli and Roland Siegwart, “Motion- and Uncertainty-Aware Path Planning for Micro Aerial Vehicles,” Journal of Field Robotics, Vol. 31, No. 4, pp. 676-698, Aug. 2012. [48: Usenko et al. 2017] Vladyslav Usenko, Lukas von Stumberg, Andrej Pangercic and Daniel Cremers, “Real-Time Trajectory Replanning MAVs using Uniform B-splines and a 3D Circular Buffer,” in arXiv:1703.01416, July 24, 2017. [49: Nägeli et al. 2017] Tobias Nägeli, Lukas Meier, Alexander Domahidi, Javier Alonso-Mora and Otmar Hilliges, “Real-time Planning for Automated Multi-View Drone Cinematography,” ACM Transactions on Graphics (TOG), Vol. 36, No. 4, pp. 132, July 2017. [50: Chen et al. 2016] Jing Chen, Tianbo Liu, and Shaojie Shen, “Tracking a Moving Target in Cluttered Environments Using a Quadrotor,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Daejeon, Korea, pp. 446-453, Oct. 9-14, 2016. [51: Mur-Artal et al. 2015] Raul Mur-Artal, J. M. M. Montiel and Juan D. Tardos, “ORB-SLAM: A Versatile and Accurate Monocular SLAM System,” IEEE Transactions on Robotics, Vol. 31, No. 5, pp. 1147-1163, Oct. 2015. [52: Spedicato & Notarstefano 2018] Sara Spedicato and Giuseppe Notarstefano, “Minimum-Time Trajectory Generation for Quadrotors in Constrained Environments,” IEEE Transactions on Control Systems Technology, Vol. 26, No. 4, pp. 1335-1344, July 2018. [53: Mahony et al. 2012] Robert Mahony, Vijay Kumar and Peter Corke, “Multirotor Aerial Vehicles: Modeling, Estimation, and Control of Quadrotor,” IEEE Robotics & Automation Magazine, Vol. 19, No. 3, pp. 20-32, Sept. 2012. [54: Faessler et al. 2018] Matthias Faessler, Antonio Franchi and Davide Scaramuzza, “Differential Flatness of Quadrotor Dynamics Subject to Rotor Drag for Accurate Tracking of High-Speed Trajectories,” IEEE Robotics and Automation Letters, Vol. 3, No. 2, pp. 620-626, Apr. 2018. [55: Nützi et al. 2011] Gabriel Nützi, Stephan Weiss, Davide Scaramuzza and Roland Siegwart, “Fusion of IMU and Vision for Absolute Scale Estimation in Monocular SLAM,” Journal of Intelligent & Robotic Systems, Vol. 61, No. 1-4, pp. 287-299, Jan. 2011. [56: Gehrig et al. 2017] Daniel Gehrig, Mamimilian Göttgens, Brian Paden and Emilio Frazzoli, 'Scale-Corrected Monocular-SLAM for the AR.Drone 2.0,' ETH Library, ETH Zürich, Switzerland, 2017. [57: Murray et al. 1995] Richard M. Murray, Muruhan Rathinam and Willem Sluis, “Differential flatness of mechanical control systems: A catalog of prototype systems,” ASME's International Mechanical Engineering Congress and Exposition, San Francisco, CA, November 12-17, 1995. [58: Laganière 2011] Robert Laganière, “Estimating Projective Relations in Images,” in OpenCV 2 computer vision application programming cookbook: Over 50 recipes to master this library of programming functions for real-time computer vision. Packt Pub, Birmingham, UK, 2011. [59: Xu et al. 2016] Lingyun Xu, Haibo Luo, Bin Hui and Zheng Chang, “Real-Time Robust Tracking for Motion Blur and Fast Motion via Correlation Filters,” Sensors, Vol. 16, No. 9, Sep. 2016. [60: Quigley et al. 2009] Morgan Quigley, Brian Gerkey, Ken Conley, Josh Faust, Tully Foote, Jeremy Leibs, Eric Berger, Rob Wheeler and Andrew Ng, “ROS: an open-source Robot Operating System,” ICRA Workshop on Open Source Software, 2009. [61: Koenig & Howard 2004] Nathan Koenig and Andrew Howard, “Design and Use Paradigms for Gazebo, An Open-Source Multi-Robot Simulator,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Sendai, Japan, pp. 2149 - 2154, Sept. 28- Oct. 2, 2004 [62: Burri et al. 2015] Michael Burri, Helen Oleynikova, Markus W. Achtelik and Roland Siegwart, “Real-Time Visual-Inertial Mapping, Re-localization and Planning Onboard MAVs in Unknown Environments,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Hamburg, Germany, pp. 1872-1878, Sep. 28-Oct. 2, 2015 [63: Richter et al. 2015] Charles Richter, Adam Bry and Nicholas Roy, “Polynomial Trajectory Planning for Aggressive Quadrotor Flight in Dense Indoor Environments,” in Proceedings of the International Symposium on Robotics Research, 2013. [64: Li et al. 2017] Yazhe Li, Kai Zhou and Zhen Zhang “Phase difference method based position detection system for linear motion orientation,” IEEE SENSORS, Glasgow, UK, Oct. 29- Nov. 1, 2017. [65: Li and Jilkov 2003] X. Rong Li and Vesselin P. Jilkov, “Survey of maneuvering target tracking. Part I. Dynamic models,” IEEE Transactions on Aerospace and Electronic Systems, Vol. 39, No. 4, pp. 1333-1364, Oct. 2003 [66: Klein and Murray 2013] Georg Klein and David Murray, “Parallel Tracking and Mapping for Small AR Workspaces,” in Proceedings of International Symposium on Mixed and Augmented Reality, Boston, Massachusetts, USA, pp. 225-234, 2007 [67: Rublee et al. 2011] Ethan Rublee, Vincent Rabaud, Kurt Konolige and Gary Bradski, “ORB: An efficient alternative to SIFT or SURF,” International Conference on Computer Vision, Barcelona, Spain, pp. 2564-2571, Nov. 6-13, 2011 [68: Qin et al. 2017] Tong Qin, Peiliang Li, and Shaojie Shen, “VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator,” in arXiv: 1708.03852v1, Aug. 13, 2017 [69: Poultney et al. 2018] Alexander Poultney, Christopher Kennedy, Garrett Clayton and Hashem Ashrafiuon, “Robust Tracking Control of Quadrotors Based on Differential Flatness: Simulations and Experiments,” IEEE/ASME [70: Parrot, Inc.] Parrot, Inc. PARROT AR.DRONE 2.0 ELITE EDITION. [Online]. Available: https://www.parrot.com/global/drones/parrot-ardrone-20-elite-edition [71: Elecfreaks, Inc.] Elecfreaks, Inc. Ultrasonic Ranging Module HC-SR 04 [Online]. Available: https://cdn.sparkfun.com/datasheets/Sensors/Proximity/HCSR04.pdf [72: Lypiridis 2016] Giannis Lypiridis. (2016, Jul.). ardrone_simulator_gazebo7. GitHub repository. [Online]. Available: https://github.com/iolyp/ardrone_simulator_gazebo7.git [73: Monajjemi 2016] Mani Monajjemi et al. (2016). ardrone_autonomy. GitHub repository. [Online]. Available: https://github.com/AutonomyLab/ardrone_autonomy.git [74: Acuna 2017] Raul Acuna (2017). ardrone_velocity. GitHub repository. [Online]. Available: https://github.com/raultron/ardrone_velocity.git [75: AlejandroSilvestri 2017] AlejandroSilvestri (2017). Pass in an initial rotation?. GitHub issue. [Online]. Available: https://github.com/raulmur/ORB_SLAM2/issues/249 [76: Brooks-Bartlett 2018] Jonny Brooks-Bartlett (2018, Jan.). Probability concepts explained: Maximum likelihood estimation. Toward Data Science. [Online]. Available: https://towardsdatascience.com/probability-concepts-explained-maximum-likelihood-estimation-c7b4342fdbb1 [77: Arras et al. 2012] Kai Arras, Cyrill Stachniss, Maren Bennewitz and Wolfram Burgard (2012, Jan.). Robotics 2 Target Tracking. University of Freiburg. [Online]. Available: http://ais.informatik.uni-freiburg.de/teaching/ws11/robotics2/pdfs/rob2-19-tracking.pdf [78: Bemporad 2010] Alberto Bemporad. (2010-2011). State estimation and linear observers. University of Trento. [Online]. Available: http://cse.lab.imtlucca.it/~bemporad/teaching/ac/pdf/06b-estimator.pdf [79: University of Dallas] University of Dallas. Elements of Cinematography: Camera. [Online]. Available: https://www.utdallas.edu/atec/midori/Handouts/camera.htm [80: Mur-Artal 2017] Raul Mur-Artal. (2017). ORB_SLAM2. GitHub repository. [Online]. Available: https://github.com/raulmur/ORB_SLAM2.git [81: ROS. org] ROS. org. gazebo_ros_pkgs. [Online]. Available: http://wiki.ros.org/gazebo_ros_pkgs [82: Johnson 2017] Steven G. Johnson. (2017). The NLopt nonlinear-optimization package. [Online]. Available: http://ab-initio.mit.edu/nlopt [83: Calais 2018] Ross Calais. (2018, Apr. 21). Boston Marathon 2018 - Yuki Kawauchi Highlights. [Online]. Available: https://www.youtube.com/watch?v=BE6nuOcbMck&fbclid=IwAR353sH6qJvgVvNecYFg9LHmMc0ro7ktMbTgyB-C6qH738Y_xrHVocXmvYM [84: Gazebo, Org.] Gazebo, Org. Make an animated model (actor). [Online]. Available: http://gazebosim.org/tutorials?tut=actor&cat=build_robot [85: Mallick 2016] Satya Mallick. (2016). Histogram of Oriented Gradients. Learn OpenCV. [Online]. Available: https://www.learnopencv.com/histogram-of-oriented-gradients/ | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/74942 | - |
dc.description.abstract | 本篇論文為無人飛行器之動態物追蹤、跟隨與空中攝影提出一個完整的系統架構。在此架構中,無人飛行器倚賴單眼視覺,並將影像資訊回傳至遠端電腦,達成追蹤、定位與跟隨動態物的任務。本篇論文主要的貢獻在於,不需仰賴額外的動作捕捉系統,即可在任意環境同時定位無人飛行器並且定位地面上之動態物,以及實時規劃無人飛行器的追蹤軌跡,其考慮無人飛行器之動力學與攝影視角的優化。
其中有五個平行運作的程序來負責不同的功能。ORB-SLAM程序負責提供飛行器的自體定位與重建真實尺度的三維空間資訊。由於單眼視覺SLAM系統缺乏真實尺度資訊,因此本系統使用在一個超聲波距離感測器,估測出單眼影像系統的真實尺度。動態物姿態估測程序負責在已重建的三維空間中,倚賴單眼影像與飛行器之姿態並定位動態物。在這個程序中,首先使用Kernelized Correlation Filter抓取待被追蹤的物體,並透過物體的特徵點擷取搭配特徵點投影之影像平面的幾何關係,求得被追蹤物在三維空間的幾何座標。軌跡規劃程序負責根據動態物之位姿,計算出飛行器應抵達的攝影位置與相對動態物移動方向的特定角度,並規劃平滑且符合無人機動態行為的軌跡。此程序規劃以多項式為基底的平滑軌跡,確保位置的四階微分函數具有連續性,並且隨時控制被追蹤物在影像中的行位置保持在影像平面中心。姿態控制器程序根據軌跡規劃的結果,控制飛行器以抵達該指定位置。 最後,本論文展示了多項模擬以及實驗結果,以證明此系統的可行性以及性能。 | zh_TW |
dc.description.abstract | In this thesis, the autonomous flight system of a quadrotor is proposed for moving object tracking, following and aerial photography. In this system, the quadrotor relies on monocular vision and transmits RGB image back to remote computer to perform the task of tracking, locating, and following moving target. Our key contributions include localization of unmanned aerial vehicle and moving target on the ground in any environment without relying on motion capture system as well as planning algorithm which considers dynamics of the unmanned aerial vehicle and optimization of photographic perspective.
There are five parallel programs that are responsible for different functions. The ORB-SLAM program is responsible for providing localization of the quadrotor and reconstruction of real-scale 3D spatial information. Since monocular SLAM system lacks real-scale information, the system uses an ultrasonic range sensor to estimate the true scale of monocular SLAM system. The dynamic target pose estimation program is responsible for locating the dynamic objects in the reconstructed three-dimensional space using onboard monocular image and the attitude of the aircraft. In this program, the Kernelized Correlation Filter is utilized to track target on image plane, and the positions of tracked target in the three-dimensional coordinate are obtained through the feature points of the target in image plane. The trajectory planning program is responsible for setpoints calculating, planning a trajectory that conforms to dynamic behavior of the quadrotor and visibility optimization on image plane. This program plans a smooth polynomial-based trajectory, ensuring continuous property of forth derivative of position, and controls column position of the tracked target at the center of image plane at the same time. The attitude controller program controls the quadrotor to arrive at the designated position based on the results of the trajectory planning. Finally, the analysis in both simulations and experiments is provided to demonstrate feasibility and performance of proposed system. | en |
dc.description.provenance | Made available in DSpace on 2021-06-17T09:10:51Z (GMT). No. of bitstreams: 1 ntu-108-R06921002-1.pdf: 13367326 bytes, checksum: 627212df7a622115733d3a5a63d54a39 (MD5) Previous issue date: 2019 | en |
dc.description.tableofcontents | 摘要 i
ABSTRACT iii CONTENTS vi LIST OF FIGURES viii LIST OF TABLES xiv Chapter 1 Introduction 1 1.1 Motivation 1 1.2 Problem Formulation 3 1.3 Contributions 5 1.4 Organization of the Thesis 6 Chapter 2 Background and Literature Survey 8 2.1 Quadrotor-Based Tacking System 8 2.2 Visual Servoing 12 2.3 Trajectory Representation and Planning 17 Chapter 3 Related Algorithms 20 3.1 Differential Flatness 20 3.2 Pinhole Camera Model 25 3.3 Oriented FAST and Rotated BRIEF SLAM (ORB-SLAM) 27 3.4 Kernelized Correlation Filters (KCF) Tracker 29 Chapter 4 System Overview 32 4.1 Preliminaries 32 4.2 Coordinate Frames 34 4.2.1 ORB-SLAM Coordinate System 34 4.2.2 3D Reconstructed Coordinate System 36 4.3 System Structure 39 4.4 Controller Design 43 4.4.1 Control for Scale Estimation Mode 44 4.4.2 Control for Target Following Mode 45 Chapter 5 Vision-Based Relative Pose Estimation 47 5.1 Scale Estimation for Monocular SLAM 48 5.1.1 Data Alignment 50 5.1.2 Maximum Likelihood 51 5.1.3 1-Point RANSAC for Scale Estimation 55 5.2 Target State Estimation 56 5.2.1 Feature Point Selection 59 5.2.2 Relative Pose Estimation 61 5.2.3 Homogenous Transform for Target Position 63 Chapter 6 Online Trajectory Planning 66 6.1 Trajectory Representation 68 6.2 Following Trajectory Generation Strategy 69 6.3 Trajectory Optimization and Constraints 73 Chapter 7 Simulation and Experimental Results and Analysis 77 7.1 Experimental Setup 78 7.1.1 Hardware Platform 78 7.1.2 Software Platform 80 7.1.3 Experimental Scenes 81 7.2 Simulations 92 7.2.1 Static Environment Reconstruction with Static Target 92 7.2.2 Dynamic Target Tracking with a Static Quadrotor 111 7.2.3 Target Following and Filming in Static Environment 152 7.3 Experiments 196 7.3.1 Static Environment Reconstruction with Static Target 196 7.3.2 Target Following and Filming in Static Environment 208 7.4 Summary 248 7.4.1 Static Environment Reconstruction with Static Target 248 7.4.2 Dynamic Target Tracking with a Static Quadrotor 250 7.4.3 Target Following and Filming in Static Environment 252 Chapter 8 Conclusions and Future Works 256 8.1 Conclusions 256 8.2 Future Works 258 References 261 | |
dc.language.iso | en | |
dc.title | 基於單眼視覺之無人飛行器三維空間之動態物定位、跟隨與空中攝影 | zh_TW |
dc.title | Monocular Vision-Based Unmanned Aerial Vehicle Autonomous Flight for 3D Tracking, Following and Aerial Videography of Moving Target | en |
dc.type | Thesis | |
dc.date.schoolyear | 108-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 李後燦,黃正民 | |
dc.subject.keyword | 航空追蹤系統,單眼視覺同步定位與地圖建構,影像伺服,即時系統,路徑規劃,空中攝影, | zh_TW |
dc.subject.keyword | Aerial tracking systems,monocular SLAM system,visual servoing,real-time system,path planning,aerial photography, | en |
dc.relation.page | 270 | |
dc.identifier.doi | 10.6342/NTU201902841 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2019-08-14 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 電機工程學研究所 | zh_TW |
顯示於系所單位: | 電機工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-108-1.pdf 目前未授權公開取用 | 13.05 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。