請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/94626完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 連豊力 | zh_TW |
| dc.contributor.advisor | Feng-Li Lian | en |
| dc.contributor.author | 許芳源 | zh_TW |
| dc.contributor.author | Fang-Yuan Hsu | en |
| dc.date.accessioned | 2024-08-16T17:10:25Z | - |
| dc.date.available | 2024-08-17 | - |
| dc.date.copyright | 2024-08-16 | - |
| dc.date.issued | 2024 | - |
| dc.date.submitted | 2024-08-12 | - |
| dc.identifier.citation | Goldman Sachs. “Drones: Flying into the Mainstream.” (May 2016), [Online]. Available: https://www.goldmansachs.com/intelligence/pages/drones-flying-into-the-mainstream.html.
Droneblog. “Which Drones Do the Police Use?” (Aug. 2021), [Online]. Available: https://www.droneblog.com/drones-police-use/. 華視新聞 CH52, “北市無人機隊成立! 助” 交通疏導. 刑案偵查” |華視新聞 20221003,” [Online]. Available: %5Curl%7Bhttps://www.youtube.com/watch?v=lcBkOM0zKu0&t=16s%7D. Police 1. “How to develop a police UAS training program.” (May 2018), [Online]. Available: https://www.police1.com/police- training/articles/how-to-develop-a-police-uas-training-program-WxaxWcLRSz4buYp5/. 洪哲政, “無人機警監系統民間都說能作結果出人意料,” UDN, 2023. J. Zhang, “Occlusion-aware UAV Path Planning for Reconnaissance and Surveillance in Complex Environments,” in 2019 IEEE International Conference on Robotics and Biomimetics, Dali, China, 2019. Junko Yoshida, “自動駕駛技術的迫切挑戰:贏得大眾信任,” EE Times Taiwan, 2021. Steven J. Portugal, “Bird flocks,” in Current Biology Magazine, 2020. Paulina Tylus, “2020: Fires, floods, pandemics, and now…locust plagues. WTF?!” In Green is the New Black, 2020. Logan Berry Heritage Farm, “Spring Is Honey Bee Swarm Season!,” 2018. Jerry Young, “All About the Hyena: Misunderstood Predators,” 2023. Daqian Liu, Xiaomin Zhu, Weidong Bao, Bowen Fei, and Jianhong Wu, “SMART: Vision-Based Method of Cooperative Surveillance and Tracking by Multiple UAVs in the Urban Environment,” IEEE Transactions on Intelligent Transportation Systems, vol. 23, no. 12, pp. 24 941–24 956, 2022. Guohui Wu, Ning Wang, and Jin Ying, “Research on Distributed Real-Time Formation Tracking Control of High-Order Multi-UAV System,” IEEE Access, vol. 10, pp. 36 286–36 298, 2022. B. S. Chen, Y. C. Liu, M. Y. Lee, and C. L. Hwang, “Decentralized H PID Team Formation Tracking Control of Large-Scale Quadrotor UAVs Under External Disturbance and Vortex Coupling,” IEEE Access, vol. 10, pp. 108 169–108 184, 2022. Hui Xie and Alan F. Lynch, “Input Saturated Visual Servoing for Unmanned Aerial Vehicles,” IEEE/ASME Transactions on Mechatronics, vol. 22, no. 2, pp. 952–960, 2017. Murtaza’s Workshop - Robotics and AI, “Drone Programming With Python Course | 3 Hours | Including x4 Projects | Computer Vision,” [Online]. Available: https://www.youtube.com/watch?v=LmEcyQnfpDA&t=1326s&ab_channel=Murtaza%27sWorkshop-RoboticsandAI. Yaohong Qu and Youmin Zhang, “Cooperative localization against GPS signal loss in multiple UAVs flight,” Journal of Systems Engineering and Electronics, vol. 22, no. 1, pp. 103–112, 2011. Liqun Ma, Dongyuan Meng, Xu Huang, and Shuaihe Zhao, “Vision-Based Formation Control for an Outdoor UAV Swarm With Hierarchical Architecture,” IEEE Access, vol. 11, pp. 75 134–75 151, 2023. Dongjia Wang, Baowang Lian, Yangyang Liu, and Bo Gao, “A Cooperative UAV Swarm Localization Algorithm Based on Probabilistic Data Association for Visual Measurement,” IEEE Sensors Journal, vol. 22, no. 20, pp. 19 635–19 644, 2022. Yi Zheng, Yaqin Xie, and Jiamin Li, “Multi-UAV Collaboration and IMU Fusion Localization Method in Partial GNSS-Denied Scenarios,” IEEE Access, vol. 11, pp. 105 499–105 512, 2023. J. Thomas, J. Welde, G. Loianno, K. Daniilidis, and V. Kumar, “Autonomous Flight for Detection, Localization, and Tracking of Moving Targets With a Small Quadrotor,” IEEE Robotics and Automation Letters, vol. 2, no. 3, pp. 1762–1769, 2017. Yibing Li, Mingyang Jiu, Qian Sun, and Qianhui Dong, “An Adaptive Distributed Consensus Control Algorithm Based on Continuous Terminal Sliding Model for Multiple Quad Rotors'Formation Tracking,” IEEE Access, vol. 7, pp. 173955–173967, 2019. NightjarOne, “Tello Localization: Odometry,” [Online]. Available: https://youtu.be/WLbI0QtSdA0. D. Nister, O. Naroditsky, and J. Bergen, “Visual odometry,” in Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, DC, USA, 2004. S. Tang, Z. Chen, and W. Lin, “Visual inertial odometry,” Journal of Industrial Mechatronics, p. 30, 2020. T. Liu and S. Shen, “High altitude monocular visual-inertial state estimation: Initialization and sensor fusion,” in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), Singapore, 2017. Stephan Weiss, Markus W. Achtelik, Simon Lynen, Michael C. Achtelik, Laurent Kneip, Margarita Chli, and Roland Siegwart, “Monocular vision for long-term micro aerial vehicle state estimation: A compendium,” J. Field Robot., vol. 30, no. 5, pp. 803–831, 2013. S. Weiss and R. Siegwart, “Real-time metric state estimation for modular vision-inertial systems,” in Proc. IEEE Int. Conf. Robot. Autom., Shanghai, China, 2011. C. Forster, L. Carlone, F. Dellaert, and D. Scaramuzza, “On-manifold preintegration for real-time visual-inertial odometry,” IEEE Trans. Robot., vol. 33, no. 1, pp. 1–21, 2017. D. Caruso, A. Eudes, M. Sanfourche, D. Vissière, and G. Le Besnerais, “A robust indoor/outdoor navigation filter fusing data from vision and magneto-inertial measurement unit,” Sensors, vol. 17, no. 12, p. 2795, 2017. A. I. Mourikis and S. I. Roumeliotis, “A multi-state constraint Kalman filter for vision-aided inertial navigation,” in Proc. IEEE Int. Conf. Robot. Autom., Rome, Italy, 2007. N. de Palezieux, T. Nageli, and O. Hilliges, “Duo-VIO: Fast, lightweight, stereo inertial odometry,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), Daejeon, Korea (South), 2016. S. B. Han, J. H. Kim, and H. Myung, “Landmark-Based particle localization algorithm for mobile robots with a fish-eye vision system,” IEEE/ASME Transactions on Mechatronics, vol. 18, no. 6, pp. 1745–1756, 2013. H. Zhang, L. Zhang, and J. Dai, “Landmark-Based Localization for Indoor Mobile Robots with Stereo Vision,” in 2012 Second International Conference on Intelligent System Design and Engineering Application (ISDEA), Sanya, China, 2012. M.A. Salahuddin, A. Al-Fuqaha, V.B. Gavirangaswamy, and M. Anan M. Ljucovic, “An efficient artificial landmark-based system for indoor and outdoor identification and localization,” in 2011 7th International Wireless Communications and Mobile Computing Conference (IWCMC), Istanbul, Turkey, 2011. Marineda G. Popova and Hugh H. T. Liu, “Position-Based Visual Servoing for Target Tracking by a Quadrotor UAV,” in Proc. AIAA Guidance, Navigation, and Control Conf., San Diego, USA, 2016, pp. 2092–2103. Wanbing Zhao, Hao Liu, Frank L. Lewis, Kimon P. Valavanis, and Xinlong Wang, “Robust Visual Servoing Control for Ground Target Tracking of Quadrotors,” IEEE Transactions on Control Systems Technology, vol. 28, no. 5, pp. 1980–1987, 2020. Yuzhen Liu, Ziyang Meng, Yao Zou, and Ming Cao, “Visual Object Tracking and Servoing Control of a Nano-Scale Quadrotor: System, Algorithms, and Experiments,” IEEE/CAA Journal of Automatica Sinica, vol. 8, no. 2, pp. 344–360, 2021. Jing Chen, Tianbo Liu, and Shaojie Shen, “Tracking a Moving Target in Cluttered Environments Using a Quadrotor,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, Oct. 2016. P. Serra, R. Cunha, T. Hamel, C. Silvestre, and F. Le Bras, “Nonlinear Image-Based Visual Servo Controller for the Flare Maneuver of Fixed-Wing Aircraft Using Optical Flow,” IEEE Transactions on Control Systems Technology, vol. 23, no. 2, pp. 570–583, 2015. Hui Xie and Alan F. Lynch, “Input Saturated Visual Servoing for Unmanned Aerial Vehicles,” IEEE/ASME Transactions on Mechatronics, vol. 22, no. 2, pp. 952–960, 2017. G. Chesi and T. Shen, “Conferring Robustness to Path-Planning for Image-Based Control,” IEEE Transactions on Control Systems Technology, vol. 20, no. 4, pp. 950–959, 2012. E. Malis, F. Chaumette, and S. Boudet, “2 1/2 D visual servoing,” IEEE Transactions on Robotics and Automation, vol. 15, no. 2, pp. 238–250, 1999. P. I. Corke and S. A. Hutchinson, “A new partitioned approach to image-based visual servo control,” IEEE Transactions on Robotics and Automation, vol. 17, no. 4, pp. 507–515, 2001. Li-Yang Chang, “Formation Control for Multiple Unmanned Aerial Vehicles System with Fiducial Marker- Based Pose Estimation System,” M.S. thesis, National Taiwan University, Taipei, Taiwan, 2020. P. K. C. Wang, “Navigation strategies for multiple autonomous mobile robots moving in formation,” J. Robot. Syst., vol. 8, no. 2, pp. 177–195, 1991. J.P. Desai, J. Ostrowski, and V. Kumar, “Controlling formations of multiple mobile robots,” in Proc. IEEE Int. Conf. Robotics and Automation, Leuven, Belgium, 1998. K. D. Do and J. Pan, “Nonlinear formation control of unicycle-type mobile robots,” Robot. Auton. Syst., vol. 55, no. 3, pp. 191–204, 2007. K. D. Do, “Formation tracking control of unicycle-type mobile robots with limited sensing ranges,” IEEE Trans. Control Syst. Technol., vol. 16, no. 3, pp. 527–538, 2008. J. Yu, X. Dong, Q. Li, and Z. Ren, “Practical time-varying output formation tracking for high-order multi-agent systems with collision avoidance, obstacle dodging and connectivity maintenance,” J. Franklin Inst., vol. 356, no. 12, pp. 5898–5926, 2019. Z.H. Pang, C. B. Zheng, J. Sun, Q. L. Han, and G. P. Liu, “Distanceand velocity-based collision avoidance for time-varying formation control of second-order multi-agent systems,” IEEE Trans. Circuits Syst. II, Exp. Briefs, vol. 68, no. 4, pp. 1253–1257, 2021. R. W. Beard, J. Lawton, and F. Y. Hadaegh, “A feedback architecture for formation control,” in Proc. of the 2000 American Control Conference, Chicago, IL, USA, 2000. W. Kang, N. Xi, and A. Sparks, “Formation control of autonomous agents in 3D workspace,” in Proc. IEEE Int. Conf. Robotics and Automation, San Francisco, CA, USA, 2000. N. E. Leonard and E. Fiorelli, “Virtual leaders, artificial potentials and coordinated control of groups,” in Proc. IEEE Conf. Decision and Control, Orlando, FL, USA, 2001. Q. Chen, Y. Sun, M. Zhao, and M. Liu, “Consensus-based cooperative formation guidance strategy for multiparafoil airdrop systems,” IEEE Trans. Autom. Sci. Eng., vol. 18, no. 4, pp. 2175–2184, 2021. Y.B. Bae, Y.H. Lim, and H.S. Ahn, “Distributed robust adaptive gradient controller in distance-based formation control with exogenous disturbance,” IEEE Trans. Autom. Control, vol. 66, no. 6, pp. 2868–2874, 2021. T. Liu and J. Huang, “Discrete-time distributed observers over jointly connected switching networks and an application,” IEEE Trans. Autom. Control, vol. 66, no. 4, pp. 1918–1924, 2021. T. Balch and R. C. Arkin, “Behavior-based formation control for multirobot teams,” IEEE Trans. Robot. Autom., vol. 14, no. 6, pp. 926–939, 1998. M. Schneider-Fontan and M. J. Mataric, “Territorial multirobot task division,” IEEE Trans. Robot. Autom., vol. 14, no. 5, pp. 815–822, 1998. Jonathan R. T. Lawton, Randal W. Beard, and Brett J. Young, “A Decentralized Approach to Formation Maneuvers,” IEEE Trans. Robot. Autom., vol. 19, no. 5, pp. 933–941, 2003. J. Lu, “Multi-Intelligent Vehicle Cooperative Formation Control Method based on Visual Navigation,” in 2020 IEEE International Conference on Industrial Application of Artificial Intelligence, Harbin, China, 2020. J. Huang, Z. Ji, S. Xiao, C. Jia, P. Wang, and X. Wang, “Research on formation control of multi intelligent driving vehicles based on swarm motion,” in 2022 IEEE 10th Joint International Information Technology and Artificial Intelligence Conference, Chongqing, China, 2022. M. J. Er, H. Gong, Y. Liu, and T. Liu, “Intelligent Trajectory Tracking and Formation Control of Underactuated Autonomous Underwater Vehicles: A Critical Review,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 54, no. 1, pp. 543–555, 2024. Luis Ortiz, Luiz Gonçalves, and Elizabeth Cabrera, “A Generic Approach for Error Estimation of Depth Data from (Stereo and RGB-D) 3D Sensors,” May 2017. R. Tsai, “A versatile camera calibration technique for high-accuracy 3d machine vision metrology using off-the-shelf tv cameras and lenses,” IEEE Journal on Robotics and Automation, vol. 3, no. 4, pp. 323–344, 1987. DOI: 10.1109/JRA.1987.1087109. Greg Welch and Gary Bishop, “An Introduction to the Kalman Filter,” Jul. 2006. Wikipedia, “Convex function,” [Online]. Available: %5Curl%7Bhttps://en.wikipedia.org/wiki/Convex_function%7D. Jui-Che Wu, “Person Surveillance System by Unmanned Aerial Vehicles Using Image-Based Visual Servoing Control,” M.S. thesis, National Taiwan University, Taipei, Taiwan, 2019. Wei Ren and Randal W. Beard, Distributed Consensus in Multi-Vehicle Cooperative Control: Theory and Applications. Communications and Control Engineering, Jan. 2007. eBay, “Bed Spectacles Horizontal Reading Lying Down Watching TV Lazy Prism Eye Glasses,” [Online]. Available: %5Curl%7Bhttps://www.ebay.com/itm/Bed-Spectacles-Horizontal-Reading-Lying-Down-Watching-TV-Lazy-Prism-Eye-Glasses-/303389060572%7D. Works-Of-Claye, “Tello Mirror Clip,” [Online]. Available: %5Curl%7Bhttps://www.thingiverse.com/thing:2911427%7D. ArUco markers generator!, [Online]. Available: %5Curl%7Bhttps://chev.me/arucogen/%7D. Harveen Singh Chadha, “Extended Kalman Filter: Why do we need an Extended Version?” In Towards Data Science, 2018. Wikipedia, “Kalman filter,” [Online]. Available: %5Curl%7Bhttps://en.wikipedia.org/wiki/Kalman_filter%7D. Harveen Singh Chadha, “The Unscented Kalman Filter: Anything EKF can do I can do it better!” In Towards Data Science, 2018. Y. Du, P. Huang, Y. Cheng, Y. Fan, and Y. Yuan, “Fault Tolerant Control of a Quadrotor Unmanned Aerial Vehicle Based on Active Disturbance Rejection Control and Two-Stage Kalman Filter,” IEEE Access, vol. 11, pp. 67 556–67 566, 2023. B. Yang, E. Yang, L. Yu, and C. Niu, “Adaptive Extended Kalman Filter-Based Fusion Approach for High-Precision UAV Positioning in Extremely Confined Environments,” IEEE/ASME Transactions on Mechatronics, vol. 28, no. 1, pp. 543–554, 2023. P. Guo, J. Li, T. Chen, and Z. Wu, “Heave Motion Estimation Based on Cubature Kalman Filter,” in 2021 International Conference on Cyber-Physical Social Intelligence (ICCSI), Beijing, China, 2021, pp. 1–5. Wikipedia, “Mathematical optimization,” [Online]. Available: %5Curl%7Bhttps://en.wikipedia.org/wiki/Mathematical_optimization%7D. Wikipedia, “Sequential quadratic programmimg,” [Online]. Available: %5Curl%7Bhttps://en.wikipedia.org/wiki/Sequential_quadratic_programming%7D. Steven G. Johnson, “The NLopt nonlinear-optimization package,” 2007. OpenCV, “Perspective-n-Point (PnP) pose computation,” [Online]. Available: %5Curl%7Bhttps://docs.opencv.org/3.4/d5/d1f/calib3d_solvePnP.html%7D. Wikipedia, “Proportional–integral–derivative controller,” [Online]. Available: %5Curl%7Bhttps://en.wikipedia.org/wiki/Proportional%E2%80%93integral%E2%80%93derivative_controller%7D. koide3, “hdl_localization,” [Online]. Available: %5Curl%7Bhttps://github.com/koide3/hdl_localization%7D. ROS.org, “Getting Started with the Velodyne VLP16,” [Online]. Available: %5Curl%7Bhttps://wiki.ros.org/velodyne/Tutorials/Getting%20Started%20with%20the%20Velodyne%20VLP16%7D. Hot Robotics, “Velodyne VLP-16 LiDAR,” [Online]. Available: %5Curl%7Bhttps://hotrobotics.co.uk/equipment/velodyne-vlp-16-lidar/%7D. | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/94626 | - |
| dc.description.abstract | 無人機可用於許多空中任務,例如探索、檢查、建圖、與環境互動與搜救。在這些無人機的應用中,其中一項是針對警隊。在市區中,追捕嫌疑人或目標是市區警隊的重要任務之一,此方式無需消耗過多人力。然而,長期的訓練過程是非常勞力密集的。同時,在任務過程中,一些畫面的目標估測失誤可能也會發生,這可能是由於障礙物或訊號傳輸故障造成的遮擋情形引起的。
因此,在這篇論文中,我們使用一四旋翼無人機隊解決針對單一目標之空中多群體自主追蹤問題,而此無人機隊會受到障礙物或無法預測之暫時訊號傳輸故障遮擋的影響,造成目標估測暫時失靈。此系統包含使用卡爾曼濾波器(Kalman Filter)融合機載圖像與慣性測量單元(IMU)測量數據進行無人機的全局定位,根據與無人機相對位置估測目標位置,透過基於軌跡之動作預測解決目標位置暫時的目標估測失效,提取唯一目標狀態的平均演算法,以及基於共識(Consensus)之多群體系統隊形追蹤控制演算法。此外也提供了實驗結果來展示系統的可用性。 | zh_TW |
| dc.description.abstract | UAVs (Unmanned Aerial Vehicles) or drones can be used for many purposes in a lot of aerial tasks, such as exploration, inspection, mapping, interaction with the environment, or even search and rescue. Among these applications of the UAVs, one of them is the police team. Chasing suspects or targets can be one of the most important task for the police in the city without much effort. However, the long process of training would be labor-intensive. Simultaneously, during the tasks, there might be some image-based estimation failure of the target, which may be caused by occlusion from the obstacles or signal transmission failure.
As a result, in this thesis, we solve the autonomous aerial multi-agent tracking problem toward a single target using a team of quadrotors under the occlusion by obstacles or unpredicted temporary signal transmission failure. The system include global localization of quadrotors using data fusion of onboard image and IMU measurement by Kalman filter (KF), target position estimation with the positions relative to the quadrotors, trajectory-based target motion prediction for solving temporarily unpredictable estimation failure of the target position, averaging algorithm for extracting the unique state of the target and consensus-based formation tracking algorithm for controlling the multi-agent system. Furthermore, We also provide some experiment results to demonstrate the capability of the system. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-08-16T17:10:25Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2024-08-16T17:10:25Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | Verification Letter from the Oral Examination Committee i
誌謝 iii 摘要 v Abstract vi Contents ix List of Figures xv List of Tables xxi Chapter 1 Introduction 1 1.1 Motivation for Using Multi-Agent System 1 1.2 Problem Formulation of Aerial Multi-Agent Tracking 4 1.3 Contributions of the Thesis 6 1.4 Organization of the Thesis 9 Chapter 2 Literature Survey 11 2.1 Accurate Localization with Vision Assistance 11 2.1.1 Localization with Visual Odometry (VO) with Vision Directly 12 2.1.2 Localization with Visual Inertial Odometry (VIO) Based on Fusion of Vision and IMU Information 12 2.1.3 Landmark-Based Localization, a Modified Version of VIO 13 2.2 Visual Servoing for Robots 15 2.2.1 Position-Based Visual Servoing (PBVS) Based on Position Feedback 15 2.2.2 Image-Based Visual Servoing (IBVS) Based on Image Features Feedback 15 2.2.3 Hybrid Approach, Combination of PBVS and IBVS 16 2.3 Formation Tracking for Multi-Agent Systems 16 Chapter 3 Related Works of the System 21 3.1 Camera Pinhole Model, a Coordinate Transformation between Camera Image and World Frame 21 3.2 Kalman Filter for State Estimation and Sensor Fusion 25 3.3 Mathematical Optimization Problems 26 3.3.1 Quadratic Programming (QP) 29 3.4 Consensus-Based Formation Tracking Control Strategy 30 Chapter 4 Methodologies 31 4.1 System Overview for Vision-Based Multi-Agent Trajectory Tracking System with Landmark-Based Localization 31 4.2 Proposed KF-Based Localization with Landmarks 34 4.2.1 Workflow of State Estimation According to Landmarks with Vision 34 4.2.2 KF-Based State Estimation According to Landmarks 36 4.3 Trajectory-Based Motion Prediction of the Target 37 4.3.1 Target State Estimation with Downward Vision 37 4.3.2 Trajectory Function Representation 38 4.3.3 Target Motion Prediction with Curve Fitting 39 4.4 Control Strategies of the Multi-Quadrotor System 40 4.4.1 Planar Consensus-Based Formation Tracking Control for the Multi-Agent System 40 4.4.1.1 Reference State Extraction 40 4.4.1.2 Planar Consensus-Based Formation Tracking Control 42 Chapter 5 Experiment Results and Validations of the System 45 5.1 Experiment Setup 45 5.2 Task Overview 45 5.3 Quadrotor Positions by Onboard Localization/RGB Camera Estimation within Time 48 5.3.1 Results of 0-1 ∼ 0-3, Non-Control Cases, without Obstacles 49 5.3.2 Results of 1-1 ∼ 1-3, Control Cases, without Obstacles 53 5.3.3 Results of 2-1 ∼ 2-3, Control Cases, with an Obstacle 56 5.3.4 Results of 3-1 ∼ 3-3, Control Cases, with three Obstacles 62 5.4 Estimated/Predicted Target Positions within Time without/with Curve Fitting 66 5.4.1 Results of 0-1 ∼ 0-3, Non-Control Cases, without Obstacles 68 5.4.2 Results of 1-1 ∼ 1-3, Control Cases, without Obstacles 68 5.4.3 Results of 2-1 ∼ 2-3, Control Cases, with an Obstacle 72 5.4.4 Results of 3-1 ∼ 3-3, Control Cases, with three Obstacles 77 5.5 Onboard Prediction/RGB Camera Estimation of the Target Position within Time 82 5.5.1 Results of 0-1 ∼ 0-3, Non-Control Cases, without Obstacles 84 5.5.2 Results of 1-1 ∼ 1-3, Control Cases, without Obstacles 87 5.5.3 Results of 2-1 ∼ 2-3, Control Cases, with an Obstacle 90 5.5.4 Results of 3-1 ∼ 3-3, Control Cases, with three Obstacles 94 5.6 Actual Positions by RGB Camera within Time 101 5.6.1 Results of 0-1 ∼ 0-3, Non-Control Cases, without Obstacles 102 5.6.2 Results of 1-1 ∼ 1-3, Control Cases, without Obstacles 105 5.6.3 Results of 2-1 ∼ 2-3, Control Cases, with an Obstacle 109 5.6.4 Results of 3-1 ∼ 3-3, Control Cases, with three Obstacles 114 5.7 Command Velocities within Time 118 5.7.1 Results of 0-1 ∼ 0-3, Non-Control Cases, without Obstacles 118 5.7.2 Results of 1-1 ∼ 1-3, Control Cases, without Obstacles 120 5.7.3 Results of 2-1 ∼ 2-3, Control Cases, with an Obstacle 120 5.7.4 Results of 3-1 ∼ 3-3, Control Cases, with three Obstacles 124 Chapter 6 Conclusions and Future Works 127 6.1 Conclusions 127 6.2 Future Works 129 References 131 Appendix A Image Processing for Target and Landmark Detection with Calibrated Downward Vision 143 A.1 Downward Vision Processing 143 A.2 Undistortion of the Image in Camera Calibration 144 A.3 Target and Landmark Detection Using ArUco Markers 145 Appendix B Introduction of Nonlinear Kalman Filters 147 B.1 Extended Kalman Filter (EKF) Using Linearization 147 B.2 Unscented Kalman Filter (UKF) Using Points Fitting 148 B.3 Other Kinds of Kalman Filters 150 Appendix C Introduction of Other Mathematical Optimization Problems 153 C.1 Convex Optimization Problems 153 C.1.1 Linear Programming (LP) 153 C.1.2 Second-order Cone Programming (SOCP) 153 C.1.3 Semidefinite Programming (SDP) 154 C.1.4 Conic Programming (CP) 154 C.2 Other Research Fields of Mathematical Optimization 154 Appendix D Other Formation Tracking Control Algorithms for Multi-Agent Systems 157 D.1 Leader-Follower 157 D.2 Potential Function 157 D.3 Virtual Structure 158 D.4 Behavior-Based 158 D.5 Intelligence 159 Appendix E Two-Step Error-Based Trajectories Planning of Quadrotors 161 E.1 Sequential Quadratic Programming (SQP) for Solving Mathematical Optimization Problem 161 E.2 Cost Function Based on Error 163 E.3 Two-Step Error-Based Planning 164 Appendix F Motion Capture System from the Third Perspective for Validation 167 F.1 Hardware Design on the Quadrotors, an ArUco Marker Based State Estimation 167 F.2 Feature Extraction Using RGB Image and Markers 168 F.3 Perspective-n-Point (PnP) Algorithm for Motion Capturing from 2D RGB Image 168 Appendix G Error Accumulation Validation 171 G.1 Experiment Setup 171 G.2 3D Motion and Position within Time 172 Appendix H Height and Yaw Control with PID Controllers 175 H.1 Introduction of PID Controllers 175 H.2 Lowpass Filter for Highly-Jittering Signals 176 H.2.1 Height Control with IMU Information 176 H.2.2 Yaw Control with IMU Information 177 H.3 Experiment Results for Height and Yaw Control with IMU Information 178 H.3.1 Experiment Setup 179 H.3.2 Comparison of Raw/Filtered Data of Height 179 H.3.3 Comparison of Raw/Filtered Data of Yaw 181 Appendix I KF-Based State Estimation 183 I.1 Simulation: Vision Unavailable Case 183 I.2 Experiment Results 185 I.2.1 Experiment: 1D Case, Linear Motion 187 I.2.2 Experiment: 2D Case, Square Motion 193 Appendix J Consensus-Based Formation Tracking Control toward a Stationary Ground Target 199 J.1 Experiment Setup 199 J.2 Planar Trajectories of the Agents 200 J.3 Comparison of Tracking Positions within Time 201 J.4 Command Velocities within Time 201 J.5 Comparison of Estimated/Predicted Target Position v.s. Time without/with Curve Fitting 201 Appendix K LiDAR Odometry from the Pointcloud 205 K.1 Experiment Setup 205 K.2 Experiment Results 206 Appendix L Introduction of the Adjacency Matrix and Control Method 209 Appendix M Stability Analysis of the Consensus-Based Formation Controller 211 | - |
| dc.language.iso | en | - |
| dc.subject | 四旋翼無人機 | zh_TW |
| dc.subject | 多群組系統 | zh_TW |
| dc.subject | 卡爾曼濾波器 | zh_TW |
| dc.subject | 動作預測 | zh_TW |
| dc.subject | 軌跡追蹤 | zh_TW |
| dc.subject | 基於共識之隊形控制 | zh_TW |
| dc.subject | Kalman Filter (KF) | en |
| dc.subject | Quadrotor | en |
| dc.subject | Consensus-Based Formation Control | en |
| dc.subject | Trajectory Tracking | en |
| dc.subject | Motion Prediction | en |
| dc.subject | Multi-Agent System | en |
| dc.title | 於視覺遮蔽環境下針對地面目標之空中協同軌跡追蹤定位與監視系統 | zh_TW |
| dc.title | Cooperative Aerial Trajectory Tracking, Localization and Surveillance System toward a Ground Target with Vision Occlusion | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 112-2 | - |
| dc.description.degree | 碩士 | - |
| dc.contributor.oralexamcommittee | 李後燦;黃正民;江明理 | zh_TW |
| dc.contributor.oralexamcommittee | Hou-Tsan Lee;Cheng-Ming Huang;Ming-Li Chiang | en |
| dc.subject.keyword | 四旋翼無人機,多群組系統,卡爾曼濾波器,動作預測,軌跡追蹤,基於共識之隊形控制, | zh_TW |
| dc.subject.keyword | Quadrotor,Multi-Agent System,Kalman Filter (KF),Motion Prediction,Trajectory Tracking,Consensus-Based Formation Control, | en |
| dc.relation.page | 213 | - |
| dc.identifier.doi | 10.6342/NTU202404031 | - |
| dc.rights.note | 同意授權(全球公開) | - |
| dc.date.accepted | 2024-08-13 | - |
| dc.contributor.author-college | 電機資訊學院 | - |
| dc.contributor.author-dept | 電機工程學系 | - |
| 顯示於系所單位: | 電機工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-112-2.pdf | 6.16 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
