Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/93211
Full metadata record
???org.dspace.app.webui.jsptag.ItemTag.dcfield???ValueLanguage
dc.contributor.advisor連豊力zh_TW
dc.contributor.advisorFeng-Li Lianen
dc.contributor.author王旅青zh_TW
dc.contributor.authorLu-Ching Wangen
dc.date.accessioned2024-07-23T16:18:52Z-
dc.date.available2024-07-24-
dc.date.copyright2024-07-23-
dc.date.issued2024-
dc.date.submitted2024-07-15-
dc.identifier.citationA. Qiu, C. Young, A. L. Gunderman, M. Azizkhani, Y. Chen, and A.-P. Hu, “Tendon-Driven Soft Robotic Gripper with Integrated Ripeness Sensing for Blackberry Harvesting,” in 2023 IEEE International Conference on Robotics and Automation (ICRA), May 2023, pp. 11 831–11 837. DOI: 10.1109/ICRA48891.2023.10160893.
經濟部.“經濟統計數據分析統計.”(2023), [Online]. Available: https://dmz26.moea.gov.tw/GA/common/Common.aspx?code=J&no=2 (visited on06/17/2024).
M. Castillo and S. Simnitt. “USDA ERS - Farm Labor.” (Aug. 2023), [Online]. Available: https : / / www . ers . usda . gov / topics / farm - economy / farm - labor/#laborcostshare (visited on 06/17/2024).
鍾憶欣. “109 年農林漁牧業普查初步統計結果,” 中華民國統計資訊網. (Dec. 1, 2022), [Online]. Available: https://www.stat.gov.tw/News_Content.aspx?n=3703&s=226901 (visited on 06/17/2024).
詹和臻, 陳卓希, 邱海鳴, and 江俊緯. “青農夢攏是假?返鄉少年仔在田間的產銷困境–政大大學報.” (May 19, 2021), [Online]. Available: https ://reurl.cc/NQv5Rm (visited on 06/17/2024).
楊蘭軒. “黃仁勳:人形機器人未來3 年將有突破百年後無所不在| 產經,”中央社CNA. (Jun. 16, 2024), [Online]. Available: https://www.cna.com.tw/news/afe/202406160214.aspx (visited on 06/17/2024).
陳映璇. “黃仁勳的AI 新時代宣告:「地表最強AI 晶片」運算更快、耗電更少,” 環境資訊中心. (Jun. 17, 2024), [Online]. Available: https : / / e -info.org.tw/node/239276 (visited on 06/17/2024).
盧永山. “〈財經週報-科技趨勢〉人力成本飆升美車商重用機器人- 自由財經.” (Feb. 12, 2024), [Online]. Available: https://ec.ltn.com.tw/article/paper/1630508 (visited on 06/17/2024).
S. Crowe. “Agrobotics startup Root AI acquired by AppHarvest for $60M,” The Robot Report. (Apr. 8, 2021), [Online]. Available: https://www.therobotreport. com/root-ai-acquired-by-appharvest-for-60m/ (visited on 07/25/2023).
Panasonic. “Introducing AI-equipped Tomato Harvesting Robots to Farms May Help to Create Jobs,” Panasonic Newsroom Global. (May 23, 2018), [Online]. Available: https : / / news . panasonic . com / global / stories / 814 (visited on 07/25/2023).
I. Sa, C. Lehnert, A. English, C. McCool, F. Dayoub, B. Upcroft, and T. Perez, “Peduncle Detection of Sweet Pepper for Autonomous Crop Harvesting—Combined Color and 3-D Information,” IEEE Robotics and Automation Letters, vol. 2, no. 2, pp. 765–772, Apr. 2017, ISSN: 2377-3766. DOI: 10.1109/LRA.2017.2651952.
A. Silwal, J. R. Davidson, M. Karkee, C. Mo, Q. Zhang, and K. Lewis, “Design, integration, and field evaluation of a robotic apple harvester,” Journal of Field Robotics, vol. 34, no. 6, pp. 1140–1159, 2017, ISSN: 1556-4967. DOI: 10.1002/rob.21715.
A. L. Gunderman, J. A. Collins, A. L. Myers, R. T. Threlfall, and Y. Chen, “Tendon-Driven Soft Robotic Gripper for Blackberry Harvesting,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 2652–2659, Apr. 2022, ISSN: 2377-3766. DOI: 10.1109/LRA.2022.3143891.
J. Bakker, “Model Application For Energy Efficient Greenhouses In The Netherlands: Greenhouse Design, Operational Control And Decision Support Systems,” Acta Horticulturae, no. 718, pp. 191–202, Oct. 2006, ISSN: 0567-7572, 2406-6168. DOI: 10.17660/ActaHortic.2006.718.21.
行政院農委會農糧署. “【農業. 新南向. 科技研發】不再看天吃飯、迎戰氣候變遷, 臺灣精緻溫室設施勇闖新南向,” 天下雜誌. (Nov. 4, 2020), [Online]. Available: https://www.cw.com.tw/article/5102619 (visited on 01/30/2024).
吳秉容. “跨領域科技研發結晶臺大智慧溫室攜手朝向「機智的農民生活」,” 聯合新聞網. (Nov. 26, 2023), [Online]. Available: https://udn.com/news/story/123535/7596805 (visited on 06/17/2024).
鄭國強. “導入智慧管理不再看天吃飯翁章梁要讓嘉義青農從農業看到希望,”信傳媒. (Nov. 3, 2023), [Online]. Available: https://www.cmmedia.com.tw/home/articles/43224 (visited on 06/17/2024).
C. W. Bac, E. J. van Henten, J. Hemming, and Y. Edan, “Harvesting Robots for High-value Crops: State-of-the-art Review and Challenges Ahead,” Journal of Field Robotics, vol. 31, no. 6, pp. 888–911, 2014, ISSN: 1556-4967. DOI: 10.1002/rob.21525.
行政院農業部. “Production Value of Agricultural Products in Taiwan,” Agriculture and Food Agency, Council of Agriculture, Executive Yuan, R.O.C(TAIWAN). May 2022), [Online]. Available: https://www.afa.gov.tw/eng/index.php? (visited on 07/25/2023).
張文熹. “彰化縣首屆洋香瓜評鑑「瓜中LV」粒粒脆甜可口又多汁,” Yahoo News. (May 29, 2024), [Online]. Available: https://ynews.page.link/5wJ3M (visited on 06/17/2024).
Universal-Robots. “6 Types of Industrial Robotic Arms and Their Applications,” Universal Robots Blog. (Nov. 2022), [Online]. Available: https://www.universalrobots.com/in/blog/types-of-robotic-arms/ (visited on 03/30/2024).
J. Jun, J. Kim, J. Seol, J. Kim, and H. I. Son, “Towards an Efficient Tomato Harvesting Robot: 3D Perception, Manipulation, and End-Effector,” IEEE Access, vol. 9, pp. 17 631–17 640, 2021, ISSN: 2169-3536. DOI: 10.1109/ACCESS.2021.3052240.
W. Lili, Z. Bo, F. Jinwei, H. Xiaoan, W. Shu, L. Yashuo, Q. Zhou, and W. Chongfeng, “Development of a tomato harvesting robot used in greenhouse,” International Journal of Agricultural and Biological Engineering, vol. 10, no. 4, pp. 140–149, 4 Jul. 31, 2017, ISSN: 1934-6352. DOI: 10.25165/ijabe.v10i4.3204.
H. Yaguchi, K. Nagahama, T. Hasegawa, and M. Inaba, “Development of an autonomous tomato harvesting robot with rotational plucking gripper,” in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct. 2016, pp. 652–657. DOI: 10.1109/IROS.2016.7759122.
Y. Xiong, P. J. From, and V. Isler, “Design and Evaluation of a Novel Cable-Driven Gripper with Perception Capabilities for Strawberry Picking Robots,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), May 2018, pp. 7384–7391. DOI: 10.1109/ICRA.2018.8460705.
C. Hu, X. Liu, Z. Pan, and P. Li, “Automatic Detection of Single Ripe Tomato on Plant Combining Faster R-CNN and Intuitionistic Fuzzy Set,” IEEE Access, vol. 7, pp. 154 683–154 696, 2019, ISSN: 2169-3536. DOI: 10.1109/ACCESS.2019.2949343.
L. Zhang, J. Jia, G. Gui, X. Hao, W. Gao, and M. Wang, “Deep Learning Based Improved Classification System for Designing Tomato Harvesting Robot,” IEEE Access, vol. 6, pp. 67 940–67 950, 2018, ISSN: 2169-3536. DOI: 10.1109/ACCESS. 2018.2879324.
C. Song, K. Wang, C. Wang, Y. Tian, X. Wei, C. Li, Q. An, and J. Song, “TDPPLNet: A Lightweight Real-Time Tomato Detection and Picking Point Localization Model for Harvesting Robots,” IEEE Access, vol. 11, pp. 37 650–37 664, 2023, ISSN: 2169-3536. DOI: 10.1109/ACCESS.2023.3260222.
A. Tafuro, A. Adewumi, S. Parsa, G. E. Amir, and B. Debnath, “Strawberry picking point localization ripeness and weight estimation,” in 2022 International Conference on Robotics and Automation (ICRA), May 2022, pp. 2295–2302. DOI: 10.1109/ICRA46639.2022.9812303.
C. Lehnert, A. English, C. McCool, A. W. Tow, and T. Perez, “Autonomous Sweet Pepper Harvesting for Protected Cropping Systems,” IEEE Robotics and Automation Letters, vol. 2, no. 2, pp. 872–879, Apr. 2017, ISSN: 2377-3766. DOI: 10.1109/LRA.2017.2655622.
M. Campbell, A. Dechemi, and K. Karydis, “An Integrated Actuation-Perception Framework for Robotic Leaf Retrieval: Detection, Localization, and Cutting,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct. 2022, pp. 9210–9216. DOI: 10.1109/IROS47612.2022.9981118.
K. Zhang, K. Lammers, P. Chu, N. Dickinson, Z. Li, and R. Lu, “Algorithm Design and Integration for a Robotic Apple Harvesting System,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct. 2022, pp. 9217–9224. DOI: 10.1109/IROS47612.2022.9981417.
L. M. Dischinger, M. Cravetz, J. Dawes, C. Votzke, C. VanAtter, M. L. Johnston, C. M. Grimm, and J. R. Davidson, “Towards Intelligent Fruit Picking with In-hand Sensing,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic: IEEE, Sep. 27, 2021, pp. 3285–3291, ISBN: 978-1-66541-714-3. DOI: 10.1109/IROS51168.2021.9636341.
B. Lai, Z. Li, W. Li, C. Yang, and Y. Pan, “Homography-Based Visual Servoing of Eye-in-Hand Robots With Exact Depth Estimation,” IEEE Transactions on Industrial Electronics, vol. 71, no. 4, pp. 3832–3841, Apr. 2024, ISSN: 1557-9948. DOI: 10.1109/TIE.2023.3277072.
S. Hutchinson, G. Hager, and P. Corke, “A tutorial on visual servo control,” IEEE Transactions on Robotics and Automation, vol. 12, no. 5, pp. 651–670, Oct. 1996, ISSN: 2374-958X. DOI: 10.1109/70.538972.
H. Wang, “Adaptive visual tracking for robotic systems without image-space velocity measurement,” Automatica, vol. 55, pp. 294–301, May 1, 2015, ISSN: 0005-1098. DOI: 10.1016/j.automatica.2015.02.029.
C. P. Bechlioulis, S. Heshmati-alamdari, G. C. Karras, and K. J. Kyriakopoulos, “Robust Image-Based Visual Servoing With Prescribed Performance Under Field of View Constraints,” IEEE Transactions on Robotics, vol. 35, no. 4, pp. 1063–1070, Aug. 2019, ISSN: 1941-0468. DOI: 10.1109/TRO.2019.2914333.
F. Chaumette and S. Hutchinson, “Visual servo control. I. Basic approaches,” IEEE Robotics & Automation Magazine, vol. 13, no. 4, pp. 82–90, Dec. 2006, ISSN: 1070-9932, 1558-223X. DOI: 10.1109/MRA.2006.250573.
E. Malis, F. Chaumette, and S. Boudet, “2 1/2 D visual servoing,” IEEE Transactions on Robotics and Automation, vol. 15, no. 2, pp. 238–250, Apr. 1999, ISSN: 2374-958X. DOI: 10.1109/70.760345.
T. Li, J. Yu, Q. Qiu, and C. Zhao, “Hybrid Uncalibrated Visual Servoing Control of Harvesting Robots With RGB-D Cameras,” IEEE Transactions on Industrial Electronics, vol. 70, no. 3, pp. 2729–2738, Mar. 2023, ISSN: 1557-9948. DOI: 10. 1109/TIE.2022.3172778.
Wikipedia, Isotropy, in Wikipedia, May 22, 2024. [Online]. Available: https://en.wikipedia.org/w/index.php?title=Isotropy&oldid=1225070051#cite_ref-autogenerated1_2-0 (visited on 06/03/2024).
R. C. Gonzalez, Digital Image Processing, 4Th Edition, 4th edition. Uttar Pradesh: PEARSON INDIA, Jan. 1, 2019, 216 pp., ISBN: 978-93-5306-298-9.
L.-C. Wang, Y.-C. Chu, Y. Huang, and F.-L. Lian, “Enhancement on Target-Gripper Alignment: A Tomato Harvesting Robot with Dual-Camera Image-Based Visual Servoing,” in 2024 IEEE International Conference on Robotics and Automation (ICRA), May 2024.
Techman-Robotics. “Techman Robot | TM5 - 900,” Techman Robot. (2024), [Online]. Available: https : / / www . tm - robot . com / en / tm5 - 900/ (visited on 06/08/2024).
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/93211-
dc.description.abstract在本篇研究中,我們提出使用混和視覺伺服演算法於追蹤等向性物體的控制架構。等向性物體,舉例來說圓球,可將複雜的模型精簡化,以減輕控制的負擔。然而,過於簡單的模型將導致相機能取得的特徵點數量過少,使得只依靠視覺去追蹤目標有困難。傳統的方法有基於位置,和基於影像的視覺伺服,但兩者皆無法分別地有效處理所遇到的困境,尤其是系統存在一些不確定性時。因此,我們提出了混和視覺伺服的控制器,並試圖解決該問題。
首先機器人會針對物體的半徑大小做不同的處理。接著我們透過來自不同相機(非等空間),或是不同時間點(非等時間)的影像回授,藉由混和視覺伺服控制器的處理,只要大於三個自由度的機器手臂,便能完成視覺伺服的任務。此外,透過誤差累積補償演算法,系統所需迭代的數量可被減少,進而促使所需時間減少。利用李亞普諾夫系統分析,可確保提出的系統在一定系統增益範圍內,有漸進穩定的性質。
實驗中,我們使用兩種機器人型態進行測試:單相機配置與雙相機配置。視覺伺服的目標為採收不同大小的作物,分別為體積較大的哈密瓜與體積較小的番茄。實驗場所為溫室,針對不同的場景和參數配置進行實驗。
最終測得結果包含,對大體積物體有90%的半徑預測正確率,並且在採收大小物體時分別有80%和68.4%的成功率。結合理論計算與實際驗證,我們找到系統的增益,在數值等於0.2時有最佳解。平均來說,在最佳系統增益下需要三次迭代,便可使誤差收斂,同時耗時約21.2秒。然而透過誤差累積補償,可更進一步到只要一次迭代,即可到平衡點,耗時降低到只要6.26秒。這些成果展現了,我們的系統不僅成功解決問題,在兩種機器人型態和不同目標物上的實驗,均展示該控制演算法能適用於不同的場合。
zh_TW
dc.description.abstractIn this study, we propose a hybrid visual servoing method to track an isotropic object. Isotropic objects, such as spheres, are ideal models that simplify complex systems. However, they pose difficulties in visual tracking due to the insufficient number of features. The current visual servoing techniques, position-based and image-based algorithms, cannot handle the task individually, especially when model uncertainties are included.
As a result, we designed a hybrid visual servo control to conquer this problem. The robot handles objects of varying sizes differently. By using image feedback from different cameras (spatial) or from different time steps (temporal), a manipulator with at least 3 degrees of freedom can accomplish the task. Furthermore, cumulative error compensation reduces the number of iterations, decreasing the operating time. Through Lyapunov analysis, the system is proven to be asymptotically stable.
During the experiments, two robot configurations are presented: a single-camera setup and a dual-camera setup. The objective is to harvest large and small isotropic objects, specifically melons and tomatoes. The experiments were conducted in a greenhouse and tested under various parameter selections.
The results include a 90% accuracy rate in radius estimation for large objects and successful harvesting rates of 80% and 68.4% for large and small objects, respectively. The optimal gain of the system matches both theoretical and experimental values, with an optimal solution at 0.2. The average iteration needed is 3, with a time cost of 21.2 seconds. However, with cumulative error compensation, the time can be reduced to 6.26 seconds with only one iteration. These results showcase the success of the tasks and demonstrate the robustness of the system.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-07-23T16:18:52Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2024-07-23T16:18:52Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents摘要i
ABSTRACT iii
CONTENTS v
LIST OF FIGURES ix
LIST OF TABLES xiii
Denotation xi
Chapter 1 Introduction 1
1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.3 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.3.1 Model Uncertainties . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.3.2 Isotropic Object Tracking . . . . . . . . . . . . . . . . . . . . . . 5
1.3.3 RGBD Camera Limitation . . . . . . . . . . . . . . . . . . . . . . 6
1.4 Proposed Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.5 Chapter Organization . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Chapter 2 Literature Survey 9
2.1 Overall Review of the Challenges Faced . . . . . . . . . . . . . . . . 9
2.2 Field Robots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.2.1 Commercial Field Robots . . . . . . . . . . . . . . . . . . . . . . 10
2.2.2 Academic Field Robots . . . . . . . . . . . . . . . . . . . . . . . 12
2.3 Object Perception . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.4 Sensor Type . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.5 Visual Servoing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.5.1 Position-Based Visual Servoing . . . . . . . . . . . . . . . . . . . 15
2.5.2 Image-Based Visual Servoing . . . . . . . . . . . . . . . . . . . . 16
2.5.3 Hybrid Visual Servoing . . . . . . . . . . . . . . . . . . . . . . . 17
Chapter 3 System Overview 19
3.1 Isotropic Objects . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.2 System Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.2.1 Manipulator Model . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.2.2 Camera Pinhole Model . . . . . . . . . . . . . . . . . . . . . . . 22
3.2.3 Visual Servoing Model . . . . . . . . . . . . . . . . . . . . . . . 23
3.2.4 Autonomous Mobile Robot . . . . . . . . . . . . . . . . . . . . . 24
3.3 Problems Formulation . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.3.1 Tracking Challenges for Isotropic Objects . . . . . . . . . . . . . 25
3.3.2 Model Uncertainties . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.3.3 Control Objective . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Chapter 4 Isotropic Object Tracking Control 29
4.1 Robot’s Movement . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
4.2 Objects Sensing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.2.1 Object Perception . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.2.2 Target Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.2.3 Large Object . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.2.4 Small Object . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.3 Hybrid Visual Servoing . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.3.1 Camera Frame Transition . . . . . . . . . . . . . . . . . . . . . . 36
4.3.2 Event and Iteration . . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.3.3 Position-Based Visual Servoing . . . . . . . . . . . . . . . . . . . 38
4.3.4 Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.3.5 Interactive Image-Based Visual Servoing . . . . . . . . . . . . . . 40
4.3.6 Cumulative Error Compensation . . . . . . . . . . . . . . . . . . 43
4.3.7 Proof of Error Convergence . . . . . . . . . . . . . . . . . . . . . 44
4.3.8 Singularity and Constraint . . . . . . . . . . . . . . . . . . . . . . 48
Chapter 5 Experiments 51
5.1 Case 1: 6-DOF Robot with Large Object . . . . . . . . . . . . . . . 51
5.1.1 Observation Position Evaluation . . . . . . . . . . . . . . . . . . . 52
5.1.2 Image Perception . . . . . . . . . . . . . . . . . . . . . . . . . . 53
5.1.3 Performance Evaluation . . . . . . . . . . . . . . . . . . . . . . . 55
5.1.4 Radius Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . 57
5.1.5 Failure Scenarios Evaluation . . . . . . . . . . . . . . . . . . . . 58
5.1.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
5.2 Case 2: 4-DOF Robot with Small Object . . . . . . . . . . . . . . . 61
5.2.1 Image Perception . . . . . . . . . . . . . . . . . . . . . . . . . . 61
5.2.2 Performance of the Optimal Gain . . . . . . . . . . . . . . . . . . 62
5.2.3 Performances of Different Gains . . . . . . . . . . . . . . . . . . 64
5.2.4 Evaluation of the Cumulative Error Compensation . . . . . . . . . 66
5.2.5 Performance Evaluation . . . . . . . . . . . . . . . . . . . . . . . 68
5.2.6 Failure Scenarios Evaluation . . . . . . . . . . . . . . . . . . . . 68
5.2.7 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Chapter 6 Conclusion and Future Works 71
6.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
6.1.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
6.1.2 Experiment Evaluation: Case 1 . . . . . . . . . . . . . . . . . . . 72
6.1.3 Experiment Evaluation: Case 2 . . . . . . . . . . . . . . . . . . . 72
6.2 Future Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
References 77
Appendix A IBVS System Analysis 85
A.1 Continuous System . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
A.2 Descrete System . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
A.3 IBVS Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Appendix B Tomato Harvesting Robot 91
B.1 Hardware Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
B.2 Low Level Control . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
B.2.1 Inverse kinematics . . . . . . . . . . . . . . . . . . . . . . . . . . 92
B.2.2 Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
B.2.3 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
B.3 High Level Control . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
-
dc.language.isoen-
dc.subject農業zh_TW
dc.subject移動式機器人zh_TW
dc.subject混合視覺伺服zh_TW
dc.subject機器手臂zh_TW
dc.subject定位與追蹤zh_TW
dc.subject等向性物體zh_TW
dc.subject深度相機zh_TW
dc.subjectManipulatoren
dc.subjectIsotropic Obejecten
dc.subjectDepth Cameraen
dc.subjectRegulation and Trackingen
dc.subjectHybrid Visual Servoingen
dc.subjectAgricultureen
dc.subjectAutonomous Mobile Roboten
dc.title非等時空影像回授之混合視覺伺服應用於機器手臂對等向性目標的追蹤zh_TW
dc.titleHybrid Visual Servo Manipulation for Isotropic Target Using Individual Spatio-Temporal Image Feedbacken
dc.typeThesis-
dc.date.schoolyear112-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee林沛群;顏炳郎zh_TW
dc.contributor.oralexamcommitteePei-Chun Lin;Ping-Lang Yenen
dc.subject.keyword混合視覺伺服,深度相機,等向性物體,定位與追蹤,機器手臂,移動式機器人,農業,zh_TW
dc.subject.keywordHybrid Visual Servoing,Depth Camera,Isotropic Obeject,Regulation and Tracking,Manipulator,Autonomous Mobile Robot,Agriculture,en
dc.relation.page97-
dc.identifier.doi10.6342/NTU202401503-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2024-07-16-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept電機工程學系-
Appears in Collections:電機工程學系

Files in This Item:
File SizeFormat 
ntu-112-2.pdf57.13 MBAdobe PDFView/Open
Show simple item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved