Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 工學院
  3. 機械工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/96751
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor莊嘉揚zh_TW
dc.contributor.advisorJia-Yang Juangen
dc.contributor.author詹哲維zh_TW
dc.contributor.authorChe-Wei Chanen
dc.date.accessioned2025-02-21T16:23:27Z-
dc.date.available2025-02-22-
dc.date.copyright2025-02-21-
dc.date.issued2024-
dc.date.submitted2024-12-23-
dc.identifier.citation[1] H. Wang, M. Totaro, and L. Beccai, "Toward Perceptive Soft Robots: Progress and Challenges," Advanced Science, vol. 5, no. 9, p. 1800541, 2018, doi: https://doi.org/10.1002/advs.201800541.
[2] J. Fang et al., "A Shift from Efficiency to Adaptability: Recent Progress in Biomimetic Interactive Soft Robotics in Wet Environments," Advanced Science, vol. 9, no. 8, p. 2104347, 2022, doi: https://doi.org/10.1002/advs.202104347.
[3] D. Rus and M. T. Tolley, "Design, Fabrication and Control of Soft Robots," Nature, vol. 521, no. 7553, pp. 467-475, 2015, doi: 10.1038/nature14543.
[4] J. Shintake, V. Cacucciolo, D. Floreano, and H. Shea, "Soft Robotic Grippers," Advanced Materials, vol. 30, no. 29, p. 1707035, 2018, doi: 10.1002/adma.201707035.
[5] J. Walker et al., "Soft Robotics: A Review of Recent Developments of Pneumatic Soft Actuators," Actuators, vol. 9, no. 1, p. 3, 2020, doi: 10.3390/act9010003.
[6] L. Zhou, L. Ren, Y. Chen, S. Niu, Z. Han, and L. Ren, "Bio‐Inspired Soft Grippers Based on Impactive Gripping," Advanced Science, vol. 8, no. 9, p. 2002017, 2021, doi: https://doi.org/10.1002/advs.202002017.
[7] H. Zhang, A. S. Kumar, J. Y. H. Fuh, and M. Y. Wang, "Design and Development of a Topology-Optimized Three-Dimensional Printed Soft Gripper," Soft Robotics, vol. 5, no. 5, pp. 650-661, 2018, doi: https://doi.org/10.1089/soro.2017.0058.
[8] N. Guo et al., "Autoencoding a Soft Touch to Learn Grasping from on‐Land to Underwater," Advanced Intelligent Systems, vol. 6, no. 1, p. 2300382, 2024, doi: 10.1002/aisy.202300382.
[9] R. Jain, S. Datta, S. Majumder, and A. Dutta, "Two Ipmc Fingers Based Micro Gripper for Handling," International Journal of Advanced Robotic Systems, vol. 8, no. 1, p. 13, 2011, doi: https://doi.org/10.5772/1052.
[10] G.-H. Feng and S.-C. Yen, "Arch-Shaped Ionic Polymer–Metal Composite Actuator Integratable with Micromachined Functional Tools for Micromanipulation," IEEE Sensors Journal, vol. 16, no. 19, pp. 7109-7115, 2016, doi: 10.1109/JSEN.2016.2597861.
[11] R. Chattaraj, S. Khan, S. Bhattacharya, B. Bepari, D. Chatterjee, and S. Bhaumik, "Development of Two Jaw Compliant Gripper Based on Hyper-Redundant Approximation of Ipmc Actuators," Sensors and Actuators A: Physical, vol. 251, pp. 207-218, 2016, doi: https://doi.org/10.1016/j.sna.2016.10.017.
[12] M. Shahinpoor and K. J. Kim, "Electrically Controllable Deformation Memory Effects in Ionic Polymers," in Smart Structures and Materials 2002: Electroactive Polymer Actuators and Devices (EAPAD), 2002, vol. 4695: SPIE, pp. 85-94, doi: https://doi.org/10.1117/12.475152.
[13] Z. Samadikhoshkho, K. Zareinia, and F. Janabi-Sharifi, "A Brief Review on Robotic Grippers Classifications," in 2019 IEEE Canadian Conference of Electrical and Computer Engineering (CCECE), 2019: IEEE, pp. 1-4, doi: 10.1109/CCECE.2019.8861780.
[14] J.-H. Lee, Y. S. Chung, and H. Rodrigue, "Long Shape Memory Alloy Tendon-Based Soft Robotic Actuators and Implementation as a Soft Gripper," Scientific Reports, vol. 9, no. 1, p. 11251, 2019, doi: https://doi.org/10.1038/s41598-019-47794-1.
[15] Y. Sun, F. Han, D. Zhang, and D. Ye, "An Underwater Soft Claw Based on Bionic Principle," in 2021 IEEE International Conference on Robotics and Biomimetics (ROBIO), 2021: IEEE, pp. 973-977, doi: 10.1109/ROBIO54168.2021.9739295.
[16] J. P. Matthew A. Robertson, "Low-Inertia Vacuum-Powered Soft Pneumatic Actuator Coil Characterization and Design Methodology," IEEE International Conference on Soft Robotics (RoboSoft), 2018, doi: 10.1109/ROBOSOFT.2018.8405364.
[17] S. Han et al., "Snake Robot Gripper Module for Search and Rescue in Narrow Spaces," IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 1667-1673, 2022, doi: 10.1109/LRA.2022.3140812.
[18] Y. Yamanaka, S. Katagiri, H. Nabae, K. Suzumori, and G. Endo, "Development of a Food Handling Soft Robot Hand Considering a High-Speed Pick-and-Place Task," in 2020 IEEE/SICE International Symposium on System Integration (SII), 2020: IEEE, pp. 87-92, doi: 10.1109/SII46433.2020.9026282.
[19] X. Chen et al., "Webgripper: Bioinspired Cobweb Soft Gripper for Adaptable and Stable Grasping," IEEE Transactions on Robotics, vol. 39, no. 4, pp. 3059-3071, 2023, doi: 10.1109/TRO.2023.3262115.
[20] J. Gafford et al., "Shape Deposition Manufacturing of a Soft, Atraumatic, and Deployable Surgical Grasper," Journal of Mechanisms and Robotics, vol. 7, no. 2, p. 021006, 2015, doi: https://doi.org/10.1115/1.4029493.
[21] N. Sakagami, K. Takeuchi, and K. Koganezawa, "Numerical and Experimental Testing of Underwater Gripper with Adjustable Stiffness Joints," in 2020 IEEE/SICE International Symposium on System Integration (SII), 2020: IEEE, pp. 1118-1122, doi: 10.1109/SII46433.2020.9025809.
[22] A. Firouzeh and J. Paik, "Grasp Mode and Compliance Control of an Underactuated Origami Gripper Using Adjustable Stiffness Joints," IEEE/ASME Transactions on Mechatronics, vol. 22, no. 5, pp. 2165-2173, 2017, doi: 10.1109/tmech.2017.2732827.
[23] H. Li, J. Yao, C. Wei, P. Zhou, Y. Xu, and Y. Zhao, "An Untethered Soft Robotic Gripper with High Payload-to-Weight Ratio," Mechanism and Machine Theory, vol. 158, p. 104226, 2021, doi: https://doi.org/10.1016/j.mechmachtheory.2020.104226.
[24] Z. Wang, R. Kanegae, and S. Hirai, "Circular Shell Gripper for Handling Food Products," Soft Robotics, vol. 8, no. 5, pp. 542-554, 2021, doi: 10.1089/soro.2019.0140.
[25] S. Licht, E. Collins, G. Badlissi, and D. Rizzo, "A Partially Filled Jamming Gripper for Underwater Recovery of Objects Resting on Soft Surfaces," in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018: IEEE, pp. 6461-6468, doi: 10.1109/IROS.2018.8593361.
[26] Y. Hao et al., "A Multimodal, Enveloping Soft Gripper: Shape Conformation, Bioinspired Adhesion, and Expansion-Driven Suction," IEEE Transactions on Robotics, vol. 37, no. 2, pp. 350-362, 2020, doi: 10.1109/TRO.2020.3021427.
[27] S. D’Avella, P. Tripicchio, and C. A. Avizzano, "A Study on Picking Objects in Cluttered Environments: Exploiting Depth Features for a Custom Low-Cost Universal Jamming Gripper," Robotics and Computer-Integrated Manufacturing, vol. 63, p. 101888, 2020, doi: https://doi.org/10.1016/j.rcim.2019.101888.
[28] J. R. Amend, E. Brown, N. Rodenberg, H. M. Jaeger, and H. Lipson, "A Positive Pressure Universal Gripper Based on the Jamming of Granular Material," IEEE Transactions on Robotics, vol. 28, no. 2, pp. 341-350, 2012, doi: 10.1109/TRO.2011.2171093.
[29] E. Brown et al., "Universal Robotic Gripper Based on the Jamming of Granular Material," Proceedings of the National Academy of Sciences, vol. 107, no. 44, pp. 18809-18814, 2010, doi: https://doi.org/10.1073/pnas.100325010.
[30] T. Zhu, H. Yang, and W. Zhang, "A Spherical Self-Adaptive Gripper with Shrinking of an Elastic Membrane," in 2016 International Conference on Advanced Robotics and Mechatronics (ICARM), 2016: IEEE, pp. 512-517, doi: 10.1109/ICARM.2016.7606973.
[31] G. D. Howard, J. Brett, J. O'Connor, J. Letchford, and G. W. Delaney, "One-Shot 3d-Printed Multimaterial Soft Robotic Jamming Grippers," Soft Robotics, vol. 9, no. 3, pp. 497-508, 2022, doi: https://doi.org/10.1089/soro.2020.015.
[32] S. G. Fitzgerald, G. W. Delaney, D. Howard, and F. Maire, "Evolving Soft Robotic Jamming Grippers," in Proceedings of the Genetic and Evolutionary Computation Conference, 2021, pp. 102-110, doi: https://doi.org/10.1145/3449639.345933.
[33] H. Wang, S. Terryn, Z. Wang, G. Van Assche, F. Iida, and B. Vanderborght, "Self‐Regulated Self‐Healing Robotic Gripper for Resilient and Adaptive Grasping," Advanced Intelligent Systems, vol. 5, no. 12, p. 2300223, 2023, doi: https://doi.org/10.1002/aisy.202300223.
[34] M. Meloni et al., "Engineering Origami: A Comprehensive Review of Recent Applications, Design Methods, and Tools," Advanced Science, vol. 8, no. 13, p. 2000636, 2021, doi: https://doi.org/10.1002/advs.202000636.
[35] S. Leanza, S. Wu, X. Sun, H. J. Qi, and R. R. Zhao, "Active Materials for Functional Origami," Advanced Materials, vol. 36, no. 9, p. 2302066, 2024, doi: https://doi.org/10.1002/adma.202302066.
[36] L. Xinquan, W. Yuzhe, X. Zhen, and S. M. Ocak, "A Vacuum-Powered Soft Mesh Gripper for Compliant and Effective Grasping," in IEEE International Conference on Soft Robotics (RoboSoft), 2023, pp. 1-7, doi: 10.1109/RoboSoft55895.2023.10122056.
[37] Y. X. Mak, A. Dijkshoorn, and M. Abayazid, "Design Methodology for a 3d Printable Multi-Degree of Freedom Soft Actuator Using Geometric Origami Patterns," (in English), Advanced Intelligent Systems, vol. 6, no. 6, Jun 2024, doi: 10.1002/aisy.202300666.
[38] H. Yasuda, K. Johnson, V. Arroyos, K. Yamaguchi, J. R. Raney, and J. Yang, "Leaf-Like Origami with Bistability for Self-Adaptive Grasping Motions," Soft Robotics, vol. 9, no. 5, pp. 938-947, 2022, doi: 10.1089/soro.2021.0008.
[39] W. Yan, S. Li, M. Deguchi, Z. Zheng, D. Rus, and A. Mehta, "Origami-Based Integration of Robots That Sense, Decide, and Respond," Nature Communication, vol. 14, no. 1, p. 1553, Apr 3 2023, doi: 10.1038/s41467-023-37158-9.
[40] J. J. S. Shuguang Li, Helen J. Xu, Elian Malkin, Evelin Villegas Diaz, Daniela Rus and Robert J. Wood, "A Vacuum-Driven Origami “Magic-Ball” Soft Gripper," in International Conference on Robotics and Automation, 2019, pp. 7401-7408, doi: 10.1109/ICRA.2019.8794068.
[41] 歐陽智文, "負壓驅動軟性關節之優化並應用於仿生手指與爬行機器人," 機械工程研究所, 國立台灣大學, 2022.
[42] 于紹尹, "負壓驅動薄壁挫屈軟性材料手指," 機械工程學系, 國立台灣大學, 2021.
[43] C. W. Ou Yang et al., "Enhancing the Versatility and Performance of Soft Robotic Grippers, Hands, and Crawling Robots through Three-Dimensional-Printed Multifunctional Buckling Joints," Soft Robotics, vol. 11, no. 5, pp. 741-754, Feb 22 2024, doi: 10.1089/soro.2023.0111.
[44] R. Mutlu, G. Alici, M. in het Panhuis, and G. M. Spinks, "3d Printed Flexure Hinges for Soft Monolithic Prosthetic Fingers," Soft Robotics, vol. 3, no. 3, pp. 120-133, 2016, doi: 10.1089/soro.2016.0026.
[45] J. Qu, Z. Yu, W. Tang, Y. Xu, B. Mao, and K. Zhou, "Advanced Technologies and Applications of Robotic Soft Grippers," Advanced Materials Technologies, vol. 9, no. 11, 2024, doi: 10.1002/admt.202301004.
[46] D. Tang et al., "Bistable Soft Jumper Capable of Fast Response and High Takeoff Velocity," Science Robotics, vol. 9, no. 93, p. eadm8484, 2024.
[47] OpenCV. "Hough Circle." https://docs.opencv.org/3.4/d4/d70/tutorial_hough_circle.html (accessed 2024).
[48] P. Cigliano, V. Lippiello, F. Ruggiero, and B. Siciliano, "Robotic Ball Catching with an Eye-in-Hand Single-Camera System," IEEE Transactions on Control Systems Technology, vol. 23, no. 5, pp. 1657-1671, 2015, doi: 10.1109/tcst.2014.2380175.
[49] OpenCV. "Changing Color-Space." https://docs.opencv.org/4.x/df/d9d/tutorial_py_colorspaces.html (accessed 2024).
[50] V. Lippiello and F. Ruggiero, "Monocular Eye-in-Hand Robotic Ball Catching with Parabolic Motion Estimation," IFAC Proceedings Volumes, vol. 45, no. 22, pp. 229-234, 2012, doi: 10.3182/20120905-3-HR-2030.00015.
[51] T. W. o. a. G. H. Berthold B¨auml, "Kinematically Optimal Catching a Flying Ball with a Hand-Arm-System," in IEEE/RSJ International Conference on Intelligent Robots and Systems, 2010, pp. 2592-2599, doi: 10.1109/IROS.2010.5651175.
[52] J. Kober, M. Glisson, and M. Mistry, "Playing Catch and Juggling with a Humanoid Robot," in 2012 12th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2012), 2012: IEEE, pp. 875-881, doi: 10.1109/HUMANOIDS.2012.6651623.
[53] T. Gold, R. Römer, A. Völz, and K. Graichen, "Catching Objects with a Robot Arm Using Model Predictive Control," in 2022 American Control Conference (ACC), 2022: IEEE, pp. 1915-1920, doi: 10.23919/ACC53348.2022.9867380.
[54] S. Kim, A. Shukla, and A. Billard, "Catching Objects in Flight," IEEE Transactions on Robotics, vol. 30, no. 5, pp. 1049-1065, 2014, doi: 10.1109/tro.2014.2316022.
[55] S. Kim and A. Billard, "Estimating the Non-Linear Dynamics of Free-Flying Objects," Robotics and Autonomous Systems, vol. 60, no. 9, pp. 1108-1122, 2012, doi: 10.1016/j.robot.2012.05.022.
[56] K. H. Strobl and G. Hirzinger, "Optimal Hand-Eye Calibration," in 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006: IEEE, pp. 4647-4653, doi: 10.1109/IROS.2006.282250.
[57] R. Horaud and F. Dornaika, "Hand-Eye Calibration," The International Journal of Robotics Research, vol. 14, no. 3, pp. 195-210, 1995, doi: https://doi.org/10.1177/02783649950140030.
[58] J. Jiang, X. Luo, Q. Luo, L. Qiao, and M. Li, "An Overview of Hand-Eye Calibration," The International Journal of Advanced Manufacturing Technology, vol. 119, no. 1-2, pp. 77-97, 2021, doi: 10.1007/s00170-021-08233-6.
[59] S. Maker. "Thermoplastic Polyester Elastomer." https://www.3dspidermaker.com/products/spiderflex-tpe-matte-finish-iron-gray (accessed 2024).
[60] S. Maker. "Polylactic Acid." https://www.3dspidermaker.com/products/matte-pla-filament-morandi-colors-frosted-almond (accessed 2024).
[61] Arduino. "Arduino Uno." https://store.arduino.cc/products/arduino-uno-rev3 (accessed 2024).
[62] AIRTAG. "3v2 Series Solenoid Valve (3/2 Way)." https://global.airtac.com/pro_det.aspx?c_kind=4&c_kind2=19&c_kind3=40&c_kind4=46&c_kind5=371&id=261&3V2-Series-Solenoid-Valve-(3/2-way (accessed 2024).
[63] UNI-CROWN. "Un-40v 真空幫浦." https://unicrown-tw.com/product/un-40v/ (accessed 2024).
[64] ROS. "Robot Operation System." https://www.ros.org/ (accessed 2024).
[65] S.-Y. Lo, C.-A. Cheng, and H.-P. Huang, "Virtual Impedance Control for Safe Human-Robot Interaction," Journal of Intelligent & Robotic Systems, vol. 82, no. 1, pp. 3-19, 2015, doi: 10.1007/s10846-015-0250-y.
[66] T. robot. "
Tm5 - 900." https://www.tm-robot.com/zh-hant/tm5-900/ (accessed 2024).
[67] Intel. "Intel® Realsense™ Depth Camera D435i." https://www.intelrealsense.com/depth-camera-d435i/ (accessed 2024).
[68] StereosLabs. "Zed2." https://store.stereolabs.com/products/zed-2 (accessed.
[69] J. Z. Kolter and A. Y. Ng, "Task-Space Trajectories Via Cubic Spline Optimization," in 2009 IEEE International Conference on Robotics and Automation, 2009: IEEE, pp. 1675-1682, doi: 10.1109/ROBOT.2009.5152554.
[70] K. Zhang, J.-X. Guo, and X.-S. Gao, "Cubic Spline Trajectory Generation with Axis Jerk and Tracking Error Constraints," International Journal of Precision Engineering and Manufacturing, vol. 14, pp. 1141-1146, 2013, doi: 10.1007/s12541-013-0155-2.
[71] V. Tadic et al., "Perspectives of Realsense and Zed Depth Sensors for Robotic Vision Applications," Machines, vol. 10, no. 3, p. 183, 2022, doi: https://doi.org/10.3390/machines10030183.
[72] S. D. Joseph Redmon∗, Ross Girshick¶, Ali Farhadi∗†, "You Only Look Once: Unified, Real-Time Object Detection," Arxiv, 2016, doi: https://doi.org/10.48550/arXiv.1506.02640.
[73] Ultralytics. "Yolo V5." https://docs.ultralytics.com/yolov5/ (accessed 2024).
[74] D. Deng, "Dbscan Clustering Algorithm Based on Density," in 2020 7th International Forum on Electrical Engineering and Automation (IFEEA), 2020: IEEE, pp. 949-953, doi: 10.1109/IFEEA51475.2020.00199.
[75] Scikit-learn. "Scikit-Learn." https://scikit-learn.org/stable/ (accessed 2024).
[76] M. A. Hearst, S. T. Dumais, E. Osuna, J. Platt, and B. Scholkopf, "Support Vector Machines," IEEE Intelligent Systems and Their Applications, vol. 13, no. 4, pp. 18-28, 1998, doi: 10.1109/5254.708428.
[77] M. Yue, X. Wu, L. Guo, and J. Gao, "Quintic Polynomial-Based Obstacle Avoidance Trajectory Planning and Tracking Control Framework for Tractor-Trailer System," International Journal of Control, Automation and Systems, vol. 17, no. 10, pp. 2634-2646, 2019, doi: http://dx.doi.org/10.1007/s12555-018-0889-9.
[78] Coursera. "Coursera Robotics." https://www.coursera.org/learn/robotics1 (accessed 2024).
[79] P. J. Kyberd, M. Evans, and S. Te Winkel, "An Intelligent Anthropomorphic Hand, with Automatic Grasp," Robotica, vol. 16, no. 5, pp. 531-536, 1998, doi: https://doi.org/10.1017/S0263574798000691
[80] Z. Xu, X. Baojie, and W. Guoxin, "Canny Edge Detection Based on Open Cv," in 2017 13th IEEE International Conference on Electronic Measurement & Instruments (ICEMI), 2017: IEEE, pp. 53-56, doi: 10.1109/ICEMI.2017.8265710.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/96751-
dc.description.abstract軟性機器人利用軟材料的特性與結構設計,使其在與外界互動時,能憑藉材料的柔軟性和易變形特質,在無需複雜控制系統的情況下有效地抓握物品。軟性夾爪透過形態上的分類可分為無指夾爪、雙指夾爪、多指夾爪以及多關節夾爪。其中,無指夾爪透過特殊的幾何形狀或材料柔性來實現抓取,無需傳統的「手指」設計。無指夾爪的幾何特性使其在簡化控制下仍能達到穩定的抓握效果,但由於依賴材料變形來完成抓取,反應速度通常較慢。
本研究結合了實驗室先前開發的負壓挫屈關節與近期無指夾爪的設計概念,利用負壓挫屈關節的快速響應特性來提升無指夾爪的反應速度,同時保持穩定的抓握性能。雖然軟性夾爪可利用材料的自適應性來抓取不規則或脆弱的物品,但其易變形的特性通常會降低抓取精度,使得無法將物品精確定位至工具的中心點。為了解決此問題,本研究受到摺紙結構的啟發,將主動面板置於夾爪內,使夾爪在收合時,即便物體偏離預設的工具中心點,仍能透過主動面板將物體推回中心點,以達到精確的取放效果。
本研究會透過兩項應用將夾爪置於達明協作形機械手臂應用去反映上述所提到快速反應與精確取放功能,分別為透過接球實驗與視覺取放任務。在接球實驗中,研究結合了YOLO深度學習模型與支援向量回歸(SVR)技術進行飛行球體的偵測和軌跡預測。為了應對不同環境下的光線變化,研究運用了資料增強技術提升YOLO模型的泛化能力,確保模型在多種光源條件下均能穩定檢測飛行中的球體。研究還進一步比較了RealSense D435i與ZED深度攝影機在高速物體偵測中的性能差異,結果顯示RealSense D435i在準確度和穩定性方面優於ZED深度攝影機,使得本研究得以更精確地追蹤球體的運動。實驗結果表明,夾爪在0.64秒的反應時間下達到了55%的接球成功率。
在視覺取放過程中,夾爪展現了軟性材料的自適應性優勢,不僅能穩定抓握形狀不規則的物體,還能在一定的負重測試下維持結構穩定。為了進一步驗證夾爪的抓取精確度,研究測試了夾爪搭配達明機械手臂視覺取放系統的能力,成功將直徑4公分的高爾夫球放置在直徑3公分的瓶蓋上,顯示出夾爪在精確操作中的潛力。為了證明本研究設計的夾爪在具備快速反應能力的同時,亦能實現精確的取放操作,應在完成接球實驗後,進一步執行視覺取放任務。然而,由於協作型機器人內建的扭矩安全限制,導致無法在接球實驗後順利進行後續的視覺取放任務。未來將逐步調整達明協作型機械手臂的扭矩容許值,以驗證本研究開發的夾爪不僅具備快速反應能力,還能克服軟性機器人在精確取放方面的普遍挑戰。
zh_TW
dc.description.abstractSoft robots leverage the properties of soft materials and structural design to interact effectively with external objects, utilizing their flexibility and deformability to grasp items without the need for complex control systems. Soft grippers can be categorized by form into fingerless, two-finger, multi-finger, and multi-joint grippers. Among them, the fingerless gripper achieves grasping through special geometric shapes or material flexibility, bypassing the need for a traditional “finger” design. The geometry of fingerless grippers allows stable grasping with simplified control, though their reliance on material deformation typically results in slower response times.
This study combines a previously developed vacuum-driven buckling joint from our lab with a recent fingerless gripper design, enhancing response speed by leveraging the fast-acting characteristics of the vacuum-driven buckling joint while maintaining stable grasping performance. While soft grippers adapt to grasp irregular or delicate objects through material compliance, their deformability can often reduce grasping precision, making it challenging to align objects precisely to the tool’s center point. To address this, we incorporated an active panel inspired by origami structures within the gripper, allowing it to push objects toward the tool center point during closure, ensuring precise placement even if the object is initially offset.
This study implements the gripper on a TM collaborative robot arm to demonstrate its rapid response and precise pick-and-place capabilities through two applications: a catching experiment and a visual pick-and-place task. In the catching experiment, the study integrates the YOLO deep learning model and Support Vector Regression (SVR) to detect and predict the trajectory of a flying ball. To accommodate varying lighting conditions, data augmentation techniques enhance the YOLO model's generalization capacity, ensuring consistent detection of the ball across different lighting environments. Furthermore, the study compares the RealSense D435i and ZED depth cameras for high-speed object detection, showing that the RealSense D435i outperforms ZED in accuracy and stability, allowing for more precise ball tracking. Experimental results indicate that the gripper achieves a 55% catching success rate with a response time of 0.64 seconds.
In the pick-and-place process, the gripper showcases the adaptive advantages of soft materials, maintaining stable grasping of irregularly shaped objects and structural stability under certain load tests. To further validate the gripper’s precision, the study tests the gripper’s performance in conjunction with the TM robot’s vision-based pick-and-place system, successfully placing a 4 cm golf ball onto a 3 cm cap, highlighting its potential in precise operations.
To demonstrate that the gripper designed in this study can achieve precise pick-and-place operations while maintaining rapid response capability, it is necessary to perform visual pick-and-place tasks following the completion of the ball-catching experiment. However, due to the built-in torque safety limits of the collaborative robot, it is unable to seamlessly transition to the subsequent visual pick-and-place tasks after the ball-catching experiment. In the future, the torque limits of the TM collaborative robotic arm will be gradually adjusted to verify that the gripper developed in this study not only possesses rapid response capabilities but also overcomes common challenges faced by soft robots in achieving precise pick-and-place operations.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-02-21T16:23:27Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2025-02-21T16:23:27Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents誌謝 I
摘要 III
Abstract V
目次 VIII
表次 XII
圖次 XIII
第1章 緒論 1
1.1 研究動機與目的 1
1.2 論文架構 3
第2章 文獻回顧 5
2.1 夾爪文獻回顧 5
2.1.1 雙指夾爪 5
2.1.2 多指夾爪 7
2.1.3 多關節夾爪 9
2.1.4 無指夾爪 10
2.1.5 實驗室先前夾爪相關研究回顧 13
2.1.6 不同型態夾爪性能統整與比較 16
2.2 機械手臂與夾爪接球文獻回顧 17
2.2.1 偵測 17
2.2.2 預測 20
2.2.3 實驗室先前接球實驗相關文獻回顧 23
第3章 實驗設備與夾爪製作材料 27
3.1 夾爪製作相關材料與設備 27
3.1.1 熔融沉積3D列印 27
3.1.2 TPEE 線材 29
3.1.3 PLA 線材 30
3.1.4 電器絕緣膠帶與醫療人工皮 31
3.2 夾爪動力源與控制元件 33
3.2.1 Arduino 控制板 33
3.2.2 電磁閥 34
3.2.3 負壓真空幫浦 35
3.2.4 繼電器 36
3.3 接球實驗相關設備 37
3.3.1 ROS(robot operation system) 37
3.3.2 達明機械手臂 38
3.3.3 RealSense D435i 40
3.3.4 ZED2 40
第4章 實驗方法設計 42
4.1 接球實驗硬體與軟體系統架設 42
4.2 夾爪設計與視覺取放 46
4.2.1 夾爪運動原理構想 46
4.2.2 機構設計 49
4.2.3 夾爪視覺取放 59
4.3 偵測球體 63
4.3.1 文獻回顧方法測試結果 63
4.3.2 YOLO 檢測 64
4.3.3 群聚分析 68
4.3.4 判斷球是否離手 71
4.3.5 偵測方法流程 73
4.4 預測球體軌跡 74
4.4.1 軌跡位置與速度分析 74
4.4.2 支援向量回歸 78
4.4.3 末端點執行器位置姿態與夾爪關閉時機點設計 85
4.5 軌跡規劃 91
第5章 結論與討論 93
5.1 夾爪性能結果與討論 93
5.1.1 夾取多樣性物品結果 93
5.1.2 負重實驗結果 95
5.1.3 夾爪反應時間測試結果 96
5.1.4 夾爪配合機械手臂視覺取放結果 98
5.2 視覺偵測球體結果與討論 101
5.2.1 YOLO 檢測訓練結果 101
5.2.2 深度攝影機ZED與Realsense在偵測快速移動球體比較結果 106
5.2.3 群聚分析結果 112
5.3 物件預測結果與討論 115
5.3.1 SVR預測結果 115
5.3.2 末端執行器落點位置設計結果 119
5.4 接球實驗與取放任務合併結果與討論 125
5.4.1 接球實驗結果 125
5.4.2 接球實驗與取放任務合併結果 129
第6章 結論與未來展望 130
6.1 結論 130
6.2 未來展望 132
參考文獻 134
-
dc.language.isozh_TW-
dc.subject協作形機器人zh_TW
dc.subject視覺取放zh_TW
dc.subject接球實驗zh_TW
dc.subject支援向量回歸zh_TW
dc.subjectYOLO 檢測zh_TW
dc.subject軟性機器人zh_TW
dc.subjectSoft roboten
dc.subjectSupport vector regressionen
dc.subjectCatch ball experimenten
dc.subjectVisual pick-and-placeen
dc.subjectYOLO detectionen
dc.subjectCollaborative robot armen
dc.title具挫屈關節之無指軟性夾爪及其在協作型機械手臂系統的應用──以動態接球取放為例zh_TW
dc.titleA Soft Fingerless Gripper with Buckling Joints and Its Application in Collaborative Robotic Arm Systems: A Case Study on Dynamic Ball Catching and Placementen
dc.typeThesis-
dc.date.schoolyear113-1-
dc.description.degree碩士-
dc.contributor.oralexamcommittee林峻永;顏炳郎zh_TW
dc.contributor.oralexamcommitteeChun-Yeon Lin;Ping-Lang Yenen
dc.subject.keyword軟性機器人,協作形機器人,YOLO 檢測,支援向量回歸,接球實驗,視覺取放,zh_TW
dc.subject.keywordSoft robot,Collaborative robot arm,YOLO detection,Support vector regression,Catch ball experiment,Visual pick-and-place,en
dc.relation.page140-
dc.identifier.doi10.6342/NTU202404755-
dc.rights.note未授權-
dc.date.accepted2024-12-23-
dc.contributor.author-college工學院-
dc.contributor.author-dept機械工程學系-
dc.date.embargo-liftN/A-
顯示於系所單位:機械工程學系

文件中的檔案:
檔案 大小格式 
ntu-113-1.pdf
  未授權公開取用
8.47 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved