Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 工學院
  3. 機械工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/2708
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor林沛群
dc.contributor.authorJin-Rong Tsaien
dc.contributor.author蔡謹容zh_TW
dc.date.accessioned2021-05-13T06:48:45Z-
dc.date.available2018-08-25
dc.date.available2021-05-13T06:48:45Z-
dc.date.copyright2017-08-25
dc.date.issued2017
dc.date.submitted2017-08-21
dc.identifier.citation[1] S. C. Jacobsen, E. K. Iversen, D. Knutti, R. Johnson, and K. Biggers, 'Design of the Utah/M.I.T. Dextrous Hand,' in Robotics and Automation. Proceedings. 1986 IEEE International Conference on, 1986, pp. 1520-1532.
[2] T. Mouri, H. Kawasaki, K. Yoshikawa, J. Takai, and S. Ito, 'Anthropomorphic Robot Hand: Gifu Hand III,' presented at the ICCAS2002, The 2nd International Conference on Control, Automation and Systems, Korea.
[3] S. R. Company. Shadow Dexterous Hand. Available: http://www.shadowrobot.com/products/dexterous-hand/
[4] J. Butterfa, M. Grebenstein, H. Liu, and G. Hirzinger, 'DLR-Hand II: Next Generation of a Dextrous Robot Hand,' in Robotics and Automation, 2001. Proceedings 2001 ICRA. IEEE International Conference on, 2001, pp. 109-114.
[5] A. Bicchi, 'Hands for dexterous manipulation and robust grasping: a difficult road toward simplicity,' Robotics and Automation, IEEE Transactions on, vol. 16, pp. 652-662, 2000.
[6] A. M. Dollar and R. D. Howe, 'A robust compliant grasper via shape deposition manufacturing,' Mechatronics, IEEE/ASME Transactions on, vol. 11, pp. 154-161, 2006.
[7] L. U. Odhner, L. P. Jentoft, and M. R. Claffee, 'A compliant, underactuated hand for robust manipulation,' International Journal of Robotics Research, vol. 33, 2014.
[8] I. SPECTRUM, 'Willow Garage Introduces Velo 2G Adaptive Gripper,' ed.
[9] K. Suzumori, S. Iikura, and H. Tanaka, 'Development of flexible microactuator and its applications to robotic mechanisms,' in Robotics and Automation, 1991. Proceedings., 1991 IEEE International Conference on, 1991, pp. 1622-1627.
[10] B. Technology. BarrettHand. Available: http://www.barrett.com/robot/products-hand.htm
[11] SCHUNK. SDH servo-electric 3-Finger Gripping Hand. Available: http://mobile.schunk-microsite.com/en/produkte/produkte/sdh-servo-electric-3-finger-gripping-hand.html
[12] ROBOTIQ. 3-Finger Adaptive Robot Gripper. Available: http://robotiq.com/en/products/industrial-robot-hand
[13] FESTO. Fin Gripper. Available: http://www.electroquip.co.uk/news/pneumatics-the-best-3-grippers-available-on-the-market
[14] V. Lippiello, F. Ruggiero, B. Siciliano, and L. Villani, 'Visual Grasp Planning for Unknown Objects Using a Multifingered Robotic Hand,' presented at the IEEE/ASME Trans. Mechatronics, 2013.
[15] K. Hsiao, S. Chitta, M. Ciocarlie, and E. G. Jones, 'Contact-Reactive Grasping of Objects with Partial Shape Information,' presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2010.
[16] K. Hübner and D.Kragic, 'Selection of robot pre-grasps using box-based shape approximation,' presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2008.
[17] S.-K. Kim and M. Likhachev, 'Planning for Grasp Selection of Partially Occluded Objects,' presented at the IEEE ICRA, 2016.
[18] K. Huebner, S. Ruthotto, and D. Kragic, 'Minimum volume bounding box decomposition for shape approximation in robot grasping,' presented at the IEEE ICRA, 2008.
[19] A. T. Miller, S. Knoop, H. I. Christensen, and P. K. Allen, 'Automatic Grasp Planning using Shape Primitives,' presented at the IEEE ICRA, Taipei, Taiwan, 2003.
[20] C. Eppner and O. Brock, 'Grasping Unknown Objects by Exploiting Shape Adaptability and Environmental Constraints,' presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013.
[21] K. Koyama, Y. Suzuki, A. Ming, and M. Shimojo, 'Integrated Control of a Multi-fingered Hand and Arm using Proximity Sensors on the Fingertips,' presented at the IEEE ICRA, 2016.
[22] M. Santello, M. Flanders, and J. F. Soechting, 'Postural hand synergies for tool use,' J. Neurosci., vol. 18, pp. 10105-10115, Dec. 1998.
[23] K. Shimonomura, H. Nakashima, and K. Nozu, 'Robotic grasp control with high-resolution combined tactile and proximity sensing,' presented at the IEEE ICRA, 2016.
[24] S. YE, K. SUZUKI, Y. SUZUKI, M. ISHIKAWA, and M. SHIMOJO, 'Robust Robotic Grasping Using IR Net-Structure Proximity Sensor to Handle Objects with Unknown Position and Attitude,' presented at the IEEE ICRA, 2013.
[25] K. Hsiao, P. Nangeroni, M. Huber, A. Saxena, and A. Y. Ng, 'Reactive Grasping Using Optical Proximity Sensors,' in IEEE ICRA, 2009, pp. 2098-2105.
[26] K. Koyama, H. Hasegawa, Y. Suzuki, A. Ming, and M. Shimojo, 'Pre-shaping for Various Objects by the Robot Hand Equipped with Resistor Network Structure Proximity Sensors,' presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013.
[27] M. Grammatikopoulou, E. Psomopoulou, L. Droukas, and Z. Doulgeri, 'A controller for stable grasping and desired finger shaping without contact sensing,' presented at the IEEE ICRA, 2014.
[28] J. M. Romano, K. Hsiao, G. Niemeyer, S. Chitta, and K. J. Kuchenbecker, 'Human-Inspired Robotic Grasp Control With Tactile Sensing,' in IEEE Transactions on Robotics, 2011, pp. 1067 - 1079.
[29] Z. Pezzementi, E. Plaku, C. Reyda, and G. D. Hager, 'Tactile-Object Recognition From Appearance Information,' presented at the IEEE Transactions on Robotics, 2011.
[30] Y. Cheng, C. Su, Y. Jia, and N. Xi, 'Data Correlation Approach for Slippage Detection in Robotic Manipulations Using Tactile Sensor Array,' presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015.
[31] M. Stachowsky, T. Hummel, M. Moussa, and H. A. Abdullah, 'A Slip Detection and Correction Strategy for Precision Robot Grasping,' IEEE/ASME TRANSACTIONS ON MECHATRONICS, vol. 21, pp. 2214-2226, 2016.
[32] 林昱辰, '可變與自順應外型多感測器夾爪之開發與其在多外型與多尺寸物件自動化取放任務之應用,' 碩士論文, 機械工程學系, 國立台灣大學, 台北, 2014.
[33] Furtaba. Futaba BLS172SV. Available: http://www.futabarc.com/servos/sbus.html
[34] 小原齒輪. SI內齒輪. Available: https://www.khkgears.co.jp/khkweb/search/sunpou.do;jsessionid=0D4AB1E5C8BF53D8EB2CD05BDFE2160B?indexCode=18&referrer=series&sic=1&lang=zh_TW
[35] 小原齒輪. SW蝸桿. Available: https://www.khkgears.co.jp/khkweb/search/sunpou.do?indexCode=75&referrer=series&sic=1&lang=zh_TW
[36] 小原齒輪. BG蝸輪. Available: https://www.khkgears.co.jp/khkweb/search/sunpou.do?indexCode=79&referrer=series&sic=1&lang=zh_TW
[37] C. Li, P.-M. Wu, S. Lee, A. Gorton, M. J. Schulz, and C. H. Ahn, 'Flexible Dome and Bump Shape Piezoelectric Tactile Sensors Using PVDF-TrFE Copolyme,' JOURNAL OF MICROELECTROMECHANICAL SYSTEMS, vol. 17, pp. 334-341, 2008.
[38] H.-K. Lee, S.-I. Chang, and E. Yoon, 'A Flexible Polymer Tactile Sensor: Fabrication and Modular Expandability for Large Area Deploymen,' JOURNAL OF MICROELECTROMECHANICAL SYSTEMS, vol. 15, pp. 1681-1686, 2006.
[39] 游崴舜, '可側傾雙輪機器人之運動控制與其內部機器人泛用機電系統架構,' 碩士學位, 機械工程學系, 國立台灣大學, 台北, 2012.
[40] C. Wu. VisualSFM : A Visual Structure from Motion System. Available: http://ccwu.me/vsfm/index.html
[41] C. Wu. SiftGPU: A GPU Implementation of Scale Invariant Feature Transform (SIFT). Available: http://www.cs.unc.edu/~ccwu/siftgpu/
[42] D. G. Lowe, 'Distinctive Image Features from Scale-Invariant Keypoints,' International Journal of Computer Vision, vol. 60, pp. 91-110, 2004.
[43] T. Lindeberg, 'Detecting salient blob-like image structures and their scales with a scale-space primal sketch: A method for focus-of-attention,' International Journal of Computer Vision, vol. 11, pp. 283-318, 1993.
[44] OpenCV. Camera Calibration and 3D Reconstruction. Available: http://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html
[45] R. Balasubramanian, L. Xu, P. D. Brook, J. R. Smith, and Y. Matsuoka, 'Human-Guided Grasp Measures Improve Grasp Robustness on Physical Robot,' in IEEE ICRA, 2010.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/2708-
dc.description.abstract本研究的目標是結合視覺定位、紅外線陣列、壓力陣列和自行設計的被動適應性夾爪,對欲抓取的目標物體進行於多障礙物之間的穩定夾取。
機械手臂和機械夾爪用於工業或自動化生產線上等制式化的控制已經非常普及,但若是場景假設在日常生活中,則需做較複雜、多變化、結合各式感測器的控制才可以達到對各種物體進行良好的夾取,此部分的困難之處在於,機械手臂不同於人體的皮膚,是柔軟但是富有彈性的,因此皮膚可以提供很好的包覆性,且能允許皮膚在碰觸物體後仍然能做夾取姿態的調整,因此能允許視覺判定上的誤差和輕易的夾取各式的形狀;相較於皮膚,機械夾爪材質堅硬且於控制上較難達到良好的包覆性,因此能容許的定位誤差小且需要在夾爪碰到物體前就決定好夾取姿態,無法在碰觸的過程中調整。
為了使機械夾爪能增加更多的包覆性,研究中設計了一具有被動包覆性的夾爪,其原理是利用行星齒輪的特性達到一輸入兩輸出的控制,且因其為機構驅動,因此可以透過一旋轉式電位計輕易的知道整個夾爪的姿態。除了夾爪的設計之外,為了增加包覆和適應性,預計在夾爪上安裝紅外線陣列和壓力陣列。安裝紅外線的原因是,不同於肌膚,機械夾爪能允許的定位誤差和所提供對於形狀的微調性很差,因此需要紅外線陣列於夾取前作適當的調整,校正定位誤差和讓夾爪更趨於物體的形狀。而壓力陣列是目前許多夾爪都會選擇安裝的感測器,原因是壓力陣列能於夾取的過程中可以判斷夾取狀態的好壞,例如物體是否有滑動的現象並藉此還調整夾取不同物體的力量、或夾取時力中心的位置是否恰當、又或者是可以透過力和壓力陣列受力面積之間的關係判斷物體的軟硬等,因其能提供的資訊多元,因此目前被大量的選擇安裝於機械夾爪上。
為了讓壓力陣列能有好的表現,壓力陣列上通常需要覆蓋一層軟墊,其功能有增加壓力陣列覆蓋於不同曲面的覆蓋能力和提供緩衝,也可以較好的分散受力以及提供夾取時所需的摩擦力。在選擇軟墊上面,雖然較厚的軟墊可以較好的均勻分散受力,但是較厚的軟墊同時也會吸收較多的力使壓力陣列的解析度下降。原先實驗室的壓力陣列使用的是1mm厚矽膠材質的軟墊,但是實驗後發覺其仍然無法很好的反映出接觸的面積形狀,因此決定嘗試製作軟墊,並在軟墊上設計突起點,材料選擇的是PDMS。
影像定位的部分,結合OpenCV、CUDA、SIFTGPU函式庫,完成可以自行建模,並對此模型的物體透過特徵點比對進行六軸姿態的定位。其允許於混亂的背景中和物體部分被遮蓋的情況進行定位。定位誤差在正負4mm以內。有了物體的模型,可以想像成機器人即了解這個東西的樣子、外觀,但是了解其外觀之後和要決定如何夾取此物體之間仍有一大段的距離。
許多的研究都指出,人在抓取各式的物體時,只有用少數的幾個主要姿態,再配和細微的調整即能抓取各種幾何形狀的物體。因次,相較於其他研究是將模型加以分析後去決定夾取的姿態和位置,本研究所選擇的方法是,將模型簡化至某幾個特並的幾何形狀,以此幾何形狀決定主要的夾取姿態,再搭配紅外線陣列以及夾爪設計上的被動包覆性在夾取前進行細微的調整後對目標物體進行夾取。簡化物體模型不只可以節省大量為不同物體設計夾取姿態的時間,除此之外,建立在這個假設之上,研究中還提出了一個低計算量的夾取姿態演算法,其將夾取姿態壓縮至二維空間中作運算因此減少了大量的計算時間。
zh_TW
dc.description.abstractFor a robot to grasp a randomly placed irregular object is a challenging task. The process includes several steps. First, the robot needs to be capable of identifying the shape, position, and posture of the object. Second, the robot needs to determine a grasping posture. Next, even when the robot can achieve the previous two tasks, the success of grasping relies on adequate contact between the gripper and the object. This issue is especially crucial when the dimensions or positioning of the object have uncertainties or when the object is fragile or soft, because the mechanical gripper has less compliance unlike the human hand. In short, the grasping task requires delicate coordination between the hand, eye, and arm.
This study reports on a novel low-computation object grasping method that can classify complex objects into primitive shapes and then select the object grasping posture based on predefined grasping postures associated with the approximated primitive shapes. In this approach, the object is not precisely modeled, and the grasping posture is selected from a small number of candidates without massive search; thus, the grasping posture for the object can be quickly derived. Because the object and primitive shape have geometrical discrepancy, the gripper is compliant and equipped with infrared proximity sensors on the fingers to compensate for the geometrical uncertainties and provide adequate contact between the object and the grippers. Furtheremore, the gripper also equipped with the pressure arry for detecting the slipping and adjustinh the grasping force. The methodology is experimentally evaluated with several types of objects in different postures.
en
dc.description.provenanceMade available in DSpace on 2021-05-13T06:48:45Z (GMT). No. of bitstreams: 1
ntu-106-R04522817-1.pdf: 11703778 bytes, checksum: fa2fb3916f797cf868b3c979e10d77a7 (MD5)
Previous issue date: 2017
en
dc.description.tableofcontents口試委員審定書 I
致謝 II
摘要 III
Abstract V
目錄 VI
圖目錄 IX
表目錄 XVII
第一章 緒論 1
1.1 前言 1
1.2 研究動機 1
1.3文獻回顧 2
1.4 貢獻 7
1.5 論文架構 8
第二章 實驗平台設計 9
2.1 機械夾爪設計 9
2.1.1被動彈性設計 10
2.1.2 機構設計 14
2.1.3 機械夾爪設計結果 19
2.2 感測系統設計 22
2.2.1近接感測器設計 22
2.2.2壓力陣列軟墊設計 31
2.3 機電系統架構 42
2.3.1 手臂控制機電架構 43
2.3.2視覺定位機電架構 44
2.3.3夾爪控制機電架構 44
2.4 手臂逆運動學推導 48
2.5 本章結論 51
第三章 視覺定位與計算夾取資態 52
3.1 視覺定位 52
3.1.1 Scale-invariant feature transform(Sift)演算法介紹 53
3.1.2 介紹VisualSFM與SiftGPU 58
3.1.3 建模與定位流程 59
3.2 模型簡化 62
3.2.1模型分類 63
3.2.2模型簡化流程 64
3.3 夾取姿態的計算 72
3.3.1 演算法基本假設與概念介紹 72
3.3.2 最佳的夾取姿態的計算 76
3.3.3 定義基準向量與基準座標系 81
3.3.4 定義重心夾取姿態與邊緣夾取姿態 83
3.3.5 目標理想的夾取姿態 92
3.3.6 可行夾取姿態的計算 102
3.3.7 獲取障礙物資訊 103
3.3.8 研究中夾取姿態的計算流程 107
3.4 夾取軟體介紹 113
3.4.1 拍攝建模照片 114
3.4.2 建立模型 114
3.4.3 校正模型 115
3.4.4 偵測模型 115
3.4.5 定位模型 116
3.5 本章結論 119
第四章 結合多感測器之夾取應用 120
4.1 夾爪包覆實驗結果 120
4.2 物體定位實驗結果 121
4.3 模型簡化實驗結果 122
4.4 夾取姿態計算實驗結果 126
4.4.1 單一物體夾取姿態計算實驗結果 126
4.4.2 多物體夾取姿態計算實驗結果 143
4.5 滑動偵測與調整夾取力量實驗結果 148
4.5.1 夾取力道與滑動相關性實驗 150
4.5.2 滑動偵測實驗 151
4.5.3 夾取調整實驗 156
4.6 近接感測器實驗結果 158
4.7 本章結論 159
第五章 結論與未來展望 160
5.1 結論 160
5.2 未來展望 160
參考文獻 161
dc.language.isozh-TW
dc.title以基礎幾何模型搭配具主被動自由度和掌內壓力與近接感測之夾爪達到快速低計算成本之多樣化物體夾取zh_TW
dc.titleA Fast and Low-Computation Object Grasping Method by Using Primitive-Shape Models and a Compliant Gripper with in-hand Proximity and Pressure Sensingen
dc.typeThesis
dc.date.schoolyear105-2
dc.description.degree碩士
dc.contributor.oralexamcommittee黃光裕,顏炳郎,連豊力
dc.subject.keyword通用型夾爪,視覺定位,低計算量夾取姿態演算法,壓力陣列,近接感測器,zh_TW
dc.subject.keywordpassive gripper,proximity sensor,pressure array,low-computation grasping method,en
dc.relation.page164
dc.identifier.doi10.6342/NTU201704032
dc.rights.note同意授權(全球公開)
dc.date.accepted2017-08-21
dc.contributor.author-college工學院zh_TW
dc.contributor.author-dept機械工程學研究所zh_TW
顯示於系所單位:機械工程學系

文件中的檔案:
檔案 大小格式 
ntu-106-1.pdf11.43 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved