Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 工學院
  3. 機械工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/74298
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor詹魁元(Kuei-Yuan Chan)
dc.contributor.authorPo-Yu Chenen
dc.contributor.author陳柏宇zh_TW
dc.date.accessioned2021-06-17T08:28:36Z-
dc.date.available2019-08-16
dc.date.copyright2019-08-16
dc.date.issued2019
dc.date.submitted2019-08-12
dc.identifier.citation[1] DuPont de Nemours Inc, 2019. [Online]. Available: https://www.dupont.com/
[2] M. Hägele, K. Nilsson, J. N. Pires, and R. Bischoff, Industrial Robotics. Cham: Springer International Publishing, 2016, pp. 1385–1422.
[3] R. Y. Tsai and R. K. Lenz, “A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration,” IEEE Transactions on Robotics and Automation, vol. 5, no. 3, pp. 345–358, 1989.
[4] I. Ali, S. , A. Gotchev, and M. , “Methods for simultaneous robot­world­hand–eye calibration: A comparative study,” Sensors, vol. 19, p. 2837, 06 2019.
[5] ZIS Industrietechnik GmbH, “Robot calibration unit,” 2019. [Online]. Available:https://www.zis­cutting.de/en/machine­components/robot­calibration­unit/
[6] MathWorks, “What is camera calibration?” 2019. [Online]. Available: https://www.mathworks.com/help/vision/ug/camera­calibration.html
[7] Q. Wang, W.­C. Cheng, N. Suresh, and H. Hua, “Development of the local mag­nification method for quantitative evaluation of endoscope geometric distortion,” Journal of Biomedical Optics, vol. 21, no. 5, pp. 1 – 13 – 13, 2016.
[8] Waina, “Vpk.92 railroad, carroll hard,” 2019. [Online]. Available: http://digitalimagemakerworld.com/Railroad­Backgrounds/image­148991
[9] L. Nicoll, “Understanding depth of field ­ it’s not all about aperture,” 2015.[Online]. Available: https://fstoppers.com/education/understanding­depth­field­its­not­all­about­aperture­87014
[10] Andeggs, “Spherical coordinate system,” 2009. [Online]. Available: https://en.wikipedia.org/wiki/Spherical_coordinate_system
[11] B. Garrett, “Establishing a tool frame using the three point method,” 2017. [Online]. Available: https://www.youtube.com/watch?v=7ppEcQo7qbE
[12] Dietz Sensortechnik, “Tcp laser center units,” 2019. [Online]. Available: https://en.dietz­sensortechnik.de/products/tcp­laser­center­units.html
[13] Z. Gan and Q. Tang, “Calibration of a Robot Visual System,” Advanced Topics in Science and Technology in China, pp. 93–141, 2011.
[14] B. Blesiya, “B blesiya test indicator dial ruby probe head accessory indicator measuring tools ­ 1.6mm,” 2019. [Online]. Available: https://www.amazon.com/Blesiya­Indicator­Probe­Accessory­Measuring/dp/B07MXG22KC
[15] OpenCV, “Detection of aruco markers,” 2019. [Online]. Available: https://docs.opencv.org/4.1.0/d5/dae/tutorial_aruco_detection.html
[16] ——, “Real time pose estimation of a textured object,” 2019. [Online]. Available: https://docs.opencv.org/4.1.0/dc/d2c/tutorial_real_time_pose.html
[17] ABB Group, 2019. [Online]. Available: https://new.abb.com/
[18] CENIT AG, “Offline programming system for 3d contours with robots,” 2019.[Online]. Available: http://www.cenit.com/en_EN/plm/digital­factory/software/fastcurve.html
[19] Robotics and A. News, “Us auto industry buys half of all industrial robots, says ifr,”2016. [Online]. Available: http://roboticsandautomationnews.com/2016/03/23/us­auto­industry­buys­half­of­all­industrial­robots­says­ifr/3730/
[20] A. R. GmbH, “Denso highspeed tht with artiminds rps,” 2017. [Online]. Available: https://www.artiminds.com/applications/electronics­assembly/
[21] JUKI Corporation, “The best flexible placement system for high­ density placements,” 2019. [Online]. Available: https://www.juki.co.jp/smt_e/introduce/products/ke2060.html
[22] W. Swaim, “Smt line improvements for high mix, low volume electronics man­ufacturing,” 2011.
[23] F. Pfeiffer, “A feedforward decoupling concept for the control of elastic robots,” Journal of Robotic Systems, vol. 6, no. 4, pp. 407–416, 1989.
[24] S. D. WH Sunada, “On the dynamic analysis and behavior of industrial robotic manipulators with elastic members,” ASME. J. Mech., Trans., and Automation, vol. 105, no. 1, pp. 42–51, March 1983.
[25] T. Yoshikawa and K. Matsudera, Experimental study on modeling and control of flexible manipulators using virtual joint model. Berlin, Heidelberg: Springer Berlin Heidelberg, 1997, pp. 423–435.
[26] C. Dumas, S. Caro, S. Garnier, and B. Furet, “Joint stiffness identification of six­revolute industrial serial robots,” Robotics and Computer­Integrated Man­ufacturing, vol. 27, no. 4, pp. 881 – 888, 2011, conference papers of Flexible Automation and Intelligent Manufacturing.
[27] G. Alici and B. Shirinzadeh, “Enhanced stiffness modeling, identification and char­acterization for robot manipulators,” IEEE Transactions on Robotics, vol. 21, no. 4, pp. 554–564, Aug 2005.
[28] Z. Roth, B. Mooring, and B. Ravani, “An overview of robot calibration,” IEEE Journal on Robotics and Automation, vol. 3, no. 5, pp. 377–385, October 1987.
[29] A. D. Luca and W. Book, Robots with Flexible Elements. Cham: Springer In­ternational Publishing, 2016, pp. 243–282.
[30] X. Zhang and X. Zhang, “Minimizing the influence of revolute joint clearance using the planar redundantly actuated mechanism,” Robotics and Computer­Integrated Manufacturing, vol. 46, pp. 104 – 113, 2017.
[31] K.­L. Li, Y.­K. Tsai, and K.­Y. Chan, “Identifying joint clearance via robot ma­nipulation,” Proceedings of the author of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, vol. 0, no. 0, p. 0954406217721940, 2017.
[32] P. Flores and H. M. Lankarani, “Dynamic response of multibody systems with multiple clearance joints,” Journal of Computational and Nonlinear Dynamics, vol. 7, no. 3, p. 031003, 2012.
[33] S. Mukras, N. Kim, N. Mauntler, T. Schmitz, and W. Sawyer, “Analysis of planar multibody systems with revolute joint wear,” Wear, vol. 268, no. 5, pp. 643 – 652, 2010.
[34] F. Ghorbel, P. Gandhi, and F. Alpeter, “On the kinematic error in harmonic drive gears,” Transactions­American Society of Mechanical Engineers Journal of Mechanical Design, vol. 123, no. 1, pp. 90–97, 2001.
[35] C. Zou, T. Tao, G. Jiang, X. Mei, and J. Wu, “A harmonic drive model considering geometry and internal interaction,” Proceedings of the author of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, vol. 231, no. 4, pp. 728–743, 2017.
[36] T. Tuttle, “Understanding and modeling the behavior of a harmonic drive gear transmission,” MASSACHUSETTS INST OF TECH CAMBRIDGE ARTIFICIAL INTELLIGENCE LAB, Tech. Rep., 1992.
[37] C. W. Kennedy and J. P. Desai, “Modeling and control of the mitsubishi pa­10 robot arm harmonic drive system,” IEEE/ASME Transactions on Mechatronics, vol. 10, no. 3, pp. 263–274, June 2005.
[38] Z. Shi, Y. Li, and G. Liu, “Adaptive torque estimation of robot joint with harmonic drive transmission,” Mechanical Systems and Signal Processing, vol. 96, pp. 1 – 15, 2017.
[39] Y. T. Oh, “Influence of the joint angular characteristics on the accuracy of industrial robots,” Industrial Robot: the international journal of robotics research and application, vol. 38, no. 4, pp. 406–418, 2011.
[40] Y. Tsai, “Uncertainty estimation and performance optimization for vertical articulated and parallel robot manipulators,” Master’s thesis, National Taiwan University, Jan 2018.
[41] A. Huang and P. Liu, “Regressor­free adaptive control of flexible joint robot ma­nipulators with reduced number of estimators,” in 2017 29th Chinese Control And Decision Conference (CCDC), May 2017, pp. 4038–4042.
[42] Z. Jiang, T. Ishida, and M. Sunawada, “Neural network aided dynamic parameter identification of robot manipulators,” in 2006 IEEE International Conference on Systems, Man and Cybernetics, vol. 4, Oct 2006, pp. 3298–3303.
[43] J. Wu, J. Wang, and Z. You, “An overview of dynamic parameter identification of robots,” Robotics and Computer­Integrated Manufacturing, vol. 26, no. 5, pp. 414 – 419, 2010.
[44] J. Motta, G. de Carvalho, and R. McMaster, “Robot calibration using a 3d vision­based measurement system with a single camera,” Robotics and Computer­ Integrated Manufacturing, vol. 17, no. 6, pp. 487 – 497, 2001.
[45] Nikon, “Robot accuracy: Considerations to make when using robots in high accuracy applications and programming robots off­line,” Nikon Metrology, Tech. Rep., 2015.
[46] A. Nubiola and I. Bonev, “Absolute calibration of an abb irb 1600 robot using a laser tracker,” Robotics and Computer­Integrated Manufacturing, vol. 29, no. 1, pp. 236 – 245, 2013.
[47] K.­L. Li, W.­T. Yang, K.­Y. Chan, and P.­C. Lin, “An optimization technique for identifying robot manipulator parameters under uncertainty,” SpringerPlus, vol. 5, no. 1, p. 1771, Oct 2016.
[48] R. C. Luo, H. Wang, and M. H. Kuo, “Low cost solution for calibration in absolute accuracy of an industrial robot for iCPS applications,” Proceedings ­ 2018 IEEE Industrial Cyber­Physical Systems, ICPS 2018, pp. 428–433, 2018.
[49] J. C. Chou and M. Kamel, “Finding the position and orientation of a sensor on a robot manipulator using quaternions,” International Journal of Robotics Research, vol. 10, no. 3, pp. 240–254, 1991.
[50] J. C. K. Chou and M. Kamel, “Quaternions approach to solve the kinematic equation of rotation, a/sub a/a/sub x/=a/sub x/a/sub b/, of a sensor­mounted robotic manipulator,” in Proceedings. 1988 IEEE International Conference on Robotics and Automation, April 1988, pp. 656–662 vol.2.
[51] F. Horaud, R.;Dornaika, “Hand­Eye Calibration,” The International Journal of Robotics Research, vol. 14, no. 3, pp. 195–210, 1995.
[52] F. Dornaika and R. Horaud, “Simultaneous robot­world and hand­eye calibration,” IEEE Transactions on Robotics and Automation, vol. 14, no. 4, pp. 617–622, Aug 1998.
[53] K. Daniilidis, “Hand­Eye Calibration Using Dual Quaternions,” The International Journal of Robotics Research, vol. 18, no. 3, pp. 286–298, 1999.
[54] J. Schmidt, F. Vogt, and H. Niemann, “Vector Quantization Based Data Selection for Hand­Eye Calibration,” Vision, Modeling, and Visualization, pp. 21–28, 2004.
[55] I. Fassi and G. Legnani, “Hand to sensor calibration: A geometrical interpretation of the matrix Equation AX=XB,” Journal of Robotic Systems, vol. 22, no. 9, pp.497–506, 2005.
[56] M. Ikits, “Coregistration of pose measurement devices using nonlinear least squares parameter estimation,” The University of Utah, Tech. Rep., 2001.
[57] J. Heller, M. Havlena, and T. Pajdla, “A branch­and­bound algorithm for globally optimal hand­eye calibration,” in 2012 IEEE Conference on Computer Vision and Pattern Recognition, June 2012, pp. 1608–1615.
[58] J. Heller and T. Pajdla, “World­base calibration by global polynomial op­timization,” Proceedings ­ 2014 International Conference on 3D Vision, 3DV 2014, pp. 593–600, 2015.
[59] M. a. Fischler and R. C. Bolles, “Random Sample Paradigm for Model Consensus: A Apphcatlons to Image Fitting with Analysis and Automated Cartography,” Com­munications of the ACM, vol. 24, no. 6, pp. 381–395, 1981.
[60] X. S. Gao, X. R. Hou, J. Tang, and H. F. Cheng, “Complete solution classification for the perspective­three­point problem,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, no. 8, pp. 930–943, 2003.
[61] V. Lepetit, F. Moreno­Noguer, and P. Fua, “EPnP: An accurate O(n) solution to the PnP problem,” International Journal of Computer Vision, vol. 81, no. 2, pp.155–166, 2009.
[62] R. Y. Tsai, “A Versatile Camera Calibration Technique for High­Accuracy 3D Machine Vision Metrology Using Off­the­Shelf TV Cameras and Lenses,” IEEE Journal on Robotics and Automation, vol. 3, no. 4, pp. 323–344, 1987.
[63] Z. Zhang, “A Flexible New Technique for Camera Calibration (Technical Report),” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp. 1330–1334, 1998.
[64] ——, “A Flexible New Technique for Camera Calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp. 1330–1334, 2000.
[65] X. Meng and Z. Hu, “A new easy camera calibration technique based on circular points,” Pattern Recognition, vol. 36, no. 5, pp. 1155–1164, 2003.
[66] K. H. Strobl and G. Hirzinger, “More accurate pinhole camera calibration with imperfect planar target,” Proceedings of the IEEE International Conference on Computer Vision, pp. 1068–1075, 2011.
[67] M. Shortis, “Calibration techniques for accurate measurements by underwater camera systems,” Sensors (Switzerland), vol. 15, no. 12, pp. 30 810–30 827, 2015.
[68] L. Huang, F. Da, and S. Gai, “Research on multi­camera calibration and point cloud correction method based on three­dimensional calibration object,” Optics and Lasers in Engineering, vol. 115, no. October 2018, pp. 32–41, 2019.
[69] B. Caprile and V. Torre, “Using vanishing points for camera calibration,” In­ternational Journal of Computer Vision, vol. 4, no. 2, pp. 127–139, 1990.
[70] O. D. Faugeras, Q. T. Luong, and S. J. Maybank, “Camera self­calibration: Theory and experiments,” in Computer Vision — ECCV’92, G. Sandini, Ed. Berlin, Heidelberg: Springer Berlin Heidelberg, 1992, pp. 321–334.
[71] L. Grammatikopoulos, G. Karras, and E. Petsa, “An automatic approach for camera calibration from vanishing points,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 62, pp. 64–76, 2007.
[72] S. Huang, X. Ying, J. Rong, Z. Shang, H. Zha, and C. Science, “Camera Calibration from Periodic Motion of a Pedestrian Shiyao Huang , Xianghua Ying *, Jiangpeng Rong , Zeyu Shang and Hongbin Zha Key Laboratory of Machine Perception (Ministry of Education) School of Electronic Engineering and Computer Science ,Cent,” Cvpr, pp. 3025–3033, 2016.
[73] X. You and Y. Zheng, “An accurate and practical calibration method for roadside camera using two vanishing points,” Neurocomputing, vol. 204, pp. 222–230, 2016.
[74] H. Chang and F. Tsai, “Vanishing point extraction and refinement for robust camera calibration,” Sensors (Switzerland), vol. 18, no. 1, pp. 1–19, 2018.
[75] R. Pflugfelder and H. Bischof, “Online auto­calibration in man­made worlds,” in Digital Image Computing: Techniques and Applications (DICTA’05), Dec 2005, pp. 75–75.
[76] S. Nedevschi, C. Vancea, T. Marita, and T. Graf, “Online Extrinsic Parameters Calibration for Stereovision Systems Used in Far­Range Detection Vehicle Ap­plications,” IEEE Transactions on Intelligent Transportation Systems, vol. 8, no. 4, pp. 651–660, dec 2007.
[77] A. Basu, “Active calibration: alternative strategy and analysis,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition. IEEE Comput. Soc. Press, 1993, pp. 495–500.
[78] Q. Ji and S. Dai, “Self­Calibration of a Rotating Camera With a Translational Offset,” IEEE Transactions on Robotics and Automation, vol. 20, no. 1, pp. 1–14, feb 2004.
[79] Lei Wang, Sing Bing Kang, Heung­Yeung Shum, and Guangyou Xu, “Error analysis of pure rotation­based self­calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 2, pp. 275–280, feb 2004.
[80] N. Anjum and A. Cavallaro, “Single camera calibration for trajectory­ based behavior analysis,” in 2007 IEEE Conference on Advanced Video and Signal Based Surveillance. IEEE, sep 2007, pp. 147–152.
[81] K. Genovese, Y. Chi, and B. Pan, “Stereo­camera calibration for large­scale DIC measurements with active phase targets and planar mirrors,” Optics Express, vol. 27, no. 6, p. 9040, mar 2019.
[82] A. Buschhaus, M. Wagner, and J. Franke, “Inline Calibration Method for Robot Supported Process Tasks with High Accuracy Requirements,” IEEE/ ASME In­ternational Conference on Advanced Intelligent Mechatronics, AIM, pp. 682–687, 2017.
[83] A. Joubair, M. Slamani, and I. A. Bonev, “Kinematic calibration of a five­bar planar parallel robot using all working modes,” Robotics and Computer­Integrated Man­ufacturing, vol. 29, no. 4, pp. 15–25, 2013.
[84] M. Morozov, J. Riise, R. Summan, S. G. Pierce, C. Mineo, C. N. MacLeod, and R. H. Brown, “Assessing the accuracy of industrial robots through metrology for the enhancement of automated non­destructive testing,” IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 335–340, 2017.
[85] Y. Cai, H. Gu, C. Li, and H. Liu, “Easy industrial robot cell coordinates calibration with touch panel,” Robotics and Computer­Integrated Manufacturing, vol. 50, no. October 2017, pp. 276–285, 2018.
[86] F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler, “Analysis of the accuracy and robustness of the Leap Motion Controller,” Sensors (Switzerland), vol. 13, no. 5, pp. 6380–6393, 2013.
[87] Z. Gordić and C. Ongaro, “Calibration of robot tool centre point using camera­ based system,” Serbian Journal of Electrical Engineering, vol. 13, no. 1, pp. 9–20, 2016.
[88] X. Yu, T. Baker, Y. Zhao, and M. Tomizuka, “Robot Tool Calibration in Precise Glass Handling,” p. V002T16A001, 2017.
[89] G. B. de Sousa, A. Olabi, J. Palos, and O. Gibaru, “3D Metrology Using a Col­laborative Robot with a Laser Triangulation Sensor,” Procedia Manufacturing, vol. 11, no. June, pp. 132–140, 2017.
[90] G. Li and M. Vossiek, “A multilateral synthetic aperture wireless positioning approach to precise 3D localization of a robot tool center point,” 2011 IEEE Radio and Wireless Week, RWW 2011 ­ 2011 IEEE Topical Conference on Wireless Sensors and Sensor Networks, WiSNet 2011, no. 1, pp. 37–40, 2011.
[91] Y. Sun, D. J. Giblin, and K. Kazerounian, “Accurate robotic belt grinding of workpieces with complex geometries using relative calibration techniques,” Robotics and Computer­Integrated Manufacturing, vol. 25, no. 1, pp. 204–210, 2009.
[92] J. Xu, Z. Hou, Z. Liu, and H. Qiao, “Compare contact model­based control and contact model­free learning: A survey of robotic peg­in­hole assembly strategies,” CoRR, vol. abs/1904.05240, 2019.
[93] K. Zhang, J. Xu, H. Chen, S. Member, J. Zhao, and K. Chen, “Jamming Analysis and Force Control for Flexible Dual Peg­in­Hole Assembly,” IEEE Transactions on Industrial Electronics, vol. 66, no. 3, pp. 1930–1939, 2019.
[94] J. Xu, Z. Hou, W. Wang, B. Xu, K. Zhang, and K. Chen, “Feedback Deep De­terministic Policy Gradient with Fuzzy Reward for Robotic Multiple Peg­in­Hole Assembly Tasks,” IEEE Transactions on Industrial Informatics, vol. 15, no. 3, pp. 1658–1667, 2019.
[95] Y. Fei and X. Zhao, “An Assembly Process Modeling and Analysis for Robotic Multiple Peg­in­hole,” Journal of Intelligent and Robotic Systems, vol. 36, pp. 175–189, 2003.
[96] Y. L. Kim, H. C. Song, and J.­B. Song, “Hole detection algorithm for chamferless square peg­in­hole based on shape recognition using F/T sensor,” International Journal of Precision Engineering and Manufacturing, vol. 15, no. 3, pp. 425–432, 2014.
[97] H. C. Song, Y. L. Kim, and J. B. Song, “Automated guidance of peg­in­hole assembly tasks for complex­shaped parts,” IEEE International Conference on Intelligent Robots and Systems, no. Iros, pp. 4517–4522, 2014.
[98] J. Y. Kim, D. Kang, and H. S. Cho, “A Flexible­Parts Assembly Algorithm Based on a Visual Sensing System,” Proceedings of the 4th IEEE International Symposium on Assembly and Task Planning Soft Research Park, Fukuoka, pp. 417–422, 2001.
[99] J. Y. Kim and H. S. Cho, “Neural net­based assembly algorithm for flexible parts assembly,” Journal of Intelligent and Robotic Systems: Theory and Applications, vol. 29, no. 2, pp. 133–160, 2000.
[100] T. Fukukawa, J. Takahashi, and T. Fukuda, “Assembly algorithm for plastic ring with characteristic finger shape,” 2012 IEEE/SICE International Symposium on System Integration, SII 2012, pp. 470–475, 2012.
[101] R. K. Jain, S. Majumder, and A. Dutta, “SCARA based peg­in­hole assembly using compliant IPMC micro gripper,” Robotics and Autonomous Systems, vol. 61, no. 3, pp. 297–311, 2013.
[102] M. P. Polverini, A. M. Zanchettin, S. Castello, and P. Rocco, “Sensorless and constraint based peg­in­hole task execution with a dual­arm robot,” Proceedings­ IEEE International Conference on Robotics and Automation, vol. 2016­June, pp. 415–420, 2016.
[103] H. Park, S. Member, J. Park, D.­h. Lee, J.­h. Park, M.­h. Baeg, and J.­h. Bae, “Compliance­Based Robotic Peg­in­Hole Assembly Strategy Without Force Feedback,” IEEE Transactions on Industrial Electronics, vol. 64, no. 8, pp. 6299–6309, 2017.
[104] A. Wan, J. Xu, H. Chen, S. Zhang, and K. Chen, “Optimal path planning and control of assembly robots for hard­to­measure easy­to­deform components,” IEEE/ASME Transactions on Mechatronics, vol. 4435, no. c, pp. 1–1, 2017.
[105] W. C. Chang and C. H. Wu, “Automated USB peg­in­hole assembly employing visual servoing,” 2017 3rd International Conference on Control, Automation and Robotics, ICCAR 2017, pp. 352–355, 2017.
[106] R. J. Chang, C. Y. Lin, and P. S. Lin, “Visual­Based Automation of Peg­in­Hole Microassembly Process,” Journal of Manufacturing Science and Engineering, vol. 133, no. 4, p. 041015, 2011.
[107] Y. M. Zhao, Y. Lin, F. Xi, S. Guo, and P. Ouyang, “Switch­Based Sliding Mode Control for Position­Based Visual Servoing of Robotic Riveting System,” Journal of Manufacturing Science and Engineering, vol. 139, no. 4, p. 041010, 2016.
[108] H. Liu, W. Zhu, H. Dong, and Y. Ke, “An adaptive ball­head positioning visual servoing method for aircraft digital assembly,” Assembly Automation, vol. 39, no. 2, pp. 287–296, 2019.
[109] P. Nagarajan, S. Saravana Perumaal, and B. Yogameena, “Vision based pose estimation of multiple peg­in­hole for robotic assembly,” in Computer Vision, Graphics, and Image Processing, S. Mukherjee, S. Mukherjee, D. P. Mukherjee, J. Sivaswamy, S. Awate, S. Setlur, A. M. Namboodiri, and S. Chaudhury, Eds. Cham: Springer International Publishing, 2017, pp. 50–62.
[110] A. Zeng, K. T. Yu, S. Song, D. Suo, E. Walker, A. Rodriguez, and J. Xiao, “Multi­view self­supervised deep learning for 6D pose estimation in the Amazon Picking Challenge,” Proceedings ­ IEEE International Conference on Robotics and Automation, pp. 1386–1393, 2017.
[111] Y. Wang, W. Chao, D. Garg, B. Hariharan, M. Campbell, and K. Q. Weinberger, “Pseudo­lidar from visual depth estimation: Bridging the gap in 3d object detection for autonomous driving,” CoRR, vol. abs/1812.07179, 2018.
[112] Z. Liu, Y. Xie, J. Xu, and K. Chen, “Laser tracker based robotic assembly system for large scale peg­hole parts,” 4th Annual IEEE International Conference on Cyber Technology in Automation, Control and Intelligent Systems, IEEE­CYBER 2014, pp. 574–578, 2014.
[113] Z. Qin, P. Wang, J. Sun, J. Lu, and H. Qiao, “Precise Robotic Assembly for Large­ Scale Objects Based on Automatic Guidance and Alignment,” IEEE Transactions on Instrumentation and Measurement, vol. 65, no. 6, pp. 1398–1411, 2016.
[114] H. Bruyninckx, S. Dutre, and J. De Schutter, “Peg­on­hole: a model based solution to peg and hole alignment,” in Proceedings of 1995 IEEE International Conference on Robotics and Automation, vol. 2. IEEE, 1995, pp. 1919–1924.
[115] S. R. Chhatpar and M. S. Branicky, “Search strategies for peg­in­hole assemblies with position uncertainty,” in Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180), vol. 3, Oct 2001, pp. 1465–1470 vol.3.
[116] D. E. Whitney, “Historical Perspective and State of the Art in Robot Force Control,” The International Journal of Robotics Research, vol. 6, no. 1, pp. 3–14, mar 1987.
[117] W. Newman, Y. Zhao, and Y.­H. Pao, “Interpretation of force and moment signals for compliant peg­in­hole assembly,” Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164), vol. 1, pp. 571–576, 2001.
[118] In­Wook Kim, Dong­Jin Lim, and Kab­Il Kim, “Active peg­in­hole of chamferless parts using force/moment sensor,” in Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289), vol. 2, Oct 1999, pp. 948–953 vol.2.
[119] K. Van Wyk, M. Culleton, J. Falco, and K. Kelly, “Comparative Peg­in­ Hole Testing of a Force­Based Manipulation Controlled Robotic Hand,” IEEE Transactions on Robotics, vol. 34, no. 2, pp. 542–549, 2018.
[120] D. E. Whitney, “Quasi­Static Assembly of Compliantly Supported Rigid Parts,” Journal of Dynamic Systems, Measurement, and Control, vol. 104, no. 1, pp. 65–77, mar 1982.
[121] J. Sturges R. H. and S. Laowattana, “Design of an Orthogonal Compliance for Polygonal Peg Insertion,” Journal of Mechanical Design, vol. 118, no. 1, pp. 106–114, mar 1996.
[122] S. Lee, “Development of a New Variable Remote Center Compliance (VRCC) With Modified Elastomer Shear Pad (ESP) for Robot Assembly,” IEEE Transactions on Automation Science and Engineering, vol. 2, no. 2, pp. 193–197, 2005.
[123] C.­C. CHENG and G.­S. CHEN, “A Multiple RCC Device for Polygonal Peg Insertion,” JSME International Journal Series C Mechanical Systems, Machine Elements and Manufacturing, vol. 45, no. 1, pp. 306–315, 2002.
[124] A. D. Luca and R. Mattone, “Sensorless Robot Collision Detection and Hybrid Force/Motion Control,” Proceedings of the 2005 IEEE International Conference on Robotics and Automation, pp. 999–1004, 2005.
[125] H. Lee and J. Park, “An active sensing strategy for contact location without tactile sensors using robot geometry and kinematics,” Autonomous Robots, vol. 36, no. 1, pp. 109–121, jan 2014.
[126] ONDU Pinhole, “135 pocket pinhole camera,” 2019, retrieved from https://ondupinhole.com/products/135­pocket­pinhole­camera.
[127] B. Mellish, “How a pinhole camera works,” 2008. [Online]. Available: http://commons.wikimedia.org/wiki/Image:Pinhole­camera.png
[128] L. RAYLEIGH, “Some Applications of Photography1,” Nature, vol. 44, no. 1133, pp. 249–254, 1891.
[129] G. Q. Wei and S. D. Ma, “Implicit and Explicit Camera Calibration: Theory and Experiments,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 16, no. 5, pp. 469–480, may 1994.
[130] D. C. Brown, “Close­range camera calibration,” PHOTOGRAMMETRIC ENGINEERING, vol. 37, no. 8, pp. 855–866, 1971.
[131] J. Weng, P. Cohen, and M. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 10, pp. 965–980, oct 1992.
[132] J. Wang, F. Shi, J. Zhang, and Y. Liu, “A new calibration model of camera lens distortion,” Pattern Recognition, vol. 41, no. 2, pp. 607 – 615, 2008.
[133] Fabcon, Inc., “lincoln­robotic­welder,” 2019. [Online]. Available: http://www.fabcon.com/about/capabilities/lincoln­robotic­welder/
[134] On Robot, “On robot launches customizable grippers for collaborative robots,” 2017. [Online]. Available: https://www.manufacturing.net/product­announcement/2017/08/robot­launches­customizable­grippers­collaborative­robots
[135] J. S. GmbH, “Tool center point vee­tcp,” 2019. [Online]. Available: https://www.schmalz.com/en/vacuum­technology­for­automation/vacuum­components/area­gripping­systems­and­end­effectors/vacuum­end­effectors­vee/tool­center­point­vee­tcp
[136] A. Group, “Bullseye ­ ”tool centre point” calibration,” 2019. [Online]. Available: https://new.abb.com/products/robotics/application­equipment­and­accessories/arc­welding­equipment/process­support­tools/bullseye
[137] LEONI, “advintec tcp –calculation and calibration of robotic tools and fixtures in up to 6 dimensions,” 2019. [Online]. Available: https://www.leoni­factory­automation.com/en/products­and­services/calibration­of­robotic­tools­fixtures/
[138] F. Romero Ramirez, R. Muñoz­Salinas, and R. Medina­Carnicer, “Speeded up detection of squared fiducial markers,” Image and Vision Computing, vol. 76, 06 2018.
[139] S. Garrido­Jurado, R. Muñoz­Salinas, F. Madrid­Cuevas, and R. Medina­ Carnicer, “Generation of fiducial marker dictionaries using mixed integer linear programming,” Pattern Recognition, vol. 51, 10 2015.
[140] T. Sueishi and M. Ishikawa, “Circle grid fractal pattern for calibration at different camera zoom levels,” in SIGGRAPH Asia 2015 Posters, ser. SA ’15. New York, NY, USA: ACM, 2015, pp. 21:1–21:1.
[141] S. Garrido­Jurado, R. Muñoz­Salinas, F. Madrid­Cuevas, and M. Marín­Jiménez, “Automatic generation and detection of highly reliable fiducial markers under occlusion,” Pattern Recognition, vol. 47, no. 6, pp. 2280 – 2292, 2014.
[142] B. Greenway, “Robot accuracy,” Industrial Robot: An International Journal, vol. 27, no. 4, pp. 257–265, 2000.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/74298-
dc.description.abstract對於機械手臂應用於高精度插件組裝任務而言,各系統座標系(如機械手臂基座、末端法蘭面、相機、工具、工件)間的轉換關係對精度影響甚鉅,然而由於元件裝配誤差、機械加工誤差等幾何或尺度的不確定因素,使得座標轉換矩陣存在著不精準性,不能依靠機構設計中的理想座標轉換矩陣,因此本論文針對垂直多關節機械手臂應用於高精度插件任務提出一套離線校正方法,本方法具高精度、高彈性、高穩健性等特點,並且能離線校正機械手臂系統間的座標轉換關係,適用於工廠生產線中不能停機校正機械手臂的限制,能異地先行校正完機械手臂系統的相機與工具,再移至生產線上執行插件任務。本方法基於影像資訊進行校正,但不仰賴影像即時回饋的方式,而是採用多姿態定點拍攝校正版,透過偵測校正版特徵之影像尺度資訊並與實際物理尺度進行相機校正與手眼校正,並以實際插件組裝任務評估精度表現。本論文之研究流程將應用於虛擬系統與真實系統之插件組裝任務做為演示及討論,於虛擬系統中以理想且所有資訊皆已知的環境驗證演算法之可行性與精準度,結果顯示在虛擬系統中機械手臂插件任務之精度表現達到 0.0706 mm 之位置精度、 0.1373 度之朝向精度,而應用於真實系統中會存在環境不確定因素,但本研究方法之穩健性高,即使是有環境雜訊與裝配誤差,仍然能達到 +-0.25 mm 之絕對精度水平,相較於未校正系統其精度高於兩倍以上。zh_TW
dc.description.abstractUncertainties of a robot manipulator include the inaccurate transformation between coordinates, (e.g., robot base, flange, camera, tool, and workpiece), and the inherent variations within each links and mechanical components. In this thesis, we propose an offline calibration method for vertical articulated manipulator in high precision peg-in-hole assembly tasks. Our method is based on the images captured by well-calibrated camera to detect fiducial pattern and marker. These images are then used to determinie the transformation metrics between each coordinate. The accuracy performance is evaluated by the actual peg-in-hole assembly task. The proposed method will be applied to demonstrate the peg-in-hole assembly task in virtual and real environment. In order to verify the feasibility and accuracy of the algorithm in the virtual system with an ideal environment where all information is known. The simulation results show that the accuracy of peg-in-hole assembly can reach 0.0706 mm position error and 0.1373 degree orientation error for hole locations that are not known as a priori. Even with environment noise and unit uncertainty in real system, the absolute accuracy level of +-0.25 mm can still be reached, which is more than twice as high as that of the uncalibration system.en
dc.description.provenanceMade available in DSpace on 2021-06-17T08:28:36Z (GMT). No. of bitstreams: 1
ntu-108-R06522615-1.pdf: 98329287 bytes, checksum: d620f3d18ccd3e4451a3996813e63678 (MD5)
Previous issue date: 2019
en
dc.description.tableofcontents口試委員會審定書 i
誌謝 ii
摘要 iv
Abstract v
目錄 vi
圖目錄 ix
表目錄 xii
第一章 緒論 1
1.1 前言 1
1.2 機械手臂於工廠的定位與應用 1
1.3 多關節機械手臂應用於插件組裝任務 4
1.4 研究動機與研究目的 5
1.5 本文架構 7
第二章 文獻回顧 9
2.1 工業機械手臂之精準度提升方法 9
2.1.1 機械手臂不確定因素模型之參數識別 9
2.1.2 機械手臂實驗校正 11
2.1.3 小結 11
2.2 相機應用於機械手臂之校正方法 12
2.2.1 手眼校正 12
2.2.2 相機校正 14
2.2.3 小結 16
2.3 加工工具於機械手臂之校正方法 16
2.3.1 小結 18
2.4 機械手臂應用於插件組裝任務之相關文獻 18
2.4.1 視覺感測器 18
2.4.2 力/力矩感測器 19
2.4.3 末端效應器機構 19
2.4.4 無感測器 19
2.4.5 小結 20
第三章 模型建置 21
3.1 工業相機模型 21
3.1.1 針孔相機模型 21
3.1.2 鏡頭光學畸變模型 24
3.1.3 齊次座標系統 26
3.1.4 單應性矩陣 27
3.2 相機校正方法 31
3.2.1 張氏相機校正方法 31
3.2.2 變焦與對焦原理 34
3.3 機械手臂校正方法 40
3.3.1 座標系統介紹 40
3.3.2 手眼校正方法 42
3.3.3 工具中心點校正方法 43
3.3.4 工件座標校正方法 47
第四章 研究方法 49
4.1 環境建立 52
4.2 離線校正 (Offline Calibration) 56
4.2.1 機械手臂校正 (Robot Calibration) 56
4.2.2 置中 (Cetering) 57
4.2.3 對焦 (Focusing) 58
4.2.4 計算拍照位置 (Auto pose) 59
4.2.5 相機校正 (Camera Calibration) 59
4.2.6 手眼校正 (Hand/Eye Calibration) 62
4.2.7 工具中心校正 (Tool Center Point Calibration) 64
4.3 產線操作 (Online Operation) 64
4.3.1 工件孔洞搜尋 (Hole Searching) 64
4.3.2 插件 (Peg in hole) 69
4.4 精度驗證 70
第五章 工程案例 71
5.1 硬體規格與其控制環境 71
5.2 試驗條件 72
5.3 自動化校正流程 74
5.3.1 離線校正 (Offline Calibration) 74
5.3.2 產線操作 (Online Operation) 80
5.3.3 插件 (Peg in hole) 83
5.3.4 精度驗證 83
5.4 誤差來源 90
5.4.1 機械手臂本體之誤差 90
5.4.2 視覺影像之誤差 91
5.4.3 加工工具之誤差 91
5.4.4 數值計算之誤差 92
第六章 結論與未來展望 93
6.1 結論 93
6.2 研究建議與未來研究方向 94
附錄 A 程式碼 96
參考文獻 97
dc.language.isozh-TW
dc.subject機械手臂zh_TW
dc.subject手眼校正zh_TW
dc.subject相機校正zh_TW
dc.subject離線校正zh_TW
dc.subject插件zh_TW
dc.subject組裝zh_TW
dc.subject最佳化zh_TW
dc.subjectAssemblyen
dc.subjectOptimizationen
dc.subjectRobot manipulatoren
dc.subjectHand/Eye Calibrationen
dc.subjectCamera Calibrationen
dc.subjectOffline Calibrationen
dc.subjectPeg-in-Holeen
dc.title垂直多關節機械手臂應用於高精密插件組裝任務之離線校正方法zh_TW
dc.titleOffline Calibration for Vertical Articulated Robot Manipulator in High Precision Peg­-in­-Hole Assemblyen
dc.typeThesis
dc.date.schoolyear107-2
dc.description.degree碩士
dc.contributor.oralexamcommittee李志中(Jyh-Jone Lee),徐冠倫(Kuan-Lun Hsu)
dc.subject.keyword機械手臂,手眼校正,相機校正,離線校正,插件,組裝,最佳化,zh_TW
dc.subject.keywordRobot manipulator,Hand/Eye Calibration,Camera Calibration,Offline Calibration,Peg-in-Hole,Assembly,Optimization,en
dc.relation.page110
dc.identifier.doi10.6342/NTU201902922
dc.rights.note有償授權
dc.date.accepted2019-08-13
dc.contributor.author-college工學院zh_TW
dc.contributor.author-dept機械工程學研究所zh_TW
顯示於系所單位:機械工程學系

文件中的檔案:
檔案 大小格式 
ntu-108-1.pdf
  未授權公開取用
96.02 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved