請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/92026
標題: | 運用光學量測於機械手臂末端效應器位置和姿態之高精度檢測 High-accuracy detection and monitoring on the position and orientation of the robotic arm end-effector using optical measurement |
作者: | 黃聖皓 Sheng-Hao Huang |
指導教授: | 陳亮嘉 LIANG-CHIA CHEN |
關鍵字: | 數位結構光投射量測法,點雲配準,位置與姿態變化,區域表面積特徵描述符,最近點迭代法, Digital structured light projection profilometry,Point cloud alignment,Variations in the position and orientation,Regional surface area descriptor,Iterative closest point, |
出版年 : | 2021 |
學位: | 碩士 |
摘要: | 本論文旨在開發創新式機械手臂終端位置與姿態檢測系統,藉由三維數位結構光投射量測法建立物件之三維形貌資訊,結合晶圓三軸機械手臂終端特徵作為標的,並以點雲配準技術求得物體相對位置與姿態變化之精確數據。本方法能同時克服傳統二維影像處理法無法獲取物件高度資訊之弊病,以及需要額外標定物作為偵測基準之問題,成功實現無特定標定物之三維位置與姿態變化偵測演算法。
利用數位結構光投射量測系統可達成三維表面特徵快速重建,並建立具高辨識度之物件特徵點雲,以做為相對位置與姿態變化偵測數據基礎。利用本研究方法可獲得機械手臂終端 X、Y、Z、Roll、Pitch、Yaw 位置與姿態變化,克服傳統二維影像檢測方式僅能獲取 X、Y 方向上偏位的限制。 本論文提出以物件特徵點雲資訊為基礎,使用區域表面積特徵表述符有效地實現三維空間中物件初始對位,更利用優化的最近點迭代法以快速且精確地擬合模型及物件點雲,達成物件相對位置及姿態變化偵測。此方法不需要額外特徵標的來建立其量測基準,故可實現無標定物之位置與姿態變化偵測方法。 為了檢驗所提出系統之可行性與量測精度,使用所發展之三維光學探頭量測標準球校正單元,根據實驗量測結果,已證實該系統於全量測範圍全深(100 mm)內可達60 μm之精確度與 100 μm之深度解析。另一方面,應用於機械手臂線上自動化位置與姿態變化檢測,根據實驗數據顯示在量測範圍全深內之三維方向上達 100 μm之定位精確度。 This paper aims to develop an innovative robot arm end-effector position and orientation detection system by using digital structured light projection profilometry to reconstruct the 3D profile of an object, also combining the features of the wafer tri-axis robotic arm as the standard, and using point cloud alignment techniques to obtain the relative variation of the position and orientation of the object. This method can overcome the disadvantages of the traditional 2D image processing method, which cannot obtain object's depth information and also need an additional target to be the reference, successfully complete the detection of the position and orientation of the robot arm end-effector. The use of digital structured light projection profilometry can achieve rapid reconstruction of three-dimensional surface profile, and establish a highly recognizable object feature point cloud as the foundation for detection of the variation in the position and orientation. Using the proposed method, the variations in the position and orientation of the robot arm end-effector including X, Y, Z, Roll, Pitch, and Yaw can be obtained, which overcomes the limitation that the traditional two-dimensional image inspection method can only obtain the translations in the X and Y directions. The proposed method based on the object point cloud, applied the regional surface area descriptor to roughly align the objects in the 3D space, and the optimized iterative closest point algorithm is used to fast and accurately align the object point clouds, and so as to detect the variations in the position and orientation of the object. This method does not require additional target to measure the variations in the position and orientation of the object, so it can achieve target-free measurement. To test the feasibility of the proposed system and verify its measurement accuracy, a standard sphere calibration target was measured by the developed 3D optical probe. From the measurement results, it reveals that the 3D optical probe could achieve 60 μm measurement accuracy and 100 μm depth resolution within the entire measurement range of 100 mm. On the other hand, for the automatic detection of the position and orientation on the robot arm end-effector, the experimental results show that it achieves the positioning accuracy of 100 um in the X,Y and Z position within the entire measurement range. |
URI: | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/92026 |
DOI: | 10.6342/NTU202102889 |
全文授權: | 同意授權(限校園內公開) |
顯示於系所單位: | 機械工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-110-2.pdf 目前未授權公開取用 | 41.63 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。