Please use this identifier to cite or link to this item:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/90756
Title: | 振鏡掃描式複數條紋投影三維形貌量測術之開發 Development of Galvo-scanning Multi-line Fringe Projection Profilometry |
Authors: | 黃柏翰 Bo-Han Huang |
Advisor: | 陳亮嘉 Liang-Chia Chen |
Keyword: | 三維形貌量測,條紋投射形貌量測術,空間映射函數,光學振鏡掃描,峰值偵測演算法, 3D shape measurement,fringe projection profilometry (FPP),space mapping function,optical galvanometer scanning,peak detection algorithm, |
Publication Year : | 2022 |
Degree: | 碩士 |
Abstract: | 本研究旨在開發運用振鏡控制之精密複數線型結構光量測系統,得益於光學振鏡之快速、精密旋轉控制之特性,本系統能夠以極快速之方式於量測元件上以結構光掃描,進而獲取三維形貌之量測結果。本方法同時具備高精度、高準確度之量測結果,其精密掃描之特性能夠解決以往線型量測系統之量測精度容易受到物件掃描運動定位或速度偏差影響,使量測結果之不確定度有效降低。
為克服傳統利用雷射光源作為線型結構光所帶來的光斑效應問題,本研究選用LED模組作為結構光投射模組之光源,搭配聚焦透鏡及圓柱透鏡陣列產生複數線型結構光紋樣,並透過f-θ透鏡組將結構光紋樣投射於量測空間中,達成結構光三角法掃描量測架構。依此所設計之量測系統同時具有低功耗、高使用壽命之特性,適合業界以長期應用為主之光學檢測程序。 利用具高穩定性之影像灰度峰值偵測演算法,萃取出影像中的結構光光線中心峰值點,能夠獲取次像素級精度,克服受影像像素所限制之深度解析能力。由影像中獲取資料點後,本研究利用空間映射函數將影像資訊點轉換為空間座標,獲取量測視野中所有掃描點之空間資訊,進而將量測結果以點雲方式輸出。此方式以多項式關係擬合空間座標與像素資訊,經校正後可在短時間內獲取全域之點雲量測結果,充分符合快速量測之特點。 本研究所提出之光學量測系統經多種量測實驗之驗證,其重複量測精度可達15微米以下,並可維持1微米以下之高準確度量測結果。經由真實金屬積層製造元件的量測測試,此系統可還原具不均勻表面反射特性之金屬元件,並可還原自由曲面物體形貌,可滿足一般精密工業之三維量測運用,同時可避免量測亮光面時必須噴漆的問題,可以有效降低量測不確定度。 This study aims to develop a precision multi-line fringe projection profilometric (FPP) system using a galvanometer and optical system design. Due to the fast and precise rotation control of the optical galvanometer, the developed system can obtain three-dimensional shape measurement results by scanning the structured light on the measurement element rapidly. This method also provides high accuracy and precision measurement results and thus solves the problem that the measurement accuracy of line structured-light systems is easily affected by the deviation of the movement speed of the object conveyor belt so that the uncertainty of measurement results can be minimized. In the developed optical system, to overcome the speckle effect caused by the use of laser light source as a linear structured light source, this study uses light-emitting diodes (LED) as the light source of the structured light projection module, with a condenser lens and a cylinder lens array to generate multi-line structured-light patterns, and projects the structured light patterns in the measurement space through the f-θ lens set to achieve the structured-light triangulation scanning system. The measurement system is designed with low power consumption and high lifetime, which is suitable for the industry's long-term optical inspection process. Meanwhile, a robust image peak detection algorithm is used to extract peak points of the structured light in the image, which can obtain sub-pixel level accuracy and overcome the depth resolution capability limited by image pixels. After obtaining the data points from the image, by using the space mapping function, the pixel points can be turned into spatial coordinates and obtain the spacial coordinate information of all the scanned points in the measurement field, thus outputting the measurement results as a point cloud. This method uses a polynomial model to fit the spatial coordinates and pixel information. After calibration, the point cloud measurement results can be obtained in a short time in the field of view, which fully meets the characteristics of fast measurement. The proposed optical measurement system has been verified by various measurement experiments, in which its repeatability can reach less than 15 micrometers and a high accuracy measurement result of fewer than 1 micrometers. The system is capable of restoring metal components with uneven surface reflectance characteristics and free-form object morphology through the measurement of additive manufacturing metal components. |
URI: | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/90756 |
DOI: | 10.6342/NTU202210100 |
Fulltext Rights: | 同意授權(限校園內公開) |
Appears in Collections: | 機械工程學系 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
ntu-111-2.pdf Restricted Access | 7.61 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.