Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 工學院
  3. 土木工程學系
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/90514
Title: 基於BIM模型合成資料與領域自適應提升深度學習模型辨識現地鋼筋影像能力
Improving Deep Learning-based On-site Rebar Recognition Ability via Synthetic BIM-based Data and Domain Adaptation
Authors: 黃琮煒
Tsung-Wei Huang
Advisor: 陳俊杉
Chuin-Shan Chen
Keyword: 現地鋼筋辨識,合成資料,建築資訊模型,領域自適應,深度學習,電腦視覺,Mask R-CNN,
on-site rebar recognition,synthetic data,building information modeling,domain adaptation,deep learning,computer vision,Mask R-CNN,
Publication Year : 2023
Degree: 碩士
Abstract: 鋼筋混凝土一直是被廣泛使用的建築結構,為保證其結構強度往往需要耗費大量時間與人力進行檢核,近年來有些研究提出使用深度學習與電腦視覺的方法企圖提升鋼筋檢核的效率,然而這些方法大多僅示範在實驗室中搭建的鋼筋籠,亦或是因為不足的鋼筋資料導致模型只能在相似場域上有好表現,本研究提出一套使用合成資料與領域自適應的方法,解決資料不足以及模型表現不理想的情況。
首先透過BIM模型以及自動化腳本,將建構BIM模型與生成合成資料的流程自動化,取得大量與多樣具有實例等級標註的鋼筋合成資料。接著透過Mask R-CNN學習鋼筋的特徵,進行鋼筋影像的辨識,模型學習過程中會使用領域自適應技術消弭合成資料與真實資料間的差距,提升模型的辨識能力。另外因為過去研究並沒有公開的資料集與模型,因此本研究整理與標註現地鋼筋資料,做為本研究數值比較的對象。
本研究最終得到25287張標註的鋼筋合成資料,解決資料不足的問題。在現地鋼筋的測試資料集上,本研究基於合成資料與領域自適應得到的模型比使用現地鋼筋資料做為訓練的模型,在AP50指標上提升超過三倍。此外本研究亦從收集各樣現地鋼筋資料影像供模型預測,本研究提出的模型也都有較好的預測結果,故本研究提出的模型確實有提升現地鋼筋影像辨識能力。
Reinforced concrete has been widely used as a structural material in buildings. To make sure that the structure have enough structural strength, rebar inspection is in need. However traditional rebar inspection takes huge amount of time and involves significant labor. Recently some studies have proposed the use of deep learning and computer vision techniques to enhance the efficiency of rebar inspection. However, most of these methods have only demonstrated their effectiveness on steel cages built in laboratory settings or have been limited by insufficient steel reinforcement data, resulting in models that perform well only in similar scenarios. In this study, we address the challenges of limited data and suboptimal model performance by proposing a method with synthetic data generation and domain adaptation.
Large amount and variety of synthetic data has been generated by utilizing BIM and automated scripting. The synthetic data include instance-level annotations. Mask R-CNN model can learn those feature and recognize rebars. We bring domain adaptation into training process to eliminate the domain shift between two different data set, boosting the performance of the model. Since recent related research have not publicize their data set or model, we prepare the on-site rebar data set with annotation ourselves and serve them as a baseline in this research.
Finally we developed the rebar synthetic data set with 25287 images, solving the issue of insufficient data. Our model, which have take the advantage of hug synthetic data set and effect of domain adaptation, has achieved triple improvement on AP50 metrics compared to the baseline model when evaluating on on-site testing data set. The model also show greater predictions on the data set we collect from the internet, which is really diverse. Hence, the proposed model indeed enhances the capability of on-site rebar recognition.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/90514
DOI: 10.6342/NTU202301886
Fulltext Rights: 同意授權(全球公開)
metadata.dc.date.embargo-lift: 2028-08-01
Appears in Collections:土木工程學系

Files in This Item:
File SizeFormat 
ntu-111-2.pdf
  Until 2028-08-01
191.9 MBAdobe PDF
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved