Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 工學院
  3. 土木工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88604
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor謝尚賢zh_TW
dc.contributor.advisorShang-Hsien HSIEHen
dc.contributor.author安里達zh_TW
dc.contributor.authorAritra Palen
dc.date.accessioned2023-08-15T17:01:35Z-
dc.date.available2023-11-09-
dc.date.copyright2023-08-15-
dc.date.issued2023-
dc.date.submitted2023-08-01-
dc.identifier.citationReferences
[1] M. Adorjan. OpenSFM - A collaborative Structure-from-Motion System. PhD thesis, Vienna University of Technology, 2016.
[2] E. Agapaki and I. Brilakis. Instance Segmentation of Industrial Point Cloud Data. Journal of Computing in Civil Engineering, 35(6):04021022, 2021.
[3] A. K. Ali, O. J. Lee, D. Lee, and C. Park. Remote indoor construction progress monitoring using extended reality. Sustainability (Switzerland), 13(4):1–24, 2021.
[4] S. Alizadehsalehi and I. Yitmen. Digital twin-based progress monitoring management model through reality capture to extended reality technologies (DRX). Smart and Sustainable Built Environment, 12(1):200–236, 2021.
[5] K. Amano, E. C. Lou, and R. Edwards. Integration of point cloud data and hyperspectral imaging as a data gathering methodology for refurbishment projects using building information modelling (BIM). Journal of Facilities Management, 17(1):57–75, 2019.
[6] F. Amer and M. Golparvar-Fard. Automatic Understanding of Construction Schedules: Part-of-Activity Tagging. Proceedings of the 2019 European Conference on Computing in Construction, 1:190–197, 2019.
[7] F. Amer and M. Golparvar-Fard. Modeling dynamic construction work template from existing scheduling records via sequential machine learning. Advanced Engineering Informatics, 47(October 2020):101198, 2021.
[8] F. Amer, J. Hockenmaier, and M. Golparvar-Fard. Learning and critiquing pairwise activity relationships for schedule quality control via deep learning-based natural language processing. Automation in Construction, 134(October 2021):104036, 2022.
[9] F. Amer, Y. Jung, and M. Golparvar-Fard. Transformer machine learning language model for auto-alignment of long-term and short-term plans in construction. Automation in Construction, 132(March):103929, 2021.
[10] O. Angah and A. Y. Chen. Removal of occluding construction workers in job site image data using U-Net based context encoders. Automation in Construction, 119(July):103332, 2020.
[11] I. Armeni, O. Sener, A. R. Zamir, H. Jiang, I. Brilakis, M. Fischer, and S. Savarese. 3d semantic parsing of large-scale indoor spaces. In Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition, 2016.
[12] K. Asadi, H. Ramshankar, M. Noghabaei, and K. Han. Real-Time Image Localization and Registration with BIM Using Perspective Alignment for Indoor Monitoring of Construction. Journal of Computing in Civil Engineering, 33(5):04019031, 2019.
[13] O. C. Blender. Blender - a 3D modelling and rendering package. 2022.
[14] F. Bosché. Automated recognition of 3D CAD model objects in laser scans and calculation of as-built dimensions for dimensional compliance control in construction. Advanced Engineering Informatics, 24(1):107–118, 2010.
[15] F. Bosché, M. Ahmed, Y. Turkan, C. T. Haas, and R. Haas. The value of integrating Scan-to-BIM and Scan-vs-BIM techniques for construction monitoring using laser scanning and BIM: The case of cylindrical MEP components. Automation in Construction, 49:201–213, 2015.
[16] F. Bosche and C. T. Haas. Automated retrieval of 3D CAD model objects in construction range images. Automation in Construction, 17(4):499–512, 2008.
[17] F. Bosche, C. T. Haas, and B. Akinci. Automated Recognition of 3D CAD Objects in Site Laser Scans for Project 3D Status Visualization and Performance Control. Journal of Computing in Civil Engineering, 23(6):311–318, 2009.
[18] A. Braun, S. Tuttas, A. Borrmann, and U. Stilla. A concept for automated construction progress monitoring using BIM-based geometric constraints and photogrammetric point clouds. Journal of Information Technology in Construction, 20(January):68–79, 2015.
[19] A. Braun, S. Tuttas, A. Borrmann, and U. Stilla. Improving progress monitoring by fusing point clouds, semantic data and computer vision. Automation in Construction, 116(May):103210, 2020.
[20] T. B. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, A. Neelakantan, P. Shyam, G. Sastry, A. Askell, S. Agarwal, A. Herbert-Voss, G. Krueger, T. Henighan, R. Child, A. Ramesh, D. M. Ziegler, J. Wu, C. Winter, C. Hesse, M. Chen, E. Sigler, M. Litwin, S. Gray, B. Chess, J. Clark, C. Berner, S. McCandlish, A. Radford, I. Sutskever, and D. Amodei. Language models are few-shot learners, 2020.
[21] J. Chen, Z. Kira, and Y. K. Cho. Deep Learning Approach to Point Cloud Scene Understanding for Automated Scan to 3D Reconstruction. Journal of Computing in Civil Engineering, 33(4):04019027, 2019.
[22] M. Y. Cheng, M. T. Cao, and C. K. Nuralim. Computer vision-based deep learning for supervising excavator operations and measuring real-time earthwork productivity. Journal of Supercomputing, (0123456789), 2022.
[23] P. R. de Almeida, M. Z. Solas, A. Renz, M. Bühler, P. Gerbert, S. Castagnino, and C. Rothballer. Shaping the Future of Construction: A Breakthrough in Mindset and Technology, may 2016.
[24] H. Deng, H. Hong, D. Luo, Y. Deng, and C. Su. Automatic Indoor Construction Process Monitoring for Tiles Based on BIM and Computer Vision. Journal of Construction Engineering and Management, 146(1):04019095, 2020.
[25] L. Deng and D. Yu. Deep learning: Methods and applications. Foundations and Trends® in Signal Processing, 7(3–4):197–387, 2014.
[26] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova. Bert: Pre-training of deep bidirectional transformers for language understanding, 2019.
[27] A. Dimitrov and M. Golparvar-Fard. Vision-based material recognition for automated monitoring of construction progress and generating building information modeling from unordered site image collections. Advanced Engineering Informatics, 28(1):37–49, 2014.
[28] Z. Dong, J. Wang, B. Cui, D. Wang, and X. Wang. Patch-based weakly supervised semantic segmentation network for crack detection. Construction and Building Materials, 258:120291, 2020.
[29] Q. Fang, H. Li, X. Luo, C. Li, and W. An. A sematic and prior-knowledge-aided monocular localization method for construction-related entities. Computer-Aided Civil and Infrastructure Engineering, 35(9):979–996, 2020.
[30] M. Golparvar-Fard, F. Peña-Mora, C. A. Arboleda, and S. Lee. Visualization of Construction Progress Monitoring with 4D Simulation Model Overlaid on TimeLapsed Photographs. Journal of Computing in Civil Engineering, 23(6):391–404, 2009.
[31] M. Golparvar-Fard, F. Peña-Mora, and S. Savarese. Integrated Sequential AsBuilt and As-Planned Representation with D4AR Tools in Support of DecisionMaking Tasks in the AEC/FM Industry. Journal of Construction Engineering and Management, 137(12):1099–1116, 2011.
[32] M. Golparvar-Fard, F. Peña-Mora, and S. Savarese. Automated Progress Monitoring Using Unordered Daily Construction Photographs and IFC-Based Building Information Models. Journal of Computing in Civil Engineering, 29(1):04014025, 2015.
[33] M. Grieves and J. Vickers. Digital Twin: Mitigating Unpredictable, Undesirable Emergent Behavior pages 85–113. Springer International Publishing, Cham, 2017.
[34] A. Gupta, R. Ramanath, J. Shi, and S. S. Keerthi. Adam vs. SGD: Closing the generalization gap on image classification. In OPT2021: 13th Annual Workshopon Optimization for Machine Learning, pages 1–7, 2021.
[35] S. Halder, K. Afsari, J. Serdakowski, S. Devito, M. Ensafi, and W. Thabet. RealTime and Remote Construction Progress Monitoring with a Quadruped Robot Using Augmented Reality. 2022.
[36] H. Hamledari, B. McCabe, and S. Davari. Automated computer vision-based detection of components of under-construction indoor partitions. Automation in Construction, 74:78–94, 2017.
[37] K. Han, J. Degol, and M. Golparvar-Fard. Geometry- and Appearance-Based Reasoning of Construction Progress Monitoring. Journal of Construction Engineering and Management, 144(2):04017110, 2018.
[38] K. K. Han and M. Golparvar-Fard. Appearance-based material classification for monitoring of operation-level construction progress using 4D BIM and site photologs. Automation in Construction, 53:44–57, 2015.
[39] K. He, G. Gkioxari, P. Dollár, and R. Girshick. Mask R-CNN, 2017.
[40] Y. Hong, H. Xie, E. Agapaki, and I. Brilakis. Graph-Based Automated Construction Scheduling without the Use of BIM. Journal of Construction Engineering and Management, 149(2):1–15, 2023.
[41] A. Ibrahim, M. Golparvar-Fard, and K. El-Rayes. Metrics and methods for evaluating model-driven reality capture plans. Computer-Aided Civil and Infrastructure Engineering, 37(1):55–72, 2022.
[42] W. Kentaro. Labelme: Image Polygonal Annotation with Python, 2016.
[43] H. Kim and C. Kim. 3D as-built modeling from incomplete point clouds using connectivity relations. Automation in Construction, 130(January):103855, 2021.
[44] S. Kim, S. Kim, and D. E. Lee. Sustainable application of hybrid point cloud and BIM method for tracking construction progress. Sustainability (Switzerland), 12(10):1–16, 2020.
[45] M. Kopsida and I. Brilakis. Real-Time Volume-to-Plane Comparison for Mixed Reality–Based Progress Monitoring. Journal of Computing in Civil Engineering, 34(4):04020016, 2020.
[46] W. Kritzinger, M. Karner, G. Traar, J. Henjes, and W. Sihn. Digital Twin in manufacturing: A categorical literature review and classification. IFAC-PapersOnLine, 51(11):1016–1022, 2018.
[47] C. Kropp, C. Koch, and M. König. Interior construction state recognition with 4D BIM registered image sequences. Automation in Construction, 86(October 2017):11–32, 2018.
[48] M. Lamba and K. Mitra. Restoring Extremely Dark Images in Real Time. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pages 3486–3496, 2021.
[49] Y. LeCun, Y. Bengio, and G. Hinton. Deep learning. Nature, 521(7553):436–444, 2015.
[50] L. Li, S. Aslam, A. Wileman, and S. Perinpanayagam. Digital Twin in Aerospace Industry: A Gentle Introduction. IEEE Access, 10:9543–9562, 2022.
[51] J. J. Lin and M. Golparvar-Fard. Visual and Virtual Production Management System for Proactive Project Controls. Journal of Construction Engineering and Management, 147(7):04021058, 2021.
[52] J. J. Lin, . J. Y. Lee, and M. Golparvar-Fard. Exploring the Potential of Image-Based 3D Geometry and Appearance Reasoning for Automated Construction Progress Monitoring. In Computing in Civil Engineering 2019, pages 162–170, 2019.
[53] Z.-H. Lin, A. Y. Chen, and S.-H. Hsieh. Temporal image analytics for abnormal construction activity identification. Automation in Construction, 124:103572, 2021.
[54] C. Liu, S. M.E. Sepasgozar, S. Shirowzhan, and G. Mohammadi. Applications of object detection in modular construction based on a comparative evaluation of deep learning algorithms. Construction Innovation, 22(1):141–159, 2022.
[55] L. Liu, R.-J. Yan, V. Maruvanchery, E. Kayacan, I.-M. Chen, and L. K. Tiong. Transfer learning on convolutional activation feature as applied to a building quality assessment robot. International Journal of Advanced Robotic Systems, 14(3):1729881417712620, 2017.
[56] I. Loshchilov and F. Hutter. Decoupled weight decay regularization. 7th International Conference on Learning Representations, ICLR 2019, 2019.
[57] X. Luo, H. Li, H. Wang, Z. Wu, F. Dai, and D. Cao. Vision-based detection and visualization of dynamic workspaces. Automation in Construction, 104:1–13, 2019.
[58] J. W. Ma, T. Czerniawski, and F. Leite. Semantic segmentation of point clouds of building interiors with deep learning: Augmenting training datasets with synthetic BIM-based point clouds. Automation in Construction, 113(February):103144, 2020.
[59] R. Maalek, D. D. Lichti, and J. Y. Ruwanpura. Automatic recognition of common structural elements from point clouds for automated progress monitoring and dimensional quality control in reinforced concrete construction. Remote Sensing, 11(9), 2019.
[60] J. Martin. Productivity in the construction industry, UK: 2021, An investigation into productivity growth and its drivers for the UK construction industry. Office for National Statistics, pages 1–20, 2021.
[61] N. D. McKay and P. J. Besl. A method for registration of 3-d shapes. IEEE Transactions of Pattern Analysis and Machine Intelligence, 14(2):239–256, 1992.
[62] B. Mildenhall, P. P. Srinivasan, M. Tancik, J. T. Barron, R. Ramamoorthi, and R. Ng. NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12346 LNCS:405–421, 2020.
[63] T. Müller, A. Evans, C. Schied, and A. Keller. Instant Neural Graphics Primitives with a Multiresolution Hash Encoding. ACM Transactions on Graphics, 41(4), 2022.
[64] D. Murguia, Q. Chen, A. Rathnayake, T. Jansen Van Vuuren, V. Vilde, C. Middleton, and L. O’rourke. Digital Measurement of Construction Performance: Data-todashboard Strategy. In World Building Congress 2022, pages 1–10, 2022.
[65] N. O’Mahony, S. Campbell, A. Carvalho, S. Harapanahalli, G. V. Hernandez, L. Krpalkova, D. Riordan, and J. Walsh. Deep learning vs. traditional computer vision. In K. Arai and S. Kapoor, editors, Advances in Computer Vision, pages 128–144, Cham, 2020. Springer International Publishing.
[66] H. Omar, L. Mahdjoubi, and G. Kheder. Towards an automated photogrammetry based approach for monitoring and controlling construction site activities. Computers in Industry, 98:172–182, 2018.
[67] T. Omar and M. L. Nehdi. Data acquisition technologies for construction progress tracking. Automation in Construction, 70:143–155, 2016.
[68] A. Pal and S. H. Hsieh. Deep-learning-based visual data analytics for smart construction management. Automation in Construction, 131(August):103892, 2021.
[69] A. Pal, J. J. Lin, and S.-H. Hsieh. A Framework for Automated Daily Construction Progress Monitoring Leveraging Unordered Site Photographs. In Proc. of ASCE International Conference on Computing in Civil Engineering 2021, 2022.
[70] J. Park and Y. K. Cho. Point Cloud Information Modeling: Deep Learning–Based Automated Information Modeling Framework for Point Cloud Data. Journal of Construction Engineering and Management, 148(2):1–14, 2022.
[71] V. H. Pham, P. H. Chen, A. Pal, Christian, and S. H. Hsieh. Automatic extraction of daily concrete requirements from 3D BIM and project schedules. AIP Conference Proceedings, 2428(November), 2021.
[72] F. Pour Rahimian, S. Seyedzadeh, S. Oliver, S. Rodriguez, and N. Dawood. Ondemand monitoring of construction projects through a game-like hybrid application of BIM and machine learning. Automation in Construction, 110(August 2019):103012, 2020.
[73] Project Management Institute. A Guide to the project management body of knowledge (PMBOK® guide). 6th ed. 6th edition, 2017.
[74] Z. Pučko, N. Šuman, and D. Rebolj. Automated continuous construction progress monitoring using multiple workplace real time 3D scans. Advanced Engineering Informatics, 38(October 2017):27–40, 2018.
[75] M. Purdy and P. Daugherty. How AI Boosts Industry Profits and Innovation, 2017.
[76] N. Puri and Y. Turkan. Bridge construction progress monitoring using lidar and 4D design models. Automation in Construction, 109(September 2019):102961, 2020.
[77] A. Rasul, J. Seo, and A. Khajepour. Development of integrative methodologies for effective excavation progress monitoring. Sensors (Switzerland), 21(2):1–25, 2021.
[78] R. Ren and J. Zhang. An Integrated Framework to Support Construction Monitoring Automation Using Natural Language Processing and Sensing Technologies. Computing in Civil Engineering 2021 - Selected Papers from the ASCE International Conference on Computing in Civil Engineering 2021, pages 1101–1109, 2021.
[79] D. Roberts, W. T. Calderon, S. Tang, and M. Golparvar-Fard. Vision-based construction worker activity analysis informed by body posture. Journal of Computing in Civil Engineering, 34(4):04020017, 2020.
[80] R. Romero-Jarén and J. J. Arranz. Automatic segmentation and classification of BIM elements from point clouds. Automation in Construction, 124(January 2020), 2021.
[81] R. Sacks, I. Brilakis, E. Pikas, H. S. Xie, and M. Girolami. Construction with digital twin information systems. Data-Centric Engineering, 1(6), 2020.
[82] J. L. Schonberger and J. M. Frahm. Structure-from-Motion Revisited. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-Decem:4104–4113, 2016.
[83] H. Son and C. Kim. 3D structural component recognition and modeling method using color and 3D data for construction progress monitoring. Automation in Construction, 19(7):844–854, 2010.
[84] H. Son, C. Kim, and C. Kim. Automated Color Model–Based Concrete Detection in Construction-Site Images by Using Machine Learning Algorithms. Journal of Computing in Civil Engineering, 26(3):421–433, 2012.
[85] H. Son, C. Kim, and Y. Kwon Cho. Automated Schedule Updates Using AsBuilt Data and a 4D Building Information Model. Journal of Management in Engineering, 33(4):04017012, 2017.
[86] R. Szeliski. Computer Vision : Algorithms and Applications 2nd Edition. Springer, pages XXII, 925, 2021.
[87] M. Trzeciak, K. Pluta, Y. Fathy, L. Alcalde, S. Chee, A. Bromley, I. Brilakis, and P. Alliez. ConSLAM: Periodically Collected Real-World Construction Dataset for SLAM and Progress Monitoring. In European Conference on Computer Vision Workshops, pages 317–331, Tel Aviv, Israel, 2023. Springer.
[88] Y. Turkan, F. Bosche, C. T. Haas, and R. Haas. Automated progress tracking using 4D schedule and 3D sensing technologies. Automation in Construction, 22:414–421, 2012.
[89] Y. Turkan, F. Bosché, C. T. Haas, and R. Haas. Toward Automated Earned Value Tracking Using 3D Imaging Tools. Journal of Construction Engineering and Management, 139(4):423–433, 2013.
[90] K. P. Venkatesh, M. M. Raza, and J. C. Kvedar. Health digital twins as tools for precision medicine: Considerations for computation, implementation, and regulation. npj Digital Medicine, 5(1):150, 2022.
[91] T. Vu, K. Kim, T. M. Luu, T. Nguyen, J. Kim, and C. D. Yoo. Scalable softgroup for 3d instance segmentation on point clouds, 2022.
[92] T. Vu, K. Kim, T. M. Luu, X. T. Nguyen, and C. D. Yoo. SoftGroup for 3D Instance Segmentation on 3D Point Clouds. In CVPR, 2022.
[93] B. Wang, C. Yin, H. Luo, J. C. Cheng, and Q. Wang. Fully automated generation of parametric BIM for MEP scenes based on terrestrial laser scanning data. Automation in Construction, 125(May 2020):103615, 2021.
[94] L. Wang, N. Yang, X. Huang, B. Jiao, L. Yang, D. Jiang, R. Majumder, and F. Wei. Text embeddings by weakly-supervised contrastive pre-training, 2022.
[95] T. H. Wang, A. Pal, J. J. Lin, and S.-H. Hsieh. Construction photo localization in 3D reality models for vision-based automated daily project monitoring. Journal of Computing in Civil Engineering, 2023.
[96] Z. Wang, H. Li, and X. Yang. Vision-based robotic system for on-site construction and demolition waste sorting and recycling. Journal of Building Engineering, 32:101769, 2020.
[97] W. Wei, Y. Lu, T. Zhong, P. Li, and B. Liu. Integrated vision-based automated progress monitoring of indoor construction using mask region-based convolutional neural networks and BIM. Automation in Construction, 140(1239):104327, 2022.
[98] Y. Wu, H. Kim, C. Kim, and S. H. Han. Object Recognition in Construction-Site Images Using 3D CAD-Based Filtering. Journal of Computing in Civil Engineering, 24(1):56–64, 2010.
[99] Y. Xu, X. Shen, and S. Lim. CorDet: Corner-Aware 3D Object Detection Networks for Automated Scan-to-BIM. Journal of Computing in Civil Engineering, 35(3):04021002, 2021.
[100] J. Yang, M.-W. Park, P. A. Vela, and M. Golparvar-Fard. Construction performance monitoring via still images, time-lapse photos, and video streams: Now, tomorrow, and the future. Advanced Engineering Informatics, 29(2):211–224, 2015. Infrastructure Computer Vision.
[101] L. Yuan, J. Guo, and Q. Wang. Automatic classification of common building materials from 3D terrestrial laser scan data. Automation in Construction, 110(December 2019):103017, 2020.
[102] X. Zhang, N. Bakis, T. C. Lukins, Y. M. Ibrahim, S. Wu, M. Kagioglou, G. Aouad, A. P. Kaka, and E. Trucco. Automating progress measurement of construction projects. Automation in Construction, 18(3):294–301, 2009.
[103] X. Zhao, K.-W. Yeoh, and D. K. H. Chua. Extracting Construction Knowledge from Project Schedules Using Natural Language Processing. In The 10th International Conference on Engineering, Project, and Production Management, pages 197–211, 2019.
[104] Z. Zhu and I. Brilakis. Parameter optimization for automated concrete detection in image data. Automation in Construction, 19(7):944–953, 2010.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88604-
dc.description.abstract營建工程的進度管控是工程成功交付的重要關鍵,而目前影像資料已是瞭解工程進度的重要資訊來源之一。在過去幾十年中,許多研究開發應用電腦視覺於建築施工的自動進度監測的方法,這些方法在監測個別元件(如柱子、樑、牆壁)方面非常有效,但在監測由元件組成之排程工項工程進度(如一樓模板工程、鋼筋綁匝、混凝土澆置)仍有困難。現有方法通常難以推測介於已建或未建之間的未完成進度之工程狀態,因此限制了此技術在分析細節進度資訊方面的應用。本研究旨在透過兩種新的方法解決之研究問題。第一種方法稱為排程工項進度監測系統(ALPMS),旨在監測正在施工的元件之活動級別進度,主要以施工現場影像和四維建築信息模型(BIM)作為輸入,產出數位孿生資訊系統。該系統從影像中生成實境現場的點雲,將其與原排程中的BIM進行比較,並應用基於深度學習的語義分割進行進度推理。因此可估計每個工項的完成百分比,並且為更新專案進度提供有價值的資訊。數位孿生資訊系統同時可將語義資訊整合到實際建造的點雲和BIM中,實現進度狀態的三維可視化。第二種方法研究於在缺乏更新的四維BIM的情況下也能自動比對專案進度與實際模型,該比對方法首先使用三維BIM或控制點將實際模型對應到世界坐標系統,然後應用點雲分割來檢測與特定位置、建築元件和工項相關的進度。使用基於自然語言處理(NLP)的技術從每個工項中提取相關位置、元件和任務的資訊。從實際模型和工項中提取的資料通過基於語意距離的匹配技術進行比對,再將進度資訊與相應的進度活動進行比對。以上提出的方法已在台灣的工程專案上應用及評估並且有展示相關的有效性和適用性,ALPMS成功分析工項等級的進度狀態,平均絕對誤差少於6%,而缺乏四維BIM的情況下仍可準確更新進度資料。這些方法論通過提供有關元件和工項等級進度的分析,為工程進度監測領域進一步的貢獻,也可使人們更容易地理解專案狀態,以實現高效的進度管理和做出明智的決策,進而促進專案成功交付。zh_TW
dc.description.abstractMonitoring the progress of construction projects is crucial for ensuring successful project delivery. Visual data, such as images and videos, has emerged as a valuable source of information to understand the status of construction operations. Over the past few decades, several vision-based methods have been developed for automated progress monitoring in building construction. These methods have been effective in monitoring individual elements (e.g., columns, beams, walls) but face challenges in monitoring progress at the schedule activity level (e.g., formwork, reinforcement, concrete). Existing methods often struggle to report progress status beyond a binary form of built or not-built, limiting their usefulness in capturing nuanced progress information. This research focuses on addressing these challenges through two novel methodologies. The first methodology, called the Activity Level Progress Monitoring System (ALPMS), aims to monitor progress at the activity level of under-construction building elements. It takes construction site images and a four-dimensional building information model (BIM) as input and creates a Digital Twin information system. The system generates as-built point clouds from the images, compares them with the as-planned BIM, and applies deep learning-based semantic segmentation for progress reasoning. This enables the estimation of activity-wise completion percentages, providing valuable information for updating project schedules. The DT information system also integrates rich semantic information into the as-built point cloud and BIM, enabling three-dimensional visualization of progress status. The second methodology focuses on automatically aligning project schedules with reality models, even in the absence of an updated 4D BIM. The alignment method starts by aligning reality models to the world coordinate system using a 3D BIM or control points. Point cloud segmentation is then applied to detect progress associated with specific locations, building elements, and tasks. Information about locations, elements, and tasks is extracted from each schedule activity using natural language processing (NLP)-based techniques. Extracted information from the reality models and the schedule activities are matched through a distance-based matching technique, mapping the progress information with the corresponding schedule activities. The proposed methodologies have been applied and evaluated on construction projects in Taiwan, demonstrating their effectiveness and applicability. The ALPMS successfully reports activity level progress status with less than 6% mean absolute error. The automatic alignment method shows promise in accurately updating progress information without relying on an updated 4D BIM. These methodologies contribute to the field of construction progress monitoring by providing accurate and detailed insights into progress at both the element and activity levels. They enable a better understanding of project status, efficient schedule management, and informed decision-making, ultimately facilitating successful project delivery.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-15T17:01:35Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2023-08-15T17:01:35Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsTable of Contents
Verification Letter from the Oral Examination Committee i
Acknowledgements iii
摘要 v
Abstract vii
Table of Contents xi
List of Figures xv
List of Tables xix
Abbreviations xxi
Chapter 1 Introduction 1
1.1 Motivation and Background . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Problem statement . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.1 Lack of methods for activity-level progress monitoring . . . . . . . 4
1.2.2 Lack of methods for reporting partial completion of progress . . . . 5
1.2.3 Lack of methods for automatic schedule update . . . . . . . . . . . 7
1.3 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.4 Organization of the thesis . . . . . . . . . . . . . . . . . . . . . . . 13
Chapter 2 Literature Review 15
2.1 Vision-based construction progress monitoring . . . . . . . . . . . . 15
2.1.1 Element-level Progress Monitoring . . . . . . . . . . . . . . . . . . 15
2.1.2 Activity-level Progress Monitoring . . . . . . . . . . . . . . . . . . 18
2.1.3 Progress monitoring of partially completed elements and activities . 19
2.2 Deep-learning-based data analytics in construction management . . . 21
2.2.1 Knowledge extraction from visual data . . . . . . . . . . . . . . . . 22
2.2.2 Information extraction from construction schedules . . . . . . . . . 22
2.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Chapter 3 Progress estimation and visualization 27
3.1 Methodology: Overview of the activity-level progress monitoring system (ALPMS) . . . . . . . . . . . . . . 27
3.1.1 Input . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.1.2 3D reconstruction, camera pose estimation, and BIM+point cloud registration . . . . . . . . . . . . . . . 30
3.1.3 Detection of elements under construction . . . . . . . . . . . . . . . 31
3.1.4 Orthographic view synthesis . . . . . . . . . . . . . . . . . . . . . 32
3.1.4.1 Projective transformation of a selected camera view . . 33
3.1.4.2 Novel orthographic view synthesis using Neural radiance field (NeRF) . . . . . . . . . . . . . . . . . . . . 36
3.1.5 Dataset preparation for semantic segmentation . . . . . . . . . . . . 41
3.1.6 Progress status detection . . . . . . . . . . . . . . . . . . . . . . . 42
3.1.7 Completion percentage estimation . . . . . . . . . . . . . . . . . . 44
3.1.8 Progress visualization . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.2 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
3.2.1 Input data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
3.2.2 3D reconstruction, registration, occupancy detection . . . . . . . . . 48
3.2.3 Orthographic view synthesis . . . . . . . . . . . . . . . . . . . . . 51
3.2.3.1 Projective transformation of selected camera view . . . 51
3.2.3.2 NeRF-based orthographic view synthesis . . . . . . . . 52
3.2.4 Deep-learning-based semantic segmentation . . . . . . . . . . . . . 55
3.3 Results and discussions . . . . . . . . . . . . . . . . . . . . . . . . . 57
3.3.1 Performance of the NeRF . . . . . . . . . . . . . . . . . . . . . . . 57
3.3.1.1 Factors affecting synthetic image quality and training time . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
3.3.1.2 Effect of element-to-camera distance on orthographic view synthesis . . . . . . . . . . . . . . . . . . . . 59
3.3.2 Performance of the semantic segmentation . . . . . . . . . . . . . . 60
3.3.2.1 Hyperparameter tuning and model selection . . . . . . 60
3.3.2.2 Performance of the selected model on original images . 62
3.3.2.3 Performance on synthetic data . . . . . . . . . . . . . . 62
3.3.3 Performance of the overall activity-level progress monitoring system 66
3.3.3.1 Percentage completion estimation . . . . . . . . . . . . 66
3.3.3.2 Progress visualization . . . . . . . . . . . . . . . . . . 70
3.3.4 Implementing ALPMS in the construction management process . . . 72
3.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
Chapter 4 Progress update in the project schedule 77
4.1 Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
4.1.1 Progress estimation from reality models with L-E-M information . . 77
4.1.2 Location, building element, and material recognition from schedule activity . . .. . . . . . . . . . . . . . . 80
4.1.3 Matching activity and progress data using L-E-M strings . . . . . . 83
4.2 Experiments and Results . . . . . . . . . . . . . . . . . . . . . . . . 86
4.2.1 Semantic segmentation of point clouds . . . . . . . . . . . . . . . . 87
4.2.2 Progress percentage estimation . . . . . . . . . . . . . . . . . . . . 89
4.2.3 Information extraction from schedule activities . . . . . . . . . . . 90
4.2.4 Mapping schedule activities and estimated progress . . . . . . . . . 93
4.2.5 Comparison between with and without BIM progress estimation workflows . . .. . . . . . . . . . . . . . . 94
4.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
Chapter 5 Conclusion and future research directions 99
5.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
5.2 Future research directions . . . . . . . . . . . . . . . . . . . . . . . 102
5.2.1 Real-time monitoring of construction projects . . . . . . . . . . . . 102
5.2.2 Productivity assessment by linking progress and resource data . . . 103
5.2.3 Predictive monitoring of construction projects . . . . . . . . . . . . 104
5.2.4 Quality control and quality assurance . . . . . . . . . . . . . . . . . 105
References 107
-
dc.language.isoen-
dc.title基於影像及排程之營建數位孿生自動化進度監控zh_TW
dc.titleAutomated activity-level progress monitoring from visual data and schedules through digital twin constructionen
dc.typeThesis-
dc.date.schoolyear111-2-
dc.description.degree博士-
dc.contributor.coadvisor林之謙zh_TW
dc.contributor.coadvisorJacob Je-Chian LINen
dc.contributor.oralexamcommittee陳俊杉;周建成;楊亦東;陳柏翰;康仕仲zh_TW
dc.contributor.oralexamcommitteeChuin-Shan CHEN;Chien-Cheng CHOU;I-Tung YANG;Po-Han CHEN;Shih-Chung KANGen
dc.subject.keyword深度學習,計算機視覺,活動級進度監控,神經輻射場,自然語言處理,日程更新,數字孿生,zh_TW
dc.subject.keywordDeep learning,computer vision,activity-level progress monitoring,neural radiance field,natural language processing,schedule update,digital twin,en
dc.relation.page120-
dc.identifier.doi10.6342/NTU202302333-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2023-08-04-
dc.contributor.author-college工學院-
dc.contributor.author-dept土木工程學系-
顯示於系所單位:土木工程學系

文件中的檔案:
檔案 大小格式 
ntu-111-2.pdf26.35 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved