請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/100211完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 林之謙 | zh_TW |
| dc.contributor.advisor | Jacob J. Lin | en |
| dc.contributor.author | 黎泰和 | zh_TW |
| dc.contributor.author | Le Thai Hoa | en |
| dc.date.accessioned | 2025-09-24T16:52:04Z | - |
| dc.date.available | 2025-09-25 | - |
| dc.date.copyright | 2025-09-24 | - |
| dc.date.issued | 2025 | - |
| dc.date.submitted | 2025-08-12 | - |
| dc.identifier.citation | References
[1] JW Garrett and Jochen Teizer. Human factors analysis classification system relating to human error awareness taxonomy in construction safety. Journal of construction engineering and management, 135(8):754–763, 2009. [2] Kiyanoosh Golchin Rad. Application of Domino Theory to Justify and Prevent Accident Occurance in Construction Sites. IOSR Journal of Mechanical and Civil Engineering, 6:72–76, 2013. [3] Zhipeng Zhou, Lixuan Wei, and Haiying Luan. Deep learning for named entity recognition in extracting critical information from struck-by accidents in construction. Automation in Construction, 173:106106, 2025. [4] Kinam Kim, Hongjo Kim, and Hyoungkwan Kim. Image-based construction hazard avoidance system using augmented reality in wearable device. Automation in construction, 83:390–403, 2017. [5] OSHA Outreach Courses. Health and safety blog. https://https://www. oshaoutreachcourses.com/blog/fatal-four-osha, 2024. Accessed: July 15, 2025. [6] Bureau of Labor Statistics. National census of fatal occupational injuries in 2022. https://www.bls.gov/news.release/pdf/cfoi.pdf, 2022. Accessed: July 15, 2025. [7] William Harris, Thomas Yohannes, and Amber Brooke Trueblood. Fatal and nonfatal Focus Four injuries in construction. https://stacks.cdc.gov/view/cdc/125702, 2023. Accessed: July 15, 2025. [8] OSHA. Near-miss incident report form. https://www.osha.gov/sites/default/files/2021-07/Template%20for%20Near%20Miss%20Report%20Form.pdf, 2021. Accessed: July 15, 2025. [9] Herbert William Heinrich. Industrial Accident Prevention. A Scientific Approach. McGraw-Hill, 1941. [10] Frank E Bird. Management guide to loss control. NOSA Safety Centre (PO Box 26434, Arcadia 0007), 1984. [11] Sidney Dekker. The 1930s and onward: Heinrich and behavior-based safety. In Foundations of Safety Science, pages 87–136. Routledge, 2019. [12] Zuzanna Woźniak and Bożena Hoła. The structure of near misses and occupational accidents in the polish construction industry. Heliyon, 10(4):e26410, 2024. [13] Carl Macrae. The problem with incident reporting. BMJ quality & safety, 25(2):71–75, 2016. [14] Zuzanna Woźniak and Bożena Hoła. The structure of near misses and occupational accidents in the polish construction industry. Heliyon, 10(4), 2024. [15] Maria Grazia Gnoni, Fabiana Tornese, A Guglielmi, M Pellicci, G Campo, and D De Merich. Near miss management systems in the industrial sector: A literature review. Safety science, 150:105704, 2022. [16] Hongling Guo, Yantao Yu, and Martin Skitmore. Visualization technology-based construction safety management: A review. Automation in construction, 73:135–144, 2017. [17] Sogand Hasanzadeh, Behzad Esmaeili, and Michael D Dodd. Measuring the impacts of safety knowledge on construction workers'attentional allocation and hazard detection using remote eye-tracking technology. Journal of management in engineering, 33(5):04017024, 2017. [18] Qihua Chen, Danbing Long, Cheng Yang, and Hu Xu. Knowledge graph improved dynamic risk analysis method for behavior-based safety management on a construction site. Journal of Management in Engineering, 39(4):04023023, 2023. [19] Qihua Chen and Xianfei Yin. Tailored vision-language framework for automated hazard identification and report generation in construction sites. Advanced Engineering Informatics, 66:103478, 2025. [20] Shaoqing Ren, Kaiming He, Ross Girshick, and Jian Sun. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE transactions on pattern analysis and machine intelligence, 39(6):1137–1149, 2016. [21] Joseph Redmon, Santosh Divvala, Ross Girshick, and Ali Farhadi. You only look once: Unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 779–788, 2016. [22] Nicolai Wojke, Alex Bewley, and Dietrich Paulus. Simple online and realtime tracking with a deep association metric. In 2017 IEEE international conference on image processing (ICIP), pages 3645–3649. IEEE, 2017. [23] Seokho Chi, Carlos H Caldas, and Dae Young Kim. A methodology for object identification and tracking in construction based on spatial modeling and image matching techniques. Computer-Aided Civil and Infrastructure Engineering, 24(3):199–211, 2009. [24] JoonOh Seo, SangUk Han, SangHyun Lee, and Hyoungkwan Kim. Computer vision techniques for construction safety and health monitoring. Advanced Engineering Informatics, 29(2):239–251, 2015. [25] Almo Senja Kulinan, Minsoo Park, Pa Pa Win Aung, Gichun Cha, and Seunghee Park. Advancing construction site workforce safety monitoring through BIM and computer vision integration. Automation in Construction, 158:105227, 2024. [26] Xiao Li, Wen Yi, Hung-Lin Chi, Xiangyu Wang, and Albert PC Chan. A critical review of virtual and augmented reality (VR/AR) applications in construction safety. Automation in Construction, 86:150–162, 2018. [27] Xiangyu Wang, Jun Wang, Chenke Wu, Shuyuan Xu, and Wei Ma. Engineering brain: Metaverse for future engineering. AI in Civil Engineering, 1(1):2, 2022. [28] Yi Tan, Wenyu Xu, Penglu Chen, and Shuyan Zhang. Building defect inspection and data management using computer vision, augmented reality, and BIM technology. Automation in Construction, 160:105318, 2024. [29] Ilker Baki Alkan and Hasan Basri Basaga. Augmented reality technologies in construction project assembly phases. Automation in Construction, 156:105107, 2023. [30] Sepehr Alizadehsalehi, Ahmad Hadavi, and Joseph Chuenhuei Huang. From BIM to extended reality in AEC industry. Automation in Construction, 116:103254, 2020. [31] Ahmad Hadavi and Sepehr Alizadehsalehi. From BIM to metaverse for AEC industry. Automation in Construction, 160:105248, 2024. [32] Cheng Zhou, Rui Chen, Shuangnan Jiang, Ying Zhou, Lieyun Ding, Miroslaw J Skibniewski, and Xinggui Lin. Human dynamics in near-miss accidents resulting from unsafe behavior of construction workers. Physica A: Statistical Mechanics and its Applications, 530:121495, 2019. [33] Hongjo Kim, Kinam Kim, and Hyoungkwan Kim. Vision-based object-centric safety assessment using fuzzy inference: Monitoring struck-by accidents with moving objects. Journal of Computing in Civil Engineering, 30(4):04015075, 2016. [34] Sijie Zhang, Jochen Teizer, Nipesh Pradhananga, and Charles M Eastman. Workforce location tracking to model, visualize and analyze workspace requirements in building information models for construction safety planning. Automation in Construction, 60:74–86, 2015. [35] Vito Getuli, Silvia Mastrolembo Ventura, Pietro Capone, and Angelo LC Ciribini. Bim-based code checking for construction health and safety. Procedia engineering, 196:454–461, 2017. [36] Emily J Haas and Patrick L Yorio. The role of risk avoidance and locus of control in workers'near miss experiences: Implications for improving safety management systems. Journal of loss prevention in the process industries, 59:91–99, 2019. [37] Mostafa Namian, Farshid Taherpour, Ebrahim Ghiasvand, and Yelda Turkan. Insidious safety threat of fatigue: Investigating construction workers'risk of accident due to fatigue. Journal of construction engineering and management, 147(12):04021162, 2021. [38] Ioannis Brilakis, Man-Woo Park, and Gauri Jog. Automated vision tracking of project related entities. Advanced Engineering Informatics, 25(4):713–724, 2011. [39] Jochen Teizer and Tao Cheng. Proximity hazard indicator for workers-on-foot near miss interactions with construction equipment and geo-referenced hazard areas. Automation in construction, 60:58–73, 2015. [40] Kanghyeok Yang, Changbum R Ahn, Mehmet C Vuran, and Sepideh S Aria. Semisupervised near-miss fall detection for ironworkers with a wearable inertial measurement unit. Automation in construction, 68:194–202, 2016. [41] Xiao Lin and Hongling Guo. A sensor-based method to detect near-miss struck-by on construction site. In International Symposium on Advancement of Construction Management and Real Estate, pages 1202–1217. Springer, 2021. [42] Sayan Sakhakarmi, JeeWoong Park, and Ashok Singh. Tactile-based wearable system for improved hazard perception of worker and equipment collision. Automation in Construction, 125:103613, 2021. [43] Yong-Cheol Lee, Moeid Shariatfar, Abbas Rashidi, and Hyun Woo Lee. Evidencedriven sound detection for prenotification and identification of construction safety hazards and accidents. Automation in Construction, 113:103127, 2020. [44] Arash Shahi, Afrooz Aryan, Jeffrey S West, Carl T Haas, and Ralph CG Haas. Deterioration of UWB positioning during construction. Automation in Construction, 24:72–80, 2012. [45] Shaohua Huang, Yu Guo, Shanshan Zha, Falin Wang, and Weiguang Fang. A realtime location system based on RFID and UWB for digital manufacturing workshop. Procedia Cirp, 63:132–137, 2017. [46] Olga Golovina, Jochen Teizer, and Nipesh Pradhananga. Heat map generation for predictive safety planning: Preventing struck-by and near miss interactions between workers-on-foot and construction equipment. Automation in Construction, 71:99– 115, 2016. [47] Raiful Hasan and Ragib Hasan. Pedestrian safety using the internet of things and sensors: Issues, challenges, and open problems. Future generation computer systems, 134:187–203, 2022. [48] Wei Liu, Dragomir Anguelov, Dumitru Erhan, Christian Szegedy, Scott Reed, Cheng-Yang Fu, and Alexander C Berg. Ssd: Single shot multibox detector. In Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, October 11–14, 2016, Proceedings, Part I 14, pages 21–37. Springer, 2016. [49] Weili Fang, Lieyun Ding, Peter ED Love, Hanbin Luo, Heng Li, Feniosky Peña-Mora, Botao Zhong, and Cheng Zhou. Computer vision applications in construction safety assurance. Automation in Construction, 110:103013, 2020. [50] Xuzhong Yan, Hong Zhang, and Heng Li. Computer vision-based recognition of 3d relationship between construction entities for monitoring struck-by accidents. Computer-Aided Civil and Infrastructure Engineering, 35(9):1023–1038, 2020. [51] Suman Paneru and Idris Jeelani. Computer vision applications in construction: Current state, opportunities & challenges. Automation in Construction, 132:103940, 2021. [52] Olga Golovina, Jochen Teizer, Karsten W Johansen, and Markus König. Towards autonomous cloud-based close call data management for construction equipment safety. Automation in Construction, 132:103962, 2021. [53] Steve Benford, Chris Greenhalgh, Gail Reynard, Chris Brown, and Boriana Koleva. Understanding and constructing shared spaces with mixed-reality boundaries. ACM Transactions on computer-human interaction (TOCHI), 5(3):185–223, 1998. [54] Paul Milgram, Herman Colquhoun, et al. A taxonomy of real and virtual world display integration. Mixed reality: Merging real and virtual worlds, 1(1999):1–26, 1999. [55] Xiangyu Wang, Martijn Truijens, Lei Hou, Ying Wang, and Ying Zhou. Integrating Augmented Reality with Building Information Modeling: Onsite construction process controlling for liquefied natural gas industry. Automation in Construction, 40:96–105, 2014. [56] Hung-Lin Chi, Shih-Chung Kang, and Xiangyu Wang. Research trends and opportunities of augmented reality applications in architecture, engineering, and construction. Automation in Construction, 33:116–122, 2013. [57] Juan Manuel Davila Delgado, Lukumon Oyedele, Peter Demian, and Thomas Beach. A research agenda for augmented and virtual reality in architecture, engineering and construction. Advanced Engineering Informatics, 45:101122, 2020. [58] Yoshiyuki Mizuno, Hirokazu Kato, and Shogo Nishida. Outdoor augmented reality for direct display of hazard information. In SICE 2004 Annual Conference, volume 1, pages 831–836. IEEE, 2004. [59] Alex Albert, Matthew R Hallowell, Brian Kleiner, Ao Chen, and Mani Golparvar-Fard. Enhancing construction hazard recognition with high-fidelity augmented virtuality. Journal of Construction Engineering and Management, 140(7):04014024, 2014. [60] Chan-Sik Park and Hyeon-Jin Kim. A framework for construction safety management and visualization system. Automation in Construction, 33:95–103, 2013. [61] Shaoze Wu, Lei Hou, Guomin Kevin Zhang, and Haosen Chen. Real-time mixed reality-based visual warning for construction workforce safety. Automation in Construction, 139:104252, 2022. [62] Haosen Chen, Lei Hou, Shaoze Wu, Guomin Zhang, Yang Zou, Sungkon Moon, and Muhammed Bhuiyan. Augmented reality, deep learning and vision-language query system for construction worker safety. Automation in Construction, 157:105158, 2024. [63] Shaoze Wu, Haosen Chen, Lei Hou, Guomin Kevin Zhang, and Chun-Qing Li. Using eye-tracking to measure worker situation awareness in augmented reality. Automation in Construction, 165:105582, 2024. [64] Peizhen Gong, Ying Lu, Ruggiero Lovreglio, Xiaofeng Lv, and Zexun Chi. Applications and effectiveness of augmented reality in safety training: A systematic literature review and meta-analysis. Safety Science, 178:106624, 2024. [65] Atsushi Tagami and Zhishu Shen. LESAR: Localization System for Environmental Sensors using Augmented Reality. In 2020 IEEE 44th Annual Computers, Software, and Applications Conference (COMPSAC), pages 1667–1672. IEEE, 2020. [66] Francis Baek, Inhae Ha, and Hyoungkwan Kim. Augmented reality system for facility management using image-based indoor localization. Automation in construction, 99:18–26, 2019. [67] HyeongYeop Kang and JungHyun Han. SafeXR: alerting walking persons to obstacles in mobile xr environments. The Visual Computer, 36(10):2065–2077, 2020. [68] Rafael Sacks, Mark Girolami, and Ioannis Brilakis. Building information modelling, artificial intelligence and construction tech. Developments in the Built Environment, 4:100011, 2020. [69] Mikela Chatzimichailidou and Yue Ma. Using bim in the safety risk management of modular construction. Safety science, 154:105852, 2022. [70] Xu Shen and Eric Marks. Near-miss information visualization tool in bim for construction safety. Journal of Construction Engineering and Management, 142(4):04015100, 2016. [71] Asier Bikandi, Muhammad Shaheer, Hriday Bavle, Jayan Jevanesan, HolgerVoos, and Jose Luis Sanchez-Lopez. BIM-Constrained Optimization for Accurate Localization and Deviation Correction in Construction Monitoring. arXiv preprint arXiv:2504.17693, 2025. [72] Jun Yang, Man-Woo Park, Patricio A Vela, and Mani Golparvar-Fard. Construction performance monitoring via still images, time-lapse photos, and video streams: Now, tomorrow, and the future. Advanced Engineering Informatics, 29(2):211–224, 2015. [73] Mingyuan Zhang, Rui Shi, and Zhen Yang. A critical review of vision-based occupational health and safety monitoring of construction site workers. Safety science, 126:104658, 2020. [74] Brian HW Guo, Yang Zou, Yihai Fang, Yang Miang Goh, and Patrick XW Zou. Computer vision technologies for safety science and management in construction: A critical review and future research directions. Safety science, 135:105130, 2021. [75] Idris Jeelani, Khashayar Asadi, Hariharan Ramshankar, Kevin Han, and Alex Albert. Real-time vision-based worker localization & hazard detection for construction. Automation in Construction, 121:103448, 2021. [76] Thai-Hoa Le and Jacob J Lin. AR-enhanced near-miss incident alert system for onsite moving equipment. Safety Science, 191:106912, 2025. [77] Kiyoshi Kiyokawa, Yoshinori Kurata, and Hiroyuki Ohno. An optical see-through display for mutual occlusion of real and virtual environments. In Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000), pages 60–67. IEEE, 2000. [78] Raphaël Grasset, Philip Lamb, and Mark Billinghurst. Evaluation of mixed-space collaboration. In Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR’05), pages 90–99. IEEE, 2005. [79] Suyang Dong, Amir H Behzadan, Feng Chen, and Vineet R Kamat. Collaborative visualization of engineering processes using tabletop augmented reality. Advances in Engineering Software, 55:45–55, 2013. [80] Tin-Hui Lin, Chao-Hsiang Liu, Meng-Han Tsai, and Shih-Chung Kang. Using augmented reality in a multiscreen environment for construction discussion. Journal of Computing in Civil Engineering, 29(6):04014088, 2015. [81] Thammathip Piumsomboon, Arindam Dey, Barrett Ens, Gun Lee, and Mark Billinghurst. The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 6:5, 2019. [82] James Garbett, Thomas Hartley, and David Heesom. A multi-user collaborative BIM-AR system to support design and construction. Automation in Construction, 122:103487, 2021. [83] Bernardo Marques, Samuel Silva, João Alves, Tiago Araujo, Paulo Dias, and Beatriz Sousa Santos. A conceptual model and taxonomy for collaborative augmented reality. IEEE transactions on visualization and computer graphics, 28(12):5113–5133, 2021. [84] Faisal Zaman, Craig Anslow, Andrew Chalmers, and Taehyun Rhee. Mrmac: Mixed reality multi-user asymmetric collaboration. In 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pages 591–600. IEEE, 2023. [85] Johannes Lutz Schönberger and Jan-Michael Frahm. Structure-from-motion revisited. In Conference on Computer Vision and Pattern Recognition (CVPR), 2016. [86] Aritra Pal, Jacob J Lin, and Shang-Hsien Hsieh. Schedule-driven analytics of 3D point clouds for automated construction progress monitoring. Computing in Civil Engineering 2023, pages 412–420, 2023. [87] Tse Hsiang Wang, Aritra Pal, Jacob J Lin, and Shang-Hsien Hsieh. Construction photo localization in 3D reality models for vision-based automated daily project monitoring. Journal of Computing in Civil Engineering, 37(6):04023029, 2023. [88] Seau Chen Houng, Aritra Pal, and Jacob J Lin. 4D BIM and reality model-driven camera placement optimization for construction monitoring. Journal of Construction Engineering and Management, 150(6):04024045, 2024. [89] Zhao Han, Xiongyao Xie, Genji Tang, Peifeng Li, and Shouren Li. A KDimensional Tree–Iterative losest Point Algorithm for Overbreak and Underbreak Assessment of Mountain Tunnels. Applied Sciences, 15(2):566, 2025. [90] Bo Xiao and Shih-Chung Kang. Development of an Image Data Set of Construction Machines for Deep Learning Object Detection. Journal of Computing in Civil Engineering, 35(2):05020005, 2021. [91] Zijian Wang, Yimin Wu, Lichao Yang, Arjun Thirunavukarasu, Colin Evison, and Yifan Zhao. Fast personal protective equipment detection for real construction sites using deep learning approaches. Sensors, 21(10), 2021. [92] Glenn Jocher, Ayush Chaurasia, and Jing Qiu. Ultralytics YOLO, January 2023. [93] Isaac Robinson, Peter Robicheaux, and Matvei Popov. RF-DETR. https://github. com/roboflow/rf-detr, 2025. SOTA Real-Time Object Detection Model. [94] Richard Hartley and Andrew Zisserman. Multiple View Geometry in Computer Vision - Second Edition. Cambridge University Press, 2004. [95] Thai-Hoa Le and Jacob J Lin. Reality and BIM Model-Driven Near-Miss Alerting Framework for Construction Equipment Using AR Interface. In Computing in Civil Engineering 2023, pages 257–265. 2023. [96] Thai-Hoa Le and Jacob J Lin. An On-site and Off-site Collaborative Safety Monitoring Framework using Augmented and Virtual Reality for Nearmiss Incidents. In International conference on construction engineering and project management, pages 909–916. Korea Institute of Construction Engineering and Management, 2024. [97] Vincent Fremont, Manh Tuan Bui, Djamal Boukerroui, and Pierrick Letort. Vision-based people detection system for heavy machine applications. Sensors, 16(1):128, 2016. [98] Byung-Wan Jo, Yun-Sung Lee, Rana Muhammad Asad Khan, Jung-Hoon Kim, and Do-Keun Kim. Robust construction safety system (rcss) for collision accidents prevention on construction sites. Sensors, 19(4):932, 2019. [99] MORGAN SINDALL construction. Plant and equipment minimum standards. https://https://morgan-sindall-pztazn5y-media.s3.amazonaws.com/construction/documents/Plant_and_Equipment_Minimum_Standards_and_poster.pdf, 2020. Accessed: July 15, 2025. [100] Kristofer D Kusano and Hampton C Gabler. Safety benefits of forward collision warning, brake assist, and autonomous braking systems in rear-end collisions. IEEE Transactions on Intelligent Transportation Systems, 13(4):1546–1555, 2012. [101] Chao Dong, Heng Li, Xiaochun Luo, Lieyun Ding, Joanna Siebert, and Hanbin Luo. Proactive struck-by risk detection with movement patterns and randomness. Automation in Construction, 91:246–255, 2018. [102] Jing Xu and Alex R Bowers. Hazard warning modalities and timing thresholds for older drivers with impaired vision. Accident Analysis & Prevention, 202:107599, 2024. [103] Juwon Hong, Sangkil Song, Hyuna Kang, Jinwoo Choi, Taehoon Hong, and Dong-Eun Lee. Influence of visual environments on struck-by hazards for construction equipment operators through virtual eye-tracking. Automation in Construction, 161:105341, 2024. [104] Yiqi Zhang, Changxu Wu, Chunming Qiao, and Yunfei Hou. The effects of warning characteristics on driver behavior in connected vehicles systems with missed warnings. Accident Analysis & Prevention, 124:138–145, 2019. [105] JS Chugh and JK Caird. In-vehicle train warnings (itw): The effect of reliability and failure type on driver perception response time and trust. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, volume 43, pages 1012–1016. SAGE Publications Sage CA: Los Angeles, CA, 1999. [106] Hyojoo Son, Hyeonwoo Seong, Hyunchul Choi, and Changwan Kim. Real-time vision-based warning system for prevention of collisions between workers and heavy equipment. Journal of Computing in Civil Engineering, 33(5):04019029,2019. [107] Hyungsoo Kim, Jaehwan Seong, and Hyung-Jo Jung. Real-Time Struck-By Hazards Detection System for Small- and Medium-Sized Construction Sites Based on Computer Vision Using Far-Field Surveillance Videos. Journal of Computing in Civil Engineering, 37(6):04023028, 2023. | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/100211 | - |
| dc.description.abstract | 營造業作為全球最具危險性的產業之一,工安意外與職業災害的發生率始終居高不下。在各種主動式的安全策略中,虛驚事故(near-miss)的偵測與分析因其能提供預警並避免意外發生而日益受到重視。然而,傳統的安全管理方法往往無法即時掌握現場情況並提供及時干預措施,例如透過人工監督及事件後的事故報告。近年來,受惠於數位科技的快速進展,透過實現即時感測與沉浸式的可視化技術為這些傳統方法的限制帶來了新契機。儘管如此,許多現有的解決方案在以工人為中心導向的監控方面仍然面臨挑戰,主要原因包括:視線盲點及攝影機覆蓋程度導致觀察不完整,抑或是因為標示未能正確辨識或視覺線索被忽略而導致資訊不充足。混合實境(Mixed Reality, MR)技術為此提供了可行的解決方法,透過沉浸式與空間精準的可視化來強化風險識別,並輔助即時決策與促進協同安全管理。
本研究提出一套支援多使用者的MR虛驚事故回應系統,該系統用於提升虛驚事故的危害偵測,並在施工現場提供及時警示。本研究所提出之系統整合 MR 與建築資訊模型(Building Information Modeling, BIM)資料、實境擷取資料(點雲)及固定式攝影機網絡,用於偵測接近中的施工設備並對 MR 使用者發出及時警示。警示訊息透過多模態回饋傳遞,包括視覺警示標誌、三維方向指示器與聲音警報。系統亦支援多用戶之間的溝通協作,並允許使用者間共享危險警示與協同應對之決策。 本系統依據四階段框架進行開發:(1) 資料註冊、(2) 虛擬場景生成、(3) 物件偵測以及 (4) 基於MR的虛驚事故警示可視化。物件偵測採用YOLOv8模型執行,MR裝置則是使用Microsoft HoloLens 2。偵測結果透過Wi-Fi網路上的UDP連線傳送至HoloLens,虛驚事故事件則是直接在HoloLens根據位置的分析來進行識別。本研究透過兩組實驗場景進一步驗證,系統偵測的準確率達到91.3\%,並展現平均180毫秒的即時反應效能。此外,多用戶通訊模組的延遲僅為103毫秒。研究亦透過九位建築從業人員參與的半結構式訪談回饋進一步證實該系統在強化情境感知、支援使用者應對及提升協同安全管理方面具備實務可行性與有效性。 | zh_TW |
| dc.description.abstract | The construction industry remains one of the most hazardous sectors worldwide, consistently experiencing high rates of occupational accidents. Among various proactive safety strategies, the detection and analysis of near-miss events have become increasingly important for providing early warnings and preventing future accidents. However, traditional safety management practices, such as manual supervision and retrospective incident reporting, often fall short in delivering real-time situational awareness and timely intervention. Recent advances in digital technologies, especially those enabling real-time sensing and immersive visualization, offer new opportunities to address these limitations. Nevertheless, many existing solutions struggle with worker-centered monitoring due to incomplete observation, caused by blind spots and limited camera coverage, and insufficient information, such as unrecognized signage or overlooked visual cues. Mixed Reality (MR) technology presents a promising approach to overcome these challenges by enabling immersive, spatially accurate visualizations that enhance hazard recognition, support real-time decision-making, and promote collaborative safety management.
This study introduces a multi-user MR near-miss response system that enhances near-miss hazard detection and alert delivery in construction environments in real-time. The proposed system integrates MR with Building Information Modeling (BIM) data, reality capture data (point clouds), and a fixed camera network to detect approaching construction equipment and issue timely warnings to MR users. Alerts are delivered through multimodal feedback, including visual warning signs, 3D directional indicators, and auditory alarms. The system also facilitates multi-user communication, allowing for shared hazard awareness and coordinated decision-making. The system was developed following a four-stage framework: (1) data registration, (2) virtual scene generation, (3) object detection, and (4) MR-based near-miss alert visualization. Object detection is performed using the YOLOv8 model, and the Microsoft HoloLens 2 serves as the MR device. Detection results are transmitted to the HoloLens via a UDP connection over a Wi-Fi network. Near-miss events are identified using a location-based analysis executed directly on the HoloLens. Validation through two experimental scenarios demonstrated a detection accuracy of 91.3\% and data transfer speed at an average latency of 180 milliseconds, which is relevant to a real-time response. The multi-user communication module also achieved a latency of approximately 103 milliseconds. Qualitative insights gathered from nine semi-structured interviews with construction practitioners further confirmed the system’s practical effectiveness in enhancing situational awareness, supporting user response, and improving coordinated safety management in dynamic construction environments. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-09-24T16:52:04Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2025-09-24T16:52:04Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | Contents
Page Verification Letter from the Oral Examination Committee ..............................i Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .…... . . . . . . iii 摘要. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ……... . . . . v Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .vii Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ... . ix List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. xv List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvii Chapter 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 Motivation and background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Problem statement . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.2.1 Spatial registration and tracking accuracy for effective warning system using AR/MR . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.2.2 Real-time information integration issues . . . . . . . . . . . . . . . 7 1.2.3 User-centric challenges in AR/MR safety alert systems . . . . . . . 8 1.3 Research objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.4 Organization of the thesis . . . . . . . . . . . . . . . . . . . . . . . 11 Chapter 2 Literature review . . . . . . . . . . . . . . . . . 13 2.1 Current safety measures for near-miss . . . . . . . . . . . . . . . . . 13 2.1.1 Traditional safety methods in construction . . . . . . . . . . . . . . 14 2.1.2 Sensor-based approaches for warning proximity hazards in construction. . . . . . . . 14 2.1.3 Computer vision applications for near-miss prevention in construction 16 2.2 Mixed Reality in construction safety management . . . . . . . . . . . 17 2.2.1 Mixed Reality technology in construction . . . . . . . . . . . . . . 18 2.2.2 AR/MR applications in construction safety . . . . . . . . . . . . . . 19 2.3 Construction technologies with prospective AR/MR integration . . . 20 2.4 Applications of multi-user MR systems in construction . . . . . . . . 22 2.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 Chapter 3 Research methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 3.1 Overview of the multi-user MR near-miss response system for onsite equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 3.2 Data registration . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 3.2.1 Static elements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 3.2.2 Dynamic elements . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 3.3 Virtual scene generation . . . . . . . . . . . . . . . . . . . . . . . . 31 3.3.1 Align point clouds to BIM model . . . . . . . . . . . . . . . . . . . 31 3.3.2 Simulate site scene . . . . . . . . . . . . . . . . . . . . . . . . . . 34 3.4 Object detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.4.1 Detect equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.4.2 Calculate and transfer 3D coordinates . . . . . . . . . . . . . . . . 39 3.5 MR-based alert for near-miss . . . . . . . . . . . . . . . . . . . . . . 46 3.5.1 Locate the MR user . . . . . . . . . . . . . . . . . . . . . . . . . . 46 3.5.2 Check potential near-miss . . . . . . . . . . . . . . . . . . . . . . . 47 3.5.3 Show warnings and directional guidance . . . . . . . . . . . . . . . 49 3.5.4 Multiple MR users in the MR warning system . . . . . . . . . . . . 53 Chapter 4 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 4.1 Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 4.1.1 Devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 4.1.2 MR-based alert system development . . . . . . . . . . . . . . . . . 59 4.2 Case study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 4.2.1 Experimental scenarios . . . . . . . . . . . . . . . . . . . . . . . . 60 4.2.2 Experiment participants . . . . . . . . . . . . . . . . . . . . . . . . 65 Chapter 5 Result and discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5.1 Detection result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5.2 Processing result . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 5.2.1 Near-miss detection and response speed . . . . . . . . . . . . . . . 68 5.2.2 Voice signal latency . . . . . . . . . . . . . . . . . . . . . . . . . . 70 5.3 Qualitative results . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 5.3.1 MR users experience . . . . . . . . . . . . . . . . . . . . . . . . . 71 5.3.2 Interview results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 5.4 Comparative analysis . . . . . . . . . . . . . . . . . . . . . . . . . . 78 5.5 Industrial applicability . . . . . . . . . . . . . . . . . . . . . . . . . 80 5.6 Existing limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 Chapter 6 Conclusion and future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 6.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 6.2 Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 6.3 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 6.4 Future works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 6.4.1 Markerless localization . . . . . . . . . . . . . . . . . . . . . . . . 88 6.4.2 Multiple cameras for safety monitoring . . . . . . . . . . . . . . . . 89 6.4.3 Cross-platform for safety alert system . . . . . . . . . . . . . . . . 89 6.4.4 Automatic generation of safety guidance . . . . . . . . . . . . . . . 90 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 Appendix A — Interview Questions . . . . . . . . . . . . . . . . . . . . . . . . . . 107 | - |
| dc.language.iso | en | - |
| dc.subject | 混合實境(MR) | zh_TW |
| dc.subject | 擴增實境(AR) | zh_TW |
| dc.subject | 虛驚事故 | zh_TW |
| dc.subject | 安全 | zh_TW |
| dc.subject | 多用戶 | zh_TW |
| dc.subject | 警報 | zh_TW |
| dc.subject | near-miss | en |
| dc.subject | Mixed Reality (MR) | en |
| dc.subject | alert | en |
| dc.subject | multi-user | en |
| dc.subject | safety | en |
| dc.subject | Augmented Reality (AR) | en |
| dc.title | 多使用者混合實境工程機具虛驚事故應變系統 | zh_TW |
| dc.title | Multi-user Mixed Reality Near-miss Response System for Onsite Equipment | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 113-2 | - |
| dc.description.degree | 博士 | - |
| dc.contributor.oralexamcommittee | 謝尚賢;詹瀅潔;林祐正;王琨淇 | zh_TW |
| dc.contributor.oralexamcommittee | Shang-Hsien Hsieh;Ying-Chieh Chan;Yu-Cheng Lin;Kun-Chi Wang | en |
| dc.subject.keyword | 混合實境(MR),擴增實境(AR),虛驚事故,安全,多用戶,警報, | zh_TW |
| dc.subject.keyword | Mixed Reality (MR),Augmented Reality (AR),near-miss,safety,multi-user,alert, | en |
| dc.relation.page | 109 | - |
| dc.identifier.doi | 10.6342/NTU202501747 | - |
| dc.rights.note | 同意授權(限校園內公開) | - |
| dc.date.accepted | 2025-08-14 | - |
| dc.contributor.author-college | 工學院 | - |
| dc.contributor.author-dept | 土木工程學系 | - |
| dc.date.embargo-lift | 2029-08-30 | - |
| 顯示於系所單位: | 土木工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-113-2.pdf 未授權公開取用 | 9.29 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
