請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/47164
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 康仕仲 | |
dc.contributor.author | Jhih-Ren Juang | en |
dc.contributor.author | 莊智仁 | zh_TW |
dc.date.accessioned | 2021-06-15T05:49:29Z | - |
dc.date.available | 2012-09-18 | |
dc.date.copyright | 2012-02-08 | |
dc.date.issued | 2011 | |
dc.date.submitted | 2011-08-18 | |
dc.identifier.citation | 1. Ali, M. S., Babu, N. R., and Varghese, K. (2005). “Collision Free Path Planning of Cooperative Crane Manipulators Using Genetic Algorithm,” Journal of Computing in Civil Engineering, 19(2), 182–193.
2. Allison, R. S., Gillam, B. J. and Vecellio, E. (2009). “Binocular Depth Discrimination and Estimation beyond Interaction Space,” Journal of Vision, 9(1): 10, 1-14. 3. Amatucci, E., Bostelman, R., Dagalakis, N. and Tsai, T. (1997). “Summary of Modeling and Simulation for NIST RoboCrane Applications,” Proceedings of the 1997 Deneb International Simulation Conference and Technology Showcase, Detroit, Michigan, U.S., September 29–October 3. 4. Beavers, J. E., Moore, J. R., Rinehart, R. and Schriver, W. R. (2006). “Crane-Related Fatalities in the Construction Industry,” Journal of Construction Engineering and Management, 132(9), 901-910. 5. Belhumeur, P. N. (1993). “A Binocular Stereo Algorithm for Reconstructing Sloping, Creased, and Broken Surfaces in the Presence of Half-Occlusion,” Proceedings of the Fourth International Conference on Computer Vision, 431-438, Berlin, Germany, May 11-14. 6. Chen, S. Y. and Li, Y. F. (2003). “A 3D Vision System Using Unique Color Encoding,” Proceedings of International Conference on Robotics, Intelligent Systems and Signal Processing, 411-416, Changsha, China, October. 7. Chi, H. L. (2007). “Physics-Based Simulation of Detailed Erection Activities of Construction Cranes,” Master Thesis, National Taiwan University. 8. Chi, H. L. and Kang, S. C. (2010). “A Physics-based Simulation Approach for Cooperative Erection Activities,” Automation in Construction, 19(6), 750-761. 9. Choi, C. W. and Harris, F. C. (1991). “A Model for Determining Optimum Crane Position,” Proceedings of Institution in Civil Engineers, 627–634, Birmingham, U.K. 10. CM Labs. (2007). “Vortex Simulators,” Retrieved March 09, 2010, from http://www.vxsim.com/en/simulators/index.php. 11. Cochran, S.D. and Medioni, G. (1992). “3-D Surface Description from Binocular Stereo,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 14, 981-994. 12. CraneAccidents.com. (2009). “Crane Accidents Statistics,” Retrieved July 13, 2011, from http://craneaccidents.com/stats.htm. 13. Cutting, J. E. (1986). Perception with an Eye for Motion, The MIT Press, U.S.A. 14. Furusaka, S. and Gray, C. (1984). “A model for the Selection of the Optimum Crane for Construction Sites,” Construction Management and Economics, 2, 157–176. 15. GlobalSim. (2007). “GlobalSim”, Retrieved March 09, 2010, from http://www.globalsim.com. 16. Gokturk, S.B., Yalcin, H. and Bamji, C. (2004). “A Time-of-Flight Depth Sensor - System Description, Issues and Solutions,” Proceedings of Conference on Computer Vision and Pattern Recognition Workshop, 3, 35-43, Washington, D.C., U.S., June 27-July 02. 17. Gray, C. and Little, J. (1985). “A Systematic Approach to the Selection of an Appropriate Crane for a Construction Site,” Construction Management and Economics, 3, 121–144. 18. Gruss, A., Carley, L.R. and Kanade, T. (1991). “Integrated Sensor and Rangefinding Analog Aignal Processor,” IEEE Journal of Solid-State Circuits, 26, 184-191. 19. Hakkinen, K. (1978). “Crane Accidents and Prevention,” Journal of Occupational Accidents, 1, 353-361. 20. Hanna, A. S. and Lotfallah, W. B. (1999). “A Fuzzy Logic Approach to the Selection of Cranes,” Automation in Construction, 8, 597-608. 21. Huang, J. Y. and Gau, C. Y. (2003). “Modeling and Designing a Low-Cost High-Fidelity Mobile Crane Simulator,” International Journal of Human-Computer Studies, 58, 151-176. 22. Hung, W. H. and Kang, S. C. (2009). 'Physics-Based Crane Model for the Simulation of Cooperative Erections,' Proceedings of 9th International Conference on Construction Applications of Virtual Reality (CONVR), Sydney, Australia. November 5-6. 23. Hyun, K. and Gerhardt, L.A. (1994). “The Use of Laser Structured Light for 3D Surface Measurement and Inspection,” Proceedings of the Fourth International Conference on Computer Integrated Manufac-turing and Automation Technology, 215-221, Troy, New York, U.S., October 10-12. 24. Kamat, V. R. and Martinez, J. C. (2005). “Dynamic 3D visualization of Articulated Construction Equipment,” Journal of Computing in Civil Engineering, 19(4), 356–68. 25. Kang, S. C. and Miranda, E. (2006). “Planning and Visualization for Automated Robotic Crane Erection Processes in Construction,” Automation in Construction, 15(4), 398–414. 26. Kang, S. C. and Miranda, E. (2008). “Computational Methods for Coordinating Multiple Cranes,” Journal of Computing in Civil Engineering, 22(4), 252–63. 27. Kang, S.C., Chi, H.L. and Miranda, E. (2009). “Three-Dimensional Simulation and Visualization of Crane Assisted Construction Erection Processes,” Journal of Computing in Civil Engineering, 23(6), 363-371. 28. Lange, R., and Seitz, P. (2001). “Solid-State Time-of-Flight Range Camera,” IEEE Journal of Quantum Electronics, 37, 390-397. 29. Leung,W. T. and Tam, C. M. (1999). “Models for Assessing Hoisting Times of Tower Cranes,” Journal of Construction Engineering and Management, 125(6), 385–391. 30. Lin, K. and Haas, C. (1996). “Multiple Heavy Lifts Optimization,” Journal of Construction Engineering and Management, 122(4), 354–362. 31. Lipman, R. R. and Reed, K. A. (2000). “Using VRML in Construction Industry Applications,” Proceedings of Web3D: VRML 2000 Symposium, Monterey, California, U.S., February 21–24. 32. Luursema, J. M., Verwey, W. B., Kommers, P. A. M. and Annema, J. H. (2008). “The Role of Stereopsis in Virtual Anatomical Learning,” Interacting with Computers, 20, 455-460. 33. Marr, D. and Poggio, T. (1979). “A Computational Theory of Human Stereo Vision,” Proceedings of the Royal Society of London B, 204, 301-328. 34. Mazyn, L. I. N., Lenoir, M., Montagne, G. and Savelsbergh, G. J. P. (2004). “The Contribution of Stereo Vision to One-Handed Catching,” Exp Brain Res, 157, 383-390. 35. Mazyn, L. I. N., Lenoir, M., Montagne, G., Delaey, C. and Savelsbergh, G. J. P. (2007). “Stereo Vision Enhances the Learning of a Catching Skill,” Exp Brain Res, 179, 723-726. 36. McKee, S. P. and Taylor, D. G. (2010). “The Precision of Binocular and Monocular Depth Judgments in Natural Settings,” Journal of Vision, 10(10), 1-13. 37. Microsoft. (2002). .NET Framework, Retrieved August 7, 2011, from http://www.microsoft.com/taiwan/netframework/default.mspx. 38. Microsoft. (2008). XNA, Retrieved July 28, 2011, from http://msdn.microsoft.com/en-us/xna/aa937791. 39. Microsoft. (2010). Kinect, Retrieved July 28, 2011, from http://www.xbox.com/en-US/Kinect. 40. Neitzel, R. L., Seixas, N. S. and Ren, K. K. (2001). “A Review of Crane Safety in the Construction Industry,” Applied Occupational and Environmental Hygiene, 16(12), 1106-1117. 41. NVIDIA. (2008). PhysX, Retrieved August 7, 2011, from http://www.nvidia.com/object/physx_new.html. 42. NVIDIA. (2010). 3D Vision, Retrieved July 28, 2011, from http://www.nvidia.com/object/3d-vision-main.html. 43. O’Connor, J. T., Dharwadkar, P. V., Varghese, K. and Gatton, T. M. (1994). “Graphical Visualization for Planning Heavy Lifts,” Proceedings of the First Congress on Computing in Civil Engineering, ASCE, Washington DC, U.S., June. 44. OpenGL Architecture Review Board, Woo, M., Neider, J., Davis, T. and Shreiner, D. (1999). OpenGL Programming Guide: the Official Guide to Learning OpenGL, Version 1.2, Third Edition, Addison-Wesley Professional, U.S.A. 45. OpenNI. (2010). OpenNI, Retrieved August 03, 2011, from http://www.openni.org/. 46. Palmisano, S., Gillam, B., Govan, D. G., Allison, R. S. and Harris, J. M. (2010). “Stereoscopic Perception of Real Depths at Large Distances,” Journal of Vision, 10(6): 19, 1-16. 47. PrimeSense. (2010). NITE Middleware, Retrieved August 03, 2011, from http://www.primesense.com/?p=515. 48. Reddy, R. and Varghese, K. (2002). “Automated Path Planning for Mobile Crane Lifts,” Computer-Aided Civil and Infrastructure Engineering, 17(6), 439–48. 49. Rodriguez-Ramos, W. E. and Francis, R. L. (1983). “Single Crane Location Optimization,” Journal of Construction Engineering and Management, 109(4), 387-397. 50. Sacks, R., Navon, R., Brodetskaia, I. and Shapira, A. (2005). “Feasibility of Automated Monitoring of Lifting Equipment in Support of Project Control,” Journal of Construction Engineering and Management, 131(5), 604–614. 51. Shapira, A. and Lyachin, B. (2009). “Identification and Analysis of Factors Affecting Safety on Construction Sites with Tower Cranes,” Journal of Construction Engineering and Management, 135(1), 24-33. 52. Shapira, A. and Simcha, M. (2009a). “AHP-Based Weighting of Factors Affecting Safety on Construction Sites with Tower Cranes,” Journal of Construction Engineering and Management, 135(4), 307-318. 53. Shapira, A. and Simcha, M. (2009b). “Measurement and Risk Scales of Crane-Related Safety Factors on Construction Sites,” Journal of Construction Engineering and Management, 135(10), 979-989. 54. Shepherd, G. W., Kahler, R. J. and Cross, J. (2000). “Crane Fatalities-a Taxonomic Analysis,” Safety Science, 36, 83-93. 55. Simlog. (2007). “Mobile Crane Personal Simulators,” Retrieved March 09, 2010, from http://www.simlog.com/personal-crane.html. 56. Sivakumar, P. L., Varghese, K., and Babu, N. R. (2003). “Automated Path Planning of Cooperative Crane Lifts Using Heuristic Search,” Journal of Computing in Civil Engineering, 17(3), 197–207. 57. Song, L. M. and Wang, D. N. (2006). “A Novel Grating Matching Method for 3D Reconstruction,” NDT & E International, 39, 282-288. 58. Stone,W., Reed, K., Chang, P., Pfeffer, L. and Jacoff, A. (1999). “NIST Research Toward Construction Site Integration and Automation,” Journal of Aerospace Engineering, 50–57. 59. Szeliski, R. (2000). “Scene Reconstruction from Multiple Cameras,” Proceedings of International Conference on Image Processing, 1, 13-16, Vancouver, BC, Canada, September 9-13. 60. Tam, C. M., Leung, A. W. T. and Liu, D. K. (2002). “Nonlinear Models for Predicting Hoisting Times of Tower Cranes,” Journal of Computing in Civil Engineering, 16(1), 76–81. 61. Tam, C. M., Tong, K. L. and Chan, K. W. (2001). “Genetic Algorithm for Optimizing Supply Location around Tower Crane,” Journal of Construction Engineering and Management, 127(4), 315–321. 62. Tantisevi, K. and Alinci, B. (2006). “Automated Planning and Visualization of Mobile Crane Operation Based on Building and Schedule Information,” Proceedings of the Joint International Conference on Computing and Decision Making in Civil and Building Engineering, ICCCBE 2006, Montreal, Canada , June 14–16. 63. Valkenburg, R.J. and McIvor, A.M. (1998). “Accurate 3D Measurement Using a Structured Light System,” Image and Vision Computing, 16, 99-110. 64. Wheatstone, C. (1838). “Contributions to the Physiology of Vision-Part the First. On Some Remarkable and Hitherto Unobserved Phenomena of Binocular Vision,” Philosophical Transactions of the Royal Society, 128, 371-394. 65. Zhang, P., Harris, F. C., Olomolaiye, P. O. and Holt, G. D. (1999). “Location Optimization for a Group of Tower Cranes,” Journal of Construction Engineering and Management, 125(2), 115–122. 66. Zhang, P., Tam, C. M. and Shi, J. (2003). “Application of Fuzzy Logic to Simulation for Construction Operations,” Journal of Computing in Civil Engineering, 17(1), 38–45. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/47164 | - |
dc.description.abstract | 在營建工業,多用途的特性使吊車在工地扮演著關鍵的角色,提供適當的吊車操作訓練對吊車工安及良好的吊車施工品質是非常重要的。近年來,研究者開發了吊車模擬器用於吊車操作訓練。吊車模擬器的使用可降低吊車訓練的成本,包括機具及場地的租用;另一方面,吊車模擬器可提供擬真的工地環境和工項來增進操作者熟練度。然而,目前吊車模擬器的視覺資訊顯示方式無法提供足夠的視覺深度訊息。若吊車操作者無法透過雙眼及身體位置得到相對應的正確三維資訊,則訓練的效果將會大打折扣,尤其是在模擬複雜的空間場景時;然而實際工地場景通常是複雜的,擬真場景是使用吊車模擬器的重要目的之一。因此,本研究增進了兩個機制於吊車模擬系統:體感視覺與立體視覺。為了實作體感視覺機制,本研究使用微軟公司(Microsoft)的Kinect作為身體資訊的偵測器,並推導了兩個轉換矩陣(transformation matrices)用於三維電腦繪圖流程(rendering pipeline):一個是用來將從Kinect讀取到的視點位置導入繪圖流程中;另一個是用於計算視角對三維繪圖畫面的影響。另外,本研究使用NVIDIA 3D VisionTM套裝工具實現立體視覺,包含了立體視覺的繪圖演算法及相關硬體設備,並搭配了3D眼鏡及3D電視。綜合了體感視覺與立體視覺,本研究開發了體感立體視覺吊車模擬系統(SimCrane3D+),此系統是使用微軟公司的XNA遊戲開發工具開發,本研究也建立了虛擬的吊裝工程場景作為研究案例。案例顯示,SimCrane3D+可藉由讀取操作者身體位置而提供對應其視角的連續動態畫面,及提供雙眼視覺的立體畫面。透過體感及立體視覺,操作者可感受到較佳的視覺深度訊息;而且當視角不同時,也會感受到相對應的視野及遮障。本研究成果未來可應用於吊車操作的虛擬訓練及精密吊裝工程的施工前虛擬演練。 | zh_TW |
dc.description.abstract | Since cranes have a critical and versatile role in construction sites, it is extremely important that operators be provided with adequate training to enable them perform erections safely and efficiently. Several researchers have developed crane simulators to facilitate operator training. The use of simulators reduces the costs associated with renting actual cranes, and enables the training of operators in a range of tasks and environments, allowing the use of virtual environments to develop the operators’ skills. However, a critical drawback common to most existing simulators is the lack of perception of depth. The effectiveness of training may be reduced if the 3D perspective, obtained through human eyes and body movements is not simulated, especially for complex erection tasks, which are the main objectives of virtual trainings. Therefore, this research added two major components to the simulation system, kinesthetic vision and stereoscopic vision. To realize kinesthetic vision, we integrated Microsoft Kinect as the motion sensor. We also derived two transformation matrices: one for the dynamic eye position captured by the motion sensor and the other for compensating for distortion induced due to the inclined view angle. Stereoscopic vision was realized by integrating the NVIDIA 3D VisionTM package, which includes a 3D rendering pipeline, a pair of 3D glasses, and a 3D display. We also developed a crane simulator, called SimCrane 3D+, by integrating kinesthetic vision and stereoscopic vision into a game framework based on the Microsoft XNA toolset. We also developed a typical erection scenario in a complex simulated environment as an example. We found that SimCrane 3D+ can process continuous readings from the motion sensor and smoothly render stereoscopic views. With the addition of kinesthetic and stereoscopic vision, users now have better depth perception and excellent visibility during the operation. The research results show that the system has great potential for training operators and rehearsing critical erections. | en |
dc.description.provenance | Made available in DSpace on 2021-06-15T05:49:29Z (GMT). No. of bitstreams: 1 ntu-100-R98521606-1.pdf: 1702983 bytes, checksum: e13f716e7e67f27bda681037be56bed0 (MD5) Previous issue date: 2011 | en |
dc.description.tableofcontents | 誌謝 i
摘要 iii ABSTRACT v TABLE OF CONTENTS vii LIST OF FIGURES ix 1. CHALLENGES IN CRANE USAGES AND SIMULATIONS 1 2. RELATED RESEARCH 4 2.1. Erection Simulation 4 Optimization for Crane Selection and Location 4 Erection Schedule Planning 5 3D visualization and Simulation for Erection Process 6 Real-time physics-based crane simulation 6 2.2. Operator Training 7 The importance of operator training 7 Crane simulators for operators 9 3. IMPORTANCE OF PRECISE 3D PERCEPTION IN CRANE OPERATIONS 10 4. KINESTHETIC VISION 13 4.1. Overview of Kinesthetic Vision 13 4.2. The Influence of the Perspective Nature of a Crane Operator 14 4.3. Calibration Viewing Frustum 16 4.4. Synchronizing Virtual eye Positions with Actual Eyes 20 4.5. Synchronizing Projection Transformation with Actual Eyes 23 5. STEREOSCOPIC VISION 26 5.1. Stereoscopic Vision 26 5.2. Implementing Stereoscopic for Crane Simulation 30 6. IMPLEMENTATION 32 6.1. System Overview 32 6.2. The Physical Setup 34 6.3. The Software Setup 35 6.4. System Demonstration 36 Stereoscopic and Kinesthetic Vision 36 Crane Simulation 38 Virtual Operation Environment 39 7. DISCUSSIONS 41 8. RESEARCH CONTRIBUTIONS 44 9. CONCLUSIONS 46 10. REFERENCES 48 LIST OF FIGURES Figure 1. The kinesthetic vision from virtual world and actual world. 14 Figure 2. The changes in views due to eye positions: (a) top view of leftward eye position; (b) operator view of leftward eye position; (c) top view of rightward eye position; (d) operator view of rightward eye position. 16 Figure 3. The coordinates of the actual world and virtual world: (a) actual eye coordinates; (b) virtual eye coordinates {E} in virtual world coordinates {O}. 18 Figure 4. Deformation of rendering frustum. 20 Figure 5. The illustration of modelview transformation. 22 Figure 6. The definition of the perspective projection parameters. 24 Figure 7. The cube observed through (a) the right eye only and (b) the left eye only (Wheatstone, 1838). 27 Figure 8. Wheatstone’s stereoscope (Wheatstone, 1838). 28 Figure 9. The principle of the stereoscopic projection. 29 Figure 10. The principle of the stereoscopic projection for crane simulators. 31 Figure 11. The system architecture of SimCrane 3D+. 33 Figure 12. The hardware environment of SimCrane 3D+. 35 Figure 13. The rendering results of stereoscopic and kinesthetic vision: (a) the rendered scene while the trainee is moving his head to the left. (b) the rendered scene while the trainee is moving his head to the right. 37 Figure 14. The physics-based virtual crane in SimCrane 3D+: (a) the physics-based simulation of a luffing jib crane; (b) the physics model of the virtual crane. 39 Figure 15. The virtual environment in SimCrane 3D+: (a) the site overview; (b) the reverse sun light effect. 40 | |
dc.language.iso | en | |
dc.title | 體感立體視覺吊車模擬系統 | zh_TW |
dc.title | SimCrane 3D+: A Crane Simulator with
Kinesthetic and Stereoscopic Vision | en |
dc.type | Thesis | |
dc.date.schoolyear | 99-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 謝尚賢,林楨中,廖源輔 | |
dc.subject.keyword | 吊車模擬,吊車訓練,體感,立體視覺,虛擬實境, | zh_TW |
dc.subject.keyword | crane simulator,crane training,kinesthetic vision,stereoscopic vision,virtual reality, | en |
dc.relation.page | 54 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2011-08-19 | |
dc.contributor.author-college | 工學院 | zh_TW |
dc.contributor.author-dept | 土木工程學研究所 | zh_TW |
顯示於系所單位: | 土木工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-100-1.pdf 目前未授權公開取用 | 1.66 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。