Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 工學院
  3. 機械工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/77896
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳湘鳳
dc.contributor.authorChih-Kai Yangen
dc.contributor.author楊智凱zh_TW
dc.date.accessioned2021-07-11T14:36:55Z-
dc.date.available2022-08-31
dc.date.copyright2017-08-31
dc.date.issued2017
dc.date.submitted2017-08-14
dc.identifier.citationAzuma, R. T. (1997). A survey of augmented reality. Presence: Teleoperators and virtual environments, 6(4), 355-385.
Bouguet, J.-Y. (2002). Camera calibration toolbox for matlab. Retrieved June 1 2016, from http://www.vision.caltech.edu/bouguetj/calib_doc/.
Bourke, P. (1994). Polygonising a scalar field. Cupertino: http://paulbourke.net Available from: http://paulbourke.net/geometry/polygonise [Accessed 1 April 2011].
Brooke, J. (1996). SUS-a quick and dirty usability scale. Usability evaluation in industry, 189(194), 4-7.
Brown, D. C. (1966). Decentering distortion of lenses. Photogrammetric Engineering and Remote Sensing, 444-462.
Colaço, A., Kirmani, A., Yang, H. S., Gong, N.-W., Schmandt, C., & Goyal, V. K. (2013). Mime: Compact, low power 3d gesture sensing for interaction with head mounted displays. Paper presented at the Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, St. Andrews, Scotland, United Kingdom, 227-236.
Collins, R. (2007). Coordinate system diagram. Retrieved from Computer Vision Course (http://www.cse.psu.edu/~rtc12/CSE486/).
De Pra, Y., Fontana, F., & Tao, L. (2014). Infrared vs. Ultrasonic finger detection on a virtual piano keyboard. Paper presented at the ICMC, Athens, Greece, 654-658.
Feiner, S., Macintyre, B., & Seligmann, D. (1993). Knowledge-based augmented reality. Communications of the ACM, 36(7), 53-62.
Fiorentino, M., Uva, A. E., Monno, G., & Radkowski, R. (2012). Augmented technical drawings: A novel technique for natural interactive visualization of computer-aided design models. Journal of Computing and Information Science in Engineering, 12(2), 024503.
Garon, M., Boulet, P.-O., Doironz, J.-P., Beaulieu, L., & Lalonde, J.-F. (2016). Real-time high resolution 3d data on the HoloLens. Paper presented at the 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Merida, Mexico, 189-191.
Hakkarainen, M., Woodward, C., & Billinghurst, M. (2008). Augmented assembly using a mobile phone. Paper presented at the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, ISMAR 2008, Cambridge, UK, 167-168.
Hou, L., & Wang, X. (2013). A study on the benefits of augmented reality in retaining working memory in assembly tasks: A focus on differences in gender. Automation in Construction, 32, 38-45.
Jota, R., Ng, A., Dietz, P., & Wigdor, D. (2013). How fast is fast enough?: A study of the effects of latency in direct-touch pointing tasks. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, 2291-2300.
Kanbara, M., Okuma, T., Takemura, H., & Yokoya, N. (2000). A stereoscopic video see-through augmented reality system based on real-time vision-based registration. Paper presented at the Proceedings of the IEEE Virtual Reality., New Brunswick, NJ, USA, 255-262.
Khattak, S., Cowan, B., Chepurna, I., & Hogue, A. (2014). A real-time reconstructed 3d environment augmented with virtual objects rendered with correct occlusion. Paper presented at the 2014 IEEE on Games Media Entertainment (GEM). Toronto, ON, Canada, 1-8.
Leal-Meléndrez, J. A., Altamirano-Robles, L., & Gonzalez, J. A. (2013). Occlusion handling in video-based augmented reality using the Kinect sensor for indoor registration. Paper presented at the Proceedings of 18th Iberoamerican Congress CIARP 2013 on Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications, Havana, Cuba, 447-454.
Lorensen, W. E., & Cline, H. E. (1987). Marching cubes: A high resolution 3d surface construction algorithm. Paper presented at the ACM SIGGRAPH Computer Graphics, Anaheim, 21(4), 163-169.
Lu, Y., & Smith, S. (2009). Gpu-based real-time occlusion in an immersive augmented reality environment. Journal of Computing and Information Science in Engineering, 9(2), 024501.
Meza, D., & Berndt, S. (2014). Usability/sentiment for the enterprise and ENTERPRISE. Paper presented at the Text Analytics World 2014 Conference, San Francisco, CA; United States.
Neugebauer, R., Klimant, P., & Wittstock, V. (2010). Virtual-reality-based simulation of NC programs for milling machines. Paper presented at the Proceedings of the 20th CIRP Design Conference on Global Product Development, Ecole Centrale de Nantes, Nantes, France, 697-703.
Ong, S., & Zhu, J. (2013). A novel maintenance system for equipment serviceability improvement. CIRP Annals-Manufacturing Technology, 62(1), 39-42.
Penelle, B., & Debeir, O. (2014). Multi-sensor data fusion for hand tracking using Kinect and Leap Motion. Paper presented at the Proceedings of the 2014 Virtual Reality International Conference, Laval, France, 22.
Qiu, S., Fan, X., Wu, D., He, Q., & Zhou, D. (2013). Virtual human modeling for interactive assembly and disassembly operation in virtual reality environment. International Journal of Advanced Manufacturing Technology, 69, 9-12.
Shim, J., Yang, Y., Kang, N., Seo, J., & Han, T.-D. (2016). Gesture-based interactive augmented reality content authoring system using HMD. Virtual Reality, 20(1), 57.
Sportillo, D., Avveduto, G., Tecchia, F., & Carrozzino, M. (2015). Training in VR: A preliminary study on learning assembly/disassembly sequences. Paper presented at the International Conference on Augmented and Virtual Reality, Lecce, Italy, 332-343.
Stark, R., Israel, J., & Wöhler, T. (2010). Towards hybrid modelling environments—merging desktop-CAD and virtual reality-technologies. CIRP Annals-Manufacturing Technology, 59(1), 179-182.
Tullis, T. S., & Stetson, J. N. (2004). A comparison of questionnaires for assessing website usability. Paper presented at the Usability Professional Association Conference, Minneapolis, Minnesota, USA, 1-12.
Ullrich, S., & Kuhlen, T. (2012). Haptic palpation for medical simulation in virtual environments. IEEE Transactions on Visualization and Computer Graphics, 18(4), 617-625.
Vyawahare, V. S., & Stone, R. T. (2012). Asymmetric interface and interactions for bimanual virtual assembly with haptics. Paper presented at the Proceedings of the ASME 2012 International Design Engineering Technical Conferences & Computers and Information in Engineering Conference IDETC/CIE, Chicago, Illinois, USA, 2, 29-37.
Weichert, F., Bachmann, D., Rudak, B., & Fisseler, D. (2013). Analysis of the accuracy and robustness of the leap motion controller. Sensors, 13(5), 6380-6393.
Westerfield, G., Mitrovic, A., & Billinghurst, M. (2015). Intelligent augmented reality training for motherboard assembly. International Journal of Artificial Intelligence in Education, 25(1), 157-172.
Zünd, F., Ryffel, M., Magnenat, S., Marra, A., Nitti, M., Kapadia, M., Noris, G., Mitchell, K., Gross, M., & Sumner, R. W. (2015). Augmented creativity: Bridging the real and virtual worlds to enhance creative play. Paper presented at the SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications, Kobe, Japan, 21.
Zhang, J., Ong, S.-K., & Nee, A. Y. (2008). AR-assisted in situ machining simulation: Architecture and implementation. Paper presented at the Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry, Singapore, 26.
Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on pattern analysis and machine intelligence, 22(11), 1330-1334.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/77896-
dc.description.abstract屆於傳統工具機技術人才的流失與老化,許多以經驗傳承的加工技術已逐漸失傳。加上傳統工具機一對多的教學與安全限制,通常導致學習效果不佳。如何將這些技術保存下來並訓練下一代加工人才,是一件緊迫且重要的課題。本研究利用擴增實境 (Augmented Reality, AR),將加工技術以資訊科技的方式模擬與保存,並以接近操作真實工具機的互動模式,達成一對一的訓練效果與經驗傳承的目的。
擴增實境為利用電腦繪圖與攝影機方位的計算,結合虛擬影像與真實環境的一種技術。本研究建立了一個基於AR的銑削訓練系統。使用者可利用實際的雙手與肢體動作,和與真實環境中一樣的自然操作模式,操作與實體大小一樣的虛擬銑床。本研究藉此希望能夠改善學習效果,並減少使用銑床的潛在危險與成本。
本研究應用Intel® RealSense™ R200攝像機獲取室內場景的彩色和深度資訊;使用Microsoft Kinect v2來追蹤使用者的身體運動資訊,以增強使用者與虛擬物體的互動;結合Leap Motion來追蹤使用者的手部姿態; 最後,藉由Oculus Rift頭戴式顯示器結合真實場景和虛擬物體,並提供AR影像予使用者。
本研究整合了不同相機間的座標系統,讓使用者能夠以雙手與肢體以自然的操作方式,與虛擬物體互動。本研究並提供了及時動態遮蔽處理的方法。該方法使用了校正後的深度資訊作為深度緩衝 (Z-buffer) 覆寫,使每個像素能夠正確顯示出真實物體與虛擬物體之間的遮蔽關係。
最後,使用者測試結果顯示,本研究所提出的AR訓練系統比傳統教學影片的訓練效果更為顯著,同時也證明本系統的互動是直覺、流暢且具有樂趣的。使用者測試結果可做為未來擴增實境應用於工具機訓練與模擬的參考。本研究的示範影片網址為https://youtu.be/k7TeRTYiD-8。
zh_TW
dc.description.abstractSince the lack of and aging of the traditional machine tool technicians, many machining techniques have gradually lost. Furthermore, because of the uneven resource and the safety issues in the traditional machine tool training, the training effect is usually not significant. It is an imperative and important issue to preserve those techniques and to train the next-generation machine tool technicians. In this study, Augmented Reality (AR) is used to simulate and preserve machining technologies. Through the realistic and immersive interactions with a full-size virtual machine, the purpose of one to one training and passing on the traditional machining techniques can be achieved.
AR is a technology combining virtual objects with real scenes, captured from camera, using computer graphics rendering technology. This study created an AR-based milling machine training system. Users can operate a virtual milling machine by using their natural behaviors in a physical environment. The system is expected to enhance users’ learning effect and reduce the accident and the cost in using a milling machine.
Intel® RealSense™ Camera R200 was used to get the depth information of the indoor scenes. Microsoft Kinect v2 was used to get user’ body motion information to enhance the interactions between users and virtual objects. Leap Motion was used to get user’s hand gestures. Oculus Rift head-mounted display was used to merge real scene images and virtual images.
A calibration board was used to correct the translation error between the different camera coordinates. A calibration method was developed to solve the dynamic occlusion problem in real time. The approach overwrote the Z-buffer in the drawing library with the calibrated depth information. The depth information of each pixel was compared to show the partial occluded images.
Finally, user test results show that the training effect of the developed AR system is better than the traditional education video. The user test also shows that the system is intuitive, smooth, and interesting. In this study, the feasibility of the AR system was validated and could be used as a reference for future AR-based training and simulation for machine tools. The AR demo can be found at https://youtu.be/k7TeRTYiD-8.
en
dc.description.provenanceMade available in DSpace on 2021-07-11T14:36:55Z (GMT). No. of bitstreams: 1
ntu-106-R04522627-1.pdf: 7026901 bytes, checksum: 9797ebd2b6cb597bea023efc1ec91d70 (MD5)
Previous issue date: 2017
en
dc.description.tableofcontents致謝 i
中文摘要 ii
ABSTRACT iv
CONTENTS vi
LIST OF FIGURES x
LIST OF TABLES xv
Chapter 1 Introduction 1
1.1 Research Background 1
1.2 Research Motivation and Objectives 1
Chapter 2 Literature Review 3
2.1 Augmented Reality 3
2.2 Interaction Interfaces 7
2.2.1 Indirect Interaction Interfaces 7
2.2.2 Direct Interaction Interfaces 12
2.3 Occlusion 19
2.4 Virtual and Augmented Reality Manufacturing 23
2.5 Comparison of the Prior Research and the Proposed Research 29
Chapter 3 Immersive AR Environment 32
3.1 Software and Hardware 32
3.1.1 Unity3D 32
3.1.2 Vuforia 33
3.1.3 Kinect v2 for Windows 34
3.1.4 Leap Motion 35
3.1.5 Oculus Rift and RGB-D Camera 36
3.2 Occlusion Handling 39
3.2.1 Z-buffer Representation 39
3.2.2 Method 40
3.2.3 Occlusion Results 43
3.3 System Architecture 44
Chapter 4 Unification of Coordinate Systems 47
4.1 Oculus Rift and Unity3D 49
4.1.1 Method 49
4.1.2 Experimental Results 50
4.2 Intel RealSense R200 camera and Oculus Rift 52
4.2.1 Method 52
4.2.2 Experimental Results 53
4.3 Real world and Unity3D 55
4.3.1 Method 55
4.3.2 Experimental Results 58
4.4 Color camera and depth stream 62
4.4.1 Method 62
4.4.2 Experimental Results 68
4.5 Color Camera and Leap Motion 70
4.5.1 Method 70
4.5.2 Experimental Results 73
4.6 Kinect v2 and Oculus Rift 76
4.6.1 Method 76
4.6.2 Experimental Results 77
Chapter 5 Milling Simulation and Training System 79
5.1 Machine Components and Operation Instructions 79
5.2 Milling Simulation 84
5.2.1 Method 84
5.2.2 Experimental Results 88
5.3 Interaction Module 91
5.3.1 Method 91
5.3.2 Experimental Results 93
Chapter 6 User Test 95
6.1 User Test Design 95
6.2 Real Milling Task 97
6.3 Video Training - Control Group 101
6.4 AR Training - Experiment Group 104
6.4.1 Interaction Experience 105
6.4.2 Virtual Milling Task 106
6.5 Objective Analysis – Performance Results 109
6.5.1 Correct Rate 110
6.5.2 Help Frequency 111
6.5.3 Time Cost 111
6.6 Subjective Analysis - Questionnaire Results 112
6.6.1 Results for Interactions 112
6.6.2 Results for Occlusion 114
6.6.3 Results for Instructions 114
6.6.4 Comparison of AR-Based Training and Video Training 115
6.6.5 Results of the System Usability Scale Analysis 116
Chapter 7 Conclusions and Future Work 118
7.1 Conclusions 118
7.2 Future Work 119
References 121
Appendix – Technical drawing 126
Appendix – Questionnaire 127
dc.language.isoen
dc.subjectKinectzh_TW
dc.subjectLeap Motionzh_TW
dc.subject擴增實境zh_TW
dc.subjectRealSense?zh_TW
dc.subjectOculus Riftzh_TW
dc.subject遮蔽處理zh_TW
dc.subject銑床訓練zh_TW
dc.subjectKinecten
dc.subjectMilling Machine Trainingen
dc.subjectOcclusionen
dc.subjectRealSense?en
dc.subjectOculus Riften
dc.subjectLeap Motionen
dc.subjectAugmented Realityen
dc.title建立一個沉浸式擴增實境之銑削模擬與訓練系統zh_TW
dc.titleDevelopment of an Immersive Augmented Reality Training and Simulation System for Milling Operationsen
dc.typeThesis
dc.date.schoolyear105-2
dc.description.degree碩士
dc.contributor.oralexamcommittee蔡曜陽,林清安
dc.subject.keyword擴增實境,銑床訓練,遮蔽處理,RealSense?,Oculus Rift,Leap Motion,Kinect,zh_TW
dc.subject.keywordAugmented Reality,Milling Machine Training,Occlusion,RealSense?,Oculus Rift,Leap Motion,Kinect,en
dc.relation.page132
dc.identifier.doi10.6342/NTU201702740
dc.rights.note有償授權
dc.date.accepted2017-08-15
dc.contributor.author-college工學院zh_TW
dc.contributor.author-dept機械工程學研究所zh_TW
顯示於系所單位:機械工程學系

文件中的檔案:
檔案 大小格式 
ntu-106-R04522627-1.pdf
  未授權公開取用
6.86 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved