請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/78088
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 陳湘鳳(Shana Smith) | |
dc.contributor.author | Chia-Wei Hsu | en |
dc.contributor.author | 許家維 | zh_TW |
dc.date.accessioned | 2021-07-11T14:41:51Z | - |
dc.date.available | 2021-11-02 | |
dc.date.copyright | 2016-11-02 | |
dc.date.issued | 2016 | |
dc.date.submitted | 2016-08-19 | |
dc.identifier.citation | Aleotti, J., & Caselli, S. (2011). Physics-based virtual reality for task learning and intelligent disassembly planning. Virtual Reality, 15(1), 41-54.
Azuma, R. T. (1997). A survey of augmented reality. Presence: Teleoperators and virtual environments, 6(4), 355-385. Berg, L. P., Behdad, S., Vance, J. M., & Thurston, D. (2015). Disassembly Sequence Evaluation: A User Study Leveraging Immersive Computing Technologies. Journal of Computing and Information Science in Engineering, 15(1), 011002-011002. Bouguet, J. Y. (2010, October 14th, 2015). Camera calibration toolbox for matlab. Retrieved from http://www.vision.caltech.edu/bouguetj/calib_doc/index.html Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability evaluation in industry, 189(194), 4-7. Cakmakci, O., Ha, Y., & Rolland, J. P. (2004). A Compact Optical See-Through Head-Worn Display with Occlusion Support. Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality, Arlington, VA, USA, 16-25. Collins, R. (2007). Coordinate system diagram. Retrieved from Computer Vision Course. De Pra, Y., Spoto, F., Fontana, F., & Tao, L. (2014). Infrared vs. ultrasonic finger detection on a virtual piano keyboard: Ann Arbor, MI: Michigan Publishing, University of Michigan Library. Gonzalez-Badillo, G., Medellin-Castillo, H., Lim, T., Ritchie, J., & Garbaya, S. (2014). The development of a physics and constraint-based haptic virtual assembly system. Assembly Automation, 34(1), 41-55. Guna, J., Jakus, G., Pogacnik, M., Tomazic, S., & Sodnik, J. (2014). An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors (Basel), 14(2), 3702-3720. Holz, D., Ullrich, S., Wolter, M., Kuhlen, T., & Herder, J. (2008). Multi-contact grasp interaction for virtual environments. Journal of Virtual Reality and Broadcasting, 5(7), 1860-2037. Hou, L., & Wang, X. (2013). A study on the benefits of augmented reality in retaining working memory in assembly tasks: A focus on differences in gender. Automation in Construction, 32, 38-45. Jay, C., & Hubbold, R. (2005). Delayed visual and haptic feedback in a reciprocal tapping task. First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference, Pisa, Italy, 655-656. Jiménez, S. (2014). Physical interaction in augmented environments. (Master), Gj?vik University College. Khattak, S., Cowan, B., Chepurna, I., & Hogue, A. (2014). A real-time reconstructed 3D environment augmented with virtual objects rendered with correct occlusion. Games Media Entertainment (GEM), 2014 IEEE, Toronto, ON, 1-8. Kim, M., & Lee, J. Y. (2016). Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality. Multimedia Tools and Applications. Leal-Meléndrez, J. A., Altamirano-Robles, L., & Gonzalez, J. A. (2013). Occlusion Handling in Video-Based Augmented Reality Using the Kinect Sensor for Indoor Registration. In J. Ruiz-Shulcloper & G. Sanniti di Baja (Eds.), Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications: 18th Iberoamerican Congress, CIARP 2013, Havana, Cuba, November 20-23, 2013, Proceedings, Part II, 447-454. Berlin, Heidelberg: Springer Berlin Heidelberg. Lepetit, V., & Berger, M. O. (2000). A semi-automatic method for resolving occlusion in augmented reality. Proceedings of the IEEE Computer Vision and Pattern Recognition, Hilton Head Island, SC, 225-230. Li, J. R., Khoo, L. P., & Tor, S. B. (2003). Desktop virtual reality for maintenance training: an object oriented prototype system (V-REALISM). Computers in Industry, 52(2), 109-125. Lin, C. H. (2013). An Augmented Reality Furniture Customization System. (Master), National Taiwan University. Mark, M. M. (2013). Unity view frustum diagram. Retrieved from http://blog.markmmiller.co.uk/2013/11/3d-tilt-virtual-reality-app-unity3d.html Mendívil, E. G., Solís, R. E. N., & Ríos, H. (2013). Innovative augmented reality system for automotive assembling processes and maintenance: An entrepreneurial case at Tec de Monterrey. 2013 15th International Conference on Transparent Optical Networks (ICTON), Cartagena, 1-4. Meza, D., & Berndt, S. (2014). Usability/Sentiment for the Enterprise and ENTERPRISE. Text Analytics World 2014 Conference, San Francisco, CA, United States, Noh, S. T., Yeo, H. S., & Woo, W. (2015). An HMD-based mixed reality system for avatar-mediated remote collaboration with bare-hand interaction. Proceedings of the 25th International Conference on Artificial Reality and Telexistence and 20th Eurographics Symposium on Virtual Environments, Kyoto, Japan, 61-68. Penelle, B., & Debeir, O. (2014). Multi-sensor data fusion for hand tracking using Kinect and Leap Motion. Proceedings of the 2014 Virtual Reality International Conference, Laval, France, 22. Qiu, S., Fan, X., Wu, D., He, Q., & Zhou, D. (2013). Virtual human modeling for interactive assembly and disassembly operation in virtual reality environment. The International Journal of Advanced Manufacturing Technology, 69(9-12), 2355-2372. Regenbrecht, H., Collins, J., & Hoermann, S. (2013). A Leap-supported, hybrid AR interface approach. Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, Adelaide, Australia, 281-284. Sanches, S. R., Tokunaga, D. M., Silva, V. F., Sementille, A. C., & Tori, R. (2012). Mutual occlusion between real and virtual elements in augmented reality based on fiducial markers. IEEE Workshop on Applications of Computer Vision (WACV), Breckenridge, CO, 49-54. Seth, A., Vance, J. M., & Oliver, J. H. (2011). Virtual reality for assembly methods prototyping: a review. Virtual Reality, 15(1), 5-20. Shim, J., Yang, Y., Kang, N., Seo, J., & Han, T.-D. (2016). Gesture-based interactive augmented reality content authoring system using HMD. Virtual Reality, 20(1), 57-69. Sportillo, D., Avveduto, G., Tecchia, F., & Carrozzino, M. (2015). Training in VR: A Preliminary Study on Learning Assembly/Disassembly Sequences. In T. L. De Paolis & A. Mongelli (Eds.), Augmented and Virtual Reality: Second International Conference, AVR 2015, Lecce, Italy, August 31 - September 3, 2015, Proceedings, 332-343. Cham: Springer International Publishing. Suarez, J., & Murphy, R. R. (2012). Hand gesture recognition with depth images: A review. 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, 411-417. Tullis, T. S., & Stetson, J. N. (2004). A comparison of questionnaires for assessing website usability. Usability Professional Association Conference, Boston, MA, 1-12. Westerfield, G., Mitrovic, A., & Billinghurst, M. (2014). Intelligent Augmented Reality Training for Motherboard Assembly. International Journal of Artificial Intelligence in Education, 25(1), 157-172. Wren, C. (2016). Rainbow Jelly AR. Retrieved from https://developer.leapmotion.com/gallery/rainbow-jelly-ar Yang, R. D., Fan, X., Wu, D., & Yan, J. (2007). Virtual assembly technologies based on constraint and DOF analysis. Robotics and Computer-Integrated Manufacturing, 23(4), 447-456. Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on pattern analysis and machine intelligence, 22(11), 1330-1334. Zhou, Y., Ma, J. T., Hao, Q., Wang, H., & Liu, X. P. (2007). A Novel Optical See-Through Head-Mounted Display with Occlusion and Intensity Matching Support. Technologies for E-Learning and Digital Entertainment: Second International Conference, Edutainment 2007, Hong Kong, China, June 11-13, 2007. Proceedings, 56-62. Berlin, Heidelberg: Springer Berlin Heidelberg. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/78088 | - |
dc.description.abstract | Augmented reality (AR) is an increasingly discussed topic with major research efforts focused on industrial applications in the recent years. Recent developments in ideas, software and hardware opened up new opportunities, and the applications of AR for virtual assembly and disassembly have great potential. Compared to real world tasks, virtual disassembly allows users to complete tasks with lower cost and without exposure to environmental danger. The simulation of realistic real world interactions within a virtual world therefore becomes an essential issue to solve. This research develops an AR-based disassembly training system, which provides users with a natural user interface that allows them to manipulate objects using their bare hands, without additional worn or handheld devices on their hands. Multiple coordinate systems are unified, the issue of creating and updating the dynamic coordinate transform between that of the Oculus Rift, Leap Motion, RGB camera, and the AR marker which represents the real world is solved. To resolve issues with occlusion and increase the immersion and perceived realism of the system, an innovative occlusion approach proposed to minimize computational load. A flexible physics-based disassembly system is implemented to handle part disassembly. The physical tools can be used in the system. User test results show that the system was robust and successful in improving user experience in virtual disassembly. | en |
dc.description.provenance | Made available in DSpace on 2021-07-11T14:41:51Z (GMT). No. of bitstreams: 1 ntu-105-R03522617-1.pdf: 4913553 bytes, checksum: b1bcbf751efad5d75951ed057c711f3b (MD5) Previous issue date: 2016 | en |
dc.description.tableofcontents | Acknowledgement i
ABSTRACT ii CONTENTS iii LIST OF FIGURES vii LIST OF TABLES xii Chapter 1 Introduction 1 1.1 Research Background 1 1.2 Research Motivation 2 1.3 Research Aim 3 Chapter 2 Literature Review 4 2.1 Augmented Reality 4 2.2 Disassembly Training 10 2.3 Interaction Interfaces 15 2.4 Constraint-based and Physics-based Modeling 21 2.5 Occlusion 24 2.6 Comparison of Prior Research and the Proposed Research 28 Chapter 3 Immersive AR and Occlusion 30 3.1 Software and Hardware 30 3.1.1 Unity3D 30 3.1.2 Vuforia AR 31 3.1.3 Concave Collider 31 3.1.4 Leap Motion 33 3.1.5 Oculus Rift and RGB Camera 33 3.2 Occlusion Handling 36 3.2.1 Hand Representation 36 3.2.2 Method 37 3.2.3 Occlusion Results 40 3.3 Software Architecture 41 Chapter 4 Unification of Coordinate Systems 43 4.1 RGB Camera and Unity3D 45 4.1.1 Method 45 4.1.2 Result of Camera Calibration 50 4.2 Leap Motion and Unity3D 52 4.2.1 Method 52 4.2.2 Experimental Results 53 4.3 Oculus Rift and Unity3D 54 4.3.1 Method 54 4.3.2 Experimental Results 54 4.4 Real world, Vuforia and Unity3D 55 4.4.1 Method 55 4.4.2 Experimental Results 57 4.5 RGB camera and Leap Motion 57 4.5.1 Method 57 4.5.2 Result of Stereo Calibration 62 Chapter 5 Physics-based Disassembly System 63 5.1 Grabbing and Releasing Mechanism 63 5.1.1 Method 63 5.1.2 Experimental Results 66 5.2 Object Transformation 68 5.2.1 Method 68 5.2.2 Experimental Results 69 5.3 Tools and Instructions 71 5.3.1 Tool Representation 71 5.3.2 Instructions 72 Chapter 6 User Testing 74 6.1 User Testing Design 74 6.2 Virtual Training 75 6.3 Virtual Vice Disassembly 76 6.3.1 Test Design 76 6.3.2 Results 78 6.4 Physical Vice Disassembly 79 6.4.1 Test Design 79 6.4.2 Results 80 6.5 Virtual Actuator Disassembly 81 6.5.1 Test Design 81 6.5.2 Results 83 6.6 System Performance Analysis 83 6.7 Subjective Analysis - Questionnaire Results 84 6.7.1 Virtual and Physical Objects Disassembly Comparison Results 84 6.7.2 Occlusion User Experience Results 86 6.7.3 Tools, Instructions, and Training User Experience Results 88 6.7.4 System Usability Scale Analysis Results 89 Chapter 7 Conclusion and Future Work 92 7.1 Conclusion 92 7.2 Future Work 93 References 95 Appendix – Questionnaire 100 | |
dc.language.iso | en | |
dc.title | 建立一個具有自然使用者介面的可攜式與沉浸式擴增實境之拆卸訓練系統 | zh_TW |
dc.title | Development of a Portable Immersive Augmented Reality Disassembly Training System with Natural User Interface | en |
dc.type | Thesis | |
dc.date.schoolyear | 104-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 郭財吉(Tsai-Chi Kuo),蘇偉(Wei-Jiun Su),歐陽明(Ming Ouhyoung) | |
dc.subject.keyword | 擴增實境,拆卸訓練,物理基礎的拆卸,自然使用者介面,遮蔽處理, | zh_TW |
dc.subject.keyword | Augmented Reality,Disassembly Training,Physics-based Disassembly,Natural User Interface,Occlusion, | en |
dc.relation.page | 104 | |
dc.identifier.doi | 10.6342/NTU201600717 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2016-08-20 | |
dc.contributor.author-college | 工學院 | zh_TW |
dc.contributor.author-dept | 機械工程學研究所 | zh_TW |
顯示於系所單位: | 機械工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-105-R03522617-1.pdf 目前未授權公開取用 | 4.8 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。