Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 工學院
  3. 機械工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/65459
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳湘鳳
dc.contributor.authorKeng-Ho Chenen
dc.contributor.author陳鏗合zh_TW
dc.date.accessioned2021-06-16T23:44:24Z-
dc.date.available2016-08-01
dc.date.copyright2012-07-31
dc.date.issued2012
dc.date.submitted2012-07-24
dc.identifier.citationAzuma, R. T., “A Survey of Augmented Reality,” Teleoperators and Virtual Environments 6, 4, pp. 355-385, August 1997
Human Interface Technology Laboratory at the University of Washington : http://www.hitl.washington.edu
Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K., Tachibana, K., “Virtual Object Manipulation on a Table-Top AR Environment,” Augmented Reality, 2000. (ISAR 2000), Proceedings IEEE and ACM International Symposium, pp. 111-119, 2000
Liverani, A., Amati, G., Caligiana, G., “A CAD-augmented Reality Integrated Environment for Assembly Sequence Check and Interactive Validation,” Concurrent Engineering, Vol. 12, No. 1, pp. 67-77, March 2004
Livingston, M.A, Zanbaka, C., Swan, J.E.II, Smallman, H.S., “Objective Measures for the Effectiveness of Augmented Reality,” IEEE Virtual Reality 2005, pp 287-288, March 2005
Lee, T., Hollerer, T., “Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking,” Wearable Computers, 2007 11th IEEE International Symposium, pp. 83-90, October 2007
Lee, T., Hollerer, T., “Multithreaded Hybrid Feature Tracking for Markerless Augmented Reality,” IEEE Transactions on Visualization and Computer Graphics, Vol. 15, No. 3, May 2009
Lin, L., Wang, Y., Liu, Y., Xiong, C., Zeng, K., ”Marker-less registration based on template tracking for augmented reality,” MULTIMEDIA TOOLS AND APPLICATIONS, Vol. 41, No. 2, pp. 235-252, 2009
Martin, A., Adan, A., “3D real-time positioning for autonomous navigation using a nine-point landmark,” Pattern Recognition, Vol. 45, Issue 1, pp. 578-595, January 2012
Ong, S.K., Pang, Y., Nee, A.Y.C., “Augmented Reality Aided Assembly Design and Planning,” CIRP Annals – Manufacturing Technology, Vol. 56, Issue 1, pp. 49-52, 2007
Ong, S.K., Yuan, M.L., Nee, A.Y.C., “Augmented reality applications in manufacturing: a survey,” International Journal of Production Research, Vol. 46, No. 10, pp. 2707-2742, May 2008
Pang, Y., Nee, A.Y.C., Ong, S.K., Yuan, M., “Assembly feature design in an augmented reality environment,” Assembly Automation, Vol. 26, Iss: 1, pp. 34-43, ISSN 0144-5154, 2006
Philipes : http:// www.philips.com.tw
Raghavan, V., Molineros, J., Sharma, R., “Interactive Evaluation of Assembly Sequences Using Augmented Reality,” IEEE Transactions on Robotics and Automation, Vol. 15, No. 3, June 1999
Schwald, B., Laval, B., “An Augmented Reality System for Training and Assistance to Maintenance in the Industrial Context,” Journal of WSCG, Vol. 11, No. 1., ISSN 1213-6972, WSCG’2003
Song, J., Jia, Q., Sun, H., Gao, X., “Study on the Perception Mechanism and Method of Virtual and Real Objects in Augmented Reality Assembly Environment,” Industrial Electronics and Applications, 2009. ICIEA 2009. 4th IEEE Conference, pp. 1452-1456, May 2009
Tsai, R. Y., “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses,” IEEE Journal of Robotics and Automation, Vol. RA-3, No. 4, August 1987
Wiki : http://en.wikipedia.org
Zhang, Z., “A Flexible New Technique for Camera Calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 22, No. 11, November 2000
Zhang, J., Ong, S.K., Nee, A.Y.C., “AR-Assisted in situ Machining Simulation: Architecture and Implementation,” VRCAI’08 Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry, No. 26, 2008
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/65459-
dc.description.abstract擴增實境(Augmented Reality, AR)是將虛擬物件與真實物件在電腦產生之影像場景下做結合之技術。其中無標記的追蹤方法是使用真實物件當追蹤目標,將虛擬物件擺放至電腦影像場景中。本研究的目標是建立一個以電腦視覺為基礎之無標記追蹤方法。
研究方法使用主成分分析法(Principal Component Analysis, PCA)與平方預測誤差(Squared Prediction Error, SPE),從真實物件的顏色特徵取得閥值,進行追蹤。該方法使用顏色特徵將虛擬物件放置於電腦影像場景中。實驗結果顯示,該方法可用於偵測顏色特徵,估計攝影機之內部參數與外部參數,並在即時電腦影像中建立擴增實境座標系統。
實驗結果中,將真實顏色物件放置於電腦影像場景中,使用四個或更多的真實顏色物件,在部分真實顏色物件被遮蔽時,還是可以進行攝影機的姿態計算,建立擴增實境座標系統。使用有限制條件之計算公式來測量追蹤之精確度,藉由真實顏色物件在影像中之位置座標,計算出真實顏色物件於真實空間中之位置座標。實驗結果顯示,在邊長為30公分之立方體空間中,該方法的誤差於為0.3公分左右。實驗結果顯示,該方法是靈活、快速且精確。
zh_TW
dc.description.abstractThe goal of augmented reality (AR) research is to combine virtual objects with real objects in computer generated scenes. Markerless tracking methods use real object features to place virtual objects in computer generated scenes. The goal of this research study is to create a vision-based Markerless tracking method.
The method uses principal component analysis (PCA), squared prediction error (SPE) calculation, and feature extraction thresholds to extract color features from real objects. The method uses color feature to place virtual objects in computer generated scenes. Experimental results show the method can be used to detect color features, estimate camera pose parameters, and create 3D coordinate systems in real-time video camera images.
Experimental results were achieved by adding real colored objects to computer generated scenes. Four or more real colored objects were added to computer generated scenes, to handle occlusion effects for different camera pose parameters. A constraint function was used to measure tracking precision. The constraint function projects real object space and compares real object space feature points to image space feature point to determine tracking precision. Experimental results show that the method can be used to place virtual objects with, 0.3 cm precision in a 30 cm x 30 cm x 30 cm computer generated scene. Experimental results show that the method is flexible, fast, and precise.
en
dc.description.provenanceMade available in DSpace on 2021-06-16T23:44:24Z (GMT). No. of bitstreams: 1
ntu-101-R99522628-1.pdf: 5544771 bytes, checksum: dc87f2404965c42ab1c65689cd46cebf (MD5)
Previous issue date: 2012
en
dc.description.tableofcontents口試委員會審定書 i
致謝 ii
摘要 iii
ABSTRACT iv
目錄 v
圖目錄 vii
表目錄 xii
Chapter 1 緒論 1
1.1 研究背景與動機 1
1.2 研究內容與目的 3
Chapter 2 文獻回顧 5
2.1 擴增實境(Augmented Reality) 5
2.2 攝影機校正(Camera Calibration) 7
2.3 追蹤(Tracking) 13
2.3.1 標記(Marker) 14
2.3.2 無標記(Markerless) 21
Chapter 3 擴增實境座標系統建立方法 26
3.1 攝影機位置與角度判斷 27
3.2 主成份分析法數學模型 34
3.3 座標轉換穩定性 39
3.4 特徵點真實座標可變性 41
Chapter 4 實驗規劃與流程 45
4.1 實驗器材與設備 45
4.2 實驗數據取得方式 47
4.3 攝影機位置與角度判斷運作流程 49
4.4 主成份分析法數學模型運作流程 50
4.5 座標轉換穩定性運作流程 51
4.6 特徵點真實座標可變性運作流程 53
Chapter 5 實驗結果與討論 55
5.1 攝影機位置與角度判斷之實驗結果 55
5.2 主成份分析法數學模型之實驗結果 58
5.2.1 HSV分析結果 58
5.2.2 RGB分析結果 66
5.3 座標轉換穩定性之實驗結果 91
5.4 特徵點真實座標可變性之實驗結果 95
Chapter 6 結論與未來展望 109
6.1 結論 109
6.2 未來展望 110
REFERENCE 111
dc.language.isozh-TW
dc.subject精確zh_TW
dc.subject無標記追蹤zh_TW
dc.subject擴增實境zh_TW
dc.subject遮蔽zh_TW
dc.subject靈活zh_TW
dc.subjectmarkless trackingen
dc.subjectprecisionen
dc.subjectflexibleen
dc.subjectAugmented realityen
dc.subjectocclusionen
dc.title以主成份分析法建立靈活之擴增實境座標系統zh_TW
dc.titleA Flexible Augmented Reality Coordinate System Based on Principal Component Analysisen
dc.typeThesis
dc.date.schoolyear100-2
dc.description.degree碩士
dc.contributor.oralexamcommittee陳亮嘉,林清安
dc.subject.keyword擴增實境,無標記追蹤,遮蔽,精確,靈活,zh_TW
dc.subject.keywordAugmented reality,markless tracking,occlusion,precision,flexible,en
dc.relation.page113
dc.rights.note有償授權
dc.date.accepted2012-07-24
dc.contributor.author-college工學院zh_TW
dc.contributor.author-dept機械工程學研究所zh_TW
顯示於系所單位:機械工程學系

文件中的檔案:
檔案 大小格式 
ntu-101-1.pdf
  未授權公開取用
5.41 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved