請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/56105
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 林達德(Ta-Te Lin) | |
dc.contributor.author | Wei-Chen Zhao | en |
dc.contributor.author | 趙偉辰 | zh_TW |
dc.date.accessioned | 2021-06-16T05:15:36Z | - |
dc.date.available | 2014-08-22 | |
dc.date.copyright | 2014-08-22 | |
dc.date.issued | 2014 | |
dc.date.submitted | 2014-08-18 | |
dc.identifier.citation | 余世忠。2012。主從式影像監測系統之研製與生態監測應用。碩士論文。臺北: 國立臺灣大學生物產業機電工程所。
臺北市立動物園。2012。動物園資料索引。臺北:臺北市立動物園。網址:http://newweb.zoo.gov.tw/。日期:2012-06-19。 Achanta, R., Estrada, F., Wils, P., Susstrunk, S. 2008. Computer Vision Systems. 1st ed., 66-75. Springer Berlin Heidelberg. Arulampalam, M. S., S. Maskell, N. Gordon, and T. Clapp. 2002. A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Transactions on Signal Processing 50(2): 174-188. Bagdanov, A. D., A. del Bimbo, and F. Pernici. 2005. Acquisition of high-resolution images through on-line saccade sequence planning. In Proceedings of the Third ACM International Workshop on Video Surveillance & Sensor Networks. Bay, H., A. Ess, T. Tuytelaars, and L. Van Gool. 2008. Speeded-up robust features (SURF). Computer Vision and Image Understanding 110(3): 346-359. Black, J., and T. Ellis. 2001. Multi camera image tracking. In International Workshop on Performance Evaluation of Tracking and Surveillance. Booth, D. T., and S. E. Cox. 2008. Image-based monitoring to measure ecological change in rangeland. Frontiers in Ecology and The Environment 6(4): 185-190. Bradski, G., and A. Kaehler. 2008. Learning OpenCV: Computer Vision with the Opencv Library. Sebastopol: O'Reilly Medi. Brown, M., and D. G. Lowe. 2007. Automatic panoramic image stitching using invariant features. International Journal of Computer Vision 74(1): 59-73. Challa, S. 2011. Fundamentals of Object Tracking. 1st ed. Cambridge University Press. Chen, C. H., Y. Yao, D. Page, B. Abidi, A. Koschan, and M. Abidi. 2008. Heterogeneous fusion of omnidirectional and PTZ cameras for multiple object tracking. IEEE Transactions on Circuits and Systems for Video Technology 18(8): 1052-1063. Cheng, M. M., Zhang, G. X., Mitra, N. J., Huang, X., Hu, S. M. 2011. Global contrast based salient region detection. IEEE Conference on Computer Vision and Pattern Recognition 409-416. Comaniciu, D., and P. Meer. 1999. Mean shift analysis and applications. The Proceedings of the Seventh IEEE International Conference on Computer Vision 2: 1197-1203. Cucchiara, R., M. Piccardi, and P. Mello. 2000. Image analysis and rule-based reasoning for a traffic monitoring system. IEEE Transactions on Intelligent Transportation Systems 1(2): 119-130. Diebold, F. 2007. Elements of Forecasting. 1st ed. Cengage Learning. Doucet, A., N. D. Freitas, and N. Gordon. 2001. An introduction to sequential Monte Carlo methods. In Sequential Monte Carlo methods in practice. Springer New York 3-14. Fischler, M. A., and R. C. Bolles. 1981. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM 24(6): 381-395. Haritaoglu, I., D. Harwood, and L. S. Davis. 2000. W4: Real-time surveillance of people and their activities. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(8): 809-830. KaewTraKulPong, P., and R. Bowden. 2002. An improved adaptive background mixture model for real-time tracking with shadow detection. In Video-Based Surveillance Systems 135-144. Kalman, R. E. 1960. A new approach to linear filtering and prediction problems. Journal of Basic Engineering 82(1): 35-45. Kenneth, L. 1944. A method for the solution of certain non-linear problems in least squares. Quarterly of Applied Mathematics 2: 164-168. Li, M. H. 2006. Introduction to Ecological Monitoring. 1st ed. Taipei:Ming-Wen. Pei, J. 1998. An evaluation of using auto-trigger cameras to record activity patterns of wild animals. Taiwan Journal of Forest Science 13(4): 317-324. Shek, C. T., C. S. Chan, and Y. F. Wan. 2007. Camera trap survey of Hong Kong terrestrial mammals in 2002-06. Hong Kong. Biodiversity 15:1-11. Silveira, L., A. T. Jacomo, and J. A. F. Diniz-Filho. 2003. Camera trap, line transect census and track surveys: a comparative evaluation. Biological Conservation 114(3): 351-355. Spellerberg, I. F. 2005. Monitoring Ecological Change. 2nd ed. Cambridge University Press. Stauffer, C., and W. E. L. Grimson. 1999. Adaptive background mixture models for real-time tracking. IEEE Computer Society Conference on Computer Vision and Pattern Recognition 2: 252. Stillman, S. T., Tanawongsuwan, R., and Essa, I. A. 1998. A system for tracking and recognizing multiple people with multiple camera. Georgia Institute of Technology. Stratonovich, R. L. 1960. Conditional markov processes. Theory of Probability & Its Applications. 5(2): 156-178. Thrun, S. et al. 2006. Stanley: The robot that won the DARPA Grand Challenge. Journal of Field Robotics 23(9): 661-692 Valera, M., and S. Velastin. 2005. Intelligent distributed surveillance systems: a review. IEEE Proceedings of Vision, Image and Signal Processing 152(2): 192-204. Yao, Y., B. Abidi, and M. Abidi. 2006. Fusion of omnidirectional and PTZ cameras for accurate cooperative tracking. IEEE International Conference on Video and Signal Based Surveillance 46. Yilmaz, A., O. Javed, and M. Shah. 2006. Object tracking: a survey. ACM Computing Surveys 38(4): 13. Zhou, X., R. T. Collins, T. Kanade, and P. Metes. 2003. A master-slave system to acquire biometric imagery of humans at distance. First ACM SIGMM International Workshop on Video Surveillance 113-120 | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/56105 | - |
dc.description.abstract | 影像監測系統已被廣泛應用於許多場合,像是在一般公共場所或大樓出入口常見的傳統攝影機。但由於受限的監測視野範圍很小,若要增加監測視野範圍,最直接的方法就是增加攝影機的數量。但此時取得之影像會有解析度過低以致於無法用來做為辨識、判別的問題,近年來有許多研究主題為應用主從式影像系統於行人監測、通道監控,利用其可以同時提供更廣泛視野範圍的影像和擷取高解析度影像的特點。其中主控端攝影機負責監測視野範圍內的影像以及偵測、追蹤移動目標物,並且控制受控端攝影機取得目標物的高解析度影像。進行環境監測時,由於觀察的目標物並非處於靜態,目標物的動態、運動狀態、軌跡就相對重要。因此,本研究結合多顆攝影機並簡化其架構,提出針對目標物的偵測與追蹤方法,偵測移動目標物並估測其運動狀態、有效率處理多重追蹤問題並解決互相遮蔽與重疊的問題,提高系統對影像中目標物追蹤與擷取影像的準確率,其中再針對受控端所擷取的高解析度影像進行微調。本研究系統已實際應用於臺灣大學校內的戶外環境與臺北市立動物園的生態環境,並追蹤、記錄觀測場景內目標物的高解析度連續影像以及目標物移動軌跡與分佈,以提供做為離線分析的依據,其中監測視角範圍可達到195度以涵蓋監測場景之全部視野。實驗結果顯示,整體目標物的追蹤成功率約為74% ~ 77%、取得其目標物影像的解析度放大約可達到35倍,錄製之記錄影片速度可達到10 fps。 | zh_TW |
dc.description.abstract | Visual surveillance system has various applications and it is an important research topic in computer vision. There are some issues of these systems on expanding field of view (FOV) and enhancing the resolution of images. Hence, the master-slave imaging systems which can provide large FOV and also high resolution images simultaneously have been applied to pedestrian surveillance, access control, and crowd statistical analysis. The master-slave imaging system is a combination of two cameras: a master camera and a slave camera. The master camera has large FOV and is responsible for monitoring and object tracking. The slave camera, pan-tilt-zoom camera, is then guided by the master camera to rotate and zoom in the targeted object to acquire high resolution images. As the targeted objects are not always stationary, the objects’ motion behavior estimation is required for tracking the correct object and acquiring zoom-in images. In reality, objects have much more complex interactions and may cause object tracking failures. In this research, data from multiple cameras in master-slave imaging system were integrated to track and predict multiple objects’ behavior. The individual object was identified and matched for occlusion handling. The multiple object tracking results show a significant improvement on the master-slave imaging system’s robustness. The system was tested at an outdoor environment in National Taiwan University campus and the Taipei Zoo. It can track and record the high resolution image sequence and trajectory of targeted object for later offline analysis. The master camera provides a large FOV of about 195 degrees. The object tracking successful rate was about 74% ~ 77% and the resolution of targeted object’s image can be zoomed to about 35 times. The video frame rate speed could achieve 10 fps. | en |
dc.description.provenance | Made available in DSpace on 2021-06-16T05:15:36Z (GMT). No. of bitstreams: 1 ntu-103-R01631024-1.pdf: 8715545 bytes, checksum: 310fc30e35211898be39987f138ee55a (MD5) Previous issue date: 2014 | en |
dc.description.tableofcontents | 目錄
誌謝 i 中文摘要 iii Abstract iv 圖目錄 viii 表目錄 xiii 第一章 緒論 1 1.1 前言 1 1.2 研究目的 3 第二章 文獻探討 5 2.1 影像監測系統應用 5 2.1.1 行人監測 5 2.1.2 生態監測 5 2.2 多重攝影機架構 7 2.3 目標物偵測 10 2.4 物件追蹤 13 2.4.1 貝氏濾波器 13 2.4.2 平均移動演算法 14 2.4.3 卡爾曼濾波器 17 2.4.4 粒子濾波器 18 第三章 材料與方法 20 3.1 系統架構 20 3.1.1 硬體架構 20 3.1.2 軟體介面 23 3.1.3 軟體架構 25 3.2 主從式系統 26 3.2.1 系統幾何座標轉換 26 3.2.2 主控端環場影像 31 3.3 目標物偵測與追蹤 32 3.3.1 背景模型相減法 32 3.3.2 目標物圖像比對 34 3.3.3 物件追蹤策略 36 3.3.4 受控端物件追蹤 37 3.3.5 卡爾曼濾波器 38 3.3.6 粒子濾波器 41 3.4 實驗方法規劃 43 第四章 結果與討論 45 4.1 主從式影像系統 45 4.1.1 主控端系統 45 4.1.2 受控端系統 48 4.2 追蹤方法探討 50 4.2.1 目標物偵測干擾 50 4.2.2 重疊目標物 51 4.2.3 不同軌跡之預測比較 54 4.2.4 受控端物件追蹤比較 57 4.3 實際應用與動物行為觀察 60 4.3.1 實驗環境與對象 60 4.3.2 活動頻率直方統計圖 63 4.3.3 目標物軌跡累積圖 65 4.3.4 區域活動率分析 69 4.3.5 物件追蹤之結果 78 4.3.6 物件追蹤之擷取影像成功率 82 4.3.7 高解析度影像之擷取與記錄 84 第五章 結論與建議 91 5.1 結論 91 5.2 建議 92 參考文獻 93 | |
dc.language.iso | zh-TW | |
dc.title | 主從式影像監測系統之物件追蹤演算法研究 | zh_TW |
dc.title | A Study on Object Tracking Method for Master-Slave Imaging Surveillance System | en |
dc.type | Thesis | |
dc.date.schoolyear | 102-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 江昭皚(Joe-Air Jiang),艾群(Chyung Ay) | |
dc.subject.keyword | 主從式影像系統,移動物偵測,物件追蹤, | zh_TW |
dc.subject.keyword | Master-slave system,Moving object detection,Object tracking, | en |
dc.relation.page | 96 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2014-08-18 | |
dc.contributor.author-college | 生物資源暨農學院 | zh_TW |
dc.contributor.author-dept | 生物產業機電工程學研究所 | zh_TW |
顯示於系所單位: | 生物機電工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-103-1.pdf 目前未授權公開取用 | 8.51 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。