請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/43495
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 傅立成(Li-Chen Fu) | |
dc.contributor.author | Cheng-Ming Huang | en |
dc.contributor.author | 黃正民 | zh_TW |
dc.date.accessioned | 2021-06-15T02:22:25Z | - |
dc.date.available | 2019-12-31 | |
dc.date.copyright | 2009-09-02 | |
dc.date.issued | 2009 | |
dc.date.submitted | 2009-08-19 | |
dc.identifier.citation | [1] Y. Bar-Shalom and T. E. Fortmann, Tracking and Data Association, Academic Press, 1988.
[2] H. M. Shertukde and Y. Bar-Shalom, “Tracking of crossing targets with imaging sensors,” IEEE Trans. Aerosp. Electron. Syst., vol. 27, pp. 582-592, 1991. [3] A. Kumar, Y. Bar-Shalom, and E. Oron, “Precision tracking based on segmentation with optimal layering for imaging sensors,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 17, pp.182-188, 1995. [4] T. Kirubarajan, Y. Bar-Shalom, and K. R. Pattipati, “Multiassignment for tracking a large number of overlapping objects and application to fibroblast cells,” IEEE Trans. Aerosp. Electron. Syst., vol. 37, pp. 2-21, 2001. [5] Y. S. Yao and R. Chellappa, “Dynamic feature point tracking in an image sequence,” in Proc. 12th IAPR Int. Conf. Computer Vision and Image Processing, vol. 1, pp. 654-657, 1994. [6] C. Rasmussen and G. D. Hager, “Probabilistic data association methods for tracking complex visual objects,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 23, pp. 560-576, 2001. [7] J. C. Nascimento and J. S. Marques, “Robust shape tracking in the presence of cluttered background,” IEEE Trans. Multimedia, vol. 6, pp. 852-861, 2004. [8] D. Liu and L. C. Fu, “Target tracking in an environment of nearly stationary and biased clutter,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, vol. 3, pp. 1358-1363, 2001. [9] S. B. Colegrove and S. J. Davey, “PDAF with multiple clutter regions and target models,” IEEE Trans. Aerosp. Electron. Syst., vol. 39, pp. 110-124, 2003. [10] M. Boshra and B. Bhanu, “Predicting performance of object recognition,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, pp. 956-969, 2000. [11] K. B. Sarachik, “The effect of gaussian error in object recognition,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 19, pp. 289-301, 1997. [12] J. Chen, H. Leung, T. Lo, J. Litva, and M. Blanchette, “A modified probabilistic data association in a real clutter environment,” IEEE Trans. Aerosp. Electron. Syst., vol. 32, pp. 300-313, 1996. [13] T. W. Ridler and S. Calvard, “Picture thresholding using an iterative selection method,” IEEE Trans. Syst., Man and Cybern., vol. 8, pp. 630-632, 1978. [14] Y. S. Chen, Y. P. Hung, and C. S. Fuh, “A fast block matching algorithm based on the winner-update strategy,” in Proc. Fourth Asian Conf. Computer Vision, vol. 2, pp. 977-982, 2000. [15] Y. Bar-Shalom and X. R. Li, Multitarget-Multisensor Tracking: Principles and Techniques, YBS, Storrs Conneticut USA, 1995. [16] G. Gennari and G. D. Hager, “Probabilistic data association methods in visual tracking of groups,” in Proc. IEEE Int. Conf. Computer Vision and Pattern Recognition, vol. 2, pp. II-876 - II-881, 2004. [17] C. Haworth, A. M. Peacock, and D. Renshaw, “Performance of reference block updating techniques when tracking with the block matching algorithm,” in Proc. IEEE Int. Conf. Image Processing, vol. 1, pp. 365-368, 2001. [18] A. Cavallaro, O. Steiger, and T. Ebrahimi, “Tracking video objects in cluttered background,” IEEE Trans. Circuits Syst. Video Technol. vol. 15, pp. 575-584, 2005. [19] W. Hu, T. Tan, L. Wang, and S. Maybank, “A survey on visual surveillance of object motion and behaviors,” IEEE Trans. Systems, Man and Cybernetics, Part C, Vol. 34, pp. 334-352, 2004. [20] T. Fukuda, T. Suzuki, F. Kobayashi, F. Arai, Y. Hasegawa, and M. Negi, “Seamless tracking system with multiple cameras,” in Proc. IEEE Int. Conf. Industrial Electronics Society (IECON), 2000, vol. 2, pp. 1249-1254. [21] T. Matsuyama and N. Ukita, “Real-time multitarget tracking by a cooperative distributed vision system,” Proc. of IEEE, vol. 90, pp. 1136-1150, 2002. [22] M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. Signal Processing, vol. 50, pp. 174-188, 2002. [23] M. Isard and A. Blake, “ICONDENSATION: Unifying low level and high level tracking in a stochastic framework,” in Proc. 5th European Conf. Computer Vision, 1998, Freiburg, Germany. [24] K. Zia, T. Balch, and F. Dellaert, “MCMC-based particle filtering for tracking a variable number of interacting targets,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 27, pp. 1805-1819, 2005. [25] D. Murray and A. Basu, “Motion tracking with an active camera,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 16, pp. 449-459, 1994. [26] F. Chaumette and S. Hutchinson, “Visual servo control. I. Basic approaches,” IEEE Robot. Autom. Magazine, vol. 13, pp. 82-90, 2006. [27] S. Khan and M. Shah, “Consistent labeling of tracked objects in multiple cameras with overlapping fields of view,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 25, pp. 1355-1360, 2003. [28] S. Calderara, R. Vezzani, A. Prati, and R. Cucchiara, “Entry edge of field of view for multi-camera tracking in distributed video surveillance,” in Proc. IEEE Int. Conf. Advanced Video and Signal Based Surveillance, 2005, vol. 2, pp. 93-98. [29] Q. Cai and J. K. Aggarwal, “Tracking human motion in structured environments using a distributed-camera system,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 21, pp. 1241-1247, 1999. [30] R. T. Collins, A. J. Lipton, H. Fujiyoshi, and T. Kanade, “Algorithms for cooperative multisensor surveillance,” Proc. of IEEE, vol. 89, pp. 1456-1477, 2002. [31] A. Bakhtari and B. Benhabib, “An active vision system for multitarget surveillance in dynamic environments,” IEEE Trans. Syst., Man. Cybern. – Part B, vol. 37, No. 1, pp. 190-198, 2007. [32] V. Isler, S. Kannan and K. Daniilidis, “Vc-dimension of exterior visibility,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 26, No. 5, pp. 667-671, 2004. [33] L. Hodge, M. Kamel, and C. Bardon, “Scalability and optimality in a multi-agent sensor planning systems,” in Proc. IEEE World Autom. Conf., 2004, pp. 74–80. [34] P. Palmer, J. Kittler, and M. Petrou, “Using Focus of Attention with the Hough Transform for Accurate Line Parameter Estimation,” Pattern Recognition, vol. 27, pp. 1,127-1,134, 1994. [35] A. L. Kesidis and N. Papamarkos, “On the Inverse Hough Transform,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 21, no. 12, pp. 1329-1343, 1999. [36] C. Kimme, D. Ballard, and J. Sklansky, “Finding Circles by an Array of Accumulators,” Comm. ACM, vol. 18, pp. 120-122, 1975. [37] R. O. Duda and P. E. Hart, “Use of the Hough Transform to Detect Lines and Curves in Pictures,” Comm. ACM, vol. 15, pp. 11-15, 1972. [38] D. H. Ballard, “Generalizing the Hough Transform to Detect Arbitrary Shapes,” Pattern Recognition, vol. 13, pp. 111-122, 1981. [39] S. Se, D. G. Lowe, J. J. Little, “Vision-Based Global Localization and Mapping for Mobile Robots,” IEEE Trans. Robot., vol. 21, no. 3, pp. 364-375, 2005. [40] K. W. Lo, S. W. Perry, and B. G. Ferguson, “Aircraft Flight Parameter Estimation Using Acoustical Lloyd's Mirror Effect,” IEEE Trans. Aerosp. Electron. Syst., vol. 38, no. 1, pp. 137-151, 2002. [41] M. J. Carlotto, “MTI Data Clustering and Formation Recognition,” IEEE Trans. Aerosp. Electron. Syst., vol. 37, no. 2, pp. 524-537, 2001. [42] S. Mills, T. Pridmore, and M. Hills, “Tracking in a Hough Space with an Extended Kalman Filter”, Proc. BMVC 2003, pp. 173-182, 2003. [43] A. French, S. Mills, and T. Pridmore, “Condensation Tracking through a Hough Space,” in Proc. the 17th Int. Conf. Pattern Recognition, vol. 4, pp. 195-198, 2004. [44] H. K. Yuen, J. Princen, J. Illingworth, and J. Kittler, “Comparative Study of Hough Transform Methods for Circle Finding,” Image and Vision Computing, vol. 8, no. 1, pp. 71-77, 1990. [45] B. Zhou and N. K. Bose, “Multitarget Tracking in Clutter: Fast Algorithms for Data Association,” IEEE Trans. Aerosp. Electron. Syst., vol. 29, no. 2, pp. 352-363, 1993. [46] M. Isard and A. Blake, “CONDENSATION - Conditional density propagation for visual tracking,” Int. J. Computer Vision, vol. 29, pp. 5-28, 1998. [47] T. Yu and Y. Wu, “Collaborative tracking of multiple targets,” in Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2004, vol. 1, pp. I-834-I-841. [48] P. Pérez and J. vermaak, “Bayesian tracking with auxiliary discrete processes. Application to detection and tracking of objects with occlusions.” in Proc. ICCV'05 Workshop on Dynamical Vision, 2005. [49] J. MacCormick and A. Blake, “A probabilistic exclusion principle for tracking multiple objects,” Int. J. Computer Vision, vol. 39, No. 1, pp. 57-71, 2000. [50] D. P. Huttenlocher, G. A. Klanderman, and W. J. Rucklidge, “Comparing images using the Hausdorff distance,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 15, pp. 850-863, 1993. [51] A. H. Göktoğan, E. Nettleton, M. Ridley, and S. Sukkarieh, “Real time multi-UAV simulator,” in Proc. IEEE Int. Conf. Robot. Autom., pp. 2720-2727, 2003. [52] R. Schroer, “UAVs: the future,” IEEE Aerospace and Electronic Systems Magazine, vol. 18, pp. 61- 63, July 2003. [53] R. W. Beard, T. W. McLain, M. A. Goodrich, and E. P. Anderson, “Coordinated target assignment and intercept for unmanned air vehicles,” IEEE Trans. Robot. Autom., vol. 18, no. 6, 2002. [54] I. K. Nikolos, K. P. Valavanis, N. C. Tsourveloudis, A. N. Kostaras, “Evolutionary algorithm based offline/online path planner for UAV navigation,” IEEE Trans. Syst., Man. Cybern. – Part B, vol. 33, no. 6, 2003. [55] J. E. Korteling and W. van der Borg, “Partial camera automation in an unmanned air vehicle,” IEEE Trans. Syst., Man. Cybern. – Part A, vol. 27, no. 2, 1997. [56] H. Zhang and J. P. Ostrowski, “Visual servoing with dynamics: Control of an unmanned blimp,” in Proc. IEEE Int. Conf. Robot. Autom., pp. 618-623, 1999. [57] J. L. Solka, D. J. Marchette, B. C. Wallet, V. L. Irwin, and G. W. Rogers, “Identification of man-made regions in unmanned aerial vehicle imagery and videos,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 20, no. 8, 1998. [58] P. G. Ducksbury and M. J. Varga, “Region based image content descriptors and representation,” in Proc. Sixth Int. Conf. Image Processing and Its Applications, vol.2, pp. 561 – 565, 1997. [59] P. Y. Chen, C. M. Huang, and L. C. Fu, “A robust visual servo system for tracking an arbitrary-shaped object by a new active contour method,” American Control Conf., 2004. [60] F. B. Hsiao, “A novel unmanned aerial vehicle system with autonomous flight and auto-lockup capability,” 43th AIAA Aerospace Science Meeting and Exhibit, Jan. 2005. [61] D. Mclean, Automatic Flight Control Systems, Prentice Hall, 1990. [62] J. R. Azinheira, P. Rives, J. R. H. Carvalho, G. F. Silveira, E. C. dePaiva, and S. S. Bueno, “Visual servo control for the hovering of all outdoor robotic airship,” in Proc. IEEE Int. Conf. Robot. Autom., vol. 3, pp. 2787-2792, 2002. [63] C. F. Lin, Advanced Control System Design, Prentice Hall International, 1994. [64] J. Kim and J. A. Fessler, “Intensity-based image registration using robust correlation coefficients,” IEEE Trans. Medical Image, vol. 23, pp. 1430-1444, 2004. [65] H. C. Peng, F. H. Long and C. Ding, “Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 27, pp. 1226-1238, 2005. [66] J. Denzler and C. M. Brown, “Information theoretic sensor data selection for active object recognition and state estimation,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 24, pp. 145-157, 2002. [67] T. Vercauteren, D. Guo and X. Wang, “Joint multiple target tracking and classification in collaborative sensor networks,” IEEE J. Selected Areas in Communications, vol. 23, pp. 714-723, 2005. [68] A. Doucet, B. Vo, C. Andrieu, and M. Davy, “Particle filtering for multitarget tracking and sensor management,” in Proc. Int. Conf. Information Fusion, vol. 1, pp. 474-481, 2002. [69] K. Murphy, “Dynamic Bayesian Networks: Representation, Inference and Learning,” PhD. thesis, Dept. Computer Science, UC Berkeley, 2002. [70] L. Hodge and M. Kamel, “An agent-based approach to multisensor coordination,” IEEE Trans. Syst., Man. Cybern. – Part A, vol. 33, pp. 648-661, 2003. [71] S. Birchfield, “An elliptical head tracker,” in Proc. 31st Asilomar Conf. Signals, Systems, and Computers, 1997, vol. 2, pp. 1710-1714. [72] N. Takemura and J. Miura, “View planning of multiple active cameras for wide area surveillance,” in Proc. IEEE Int. Conf. Robot. Autom., 2007, pp. 3173-3179. [73] Z. Cao, M. Tan, L. Li, N. Gu, and S. Wang, “Cooperative hunting by distributed mobile robots based on local interaction,” IEEE Trans. Robot., vol. 22, pp. 402-406, 2006. [74] Y. S. Cheng, C. M. Huang and L. C. Fu, “Multiple people visual tracking in a multi-camera system for cluttered environments,” in Proc. IEEE Int. Conf. Intelligent Robots and Systems, 2006, pp. 675-680. [75] A. J. Davison, “Real-time simultaneous localisation and mapping with a single camera,” in Proc. IEEE Int. Conf. Computer Vision, 2003, vol. 2, pp. 1403-1410. [76] H. Zhang and J. P. Ostrowski, “Visual motion planning for mobile robots,” IEEE Trans. Robot. Autom., vol. 18, pp. 199-208, 2002. [77] H. W. Kuhn, “The Hungarian method for the assignment problem,” Naval Research Logistic Quarterly, vol. 2, pp. 83-97, 1955. [78] J. Park, P. C. Bhat and A. C. Kak, “A look-up table based approach for solving the camera selection problem in large camera networks,” in Proc. ACM SenSys'06 Workshop on Distributed Smart Cameras, 2006. [79] G. L. Mariottini and D. Prattichizzo, “EGT for multiple view geometry and visual servoing: robotics vision with pinhole and panoramic cameras,” IEEE Robot. Autom. Magazine, vol. 12, pp. 26-39, 2005. [80] H. Ohzu and K. Habara, “Behind the scene of virtual reality: vision and motion,” Proc. of IEEE, vol. 84, pp. 782-798, 1996. [81] R. W. Huang and C. Chen, “A low-cost driving simulator for full vehicle dynamics simulation,” IEEE Trans. Vehicular Technology, vol. 5, pp. 162-172, 2003. [82] S. Thrun, W. Burgard and D. Fox, Probabilistic robotics, MIT Press, 2005. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/43495 | - |
dc.description.abstract | 本論文提出了基於貝式濾波器發展的影像追蹤以及影像伺服系統。為了達到即時處理且強健的影像追蹤效果,機率資料連結濾波器和粒子濾波器等機率的方法論將擴展延伸來解決影像感測器所產生的一些問題。我們著重於在相機平台機構晃動、相似物體成像干擾、複雜環境、遮蔽、光亮變化等雜訊量測所下進行的目標物資訊估測,同時嘗試去克服物體成像在二維影像平面上,因而喪失與相機間的深度資訊,造成彼此間互相遮蔽狀況下的多目標物追蹤問題。
所發展出來的影像追蹤演算法也同時應用在可動式相機平台的即時影像伺服控制上,使單一一台可水平垂直旋轉相機平台同時伺服控制觀測多個目標物。延續影像追蹤以及影像伺服的技術,我們也特別設計了無人駕駛飛行器的影像伺服自動導航控制器,以持續追蹤鎖定空中的其他飛行器或對地面上的物體進行盤旋偵照。也更進一步利用我們所設計的影像追蹤與伺服演算法,開發成多個分散式的影像子系統,以建構成一個大型的監控合作系統,可同時操控多台可動式相機平台追蹤多個移動目標物,有效率地達成以少量相機進行廣大範圍區域的視覺監控。 | zh_TW |
dc.description.abstract | The Bayesian filter based visual tracking and visual servoing systems are presented in this dissertation. In order to achieve the robust visual tracking performance in real-time, we extend some probabilistic methodologies, such as the probabilistic data association filter and the particle filter, to solve the problems generated by the vision sensor. We focus on target estimation with noisy measurements, which may be due to the mechanical noise arising from undesirable shaking of the camera, optical interference from similar objects, cluttered background, occlusions, and lighting changes, etc. We also try to overcome the target overlapping problem, which arises from loss of depth information of different targets with only 2-D visual observation.
The proposed visual tracking algorithms are then applied to real-time control of the active camera platforms to fulfill the visual servoing function. Along this line of research, visual servoing systems are particularly designed such that an unmanned aerial vehicle (UAV) chases another aerial vehicle, an UAV performs the mission of aerial reconnaissance, and a single moving camera simultaneously tracks multiple targets. Moreover, we utilize our designed visual tracking and visual servoing algorithms to develop several distributed vision systems. A wide area surveillance system is constructed to track multiple moving objects through effective cooperation of multiple distributed vision sub-systems, each with a pan-tilt camera. | en |
dc.description.provenance | Made available in DSpace on 2021-06-15T02:22:25Z (GMT). No. of bitstreams: 1 ntu-98-D91921002-1.pdf: 9336709 bytes, checksum: 9687a66f649c55cb6f7d147cdaeb10f5 (MD5) Previous issue date: 2009 | en |
dc.description.tableofcontents | 摘要……………………………………………………………………………………… i
Abstract………………………………………………………………………………… ii Table of Contents……………………………………………………………………… iv List of Figures………………………………………………………………………… vi List of Tables…………………………………………………………………………… x Chapter 1 Introduction……………………………………………………………… 1 1.1 Motivation………………………………………………………………… 2 1.2 Survey of Related Research……………………………………………… 5 1.3 Contributions……………………………………………………………… 9 1.4 Thesis Organization……………………………………………………… 11 Chapter 2 Preliminaries of Bayesian Filter……………………………………… 13 2.1 Basic Probabilistic Rules and General Bayesian Filter………………… 13 2.2 Kalman Filter……………………………………………………………. 18 2.3 Probabilistic Data Association Filter…………………………………… 21 2.4 Particle Filter……………………………………………………………. 31 Chapter 3 Bayesian Filter based Visual Tracking…………………… 37 3.1 Single Target Visual Tracking with Visual Probabilistic Data Association Filter……………………………………………………………………… 37 3.1.1 Augmented Probabilistic Data Association Filter with Observational Density…………………………………………… 39 3.1.2 Persistent Noise Removal……………………………………… 41 3.1.3 Experiments……………………………………………………… 49 3.2 Multi-Target Visual Tracking in the Target Feature Space……………… 57 3.2.1 Measurement Generation through Circular Hough Transform… 58 3.2.2 Joint Visual Probabilistic Data Association Filter……………… 61 3.2.3 Experiments……………………………………………………… 65 3.3 Multi-Target Visual Tracking for Occlusion Handling………………… 72 3.3.1 Target Depth Order Definition…………………………………… 73 3.3.2 Particle Filter for Occlusion Handling…………………………… 75 3.3.3 Experiments……………………………………………………… 79 Chapter 4 Applications of Visual Tracking……………………………………… 85 4.1 Air Combat Simulator…………………………………………………… 86 4.1.1 Visual Servo Control for the Air Combat in the Virtual Reality… 90 4.1.2 Experiments……………………………………………………… 94 4.2 Aerial Reconnaissance System………………………………………… 97 4.2.1 Image Based Fuzzy Logic Guidance Design…………………… 105 4.2.2 Simulations…………………………………………………… 108 4.3 Multi-Target Visual Surveillance with a Pan-Tilt Camera……………… 112 4.3.1 Action Selection of Pan-Tilt Camera…………………………… 116 4.3.2 Experiments…………………………………………………… 119 Chapter 5 Effective Visual Surveillance with Cooperation of Multiple Active Cameras………………………………………………………………………… 125 5.1 Cooperation Strategy of Multiple Active Cameras for Multi-Target Tracking………………………………………………………………………… 126 5.2 Positioning Strategy of Active Camera with Specific Task Assignment.. 134 5.3 Simulations……………………………………………………………………… 139 5.4 Experiments……………………………………………………………………… 145 Chapter 6 Conclusions…………………………………………………………………… 157 Bibliography…………………………………………………………………………………… 161 | |
dc.language.iso | en | |
dc.title | 以貝氏濾波器及可動式相機進行之影像追蹤暨其應用 | zh_TW |
dc.title | Visual Tracking and Its Applications by Bayesian Filtering with Active Cameras | en |
dc.type | Thesis | |
dc.date.schoolyear | 97-2 | |
dc.description.degree | 博士 | |
dc.contributor.oralexamcommittee | 蔡文祥,宋開泰,孫永年,蔡清池,黃仲陵,陳世旺,范欽雄 | |
dc.subject.keyword | 影像追蹤,影像伺服,貝式濾波器,機率資料連結濾波器,粒子濾波器,多目標物影像追蹤,多相機合作系統,可動式相機, | zh_TW |
dc.subject.keyword | Visual tracking,visual servoing,Bayesian filter,probabilistic data association filter,particle filter,multi-target visual tracking,multi-camera cooperative system,active camera., | en |
dc.relation.page | 170 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2009-08-19 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 電機工程學研究所 | zh_TW |
顯示於系所單位: | 電機工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-98-1.pdf 目前未授權公開取用 | 9.12 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。