請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/68851
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 林沛群 | |
dc.contributor.author | Chi-Fai Wong | en |
dc.contributor.author | 王智暉 | zh_TW |
dc.date.accessioned | 2021-06-17T02:38:35Z | - |
dc.date.available | 2022-08-31 | |
dc.date.copyright | 2017-08-31 | |
dc.date.issued | 2017 | |
dc.date.submitted | 2017-08-16 | |
dc.identifier.citation | [1] E. Kim. (2016). Amazon's $775 million deal for robotics company Kiva is starting to look really smart [online]. Available: http://www.businessinsider.com/kiva-robots-save-money-for-amazon-2016-6
[2] S. Shead. (2017). Amazon now has 45,000 robots in its warehouses [online]. Available: http://uk.businessinsider.com/amazons-robot-army-has-grown-by-50-2017-1 [3] T. Spendlove. (2013). Amazing's Robotic Order Fulfillment [online]. Available: http://www.engineering.com/DesignerEdge/DesignerEdgeArticles/ArticleID/6880/Amazons-Robotic-Order-Fulfillment.aspx [4] Swisslog. (2017). CarryPick Mobile System for Efficient Storage and Order Picking [online]. Available: http://www.swisslog.com/carrypick [5] 華儲物流設備有限公司. (2014). 無人搬運車-A1型AGV [online]. Available: http://www.axis-group.com.tw/ASHE/product_inside.asp?id1=121&prodid=109 [6] C. Ackerman and L. Itti, 'Robot steering with spectral image information,' IEEE Transactions on Robotics, vol. 21, no. 2, pp. 247-251, 2005. [7] T. B. Moeslund, A. Hilton, and V. Kruger, 'A survey of advances in vision-based human motion capture and analysis,' Computer Vision and Image Understanding, vol. 104, no. 2-3, pp. 90-126, Nov-Dec 2006. [8] X. P. Yun and E. R. Bachmann, 'Design, implementation, and experimental results of a quaternion-based Kalman filter for human body motion tracking,' IEEE Transactions on Robotics, vol. 22, no. 6, pp. 1216-1227, 2006. [9] Y. Yoon, A. Kosaka, and A. C. Kak, 'A new Kalman-filter-based framework for fast and accurate visual tracking of rigid objects,' IEEE Transactions on Robotics, vol. 24, no. 5, pp. 1238-1251, 2008. [10] E. Petrovic, A. Leu, D. Ristic-Durrant, and V. Nikolic, 'Stereo Vision-Based Human Tracking for Robotic Follower,' International Journal of Advanced Robotic Systems, Article vol. 10, May 2013, Art. no. 230. [11] C.-H. Chao, B.-Y. Hsueh, M.-Y. Hsiao, S.-H. Tsai, and T.-H. S. Li, 'Fuzzy target tracking and obstacle avoidance of mobile robots with a stereo vision system,' International Journal of Fuzzy Systems, vol. 11, no. 3, pp. 183-191, 2009. [12] M. Gupta, S. Kumar, L. Behera, and V. K. Subramanian, 'A Novel Vision-Based Tracking Algorithm for a Human-Following Mobile Robot,' IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 47, no. 7, pp. 1415-1427, 2017. [13] M. Chueh, Y. Yeung, K. P. C. Lei, and S. S. Joshi, 'Following controller for autonomous mobile robots using behavioral cues,' IEEE Transactions on Industrial Electronics, vol. 55, no. 8, pp. 3124-3132, 2008. [14] M. F. R. Lee and K. H. E. Lee, 'Autonomous target tracking and following mobile robot,' Journal of the Chinese Institute of Engineers, vol. 36, no. 4, pp. 502-529, 2013. [15] K. Morioka, J. H. Lee, and H. Hashimoto, 'Human-following mobile robot in a distributed intelligent sensor network,' IEEE Transactions on Industrial Electronics, vol. 51, no. 1, pp. 229-237, 2004. [16] Y. H. Hu, W. Zhao, and L. Wang, 'Vision-based target tracking and collision avoidance for two autonomous robotic fish,' IEEE Transactions on Industrial Electronics, vol. 56, no. 5, pp. 1401-1410, May 2009. [17] S. Saripalli, J. F. Montgomery, and G. S. Sukhatme, 'Visually guided landing of an unmanned aerial vehicle,' IEEE transactions on robotics and automation, vol. 19, no. 3, pp. 371-380, 2003. [18] E. Machida, M. Cao, T. Murao, and H. Hashimoto, 'Human motion tracking of mobile robot with Kinect 3D sensor,' SICE Annual Conference (SICE), 2012, pp. 2207-2211. [19] E. Babaians, N. K. Korghond, A. Ahmadi, M. Karimi, and S. S. Ghidary, 'Skeleton and visual tracking fusion for human following task of service robots,' RSI International Conference on Robotics and Mechatronics (ICROM), 2015, pp. 761-766. [20] B. Ilias, S. A. Shukor, S. Yaacob, A. Adom, and M. M. Razali, 'A Nurse Following Robot with High Speed Kinect Sensor,' ARPN Journal of Engineering and Applied Sciences, vol. 9, no. 12, pp. 2454-2459, 2014. [21] J. Sales, R. Marín, E. Cervera, S. Rodríguez, and J. Pérez, 'Multi-sensor person following in low-visibility scenarios,' Sensors, vol. 10, no. 12, pp. 10953-10966, 2010. [22] W. Chung, H. Kim, Y. Yoo, C. B. Moon, and J. Park, 'The Detection and Following of Human Legs Through Inductive Approaches for a Mobile Robot With a Single Laser Range Finder,' IEEE Transactions on Industrial Electronics, vol. 59, no. 8, pp. 3156-3166, 2012. [23] E. J. Jung, J. H. Lee, B. J. Yi, J. Park, S. Yuta, and S. T. Noh, 'Development of a Laser-Range-Finder-Based Human Tracking and Control Algorithm for a Marathoner Service Robot,' IEEE/ASME Transactions on Mechatronics, vol. 19, no. 6, pp. 1963-1976, 2014. [24] S. Koo and D.-S. Kwon, 'Recognizing human intentional actions from the relative Movements between human and robot,' IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) 2009, pp. 939-944. [25] T. Linder, S. Breuers, B. Leibe, and K. O. Arras, 'On multi-modal people tracking from mobile platforms in very crowded and dynamic environments,' IEEE International Conference on Robotics and Automation (ICRA), 2016, pp. 5512-5519. [26] R. C. Luo, N. W. Chang, S. C. Lin, and S. C. Wu, 'Human tracking and following using sensor fusion approach for mobile assistive companion robot,' 35th Annual Conference of IEEE Industrial Electronics, 2009, pp. 2235-2240. [27] M. N. A. Bakar, R. Nagarajan, and A. R. M. Saad, 'Development of a doctor following mobile robot with Mono-vision based marker detection,' IEEE Applied Power Electronics Colloquium (IAPEC), 2011, pp. 86-91. [28] Y. Nagumo and A. Ohya, 'Human following behavior of an autonomous mobile robot using light-emitting device,' IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) 2001, pp. 225-230. [29] T. Anezaki, K. Eimon, S. Tansuriyavong, and Y. Yagi, 'Development of a human-tracking robot using QR code recognition,' 17th Korea-Japan Joint Workshop on Frontiers of Computer Vision (FCV), 2011, pp. 1-6. [30] Q. K. Dang and Y. S. Suh, 'Human-following robot using infrared camera,' 11th International Conference on Control, Automation and Systems, 2011, pp. 1054-1058. [31] T.-H. Li, S.-J. Chang, and W. Ting, 'Fuzzy target tracking control of autonomous mobile robots by using infrared sensors,' IEEE Transactions on Fuzzy Systems, vol. 12, no. 4, pp. 491-501, 2004. [32] J. Borenstein and Y. Koren, 'The vector field histogram-fast obstacle avoidance for mobile robots,' IEEE Transactions on Robotics and Automation, vol. 7, no. 3, pp. 278-288, 1991. [33] I. Ulrich and J. Borenstein, 'VFH+: Reliable obstacle avoidance for fast mobile robots,' in Robotics and Automation, IEEE International Conference on, 1998, vol. 2, pp. 1572-1577: IEEE. [34] S. H. A. Mohammad, M. A. Jeffril, and N. Sariff, 'Mobile robot obstacle avoidance by using Fuzzy Logic technique,' IEEE 3rd International Conference on System Engineering and Technology, 2013, pp. 331-335. [35] X. Li and B.-J. Choi, 'Design of obstacle avoidance system for mobile robot using fuzzy logic systems,' International Journal of Smart Home, vol. 7, no. 3, pp. 321-328, 2013. [36] M. Faisal, R. Hedjar, M. Al Sulaiman, and K. Al-Mutib, 'Fuzzy logic navigation and obstacle avoidance by a mobile robot in an unknown dynamic environment,' International Journal of Advanced Robotic Systems, vol. 10, no. 1, p. 37, 2013. [37] H. Bing-Qiang, C. Guang-Yi, and G. Min, 'Reinforcement Learning Neural Network to the Problem of Autonomous Mobile Robot Obstacle Avoidance,' International Conference on Machine Learning and Cybernetics, 2005, vol. 1, pp. 85-89. [38] V. Ganapathy, S. C. Yun, and J. Ng, 'Fuzzy and neural controllers for acute obstacle avoidance in mobile robot navigation,' Advanced Intelligent Mechatronics, 2009, pp. 1236-1241 [39] M. A. Jeffril and N. Sariff, 'The integration of fuzzy logic and artificial neural network methods for mobile robot obstacle avoidance in a static environment,' System Engineering and Technology (ICSET) IEEE 3rd International Conference on, 2013, pp. 325-330 [40] 溫詠仁, '輪型機器人智慧化跟隨與召喚之設計與實現,' 碩士論文, 機械工程學系, 國立臺灣大學, 台北, 2011. [41] V. Semiconductor. IR Receiver Modules for Data Transmission TSDP341, TSDP343 [online]. Available: http://www.vishay.com/docs/82667/tsdp341.pdf [42] 蔡佳宏, '距離感測器於輪型機器人之應用,' 碩士論文, 機械工程學系, 國立臺灣大學, 台北, 2011. [43] (1998). Infrared Remote Control Transmitter PT2248 Datasheet [online]. Available: http://pdf.datasheetcatalog.com/datasheet/PrincetonTechnologyCorporation/mXsswsw.pdf [44] (2001). PT2249 Datasheet [online]. Available: http://www.datasheetlib.com/datasheet/724639/pt2249_ptc-princeton-technology-corporation.html [45] M. Rubenstein, C. Ahler, and R. Nagpal, 'Kilobot: A low cost scalable robot system for collective behaviors,' IEEE International Conference on Robotics and Automation, 2012, pp. 3293-3298. [46] Wikipedia. (2017). Carrier-sense multiple access with collision avoidance [online]. Available: https://en.wikipedia.org/wiki/Carrier-sense_multiple_access_with_collision_avoidance [47] T. Instruments. IR Light-to-Voltage Optical Sensors TSL260, TSL261, YSL262. Available: http://www.ti.com/general/docs/lit/getliterature.tsp?genericPartNumber=tsl260&fileType=pdf [48] Y. Koren and J. Borenstein, 'Potential field methods and their inherent limitations for mobile robot navigation,' in Robotics and Automation, IEEE International Conference, 1991, pp. 1398-1404 | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/68851 | - |
dc.description.abstract | 自動引導車(Automated Guided Vehicle, AGV)是現時常見於倉庫內協助搬運貨物的工具。它們收到程式指示後,會跟隨地上的磁性或光學導引帶行駛至指定的地方取貨/卸貨。不同於傳統軌道,導引帶容易安裝及拆卸,對工作環境影響甚少。因此,自動引導車也適合與工人在同一個工作環境下工作。
然而,自動引導車的活動範圍最終還是受導引帶所限。本研究著重於在自動引導車上增設一個跟隨特定目標的模組,讓自動引導車除了跟隨導引帶外,也具備跟隨工人的能力。本研究開發的新一代跟隨模組沿用低成本的紅外線收發射器作為跟隨人的基礎。經過重新設計後的跟隨裝置,在偵測特定目標的位置表現變得更穩定及準確。除了跟隨特定目標外,模組也具備避障功能。本研究重新設計了避障系統的感測器運用及演算法,讓模組可以在更多複雜的環境亦能同時進行避障及跟隨的任務。最後,本研究設計了自訂的紅外線信號傳輸協定,讓多個跟隨模組能夠被應用在多台自動引導車,並將紅外線信號的干擾減至最低。因此,多台自動引導車能夠跟隨著個別的特定目標。唯有達至這點,跟隨模組方可真正能被應用到真實工作環境內。而本研究亦可被視為將跟隨模組商業化至自動引導車上的基礎。 | zh_TW |
dc.description.abstract | Automatic Guided Vehicles (AGVs) are apparatus capable of transporting goods to designated positions. The most commonly used AGVs follow guide tapes, which are either magnetic or colored as if invisible rails to wherever they are instructed. Unlike rails, they do the least disturbance to the working environment and are easy to install or remove, and that is why it is not surprising to see AGVs sharing the same working environment with humans.
Yet, AGVs are always restrained by guide tapes. This thesis aims to study the feasibility of enabling AGVs to follow humans instead of guide tapes. The human following module (the module) developed previously from our team can be deployed on a two-wheeled vehicle and have it followed the specific target and avoided obstacles at the same time. The fundamental idea of the former design utilizing low-cost infrared sensors as main components to achieve human following is further developed in this thesis. By redesigning the human tracking device, a part of the module, the performance of human bearing tracking is enhanced with higher stability and accuracy. The sensor system and algorithm for obstacle avoidance are modified to be more adaptive to various complex environments. The customized infrared transmitters utilized for building a bonded relationship with the infrared receivers on the AGVs allow multiple modules/AGVs working in the same environment cohesively. Achieving stable performances on human following and obstacles avoidance with low-cost sensors on multiple AGVs could be regarded as a solid foundation for the commercialization of human following AGVs. | en |
dc.description.provenance | Made available in DSpace on 2021-06-17T02:38:35Z (GMT). No. of bitstreams: 1 ntu-106-R04522837-1.pdf: 7281776 bytes, checksum: 386d02228cee20837f1b0f7b66667515 (MD5) Previous issue date: 2017 | en |
dc.description.tableofcontents | Acknowledgements I
Abstract II 摘要 IV Table of Contents V List of Figures VIII List of Tables XII Chapter 1 Introduction 1 1.1 Background 1 1.2 Motivation 4 1.3 Literature Survey 6 1.4 Contributions 12 1.5 Thesis Organization 13 Chapter 2 Overview of the Enhanced and Former Design 14 2.1 Methodology of Human Following 14 2.1.1 Introduction of Human Tracking Device 15 2.1.2 Introduction of Obstacles Detection Sensor System 18 2.2 Introduction of Sensors 20 2.2.1 Sensors in the Human Tracking Device 20 2.2.2 Sensors in the Obstacles Detection Sensor System 22 2.3 System Integration 23 2.4 Former Design of the Human Following Module 26 2.4.1 Brief of the Former Sensor Systems and Algorithms 26 2.4.2 Unresolved Problems and Expected Improvements 30 Chapter 3 Enhanced Design of Human Following Module 32 3.1 Design of the Infrared Transmitter 32 3.1.1 Overview 32 3.1.2 Customized Infrared Signal Transmission Protocol 34 3.2 Design of the Human Tracking Device and the Algorithm 42 3.2.1 First Version – Three-chamber Design 42 3.2.2 Final Version – Four-chamber Design 45 3.2.3 Human Tracking Algorithm 49 3.3 Design of the Obstacles Detection Sensor System and the Algorithm 54 3.3.1 Design of the Obstacles Detection Sensor System 54 3.3.2 Algorithm of Obstacles Avoidance 55 3.4 The Flow from Sensors Values to the Output of AGV’s Velocities 64 Chapter 4 Experiment Results 67 4.1 Experimental Setup 69 4.2 Target Bearing Detection Reliability 71 4.3 Human Following Behavior 75 4.3.1 Human Following in U-shaped and S-shaped Paths 75 4.3.2 AGV Going Around a Corner 85 4.4 Avoiding Dynamic Obstacles 89 4.4.1 Obstacles Approaching from the Front 90 4.4.2 Obstacles Approaching from the Side 94 4.5 Passing through a Narrow Corridor 98 4.6 Two AGVs Operating in the Same Environment 102 4.6.1 Two AGVs Facing Each Other 102 4.6.2 Two AGVs Crossing with Each Other 108 4.7 Real Case Scenarios 111 4.8 Experiment Summaries 115 Chapter 5 Conclusions and Future Works 117 5.1 Conclusions 117 5.2 Future Works 119 References 121 Appendix I – List of Symbols 127 | |
dc.language.iso | en | |
dc.title | 多台自動引導車目標跟隨及避障功能之設計與實現 | zh_TW |
dc.title | Design and Implementation of Human Following and Obstacle Avoidance on Multiple Automatic Guided Vehicles | en |
dc.type | Thesis | |
dc.date.schoolyear | 105-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 黃光裕,顏炳郎,連豊力 | |
dc.subject.keyword | 目標跟隨,避障,紅外線,超音波,紅外線信號傳輸, | zh_TW |
dc.subject.keyword | human tracking,obstacles avoidance,infrared,ultrasonic,infrared signal transmission, | en |
dc.relation.page | 128 | |
dc.identifier.doi | 10.6342/NTU201703581 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2017-08-17 | |
dc.contributor.author-college | 工學院 | zh_TW |
dc.contributor.author-dept | 機械工程學研究所 | zh_TW |
顯示於系所單位: | 機械工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-106-1.pdf 目前未授權公開取用 | 7.11 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。