Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 生物資源暨農學院
  3. 生物機電工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/72056
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor周瑞仁(Jui Jen Chou)
dc.contributor.authorSzu-Yu Linen
dc.contributor.author林思妤zh_TW
dc.date.accessioned2021-06-17T06:21:08Z-
dc.date.available2021-08-19
dc.date.copyright2018-08-19
dc.date.issued2018
dc.date.submitted2018-08-19
dc.identifier.citation1. 郭哲男。2015。應用ZMP於輪爪機器人之穩定爬階。碩士論文。台北:國立台灣大學生物產業機電工程研究所。
2. 黃峻逸。2015。輪爪變形機器人應用於兩棲環境之偵搜。碩士論文。台北:國立台灣大學生物產業機電工程研究所。
3. 楊力行。2013。輪式及爪式可變結構載具之研發。碩士論文。台北:國立台灣大學生物產業機電工程研究所。
4. 潘立翰。2016。輪爪變形機器人之爪式爬階性能與輪式操控策略。碩士論文。台北:國立台灣大學生物產業機電工程研究所。
5. Abeywardena, D., Z. Wang, S. Kodagoda, and G. Dissanayake. 2013. Visual-inertial fusion for quadrotor micro air vehicles with improved scale observability. IEEE International Conference on Robotics and Automation, pp. 3133-3138. Karlsruhe, Germany.
6. Abeywardena, D., Z. Wang, G. Dissanayake, S. L. Waslander, and S. Kodagoda. 2014. Model-aided state estimation for quadrotor micro air vehicles amidst wind disturbances. IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4813-4818. Chicago, Illinois, USA.
7. Alejo, D., F. Caballero, and L. Merino. 2017. RGBD-based robot localization in sewer networks. IEEE/RSJ International Conference on Intelligent Robots and Systems. Vancouver, Canada.
8. Amanatiadis, A., A. Gasteratos, S. Papadakis, and V. Kaburlasos, Image stabilization in active robot vision, Robot vision, pp. 261-274.
9. Bloesch, M., M. Hutter, M. A. Hoepflinger, S. Leutenegger, C. Gehring, C. D. Remy, and R. Siegwart (ETH Research Center), State estimation of legged robots consistent fusion of leg kinematics and IMU, RSS (Robotics Science and Systems Conference), 2012.
10. Buehler, M., U. Saranli, and D. E. Koditschek. 2002. Single actuator per leg robotic hexapod. United States Patent No. 6,481,513.
11. Canny, J. 1986. A computational approach to edge detection. IEEE Transactions on Pattern Analysis and Machine Intelligence. 8(6): 679-698.
12. Chen, S. C., K. J. Huang, W. H. Chen, S. Y. Shen, C. H. Li, and P. C. Lin. 2014. Quattroped: A leg-wheel transformable robot. IEEE/ASME Transactions on Mechatronics. 19(2): 730–742.
13. Chen, S. C., K. J. Huang, C. H. Li, and P. C. Lin. 2011. Trajectory planning for stair climbing in the leg–wheel hybrid mobile robot Quattroped. IEEE International Conference on Robotics and Automation, pp. 1229–1234. Shanghai, China.
14. Chen, W. H., H. S. Lin, and P. C. Lin. 2014. TurboQuad: A leg-wheel transformable robot using bio-inspired control. IEEE International Conference on Robotics and Automation, pp. 2090. Hong Kong, China.
15. Chen, W. H., H. S. Lin, Y. M. Lin, and P. C. Lin. 2017. TurboQuad: A novel leg-wheel transformable robot with smooth and fast behavioral transitions. To be appeared on IEEE Transactions on Robotics. 99: 1-13.
16. Chou, J. J., and L. S. Yang. 2013. Innovative design of a claw-wheel transformable robot. IEEE International Conference on Robotics and Automation, pp. 1337-1342. Karlsruhe, Germany.
17. Duda, R. O., and P. E. Hart. 1972. Use of the Hough transformation to detect lines and curves in pictures. Communications of the ACM. 15(1): 11-15.
18. Eich, M., F. Grimminger, and F. Kirchner. 2008. A versatile stair-climbing robot for search and rescue applications. International Workshop on Safety, Security and Rescue Robotics, pp. 35-40. Sendai, Japan.
19. Endo, G., and S. Hirose. 1996. Study on roller-walker (Basic characteristics and its control). IEEE International Conference on Robotics and Automation, pp. 3265-3270. Minneapolis, Minnesota, USA.
20. Farnebäck, G., Two-frame motion estimation based on polynomial expansion, Scandinavian Conference on Image Analysis (SCIA), Halmstad, Sweden, June-July 2003, pp. 363-370.
21. Fischler, M. A. and R. C. Bolles, Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography, Communications of the ACM, 1981.
22. Grotzinger, J. P., J. Crisp, A. R. Vasavada, R. C. Anderson, C. J. Baker, R. Barry, D. F. Blake, P. Conrad, K. S. Edgett, and B. Ferdowski. 2012. Mars Science Laboratory mission and science investigation. Space science reviews. 170(1-4): 5-56.
23. Grundmann, M., V. Kwatra, and I. Essa, Auto-directed video stabilization with robust L1 optimal camera paths, IEEE International Conference on Computer Vision and Pattern Recognition (CVPR), June 2011, Colorado, USA, pp. 225-232.
24. Harris, C., and M. Stephens, A combined corner and edge detector, Alvey Vision Conference, 1988.
25. Herbert, S. D., A. Drenner, and N. Papanikolopoulos. 2008. Loper: A quadruped-hybrid stair climbing robot. International Conference on Robotics and Automation, pp. 799-804. Pasadena, California, USA.
26. Hong, D., and D. Laney. 2005. Kinematic analysis of a novel rimless wheel with independently actuated spokes. ASME International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, pp. 1-5. Long Beach, California, USA.
27. Huang, C. Y., C. N. Kuo, L. H. Pan, S. Y. Lin, and J. J. Chou. 2017. Claw-wheel: A transformable robot for search and investigation in amphibious environment. IEEE/RSJ International Conference on Intelligent Robots and Systems.
28. Kakogawa, A., Y. Komurasaki, and S. Ma. 2017. Anisotropic shadow-based operation assistant for a pipeline-inspection robot using a single illuminator and camera. IEEE/RSJ International Conference on Intelligent Robots and Systems. Vancouver, Canada.
29. Kim, Y. S., G. P. Jung, H. Kim, K. J. Cho, and C. N. Chu. 2014. Wheel Transformer: A wheel-leg hybrid robot with passive transformable wheels. IEEE Transactions on Robotics. 30(6): 1487-1498.
30. Lamża, A., and Z. Wróbel. 2012. New efficient method of digital video stabilization for in-car camera, International Conference on Multimedia Communications, Services and Security (MCSS), pp. 180-187. Krakow, Poland.
31. Lin, P. C. H. Komsuoglu, and D. E. Koditschek, Sensor data fusion for body state estimation in a hexapod robot with dynamical gaits, IEEE T-RO vol. 22, 2006.
32. Lindermann, R. A., and C. J. Voorhees. 2005. Mars Exploration Rover mobility assembly design, test and performance. IEEE International Conference on Systems, Man and Cybernetics. Waikoloa, Hawaii, USA.
33. Lindemann, R. A., D. B. Bickler, B. D. Harrington, G. M. Ortiz, and C. J. Voorhees. 2006. Mars exploration rover mobility development. IEEE Robotics & Automation Magazine. 13(2): 19-26.
34. Madgwick, S. O. H., A. J. L. Harrison, and R. Vaidyanathan. 2011. Estimation of IMU and MARG orientation using a gradient descent algorithm. IEEE International Conference on Rehabilitation Robotics. Zurich, Switzerland.
35. Moore, E. Z., D. Campbell, F. Grimminger, and M. Buehler. 2002. Reliable stair climbing in the simple hexapod ‘RHex’. IEEE International Conference on Robotics and Automation, pp. 2222-2227. Washington DC, USA.
36. Pan, L. H., C. N. Kuo, C. Y. Huang, and J. J. Chou. 2016. The Claw-wheel transformable hybrid robot with reliable stair climbing and high maneuverability. IEEE International Conference on Automation Science and Engineering, pp. 233-238. Fort Worth, Texas, USA.
37. Quinn, R. D., J. T. Offi, D. A. Kingsley, and R. E. Ritzmann. 2002. Improved mobility through abstracted biological principles. International Conference on Intelligent Robots and Systems, pp. 2652-2657. Lausanne, Switzerland.
38. Reinstein, M., and M. Hoffmann, Dead reckoning in a dynamic quadruped robot: inertial navigation system aided by a legged odometer, IEEE ICRA 2011.
39. Saranli, U., M. Buehler, and D. E. Koditsheck. 2001. RHex: A simple and highly mobile hexapod robot. International Journal of Robotics Research. 20(7): 616-631.
40. Shen, S. Y., C. H. Li, C. C. Cheng, J. C. Lu, S. F. Wang, and P. C. Lin. 2009. Design of a leg–wheel hybrid mobile platform. IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4682-4687. St. Louis, Missouri.
41. Shi, J., and C. Tomasi, Good features to track, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, June 1994, Seattle, WA, USA, pp. 593-600.
42. Smyth, A., and M. Wu. 2007. Multi-rate Kalman filtering for the data fusion of displacement and acceleration response measurements in dynamic system monitoring. Elsevier Journal of Mechanical Systems and Signal Processing. 21: 706-723.
43. Vogel, A. R., K. N. Kaipa, G. M. Krummel, H. A. Bruck, and S. K. Gupta. 2014. Design of a compliance assisted quadrupedal amphibious robot. IEEE International Conference on Robotics and Automation, pp. 2378-2383.
44. Wilcox, B., T. Litwin, J. Biesiadecki, J. Matthews, M. Heverly, J. Morrison, J. Townsend, N. Ahmad, A. Sirota, and B. Cooper. 2007. ATHLETE: A cargo handing and manipulation robot for the moon. Journal of Field Robotics. 24(5): 421-434.
45. Weiss, S., and R. Siegwart. 2011. Real-time metric state estimation for modular vision-inertial systems. IEEE International Conference on Robotics and Automation, pp. 4531-4537. Shanghai, China.
46. Weiss, S., M. W. Achtilek, S. Lynen, M. Chli, and R. Siegwart. 2012. Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. IEEE International Conference on Robotics and Automation, pp. 957-964. Saint Paul, Minnesota, USA.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/72056-
dc.description.abstract本研究開發一用於攀爬越障載具「輪爪機器人」的影像穩定系統。「輪爪機器人」係為了勘查危險、人員難以進入之情境所設計,如危樓、災害現場、野外環境等。輪爪機器人具有以下幾項特色:「摺疊轉換機構」使得輪爪機器人可切換於兩種不同的運動模式之間,分別為適合快速移動於平坦地面的「輪式」運動模式,與利於攀爬、跨越崎嶇地形或階梯的「爪式」運動模式。除此之外,輪爪機器人的優勢尚有機身結構簡單、致動器數量少,使得系統和控制架構相對簡單。
輪爪機器人具有兩個主要的功能性。首先,為使得機器人能夠跨越、進入不同探勘現場與情境,機器人必須具備的第一個功能性就是「機動性」。輪爪機器人經本研究團隊歷經數年開發及改善,已成功使機器人具備可克服不同環境,如平地、崎嶇地形、階梯、水域環境等的機動性。當機器人進入這些探勘現場後,機器人需要具備的第二個功能性就是「影像功能」。我們透過影像裝置如攝影機來捕捉、回傳現場的畫面至後端的工作站與人員,輔助後端人員進行遠端遙控機器人的工作。
但是由於探勘現場的隨機地貌、障礙物、以及輪爪形狀所造成的影響,由架設在機器人上的影像裝置所拍攝的原始畫面會有相當程度的晃動,因此在傳回影像資訊給後端工作站與人員之前,需要先進行影像的穩定。
因此,本研究開發一結合姿態感測與畫面特徵點追蹤之即時影像穩定系統。影像穩定架構大致上可分為兩個階段,第一階段是針對「可預測」的機身起伏與角度,藉由感測器回授、計算機身姿態來進行影像的視角補償;第二階段則是針對「難以掌握」的隨機晃動如地形之零碎起伏、機身機械誤差等,我們透過畫面特徵點追蹤的方式進一步穩定回傳之畫面。
第一階段的影像穩定,係透過馬達編碼器、慣性量測單元(IMU)之感測器回授、搭配機身幾何規格、姿態角函數、運動特性與卡曼濾波演算法的計算,求得裝設於後機身中央的攝影機大致的位置與角度,藉此將能夠量測的視角變動進行補償。
第二階段,關於無法預測的隨機晃動,則是透過捕捉畫面中的特徵點,並且在連續的影像中追蹤這些特徵點的位置與相對移動量,以RANSAC演算法計算出畫面內容的「運動場」。計算出運動場後,我們透過在畫面中加入一個「裁切框」,並且透過順著運動場移動裁切框位置的方式,減少裁切框──也就是最終的輸出影像──中各特徵點、以及畫面內容的相對移動與晃動,藉此達到影像穩定的目的。
本研究已成功建立輪爪機器人的即時影像穩定系統,與機身機電系統的改良,加裝影像裝置、慣性量測單元與馬達控制模組以符合任務需求。從結論中得知,在多種實驗情境下,本研究開發之即時影像穩定系統可有效將機器人所拍攝到之影像震盪幅度降低。除此之外,此系統亦可應用於其他類似之具有感測器回授與影像裝置之移動載具系統。
zh_TW
dc.description.abstractThis research develops a computer-vision based video stabilization system aided by IMU (Inertial Measurement Unit) for a stair-climbing maneuvering “Claw-Wheel” robot for search and rescue purposes in dangerous or disaster sites.
The Claw-Wheel Robot is designed for search purposes in various scenarios, for instance, dangerous buildings, disaster sites, wilderness. It features the “folding transformation mechanism”, which enables it to transform between its two motion modes intended for different scenarios. The “wheel mode” is designed for rapidly moving across flat land, while on the other hand the “claw mode” enables the robot to climb and ascend rough terrains or stairs. Moreover, the Claw-Wheel Robot is also simple in structure and has less amount of actuators.
The Claw-Wheel Robot has two major functionalities. The first is the dynamic mobility which enables the robot to enter disaster or dangerous sites. Throughout these several years, our research team has fully developed the dynamic mobility functionality which makes the robot capable of maneuvering across various types of natural and artificial terrains, including flat ground, rugged terrain, stairs, amphibious environments and so on. After reaching these sites, the robot system relies on the second functionality to capture and retrieve image information by a video device, in order to aid humans during remote robot operation and control.
However, due to the uncertainty of both the terrain and the geometric configuration of the climbing claws, the raw image information captured by a video device requires stabilization, apparently. Therefore, we combined state estimation and video stabilization techniques to provide stable information.
This research develops a real-time video stabilization system featuring robot pose estimation and feature tracking techniques. The video stabilization framework can be divided into two stages. The first stage is to recover the predictable perspective deviation between frames by sensor feedback. The second processing stage is for the arbitrary shaking and unpredictable effects. During this stage we promote the stabilization process by tracking “feature points” in the captured image sequence over time.
The first stage of video stabilization is achieved by predicting the pose of the camera, which is installed on the center of the robot. The algorithm involves sensor feedback provided by IMU (Inertial measurement unit) and motor encoders, along with robot geometric configuration, claw pose angle functions, motion characteristics and the Kalman filter algorithm. With these algorithms we can approximate the camera’s position and orientation, therefore compensate the perspective variation.
The unpredictable, arbitrary motion is handled in the second stage of the process. We capture “feature points” and track them over consequential frames. After calculating the motion, or in other words, position deviation, for corresponding points in consequential frames, we apply the RANSAC algorithm to obtain the accurate “motion field” of the image content. Finally, we apply an output “cropping window” which is moved along the motion field. Inside this cropping window, which eventually turns out to be the output video, the relative positions of the feature points and frame contents are kept constant, in order to reduce shaking and stabilize the video sequence.
In this research we have successfully constructed the real-time video stabilization framework and system for the Claw-Wheel robot, and we also promoted the on-board mechatronics system by installing a video device, IMU and motor control modules in order to meet mission requirements. Under several experimental scenarios, the real-time video stabilization system can reduce the shaking motion significantly. Moreover, this system could also be adapted to similar mobile platforms with proper motion sensor feedback and video devices.
en
dc.description.provenanceMade available in DSpace on 2021-06-17T06:21:08Z (GMT). No. of bitstreams: 1
ntu-107-R05631009-1.pdf: 8238723 bytes, checksum: d08117129b3d3eaa7efd888c08c9389e (MD5)
Previous issue date: 2018
en
dc.description.tableofcontents致謝 i
摘要 ii
Abstract iv
目錄 vi
圖目錄 ix
表目錄 xiii
符號表 xiv
Chapter 1 緒論 1
Chapter 2 文獻探討 3
2.1 輪足混合式越障機器人 3
2.1.1 足安裝於輪上 4
2.1.2 輪安裝於足末端 6
2.1.3 機構切換式 8
2.1.4 輪爪機器人前作 10
2.2 機器人的自身狀態估測 12
2.3 影像穩定 15
Chapter 3 理論模式推導 21
3.1 機器人設計概念 21
3.1.1 爪式運動模式 21
3.1.2 輪式運動模式 22
3.1.3 輪爪機構 22
3.2 機器人幾何模型分析 24
3.2.1 機體規格 24
3.2.2 爬階幾何模型推導 25
3.2.3 扭力需求分析 28
3.3 機器人的運動狀態 29
3.3.1幾何向量模型 29
3.3.2線性運動特性 31
3.3.3卡曼濾波簡介 37
3.3.4 轉動運動特性與卡曼濾波演算法 44
3.4 影像穩定系統 48
3.4.1 影像穩定流程 48
3.4.2 攝影機視角與影像觀念介紹 50
3.4.3 單應性與視角轉換 53
3.4.4 特徵點追蹤與畫面裁切 56
Chapter 4 材料與方法 73
4.1機電系統架構 73
4.1.1 機電控制系統 74
4.1.2 姿態感測系統 82
4.1.3視覺模組 84
4.2即時影像處理 86
4.2.1 軟體系統與開發環境 86
4.2.2 相機校正 86
4.2.3 使用者操作介面 87
Chapter 5 結果與討論 89
5.1 各處理階段成果 89
5.1.1. 卡曼濾波與視角轉換 89
5.1.2. 特徵點追蹤與畫面裁切 91
5.2 評估標準 91
5.3. 應用情境 92
5.3.1 平地 92
5.3.2 水泥坡地 94
5.3.3 崎嶇地形 95
5.3.4 山坡地 98
5.3.5 階梯 100
Chapter 6 結論 105
參考文獻 106
dc.language.isozh-TW
dc.subject越障zh_TW
dc.subject探勘zh_TW
dc.subject搜救zh_TW
dc.subject影像穩定zh_TW
dc.subject機器人zh_TW
dc.subjectroboten
dc.subjectsearchen
dc.subjectrescueen
dc.subjectvideo stabilizationen
dc.subjectObstacle surmountingen
dc.title輪爪機器人回傳影像之穩定zh_TW
dc.titleStabilization of Acquired Environmental Information for the Claw Wheel Roboten
dc.typeThesis
dc.date.schoolyear106-2
dc.description.degree碩士
dc.contributor.oralexamcommittee顏炳郎(Ping-Lang Yen),黃緒哲(Shiuh-Jer Huang)
dc.subject.keyword越障,探勘,搜救,影像穩定,機器人,zh_TW
dc.subject.keywordObstacle surmounting,search,rescue,robot,video stabilization,en
dc.relation.page109
dc.identifier.doi10.6342/NTU201803987
dc.rights.note有償授權
dc.date.accepted2018-08-19
dc.contributor.author-college生物資源暨農學院zh_TW
dc.contributor.author-dept生物產業機電工程學研究所zh_TW
顯示於系所單位:生物機電工程學系

文件中的檔案:
檔案 大小格式 
ntu-107-1.pdf
  未授權公開取用
8.05 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved