Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 生物資源暨農學院
  3. 生物機電工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/42473
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor周瑞仁
dc.contributor.authorChung-Cheng Huangen
dc.contributor.author黃忠政zh_TW
dc.date.accessioned2021-06-15T01:14:23Z-
dc.date.available2012-07-30
dc.date.copyright2009-07-30
dc.date.issued2009
dc.date.submitted2009-07-28
dc.identifier.citation1. AI motor 1001 Manual. Available at:
http://lib.store.yahoo.net/lib/e-clec-tech/ai-1001.pdf. Accessed 26 July. 2009.
2. Blanco, A. G., G. J. Munguia, and P. O. Moreno. 2008. Development of a system for independent ambulation of patients with spinal cord injury using a data-glove and a biped robotic model. Electronics, Robotics and Automotive Mechanics Conference. 461-466. Morelos, Mexico.
3. Eun-Seok, C., B. Won-Chul, C. Sung-Jung, Y. Jing, K. Dong-Yoon, and K. Sang-Ryong. 2005. Beatbox music phone: Gesture-based interactive mobile phone using a tri-axis accelerometer. IEEE International Conference on Industrial Technology. 97-102. Hong Kong, China.
4. Hooper, S. L. Central Pattern Generators. Available at:
http://crab-lab.zool.ohiou.edu/hooper/cpg.pdf. Accessed 26 July. 2009.
5. Jiayang, L., W. Zhen, Z. Lin, J. Wickramasuriya, and V. Vasudevan. 2009. uWave: Accelerometer-based personalized gesture recognition and its applications. IEEE International Conference on Pervasive Computing and Communications. 1-9. Galveston, TX, United States.
6. Jing, Y., C. Eun-Seok, C. Wook, B. Won-Chul, C. Sung-Jung, O. Jong-Koo, C. Joon-Kee, and K. Dong-Yoon. 2004. A novel hand gesture input device based on inertial sensing technique. 30th Annual Conference of the IEEE Industrial Electronics Society. 3: 2786-2791. Busan, Korea.
7. Kallio, S., J. Kela, and J. Mantyjarvi. 2003. Online gesture recognition system for mobile interaction. IEEE International Conference on Systems, Man and Cybernetics. 3: 2070-2076. Oulu, Finland.
8. Kela, J., P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and S. Marca. 2006. Accelerometer-based gesture control for a design environment. Personal and Ubiquitous Computing. 10(5): 285-299.
9. Kirishima, T., K. Sato, and K. Chihara. 2005. Real-time gesture recognition by learning and selective control of visual interest points. IEEE Transactions on Pattern Analysis and Machine Intelligence. 27(3):351-364.
10. Kuroda, T., Y. Yabata, A. Goto, H. Ikuta, and M. Murakami. 2004. Consumer price data-glove for sign language recognition. 5th International Conference on Disability, Virtual Reality and Associated Technologies. 253-258. Oxford, UK.
11. Mäntyjärvi, J., J. Kela, P. Korpipää, and S. Kallio. 2004. Enabling fast and effortless customisation in accelerometer based gesture interaction. 3rd International Conference on Mobile and Ubiquitous Multimedia. 25-31. Maryland, United States.
12. Molet, T., H. Zhiyong, R. Boulic, and D. Thalmann. 1997. An animation interface designed for motion capture. Computer Animation. 3: 77-85.
13. Nayak, S., S. Sarkar, and B. Loeding. 2009. Distribution-based dimensionality reduction applied to articulated motion recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence. 31: 795-810.
14. Portillo-Rodriguez, O., C. A. Avizzano, E. Sotgiu, S. Pabon, A. Frisoli, J. Ortiz, and M. Bergamasco. 2007. A wireless Bluetooth data-glove based on a novel goniometric sensors. 16th IEEE International Symposium on Robot and Human interactive Communication. 1185-1190. Jeju Island, Korea.
15. Quam, D. L. 1990. Gesture recognition with a data-glove. IEEE National Aerospace and Electronics Conference. (2): 755-760. Ohio, United States.
16. Sawada, H., and S. Hashimoto. 1997. Gesture recognition using an acceleration sensor and its application to musical performance control. Electronics and Communications in Japan. 80(5): 9-17. Translated from Denshi Joho Tsushin Gakkai Ronbunshi. Vol. 79-A, No. 2, 1996, pp. 452-459.
17. Schlomer, T., B. Poppinga, N. Henze, and S. Boll. 2008. Gesture recognition with a Wii controller. 2nd ACM International Conference on Tangible and Embedded Interaction.11-14. Bonn, Germany.
18. Shi, Y., and T. Tsui. 2007. An FPGA-based smart camera for gesture recognition in HCI applications. 8th Asian Conference on Computer Vision. 718-727. Tokyo, Japan.
19. Shiqi, Z., Y. Chun, and Z. Yan. 2008. Self-defined gesture recognition on keyless handheld devices using MEMS 3D accelerometer. 4th International Conference on Natural Computation. 237-241. Jinan, China.
20. Sung-Do, C., A. S. Lee, and L. Soo-Young. 2006. On-line handwritten character recognition with 3D accelerometer. IEEE International Conference on Information Acquisition. 845-850. Weihai, China.
21. Tezuka, T., A. Goto, K. Kashiwa, H. Yoshikawa, and R. Kawano. 1994. A study on space interface for teleoperation system. 3rd IEEE International Workshop on Robot and Human Communication. 62-67. Nagoya, Japan.
22. Ullmann, T., and J. Sauer. 2000. Intuitive virtual grasping for non haptic environments. 8th Pacific Conference on Computer Graphics and Applications. 373-45. Hong Kong, China.
23. Valibeik, S., and Y. Guang-Zhong. 2008. Segmentation and tracking for vision based human robot interaction. IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology. 3: 471-476. Sydney, NSW.
24. Wu, Y., and T. Huang. 1999. Vision-based gesture recognition: A review. International Gesture Workshop on Gesture-Based Communication in Human-Computer Interaction. 1739: 103-115. Berlin, Germany.
25. Young-Rae, K., K. Eun-Yi, C. Jae-Sik, and P. Se-Hyun. 2008. Mobile robot control using hand shape recognition. Transactions of the Institute of Measurement and Control. 30(2): 143-152.
26. Zhuxin, D., U. C. Wejinya, Z. Shengli, S. Qing, and W. J. Li. 2009. Real-time written-character recognition using MEMS motion sensors: Calibration and experimental results. IEEE International Conference on Robotics and Biomimetics. 687-691. Bangkok, Thailand.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/42473-
dc.description.abstract本研究設計一個階層式手勢輸入介面,提供使用者利用內建有加速度計的遙控器,以有效而機動的方式來控制機器蛇。階層式手勢輸入介面主要包括手勢辨識及階層式手勢輸入。本研究主要使用加權互相關分析進行辨識,可以辨識控制機器蛇步態所需要的12種手勢和機器蛇形態所需的3種手勢,目前平均辨識率為94.89%,和十個阿拉伯數字0~9及14個英文字母,目前平均辨識率可達93.47%。本研究利用中樞模式產生器模型(CPG)產生連結型機器蛇的步態,除了可用12種手勢來控制已預設好的機器蛇步態,也可以使用這些阿拉伯數字及英文字母來修正機器蛇的行走速度及轉彎角度。其中英文字母代表機器蛇運動參數類別或階層,阿拉伯數字代表參數值。使用者可以用簡單、方便的方式控制機器蛇,只要依參數類別輸入適當的數值即可。本研究所提出的辨識方法除了可辨識上述12種控制機器蛇步態的手勢、3種控制機器蛇形態的手勢,阿拉伯數字與英文字母之外,亦可允許使用者自訂手勢,自訂的手勢只需要輸入30個手勢樣本即可加入訓練,建成手勢資料庫。本研究不只提供機器蛇與操作者一個有用的互動介面,也可應用在一般的消費性電子產品、工業用機器或機電設備的人機介面上。zh_TW
dc.description.abstractThis study is to design a hierarchical gesture input interface based on accelerometers mounted on a Nintendo Wii Remote Controller to improve the mobility and efficiency for controlling snake robots. This interface consists of two subunits: gesture recognition and hierarchical input approaches. We mainly use weighted cross correlation approaches to recognize user’s gestures including twelve snake motion gestures, three shape gestures, Arabic numerals from “0” to “9” and fourteen English alphabets. The average recognition rate for Snake motion gestures and shape gestures can reach up to 94.89% and for Arabic numerals and English alphabets can reach up to 93.47%. In the control of snake robots using hierarchical input approaches, users not only can directly control predefined snake robot gaits based on central pattern generator (CPG) and shapes of snake robots by using twelve snake motion gestures and three shape gestures respectively, but can also modify the motion speed or direction of snake robots by entering Arabic numerals and English alphabets. Snake motion gestures represent snake robot gaits, shape gestures represent the shapes of snake robots; English alphabets denote parameter types or hierarchical levels and Arabic numerals represent parameter values. Those Snake motion gestures, Arabic numerals and alphabets are enough for the CPG control of snake robots. Besides, the system allows self-defined gestures by providing 30 samples for each self-defined gesture in training phase. The developed human-snake robot interface can be applied to many other human-machine interaction devices such as consumer electronics products, industrial machines or other mechatronic devices.en
dc.description.provenanceMade available in DSpace on 2021-06-15T01:14:23Z (GMT). No. of bitstreams: 1
ntu-98-R95631014-1.pdf: 656142 bytes, checksum: db6e9fe28d3f30fa91303a763879c53d (MD5)
Previous issue date: 2009
en
dc.description.tableofcontents論文口試委員審定書 i
誌謝 ii
摘要 iv
Abstract v
Table of Contents vii
Figures ix
Tables x
Chapter 1 Introduction 1
Chapter 2 Literature Review 3
Chapter 3 Materials and Methods 8
3.1 System overview 8
3.2 Accelerometer-based controllers 9
3.3 Gesture recognition algorithm 10
3.3.1 Training stage 11
3.3.1.1 Preprocessing 12
3.3.1.2 User defined gestures 13
3.3.2 Testing stage 13
3.3.2.1 Weighted cross-correlation analysis 14
3.3.2.2 Gesture recognition 16
3.4 Hierarchical input structure 17
3.5 Snake robots 23
3.6 Communication between personal computer and snake robots 24
Chapter 4 Results and Discussions 29
4.1 Implementation of gesture interface for remote control of snake robots 29
4.2 User-defined gestures 38
Chapter 5 Conclusions 40
References 41
dc.language.isoen
dc.subject加速度計zh_TW
dc.subject加權互相關分析zh_TW
dc.subject機器蛇zh_TW
dc.subject手勢介面zh_TW
dc.subject中樞模式產生器模型zh_TW
dc.subjectAccelerometersen
dc.subjectWeighted cross correlation approachen
dc.subjectSnake robotsen
dc.subjectGesture interfaceen
dc.subjectCentral pattern generator (CPG)en
dc.title以加速度計感應手勢進行機器蛇之遙控zh_TW
dc.titleAccelerometer-based Gesture Interface for Remote Control of Snake Robotsen
dc.typeThesis
dc.date.schoolyear97-2
dc.description.degree碩士
dc.contributor.oralexamcommittee黃緒哲,莊勝雄,葉仲基
dc.subject.keyword加速度計,中樞模式產生器模型,手勢介面,機器蛇,加權互相關分析,zh_TW
dc.subject.keywordAccelerometers,Central pattern generator (CPG),Gesture interface,Snake robots,Weighted cross correlation approach,en
dc.relation.page45
dc.rights.note有償授權
dc.date.accepted2009-07-29
dc.contributor.author-college生物資源暨農學院zh_TW
dc.contributor.author-dept生物產業機電工程學研究所zh_TW
顯示於系所單位:生物機電工程學系

文件中的檔案:
檔案 大小格式 
ntu-98-1.pdf
  未授權公開取用
640.76 kBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved