請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/42589
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 黃漢邦 | |
dc.contributor.author | Chia-Chen Lin | en |
dc.contributor.author | 林佳貞 | zh_TW |
dc.date.accessioned | 2021-06-15T01:17:04Z | - |
dc.date.available | 2011-07-29 | |
dc.date.copyright | 2009-07-29 | |
dc.date.issued | 2009 | |
dc.date.submitted | 2009-07-27 | |
dc.identifier.citation | [1] K. Berns and J. Hirth, “Control of facial expressions of the humanoid robot head ROMAN,” Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 3119-3124, October 2006.
[2] K. Berns and T. Braum, “Design concept of a human-like robot head,” Proceedings of 2005 5th IEEE-RAS International Conference on Humanoid Robots, Tsukuba, pp. 32-37, December 2005. [3] M. Beszedes and P. Culverhouse, “Comparison of Human and Automatic Facial Emotions and Emotion Intensity Levels Recognition,” Proceedings of the 5th International Symposium on Image and Signal Processing and Analysis, Istanbul, pp. 429-434, September 2007. [4] C. Breazeal and B. Scassellati, “How to build robots that make friends and influence people,” Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, Kyongju, pp. 858-863, October 1999. [5] K. C. Chang, “Design and Implementation for the Tracking System of the Robot Head with Emotional Expression,” Master Thesis, Department of Electrical Engineering, National Chi Nan University, 2008. [6] C. Y. Cheng, “Design of Robot Head with Facial Expression,” Master Thesis, Department of Mechanical Engineering, National Taiwan University of Science and Technology, 2007. [7] Y. Cheon and Olaworks, “A Natural Facial Expression Recognition Using Differential-AAM and k-NNS,” Tenth IEEE International Symposium on Multimedia Tenth IEEE International Symposium on Multimedia, Berkeley, CA, pp. 220-227, December 2008. [8] P. W. Chiu, “Facial Expression Analysis by Computer Vision Techniques,” Master Thesis, Department of Mechanical Engineering, National Taiwan University of Science and Technology, 2006. [9] H. C. Choi and S. Y. Oh, “Realtime Facial Expression Recognition using Active Appearance Model and Multilayer Perceptron,” SICE-ICASE International Joint Conference 2006, Bexco, Busan, Korea, pp. 5924-5927, October 2006. [10] T. F. Cootes, G. J. Edwards, and C. J. Taylor, “Active Appearance Model,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 23, No. 6, pp. 681-685, June 2001. [11] P. Ekman, Emotions Revealed-Understanding Faces and Feelings, 1st Edition, Taipei, Taiwan, PsyGarden Publishing Company, 2004. [12] P. Ekman and W. V. Friesen, The Facial Action Coding System: A Technique for the Measurement of Facial Movement, Consulting Psychologists Press Inc., San Francisco, CA, 1978. [13] P. Ekman and W. V. Friesen, Facial Action Coding System, Palo Alto, CA: Consulting Psychologists Press, 1978. [14] T. Hashimoto, S. Hiramatsu, T. Tsuji, and H. Kobayashi, “Development of the Face Robot SAYA for Rich Facial Expressions,” SICE-ICASE International Joint Conference 2006, Bexco, Busan, Korea, pp. 5423-5428, October 2006. [15] T. Hashimoto, S. Hiramatsu, and H. Kobayashi, “Dynamic Display of Facial Expressions on the Face Robot Made by Using a Life Mask,” 2008 8th IEEE-RAS International Conference on Humanoid Robots, Daejeon, Korea, pp. 521-526, December 2008. [16] T. Hashimoto and H. Kobayashi, “Study on Natural Head Motion in Waiting State with Receptionist Robot SAYA that has human-like Appearance,” 2009. RIISS '09. IEEE Workshop on Robotic Intelligence in Informationally Structured Space, Nashville, TN, USA, pp. 93-98, April 2009. [17] M. Hashimoto, C. Yokogawa, and T. Sadoyama, “Development and Control of a Face Robot Imitating Human Muscular Structures,” Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 1855-1860, October 2006. [18] J. Hirth, N. Schmitz, and K. Berns, “Emotional Architecture for the Humanoid Robot Head ROMAN,” 2007 IEEE International Conference on Robotics and Automation Roma, Roma, Italy, pp. 2150-2155, April 2007. [19] C. H. Hsu, “Bimodal Emotion Recognition System Using Image and Speech Information,” Master Thesis, Department of Electrical and Control Engineering, National of Chiao Tung University, 2006. [20] K. Itoh, Y. Onishi, S. Takahashi, T. Aoki, K. Hayashi, and A. Takanishi, “Development of face robot to express various face shapes by moving the parts and outline,” Proceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, Scottsdale, AZ, pp. 439-444, October 2008. [21] P. Jaeckel, N. Campbell, and C. Melhuish, “Facial behaviour mapping-From video footage to a robot head,” Robotics and Autonomous Systems 56, pp. 1042-1049, 2008. [22] Y. M. Jao, “A Design of Emotional Expression Robot Head,” Master Thesis, Institute of Precision Mechatronic Engineering, Minghsin University of Science and Technology, 2009. [23] C. Y. Lin, C. K. Tseng, H. Y. Gu, K. L. Chung, C. S. Fahn, K. J. Lu, and C. C. Chang, “An autonomous singing and news broadcasting face robot,” 2008 8th IEEE-RAS International Conference on Humanoid Robots, Daejeon, Korea, pp. 454-461, December 2008. [24] C. Y. Lin, “Automatic Facial Expression Recognition System Based on Active Appearance Model,” Master Thesis, Graduate Institute of Automation Technology, National Taipei University of Technology, 2008 [25] S. Y. Lo, “A Model based Method for Facial Expression Recognition and Simulation,” Master Thesis, Department of Mechanical Engineering, Chung Yuan Christian University, 2008. [26] C. Martin, U. Werner, and H. Gross, “A real-time facial expression recognition system based on Active Appearance Models using gray images and edge images,” FG '08. 8th IEEE International Conference on Automatic Face & Gesture Recognition, Amsterdam, pp. 1-6, September 2008. [27] A. Meharabian, “Communication without words,” Psychology Today, Vol. 2 No. 4, pp. 52-551, 1968. [28] H. Miwa, T. Okuchi, H. Takanobu, and A. Takanishi, “Development of a New Human-like Head Robot WE-4,” Proceedings of the 2002 IEEE/RSJ International Conference on Intelligent Robots and Systems, Switzerland, pp. 2443-2448, October 2002. [29] H. Miwa, K. Itoh, M. Matsumoto, M. Zecca, H. Takanobu, S. Roccella, M. Carrozza, P. Dario, and A. Takanishi, “Effective Emotional Expressions with Emotion Expression Humanoid Robot WE-4RII -Integration of Humanoid Robot Hand RCH-1,” Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, Sendai, Japan, pp. 2203-2208, September 2004. [30] K. L. Moore and A. F. Dally, Clinically Oriented Anatomy, 5th Edition, Philadelphia: Lippincott Williams & Wilkins, 2006. [31] J. Oh, D. Hanson, W. Kim, I. Han, J. Kim, and I. Park, “Design of Android type Humanoid Robot Albert HUBO,” Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 1428-1433, October 2006. [32] S. Park, J. Shin, and D. Kim, “Facial expression analysis with facial expression deformation,” ICPR 2008. 19th International Conference on Pattern Recognition, Tampa, FL, pp. 1-4, December 2008. [33] S. Park, and D. Kim, “Spontaneous facial expression classification with facial motion vectors,” FG '08. 8th IEEE International Conference on Automatic Face & Gesture Recognition, Amsterdam, pp. 1-6, September 2008. [34] F. S. Su, “Facial Expression Detection System,” Master Thesis, Department of Communications Engineering, National Chung Cheng University, 2004. [35] F. Tang and B. Deng, “Facial Expression Recognition using AAM and Local Facial Features,” Third International Conference on Natural Computation, Haikou, pp. 632-635, August 2007. [36] V. Vapnik, The Nature of Statistical Learning Theory, 2nd, New York: Springer, 2000. [37] P. Viola and M. Jones, “Rapid object detection using a boosted cascade of simple feature,” Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Vol. 1, pp. I-511-I-518, 2001. [38] Y. M. Wang, “The Design and Development of Robot Head with Emotional Expression,” Master Thesis, Department of Electrical Engineering, National Chi Nan University, 2008. [39] W. Weiguo, M. Qingmei, and W. Yu, “Development of the humanoid head portrait robot system with flexible face and expression,” Proceedings of the 2004 IEEE International Conference on Robotics and Biomimetics, Shenyang, China, August 2004. [40] M. W. Wu, “Automatic Facial Expression Analysis System,” Master Thesis, Department of Computer Science and Information Engineering, National Cheng Kung University, 2003. [41] Y. Wu, H. Liu, and H. Zha, “Modeling Facial Expression Space for Recognition,” Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1968-1973, August, 2005. [42] Takanishi Laboratory: http://www.takanishi.mech.waseda.ac.jp/ [43] Artificial Intelligence Laboratory: http://www.ai.mit.edu/projects/humanoid-robotics-group/kismet/kismet.html [44] Hanson robotic: http://www.hansonrobotics.com/index.html [45] Kokoro Company: http://www.kokoro-dreams.co.jp/english/index.html [46] Robot watch: http://robot.watch.impress.co.jp/cda/news/2009/03/16/1665.html [47] Open CV: http://cell.fixstars.com/opencv/index.php/OpenCV_on_the_Cell [48] AAM-API: http://www.imm.dtu.dk/~aam/ [49] LIBSVM: http://www.csie.ntu.edu.tw/~cjlin/libsvm/ [50] School of Computer Science in CMU: http://www.cs.cmu.edu/afs/cs/project/face/www/facs.htm | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/42589 | - |
dc.description.abstract | 本論文研究目標為設計一具有類似人臉外型的機器人頭部,且可以呈現不同的臉部表情,並利用視覺功能進行表情辨識。
在表情呈現部分,本研究利用釣魚線與釣魚網模擬人臉線性與非線性肌肉結構,並且設計眼淚機構使機器人頭具有哭的表情並可以掉下眼淚。在表情辨識部分,本研究利用少量的嘴巴與眉毛參數進行表情辨識,研究結果顯示可以利用少量的參數達成高辨識率。最後結合表情呈現與表情辨識,設計一些狀況讓人與機器人間產生互動。 | zh_TW |
dc.description.abstract | Our research goal was to develop a face robot that has a human appearance and can display facial expressions. We also constructed facial expression recognition with vision in order to create human-robot interaction.
To display facial expressions, we used fishing line and fishing net to produce a muscular structure similar to the linear and nonlinear muscles of the human face. We also designed a mechanism for producing tears and made the face robot cry like a human being. For facial expression recognition, we used a few mouth and eyebrow descriptors by which to identify facial expressions. The results proved that a few descriptors could be used to achieve a high recognition ability. Finally, we combined facial expression display with facial expression recognition, and designed some conditions for human-robot interactions. We thus produced a closer and more natural relationship between humans and robots. | en |
dc.description.provenance | Made available in DSpace on 2021-06-15T01:17:04Z (GMT). No. of bitstreams: 1 ntu-98-R97522804-1.pdf: 3358362 bytes, checksum: 6cac2c99572f34b6322261b0a4404fbc (MD5) Previous issue date: 2009 | en |
dc.description.tableofcontents | 致謝 I
摘要 II Abstract III List of Tables VII List of Figures VIII Chapter 1 Introduction 1 1.1 Motivation 1 1.2 Contributions 2 1.3 Thesis Organization 3 Chapter 2 Background Knowledge and Relevant Research 5 2.1 Human Facial Expression 5 2.1.1 Expression Muscles 5 2.1.2 Facial Action Coding System 6 2.2 Facial Expression Display by Face Robots 8 2.3 Facial Expression Recognition 19 2.4 Active Appearance Model 20 2.5 Support Vector Machines 21 2.5.1 Linearly Separable Case 23 2.5.2 Non-Linear Case 25 Chapter 3 Face Robot Design 26 3.1 Introduction 26 3.2 Structure of the Face Robot 28 3.2.1 Skin 28 3.2.2 Skeleton 29 3.2.3 Internal Frame 29 3.2.4 Motor 30 3.2.5 Serial Servo Controller 31 3.3 Muscle Structure and Control Points 31 3.4 Mechanism of the Line-driven System 33 3.4.1 Mechanism for Linear Muscles 33 3.4.2 Mechanism for Nonlinear Muscles 38 3.4.3 Line Installation 45 3.5 Mechanism Design 45 3.5.1 Neck 45 3.5.2 Eyeballs and Eyelids 46 3.5.3 Jaw 48 3.5.4 Tears 48 3.5.5 Blushing 49 Chapter 4 Facial Expression Recognition 50 4.1 Introduction 50 4.1.1 Face Detection 51 4.2 Feature Extraction 52 4.3 Descriptor definition 53 4.3.1 Mouth 53 4.3.2 Eyebrows 57 4.4 Facial Expression Classification 59 4.4.1 Facial Expression Database 59 4.4.2 Training and Testing the Classifier 59 4.4.3 Results 60 4.5 Eyebrow Classification 64 4.5.1 Eyebrow Database 64 4.5.2 Results 65 4.6 Combination of the results 67 Chapter 5 Implementation and Experiments 70 5.1 Displaying Facial Expressions on a Face Robot 70 5.1.1 Six Typical Facial Expressions 70 5.1.2 Special Facial Expressions-Crying 74 5.1.3 Special Facial Expressions-Blushing 74 5.1.4 Jaw 74 5.2 Facial Expression Interactions 75 5.2.1 Interaction 75 5.2.2 Mimicry 75 Chapter 6 Conclusions and Future Work 78 6.1 Conclusions 78 6.2 Future Work 79 References 80 Appendix I 84 | |
dc.language.iso | en | |
dc.title | 仿生人形機器人頭部設計與表情辨識 | zh_TW |
dc.title | A Face Robot Design and Facial Expression Recognition | en |
dc.type | Thesis | |
dc.date.schoolyear | 97-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 吳東權,林其禹 | |
dc.subject.keyword | 機器人,表情,表情辨識, | zh_TW |
dc.subject.keyword | face robot,robot head,facial expression recognition,AAM,SVM, | en |
dc.relation.page | 89 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2009-07-28 | |
dc.contributor.author-college | 工學院 | zh_TW |
dc.contributor.author-dept | 機械工程學研究所 | zh_TW |
顯示於系所單位: | 機械工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-98-1.pdf 目前未授權公開取用 | 3.28 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。