Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/60876
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor羅仁權
dc.contributor.authorPo-Han Shihen
dc.contributor.author施博瀚zh_TW
dc.date.accessioned2021-06-16T10:34:19Z-
dc.date.available2015-08-20
dc.date.copyright2013-08-20
dc.date.issued2013
dc.date.submitted2013-08-14
dc.identifier.citation[1] F. Wang, C. Tang, Y. Ou, and Y. Xu, 'A real-time human imitation system,' World Congress on Intelligent Control and Automation (WCICA), 2012, pp. 3692-3697.
[2] V. V. Nguyen and J. H. Lee, 'Full-body imitation of human motions with Kinect and heterogeneous kinematic structure of humanoid robot,' IEEE/SICE International Symposium on System Integration (SII), 2012, pp. 93-98.
[3] L. Dongheui, C. Ott, Y. Nakamura, and G. Hirzinger, 'Physical human robot interaction in imitation learning,' IEEE International Conference on Robotics and Automation (ICRA), 2011, pp. 3439-3440.
[4] T. Petric and L. Zlajpah, 'Smooth transition between tasks on a kinematic control level: Application to self collision avoidance for two Kuka LWR robots,' IEEE International Conference on Robotics and Biomimetics (ROBIO), 2011, pp. 162-167.
[5] A. Thobbi and S. Weihua, 'Imitation learning of arm gestures in presence of missing data for humanoid robots,' IEEE-RAS International Conference on Humanoid Robots (Humanoids), 2010, pp. 92-97.
[6] G. Pons-Moll, A. Baak, T. Helten, Mu, x, M. ller, H. P. Seidel, and B. Rosenhahn, 'Multisensor-fusion for 3D full-body human motion capture,' IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2010, pp. 663-670.
[7] L. Yebin, C. Stoll, J. Gall, H. P. Seidel, and C. Theobalt, 'Markerless motion capture of interacting characters using multi-view image segmentation,' IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2011, pp. 1249-1256.
[8] J. B. Cole, D. B. Grimes, and R. P. N. Rao, 'Learning full-body motions from monocular vision: dynamic imitation in a humanoid robot,' IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2007, pp. 240-246.
[9] C. Ott, L. Dongheui, and Y. Nakamura, 'Motion capture based human motion recognition and imitation by direct marker control,' IEEE-RAS International Conference on Humanoid Robots (Humanoids), 2008, pp. 399-405.
[10] T. Leyvand, C. Meekhof, W. Yi-Chen, S. Jian, and G. Baining, 'Kinect identity: technology and experience,' Computer, vol. 44, pp. 94-96, 2011.
[11] J. Smisek, M. Jancosek, and T. Pajdla, '3D with Kinect,' IEEE International Conference on Computer Vision Workshops (ICCV Workshops), 2011, pp. 1154-1160.
[12] P. Azad, T. Asfour, and R. Dillmann, 'Toward an unified representation for imitation of human motion on humanoids,' IEEE International Conference on Robotics and Automation (ICRA), 2007, pp. 2558-2563.
[13] M. Do, P. Azad, T. Asfour, and R. Dillmann, 'Imitation of human motion on a humanoid robot using non-linear optimization,' IEEE-RAS International Conference on Humanoid Robots (Humanoids), 2008, pp. 545-552.
[14] P. Shon, J. J. Storz, and R. P. N. Rao, 'Towards a Real-Time Bayesian Imitation System for a Humanoid Robot,' IEEE International Conference on Robotics and Automation (ICRA), 2007, pp. 2847-2852.
[15] T. Kroger, A. Tomiczek, and F. M. Wahl, 'Towards on-line trajectory computation,' IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2006, pp. 736-741.
[16] T. Kroger and J. Padial, 'Simple and robust visual servo control of robot arms using an on-line trajectory generator,' IEEE International Conference on Robotics and Automation (ICRA), 2012, pp. 4862-4869.
[17] F. Flacco, T. Kroger, A. De Luca, and O. Khatib, 'A depth space approach to human-robot collision avoidance,' IEEE International Conference on Robotics and Automation (ICRA), 2012, pp. 338-345.
[18] N. Hogan, “Impedance control: An approach to manipulation, part I - theory, part II - implementation, part III – applications,” Journal of Dynamic Systems, Measurement and Control, 1985.
[19] N. Hogan, S. P. Buerger, “Robotics and automation handbook, Chapter 19. Impedance and interaction control,” CRC Press, Oct. 2004.
[20] A. Albu-Schaffer and G. Hirzinger, “Cartesian impedance control techniques for torque controlled light-weight robots,” IEEE International Conference on Robotics and Automation (ICRA), 2002.
[21] A. Albu-Schaffer, C. Ott, U. Frese and G. Hirzinger, “Cartesian Impedance Control of Redundant Robots: Recent Results with the DLR-Light-Weight-Arms,” IEEE International Conference on Robotics and Automation (ICRA), 2003.
[22] C. Ott and Y. Nakamura, “Base force/torque sensing for position based Cartesian impedance control,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2009.
[23] B. Freedman. 'Method and system for object reconstruction'. US Patent US 2010/0118123 A1.
[24] K. Khoshelham. 'Accuracy analysis of Kinect depth data'. ISPRS Workshop Laser Scanning, vol. 38, pp. 133-138, 2011.
[25] M. R. Andersen, T. Jensen, P. Lisouski, A. K. Mortensen, M. K. Hansen, T. Gregersen and P. Ahrendt: 'Kinect depth sensor evaluation for computer vision applications,' 2012. Department of Engineering, Aarhus University. Denmark. 37 pp. - Technical report ECE-TR-6
[26] J. Shotton, A. Fitzgibbon, M. Cook, T. Sharp, M. Finocchio, R. Moore, A. Kipman, A. Blake, “Real-time human pose recognition in parts from a single depth image,” IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2011.
[27] Z. Li, W. W. Melek and C. Clark, “Decentralized robust control of robot manipulators with harmonic drive transmission and application to modular and reconfigurable serial arms”, Robotica, vol. 27, issue 2, pp. 291-302, March 1, 2009.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/60876-
dc.description.abstract隨著科技的發展,智慧型服務機器人將漸漸地進入我們的日常生活中。為了能協助人們處理複雜工作,機器人必須具備多功能且靈活的操作技巧。尤其對人型機器人來說,以人的行為動作來完成任務將是不可或缺的能力。為了讓機器人有類似人類的動作,機器手臂的軌跡規劃是重要關鍵。然而,由於需要同時處理多個自由度,使得產生類似人類動作的軌跡非常的複雜。因此,本論文的動機與目的在於發展一個人類動作模仿系統來使機器人產生類似人類的動作。
透過示範來學習是一個直觀和有效的方式,讓人型機器人學習各種人類的動作。利用動作擷取系統直接將人的動作資訊提供給機器人,而不去進行複雜的軌跡規劃。在本論文中,我們使用具深度量測之視覺感測器Kinect來獲得人體的動作資訊,並即時轉換成機器手臂的運動軌跡。考慮到機器手臂的性能及軌跡的平滑性,我們設計了一個對速度及加速度做限制的線上軌跡產生器以得到平滑的軌跡。
在機器手臂控制的部分,我們採用基於笛卡爾空間的控制架構,求得機器手臂在空間中移動所需的力,並透過向量投影的方法將笛卡爾空間的力轉成關節空間的力矩,再以扭矩控制來操控機器手臂。我們利用虛擬彈簧阻尼元件來實現許多功能,包括運動跟隨控制、虛擬牆及自我防撞的功能。虛擬牆的概念可用於限制機器手臂的工作空間,自我防撞的功能可以避免可能發生的碰撞。由於機器手臂的行為是由虛擬彈簧阻尼元件產生的力所決定,為確保機器手臂運行時的穩定,我們對力的大小及變化做了限制。因為系統所有的功能整合的十分完善,我們成功地展示了雙手臂機器人實時模仿人類動作的功能,並能確保其穩定性與安全性。
zh_TW
dc.description.abstractWith the advancement of technology, intelligent service robots will gradually join with human society and come into our daily life. In order to help people deal with complex tasks, the robot must have versatile and flexible manipulative skills. Especially for humanoid robots, it will be indispensable that robots can accomplish the tasks with human action. To allow the robot has human-like movements, trajectory planning of robot arm is crucial. However, due to the need to handle multiple degrees of freedom (DOFs) simultaneously such that the generating human-like motion trajectory is very complicated. Therefore, the motivation and purpose of this thesis is to develop a human motion imitation system to make the robot generate human-like motions.
Learning by demonstration is an intuitive and efficient way to let a humanoid robot learn a variety of human motions. By using the motion capture system, we can provide the human motion information to the robot directly, rather than take more cumbersome way for performing complex trajectory planning. In this thesis, we use Kinect which is capable of obtaining visual 3D depth information to get the human motion information and instantly convert the data into robot arm trajectory. Taking into account the capability of robot arm and the smoothness of trajectory, we design an on-line trajectory generator imposing the limit of velocity and acceleration to obtain a smooth trajectory.
In the aspect of robot arm control, we use Cartesian-based control architecture to compute the required Cartesian force for moving the robot arm in the space. Through the vector projection method, the Cartesian force can be transformed into joint torque. Then we apply torque control to manipulate the robot arm. We use virtual spring-damper elements to implement many functions, including motion following control, virtual wall and self-collision avoidance. The concept of virtual wall can be used to restrict the workspace of robot arm. Self-collision avoidance can avoid possible collisions. Since the behavior of the robot arm is according to the force generated by the virtual spring-damper elements, we limit the magnitude and variation of the force to ensure the action of robot arm is stable. Because all of the functions in the system are well integrated, we successfully demonstrate that the dual arm robot can imitate human motion in real time and can guarantee its stability and safety.
en
dc.description.provenanceMade available in DSpace on 2021-06-16T10:34:19Z (GMT). No. of bitstreams: 1
ntu-102-R99921011-1.pdf: 3069798 bytes, checksum: 3cfab60db27ddfbf65dac3281255c3d6 (MD5)
Previous issue date: 2013
en
dc.description.tableofcontents誌謝 I
中文摘要 II
ABSTRACT III
TABLE OF CONTENTS V
LIST OF FIGURES VI
LIST OF TABLES VIII
CHAPTER 1 INTRODUCTION 1
1.1 Motivation and Objectives 1
1.2 Literature Review 2
1.3 Thesis Organization 4
CHAPTER 2 EXPERIMENT HARDWARE SYSTEM 5
2.1 Anthropomorphic Robot Arm Robot 5
2.1.1 Mechanism Design of Robot Arms 6
2.1.2 Kinematics Analysis 10
2.1.3 System structure 14
2.2 Kinect Vision Sensor 19
CHAPTER 3 HUMAN MOTION IMITATION SYSTEM 25
3.1 Overview of Human Motion Imitation System 25
3.2 Motion Capture System 28
3.3 On-line Trajectory Generator 32
3.4 Motion Following Control 37
3.5 Workspace Boundary Constraint 40
3.6 Self-collision Avoidance 43
3.7 Force Limitation 50
CHAPTER 4 EXPERIMENTAL RESULTS 53
4.1 Experimental Setup 53
4.2 Results and Discussion 54
CHAPTER 5 CONCLUSIONS AND CONTRIBUTIONS 64
5.1 Conclusions and Contributions 64
5.2 Future Works 65
REFERENCES 66
VITA 70
dc.language.isoen
dc.subject人類動作模仿zh_TW
dc.subject動作擷取zh_TW
dc.subject線上軌跡產生器zh_TW
dc.subject自我防撞zh_TW
dc.subject人型機器人zh_TW
dc.subjecton-line trajectory generatoren
dc.subjecthuman motion imitationen
dc.subjectmotion captureen
dc.subjectself-collision avoidanceen
dc.subjecthumanoid roboten
dc.title俱力量及順應性控制之擬人型雙手臂機器人應用於人類動作模仿zh_TW
dc.titleCartesian Force and Compliance Control of Anthropomorphic Dual Robot Arm for Motion Imitation of Human Beingen
dc.typeThesis
dc.date.schoolyear101-2
dc.description.degree碩士
dc.contributor.oralexamcommittee鄭榮偉,張帆人
dc.subject.keyword人類動作模仿,動作擷取,線上軌跡產生器,自我防撞,人型機器人,zh_TW
dc.subject.keywordhuman motion imitation,motion capture,on-line trajectory generator,self-collision avoidance,humanoid robot,en
dc.relation.page70
dc.rights.note有償授權
dc.date.accepted2013-08-14
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept電機工程學研究所zh_TW
顯示於系所單位:電機工程學系

文件中的檔案:
檔案 大小格式 
ntu-102-1.pdf
  未授權公開取用
3 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved