請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/16145
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 郭柏齡(Po-Ling Kuo) | |
dc.contributor.author | Chun-Fu Kuo | en |
dc.contributor.author | 郭峻輔 | zh_TW |
dc.date.accessioned | 2021-06-07T18:02:45Z | - |
dc.date.copyright | 2020-08-07 | |
dc.date.issued | 2020 | |
dc.date.submitted | 2020-08-03 | |
dc.identifier.citation | K. Fuchs, 'Minimally invasive surgery,' Endoscopy, vol. 34, no. 02, pp. 154-159, 2002. S. Shetty et al., 'Construct and face validity of a virtual reality–based camera navigation curriculum,' journal of surgical research, vol. 177, no. 2, pp. 191-195, 2012. S. Voros, G.-P. Haber, J.-F. Menudet, J.-A. Long, and P. Cinquin, 'ViKY robotic scope holder: Initial clinical experience and preliminary results using instrument tracking,' IEEE/ASME transactions on mechatronics, vol. 15, no. 6, pp. 879-886, 2010. P. F. Escobar, J. Knight, M. Kroh, S. Chalikonda, J. Kaouk, and R. Stein, 'Single-port hysterectomy with pelvic lymph node dissection in the porcine model: feasibility and validation of a novel robotic lightweight endoscope positioner,' Gynecological Surgery, vol. 9, no. 1, pp. 97-101, 2012. E. Abdi, M. Bouri, J. Olivier, and H. Bleuler, 'Foot-controlled endoscope positioner for laparoscopy: Development of the master and slave interfaces,' in 2016 4th International Conference on Robotics and Mechatronics (ICROM), 2016: IEEE, pp. 111-115. E. Abdi, M. Bouri, E. Burdet, and H. Bleuler, 'Development and Comparison of Foot Interfaces for Controlling a Robotic Arm in Surgery,' in 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), 2018: IEEE, pp. 414-420. J. Westwood, 'Portable tool positioning robot for telesurgery,' Medicine Meets Virtual Reality 17: NextMed: Design For/the Well Being, vol. 142, p. 438, 2009. C. A. Nelson, X. Zhang, B. C. Shah, M. R. Goede, and D. Oleynikov, 'Multipurpose surgical robot as a laparoscope assistant,' Surgical endoscopy, vol. 24, no. 7, pp. 1528-1532, 2010. X. Zhang, A. Lehman, C. A. Nelson, S. M. Farritor, and D. Oleynikov, 'Cooperative robotic assistant for laparoscopic surgery: CoBRASurge,' in 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2009: IEEE, pp. 5540-5545. A. Borji, D. Parks, and L. Itti, 'Complementary effects of gaze direction and early saliency in guiding fixations during free viewing,' Journal of vision, vol. 14, no. 13, pp. 3-3, 2014. O. Palinko, F. Rea, G. Sandini, and A. Sciutti, 'Robot reading human gaze: Why eye tracking is better than head tracking for human-robot collaboration,' in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016: IEEE, pp. 5048-5054. K. Kume, N. Sakai, and T. Goto, 'Development of a novel endoscopic manipulation system: the Endoscopic Operation Robot ver. 3,' Endoscopy, vol. 47, no. 09, pp. 815-819, 2015. S. S. Kommu, P. Rimington, C. Anderson, and A. Rané, 'Initial experience with the EndoAssist camera-holding robot in laparoscopic urological surgery,' Journal of robotic surgery, vol. 1, no. 2, pp. 133-137, 2007. G. Wyeth, 'Demonstrating the safety and performance of a velocity sourced series elastic actuator,' in 2008 IEEE International Conference on Robotics and Automation, 2008: IEEE, pp. 3642-3647. D. P. Noonan, G. P. Mylonas, J. Shang, C. J. Payne, A. Darzi, and G.-Z. Yang, 'Gaze contingent control for an articulated mechatronic laparoscope,' in 2010 3rd IEEE RAS EMBS International Conference on Biomedical Robotics and Biomechatronics, 2010: IEEE, pp. 759-764. N. T. Clancy, G. P. Mylonas, G.-Z. Yang, and D. S. Elson, 'Gaze-contingent autofocus system for robotic-assisted minimally invasive surgery,' in 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2011: IEEE, pp. 5396-5399. K. Fujii, A. Salerno, K. Sriskandarajah, K.-W. Kwok, K. Shetty, and G.-Z. Yang, 'Gaze contingent cartesian control of a robotic arm for laparoscopic surgery,' in 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013: IEEE, pp. 3582-3589. R. Reilink, G. de Bruin, M. Franken, M. A. Mariani, S. Misra, and S. Stramigioli, 'Endoscopic camera control by head movements for thoracic surgery,' in 2010 3rd IEEE RAS EMBS International Conference on Biomedical Robotics and Biomechatronics, 2010: IEEE, pp. 510-515. A. A. Kogkas, A. Darzi, and G. P. Mylonas, 'Gaze-contingent perceptually enabled interactions in the operating theatre,' International journal of computer assisted radiology and surgery, vol. 12, no. 7, pp. 1131-1140, 2017. R. C. Luo, J. W. Chen, and Y. W. Perng, 'Robotic endoscope system with compliance effect including adaptive impedance and velocity control for assistive laparoscopic surgery,' in 2010 3rd IEEE RAS EMBS International Conference on Biomedical Robotics and Biomechatronics, 2010: IEEE, pp. 100-105. R. C. Luo, J. Wang, C. K. Chang, and Y. W. Perng, 'Surgeon's third hand: An assistive robot endoscopic system with intuitive maneuverability for laparoscopic surgery,' in 5th IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics, 2014: IEEE, pp. 138-143. R. C. Luo, J. Wang, J.-Y. Tsai, K.-M. Lee, and Y.-W. Perng, 'Robotic Flexible Laparoscope with position retrieving system for assistive minimally invasive surgery,' in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015: IEEE, pp. 2024-2029. G. Gras and G.-Z. Yang, 'Intention recognition for gaze controlled robotic minimally invasive laser ablation,' in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016: IEEE, pp. 2431-2437. T. O. Vrielink, J. G.-B. Puyal, A. Kogkas, A. Darzi, and G. Mylonas, 'Intuitive Gaze-Control of a Robotized Flexible Endoscope,' in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018: IEEE, pp. 1776-1782. D. Das, M. G. Rashed, Y. Kobayashi, and Y. Kuno, 'Recognizing gaze pattern for human robot interaction,' in Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction, 2014, pp. 142-143. W. Chen, X. Cui, J. Zheng, J. Zhang, S. Chen, and Y. Yao, 'Gaze Gestures and Their Applications in human-computer interaction with a head-mounted display,' arXiv preprint arXiv:1910.07428, 2019. F. Koochaki and L. Najafizadeh, 'Predicting intention through eye gaze patterns,' in 2018 IEEE Biomedical Circuits and Systems Conference (BioCAS), 2018: IEEE, pp. 1-4. H. He et al., 'Real-time eye-gaze based interaction for human intention prediction and emotion analysis,' in Proceedings of Computer Graphics International 2018, 2018, pp. 185-194. E. Cáceres, M. Carrasco, and S. Ríos, 'Evaluation of an eye-pointer interaction device for human-computer interaction,' Heliyon, vol. 4, no. 3, p. e00574, 2018. A. Shafti, P. Orlov, and A. A. Faisal, 'Gaze-based, context-aware robotic system for assisted reaching and grasping,' in 2019 International Conference on Robotics and Automation (ICRA), 2019: IEEE, pp. 863-869. C. Staub, S. Can, B. Jensen, A. Knoll, and S. Kohlbecher, 'Human-computer interfaces for interaction with surgical tools in robotic surgery,' in 2012 4th IEEE RAS EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), 2012: IEEE, pp. 81-86. S. Li, J. Zhang, L. Xue, F. J. Kim, and X. Zhang, 'Attention-aware robotic laparoscope for human-robot cooperative surgery,' in 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO), 2013: IEEE, pp. 792-797. Y. Cao, Q. Liu, Y. Kobayashi, K. Kawamura, S. Sugano, and M. G. Fujie, 'Development of an endoscopic manipulator control system with intention recognition based on pupil movement,' in 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO), 2015: IEEE, pp. 1637-1642. K. Fujii, G. Gras, A. Salerno, and G.-Z. Yang, 'Gaze gesture based human robot interaction for laparoscopic surgery,' Medical image analysis, vol. 44, pp. 196-214, 2018. S. W. Prince et al., 'A robotic system for telementoring and training in laparoscopic surgery,' The International Journal of Medical Robotics and Computer Assisted Surgery, vol. 16, no. 2, p. e2040, 2020. A. M. Feit et al., 'Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design,' in Proceedings of the 2017 Chi conference on human factors in computing systems, 2017, pp. 1118-1130. J. P. Hansen, A. S. Johansen, D. W. Hansen, K. Itoh, and S. Mashino, 'Command without a click: Dwell time typing by mouse and gaze selections,' in 10th International Conference on Human-Computer Interaction, 2003: IOS Press, pp. 121-128. S. P. Liversedge and J. M. Findlay, 'Saccadic eye movements and cognition,' Trends in cognitive sciences, vol. 4, no. 1, pp. 6-14, 2000. R. J. Jacob, 'What you look at is what you get: eye movement-based interaction techniques,' in Proceedings of the SIGCHI conference on Human factors in computing systems, 1990, pp. 11-18. A. M. Penkar, C. Lutteroth, and G. Weber, 'Designing for the eye: design parameters for dwell in gaze interaction,' in Proceedings of the 24th Australian Computer-Human Interaction Conference, 2012, pp. 479-488. J. P. Hansen, V. Rajanna, I. S. MacKenzie, and P. Bækgaard, 'A Fitts' law study of click and dwell interaction by gaze, head and mouse with a head-mounted display,' in Proceedings of the Workshop on Communication by Gaze Interaction, 2018, pp. 1-5. P. Majaranta, U.-K. Ahola, and O. Špakov, 'Fast gaze typing with an adjustable dwell time,' in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2009, pp. 357-360. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/16145 | - |
dc.description.abstract | 在執行腹腔鏡手術時,拿著腹腔鏡的助手會為外科醫生調整影像的視野,以使醫生有適當的視野。但是,如果可以節省人力並為外科醫生創造更多的工作空間,它將為他們提供更好的工作環境。借助將外科醫生的凝視點作為控制內視鏡機器人的方法,它可以解決助手操作腹腔鏡時所會產生的問題。 因此,本研究開發了在基於手術房之長距離的設置下,以凝視點作為系統的輸入來控制的腹腔鏡機器人。使用者界面的設計是基於人的行為和進行肝臟手術的外科醫生的建議。另外,系統中引入的速度控制器可以使外科醫生憑自己的注視點直觀地調整內視鏡的位置,在進行手術時滿足影像的需求。 | zh_TW |
dc.description.abstract | During the laparoscopic surgery, the assistant who holds the endoscope adjusts the view of image for the surgeon to have an appropriate view to conduct the surgery. However, if the manpower can be saved and create more workspace for the surgeons, it will make a better working environment for them. With the help of using the surgeon’s gaze points as the command for controlling the endoscopic robot, it can solve the problems that may arise when the assistant operates the laparoscope. Therefore, using gaze points as an input to control the laparoscopic robot based on the setting of the long distance in the operating room is developed in this research. The design of the user interface is based on human factors and advices of the surgeon who conduct the liver surgery. Furthermore, the velocity controller introduced in the system can make surgeons intuitively adjust the position of the endoscope with their own gaze points to satisfy the need of the image while doing the surgery. | en |
dc.description.provenance | Made available in DSpace on 2021-06-07T18:02:45Z (GMT). No. of bitstreams: 1 U0001-3007202016551900.pdf: 4768880 bytes, checksum: 341674aa1147c5e648847e482694ed08 (MD5) Previous issue date: 2020 | en |
dc.description.tableofcontents | 口試委員會審定書 i 誌謝 ii 中文摘要 iii ABSTRACT iv CONTENTS v LIST OF FIGURES viii LIST OF TABLES xiv Chapter 1 Introduction 1 1.1 Motivation and Problem Definition 1 1.2 Previous Work 3 1.3 Proposed Approach 5 1.4 Thesis Overview 6 Chapter 2 Previous Work 8 2.1 Laparoscopic Robot with Different Control Methods 8 2.1.1 Voice Control 9 2.1.2 Joystick Control 11 2.1.3 Head Motion Control 13 2.1.4 Gaze Control 15 2.2 Display Manipulation by Velocity Control 18 2.3 Intention by Gaze in Human-Robot Interaction 22 Chapter 3 Gaze Acquisition System with Operating Room Setup 29 3.1 Operating Room Setup 29 3.2 Gaze Acquisition System under Long Distance 30 3.3 Gaze Points Determination by Averaging 37 Chapter 4 Gaze Guided Laparoscopic Robot Design 39 4.1 LapaRobot with Endoscope 39 4.2 Architecture of Laparoscopic Robot System 48 4.3 Laparoscopic Robot Control 52 4.4 Human-Robot Collaboration 56 Chapter 5 Experimental Results 66 5.1 Experimental Setting 66 5.2 Gaze Acquisition System 68 5.2.1 Precision and Accuracy 68 5.2.2 Subject Variation 75 5.2.3 Design of the Size of Buttons 82 5.2.4 Evaluation of Dwell Time 83 5.2.5 Work Range of Gaze Acquisition System 85 5.3 Performance of Laparoscopic Robot Control 89 5.4 Evaluation of Human-Robot Collaboration 93 5.5 Discussions 101 Chapter 6 Conclusions and Future Work 103 REFERENCES 105 | |
dc.language.iso | en | |
dc.title | 視線導引腹腔鏡機器人控制系統設計及其人機互動 | zh_TW |
dc.title | Gaze Guided Laparoscopic Robot Control System Design and its Human-Robot Interaction | en |
dc.type | Thesis | |
dc.date.schoolyear | 108-2 | |
dc.description.degree | 碩士 | |
dc.contributor.coadvisor | 陳永耀(Yung-Yaw Chen) | |
dc.contributor.oralexamcommittee | 陳政維(Cheng-Wei Chen),何明志(Ming-Chih Ho),顏家鈺(Jia-Yush Yen) | |
dc.subject.keyword | 凝視導引,腹腔鏡手術,速度控制,人-機器人互動,手術房配置, | zh_TW |
dc.subject.keyword | gaze guidance,laparoscopic surgery,velocity control,Human-Robot Interaction,operating room setup, | en |
dc.relation.page | 108 | |
dc.identifier.doi | 10.6342/NTU202002116 | |
dc.rights.note | 未授權 | |
dc.date.accepted | 2020-08-04 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 生醫電子與資訊學研究所 | zh_TW |
顯示於系所單位: | 生醫電子與資訊學研究所 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
U0001-3007202016551900.pdf 目前未授權公開取用 | 4.66 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。