Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 工學院
  3. 機械工程學系
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99246
Title: 基於混合實境之互動式手術機器人控制
Interactive Surgical Robot Control Using Mixed Reality
Authors: 麥心宇
Hsin-Yu Mai
Advisor: 黃漢邦
Han-Pang Huang
Keyword: 混合實境,手術機器人,人機互動,逆向運動學,HoloLens 2,手勢辨識,機械手臂,路徑規劃,Unity,MRTK3,
Mixed Reality,Surgical Robot,Human-Robot Interaction,Inverse Kinematics,HoloLens 2,Gesture Recognition,Robotic Arm,Path Planning,Unity,MRTK3,
Publication Year : 2025
Degree: 碩士
Abstract: 隨著智慧醫療與精準手術的需求日益增加,混合實境技術在提高手術訓練成效與機器人操作效率方面展現高度潛力。本論文提出一套基於混合實境之手術機器人互動控制架構,應用於眼科手術場景。透過 Microsoft HoloLens 2 與 MRTK3,本系統使使用者能以自然手勢直觀操作移動式平台與六自由度機械手臂,並整合前向/反向運動學、軌跡規劃與 Unity–ROS 即時通訊流程。
使用者可在 MR 環境中透過客製化手勢指定三維目標點並規劃操作路徑,系統則透過反向運動學與通訊模組,將控制指令即時傳送至實體機器人。本系統亦結合空間網格生成與 QR Code 模型註冊,實現虛實模型精確對齊;並設計自定義手勢以支援互動式操作與路徑規劃。
在控制模式方面,系統提供手動與自動雙模式,應用 A* 演算法與阻尼最小平方法進行導航與末端定位控制。此外,平台亦整合自主路徑規劃、CAD 模型視覺化、空間感知與 UI 操作等模組,建構一個直覺且沉浸式的手術模擬與訓練系統。
實驗結果顯示,該系統能穩定且準確地完成底盤導航與手臂操作任務,驗證其虛實整合與人機互動控制效能。研究成果為未來智慧型、協作式與遠距手術系統之發展提供了可行之基礎與應用潛力。
With the growing demand for intelligent healthcare and precision surgery, mixed reality (MR) technologies have shown great potential in enhancing surgical training and improving robotic operation efficiency. This thesis proposes an MR-based interactive control framework for surgical robotics, specifically designed for ophthalmic scenarios. Leveraging Microsoft HoloLens 2 and MRTK3, the system enables users to intuitively operate a mobile robotic platform and a 6-degree-of-freedom robotic arm via natural hand gestures, integrating forward/inverse kinematics, trajectory planning, and Unity–ROS real-time communication.
Users can define 3D target points and designate movement paths within the MR environment using customized gestures. The system then calculates joint angles through inverse kinematics and transmits the commands to the physical robot via a communication module. By incorporating spatial mesh generation and QR code-based CAD model registration, precise alignment between virtual and physical components is achieved. Interactive gesture control is also implemented to facilitate real-time path designation and execution.
The system supports both manual and autonomous control modes, with A* path planning and Damped Least Squares applied for navigation and end-effector positioning. Additional modules—including autonomous path planning, CAD model visualization, spatial understanding, and UI interactions—form a comprehensive and immersive surgical simulation and training platform.
Experimental results confirm that the proposed system enables stable and precise navigation and manipulation within MR environments. This research demonstrates a feasible foundation for the development of intelligent, collaborative, and remote-controlled surgical robotic systems in future clinical applications.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99246
DOI: 10.6342/NTU202503224
Fulltext Rights: 未授權
metadata.dc.date.embargo-lift: N/A
Appears in Collections:機械工程學系

Files in This Item:
File SizeFormat 
ntu-113-2.pdf
  Restricted Access
8.86 MBAdobe PDF
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved