Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 工學院
  3. 醫學工程學研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/71562
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳中明(Chung-Ming Chen)
dc.contributor.authorFu-Sheng Hsuen
dc.contributor.author許富勝zh_TW
dc.date.accessioned2021-06-17T06:03:19Z-
dc.date.available2024-02-12
dc.date.copyright2019-02-12
dc.date.issued2019
dc.date.submitted2019-01-28
dc.identifier.citation[1] Wong, C.H. and F.C. Wei, Microsurgical free flap in head and neck reconstruction. Head & neck, 2010. 32(9): p. 1236-1245.
[2] Scheker, L., G. Lister, and T. Wolff, The lateral arm free flap in releasing severe contracture of the first web space. Journal of Hand Surgery, 1988. 13(2): p. 146-150.
[3] Liang, J., et al., Free tissue flaps in head and neck reconstruction: clinical application and analysis of 93 patients of a single institution. Brazilian journal of otorhinolaryngology, 2017.
[4] Nahabedian, M.Y., et al., Breast reconstruction with the free TRAM or DIEP flap: Patient selection, choice of flap, and outcome. Plastic and reconstructive surgery, 2002. 110(2): p. 466-475.
[5] Hirigoyen, M.B., M.L. Urken, and H. Weinberg, Free flap monitoring: a review of current practice. Microsurgery, 1995. 16(11): p. 723-726.
[6] Gusenoff, J.A., et al., Free tissue transfer: comparison of outcomes between university hospitals and community hospitals. Plastic and reconstructive surgery, 2006. 118(3): p. 671-675.
[7] Wolff, K.D., et al., Incidence and time of intraoperative vascular complications in head and neck microsurgery. Microsurgery: Official Journal of the International Microsurgical Society and the European Federation of Societies for Microsurgery, 2008. 28(3): p. 143-146.
[8] Chalian, A.A., et al., Internal jugular vein versus external jugular vein anastamosis: implications for successful free tissue transfer. Head & neck, 2001. 23(6): p. 475-478.
[9] Hidalgo, D.A., et al., A review of 716 consecutive free flaps for oncologic surgical defects: refinement in donor-site selection and technique. Plastic and reconstructive surgery, 1998. 102(3): p. 722-32; discussion 733-4.
[10] Kroll, S.S., et al., Timing of pedicle thrombosis and flap loss after free-tissue transfer. Plastic and reconstructive surgery, 1996. 98(7): p. 1230-1233.
[11] Novakovic, D., et al., Salvage of failed free flaps used in head and neck reconstruction. Head & neck oncology, 2009. 1(1): p. 33.
[12] Kubo, T., K. Yano, and K. Hosokawa, Management of flaps with compromised venous outflow in head and neck microsurgical reconstruction. Microsurgery, 2002. 22(8): p. 391-395.
[13] Hidalgo, D.A. and C.S. Jones, The role of emergent exploration in free-tissue transfer: a review of 150 consecutive cases. Plastic and reconstructive surgery, 1990. 86(3): p. 492-8; discussion 499-501.
[14] Schusterman, M.A., et al., A single center's experience with 308 free flaps for repair of head and neck cancer defects. Plastic and reconstructive surgery, 1994. 93(3): p. 472-8; discussion 479-80.
[15] Chen, K.-T., et al., Timing of presentation of the first signs of vascular compromise dictates the salvage outcome of free flap transfers. Plastic and reconstructive surgery, 2007. 120(1): p. 187-195.
[16] Macnamara, M., et al., Microvascular free flaps in head and neck surgery. The Journal of Laryngology & Otology, 1994. 108(11): p. 962-968.
[17] Miyasaka, M., et al., Salvage operations of free tissue transfer following internal jugular venous thrombosis: a review of 4 cases. Microsurgery: Official Journal of the International Microsurgical Society and the European Federation of Societies for Microsurgery, 2005. 25(3): p. 191-195.
[18] Brown, J., et al., Factors that influence the outcome of salvage in free tissue transfer. British Journal of Oral and Maxillofacial Surgery, 2003. 41(1): p. 16-20.
[19] Bui, D.T., et al., Free flap reexploration: indications, treatment, and outcomes in 1193 free flaps. Plastic and reconstructive surgery, 2007. 119(7): p. 2092-2100.
[20] Panchapakesan, V., et al., Role of thrombolysis in free-flap salvage. Journal of reconstructive microsurgery, 2003. 19(08): p. 523-530.
[21] Spiegel, J.H. and J.K. Polat, Microvascular flap reconstruction by otolaryngologists: prevalence, postoperative care, and monitoring techniques. The laryngoscope, 2007. 117(3): p. 485-490.
[22] Zötterman, J., et al., Monitoring of partial and full venous outflow obstruction in a porcine flap model using laser speckle contrast imaging. Journal of Plastic, Reconstructive & Aesthetic Surgery, 2016. 69(7): p. 936-943.
[23] Hosein, R.C., A. Cornejo, and H.T. Wang, Postoperative monitoring of free flap reconstruction: A comparison of external Doppler ultrasonography and the implantable Doppler probe. Plastic Surgery, 2016. 24(1): p. 11-19.
[24] Jones, A.P. and J.E. Janis, Essentials of plastic surgery: Q&A companion. 2015: CRC Press.
[25] Tenorio, X., et al., Early detection of flap failure using a new thermographic device. Journal of Surgical Research, 2009. 151(1): p. 15-21.
[26] Smit, J.M., et al., Advancements in free flap monitoring in the last decade: a critical review. Plastic and reconstructive surgery, 2010. 125(1): p. 177-185.
[27] Salgado, C.J., S.L. Moran, and S. Mardini, Flap monitoring and patient management. Plastic and reconstructive surgery, 2009. 124(6S): p. e295-e302.
[28] Perng, C.-K., et al., Detection of free flap pedicle thrombosis by infrared surface temperature imaging. Journal of Surgical Research, 2018. 229: p. 169-176.
[29] de Weerd, L., J.B. Mercer, and S. Weum, Dynamic infrared thermography. Clinics in plastic surgery, 2011. 38(2): p. 277-292.
[30] Lu, P.-J., An Infrared Approach for Monitoring of Flap Pedicle Thrombosis after Free Flap Surgery. 2014, National Taiwan University.
[31] Saxena, A., S.H. Chung, and A.Y. Ng, 3-d depth reconstruction from a single still image. International journal of computer vision, 2008. 76(1): p. 53-69.
[32] Liu, F., et al., Learning Depth from Single Monocular Images Using Deep Convolutional Neural Fields. IEEE Trans. Pattern Anal. Mach. Intell., 2016. 38(10): p. 2024-2039.
[33] Moeslund, T.B., A. Hilton, and V. Krüger, A survey of advances in vision-based human motion capture and analysis. Computer vision and image understanding, 2006. 104(2-3): p. 90-126.
[34] Choi, C. and H.I. Christensen, Robust 3D visual tracking using particle filtering on the special Euclidean group: A combined approach of keypoint and edge features. The International Journal of Robotics Research, 2012. 31(4): p. 498-519.
[35] Shi, J. and C. Tomasi, Good features to track. 1993, Cornell University.
[36] Ci, W. and Y. Huang, A robust method for ego-motion estimation in urban environment using stereo camera. Sensors, 2016. 16(10): p. 1704.
[37] He, L., et al., An automatic measurement method for absolute depth of objects in two monocular images based on sift feature. Applied Sciences, 2017. 7(6): p. 517.
[38] Scharstein, D. and R. Szeliski, A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. International journal of computer vision, 2002. 47(1-3): p. 7-42.
[39] Hartley, R. and A. Zisserman, Multiple view geometry in computer vision. 2003: Cambridge university press.
[40] Horn, B.K. and B.G. Schunck, Determining optical flow. Artificial intelligence, 1981. 17(1-3): p. 185-203.
[41] Baker, S., et al., A database and evaluation methodology for optical flow. International Journal of Computer Vision, 2011. 92(1): p. 1-31.
[42] Furukawa, Y. and C. Hernández, Multi-view stereo: A tutorial. Foundations and Trends® in Computer Graphics and Vision, 2015. 9(1-2): p. 1-148.
[43] Abdel-Basset, M., et al., Feature and intensity based medical image registration using particle swarm optimization. Journal of medical systems, 2017. 41(12): p. 197.
[44] Viergever, M.A., et al., A survey of medical image registration–under review. 2016, Elsevier.
[45] Sarrut, D., Deformable registration for image-guided radiation therapy. Zeitschrift für medizinische Physik, 2006. 16(4): p. 285-297.
[46] Fan, L., et al. Evaluation and application of 3D lung warping and registration model using HRCT images. in Medical Imaging 2001: Physiology and Function from Multidimensional Images. 2001. International Society for Optics and Photonics.
[47] Yan, D., D. Jaffray, and J. Wong, A model to accumulate fractionated dose in a deforming organ. International Journal of Radiation Oncology* Biology* Physics, 1999. 44(3): p. 665-675.
[48] Schaly, B., et al., Tracking the dose distribution in radiation therapy by accounting for variable anatomy. Physics in Medicine & Biology, 2004. 49(5): p. 791.
[49] Rohr, K., Extraction of 3D anatomical point landmarks based on invariance principles. Pattern Recognition, 1999. 32(1): p. 3-15.
[50] Johnson, H.J. and G.E. Christensen, Consistent landmark and intensity-based image registration. IEEE transactions on medical imaging, 2002. 21(5): p. 450-461.
[51] Sarrut, D., et al., Simulation of 4D CT images from deformable registration between inhale and exhale breath-hold CT scans. International Journal of Radiation Oncology• Biology• Physics, 2005. 63: p. S509-S510.
[52] Sundaram, T.A. and J.C. Gee, Towards a model of lung biomechanics: pulmonary kinematics via registration of serial lung images. Medical image analysis, 2005. 9(6): p. 524-537.
[53] Brock, K., et al., Automated generation of a four‐dimensional model of the liver using warping and mutual information. Medical Physics, 2003. 30(6): p. 1128-1133.
[54] Coselmon, M.M., et al., Mutual information based CT registration of the lung at exhale and inhale breathing states using thin‐plate splines: Exhale/inhale lung registration with thin‐plate splines. Medical physics, 2004. 31(11): p. 2942-2948.
[55] Pluim, J.P., J.A. Maintz, and M.A. Viergever, Mutual-information-based registration of medical images: a survey. IEEE transactions on medical imaging, 2003. 22(8): p. 986-1004.
[56] Roche, A., G. Malandain, and N. Ayache, Unifying maximum likelihood approaches in medical image registration. International Journal of Imaging Systems and Technology, 2000. 11(1): p. 71-80.
[57] Hermosillo, G., C. Chefd'Hotel, and O. Faugeras, Variational methods for multimodal image matching. International Journal of Computer Vision, 2002. 50(3): p. 329-343.
[58] Kybic, J. and M. Unser, Fast parametric elastic image registration. 2003.
[59] Bookstein, F.L., Principal warps: Thin-plate splines and the decomposition of deformations. IEEE Transactions on pattern analysis and machine intelligence, 1989. 11(6): p. 567-585.
[60] De Nigris, D., D.L. Collins, and T. Arbel, Multi-modal image registration based on gradient orientations of minimal uncertainty. IEEE transactions on medical imaging, 2012. 31(12): p. 2343-2354.
[61] Lowe, D.G. Object recognition from local scale-invariant features. in Computer vision, 1999. The proceedings of the seventh IEEE international conference on. 1999. Ieee.
[62] Myronenko, A. and X. Song, Point set registration: Coherent point drift. IEEE transactions on pattern analysis and machine intelligence, 2010. 32(12): p. 2262-2275.
[63] Tayal, Y., R. Lamba, and S. Padhee, Automatic face detection using color based segmentation. International Journal of Scientific and Research Publications, 2012. 2(6): p. 1-7.
[64] Fischler, M.A. and R.C. Bolles, Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 1981. 24(6): p. 381-395.
[65] Cheng, Y., Mean shift, mode seeking, and clustering. IEEE transactions on pattern analysis and machine intelligence, 1995. 17(8): p. 790-799.
[66] Fukunaga, K. and L. Hostetler, The estimation of the gradient of a density function, with applications in pattern recognition. IEEE Transactions on information theory, 1975. 21(1): p. 32-40.
[67] Moon, T.K., The expectation-maximization algorithm. IEEE Signal processing magazine, 1996. 13(6): p. 47-60.
[68] Kass, M., A. Witkin, and D. Terzopoulos, Snakes: Active contour models. International journal of computer vision, 1988. 1(4): p. 321-331.
[69] Xu, C. and J.L. Prince, Snakes, shapes, and gradient vector flow. IEEE Transactions on image processing, 1998. 7(3): p. 359-369.
[70] Xu, Y., et al. Error analysis of calibration parameters estimation for binocular stereo vision system. in Imaging Systems and Techniques (IST), 2013 IEEE International Conference on. 2013. IEEE.
[71] Ding, H.-M., Visible light image monitoring system for thrombosis of free flap after surgery: auto-registration and real-time tracking of free flap. 2018, National Taiwan University.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/71562-
dc.description.abstract對於因為癌症手術而造成大面積周圍組織損傷的病患,游離皮瓣重建手術能提供針對患部功能性及外觀性良好的修復,也能減少皮瓣供給部位的術後副作用。雖然手術成功率已能達到95%,但皮瓣區域仍有機會在術後出現血液循環障礙,嚴重的話可能導致皮瓣區域壞死,若及早發現皮瓣血液灌流異常並實行再探查手術,皮瓣挽救率將能達到90%以上。若能提供醫護人員及病患低成本、即時、非侵入式及非接觸式等特性之監測方式,將能有效減輕人力負擔,且提高整體醫療品質。
為了提供皮瓣病患術後更好的監測及照護方式,本研究團隊開發一套自動化即時監控皮瓣區域血流狀況之系統,結合皮瓣區域追蹤、對位以及皮瓣熱影像及可見光影像之資訊分析,達到術後皮瓣即時監測目的。本研究論文將著重在於系統中皮瓣區域自動化追蹤以及對位演算法部分。為了測試系統臨床使用情形以及適用性,本研究與台北榮民總醫院整形外科彭成康醫師合作,並於研究期間同時進行動物實驗、臨床皮瓣手術病患收案、以及開發追蹤及對位系統演算法。本研究中動物實驗、人體試驗及其相關資料應用皆通過台北榮總審查委員會同意。
為了自動追蹤病患皮瓣區域的位置,持續擷取影像並進行對位,基於移動自由度、安全性與簡易操作性,研究中硬體設備採用達明機器人公司所開發的TM5-900 機械手臂及其內嵌之可見光相機,進行追蹤演算法開發。追蹤演算法將以光流法持續追蹤皮瓣移動情形,在偵測到皮瓣區域大幅移動之後,利用機械手臂控制結合立體視覺演算法,偵測皮瓣特徵點移動方向及其於空間中之座標點位置,並移動至適當拍攝位置。且為了達成多時間點間的影像對位並轉換皮瓣區域資訊以進行後續血管阻塞偵測,本研究發展一套多時間點皮瓣區域對位演算法,於初始影像半自動圈選皮瓣區域及邊緣之後,利用SIFT演算法擷取皮瓣區域之特徵點,作為影像形變以及對位之參考資訊,再以CPD計算不同時間點影像特徵點群之間的形變模型,並以TPS轉換皮瓣區域至新的影像位置,最後再進行對位演算法之最佳化。
結果顯示以立體視覺導引機械手臂追蹤之演算法,能夠即時偵測皮瓣區域移動,並藉由立體視覺所計算出之位置,保持相同觀察距離並移動至新的觀測位置。同時藉由多時間點皮瓣區域對位演算法的計算,能讓皮瓣區域的資訊完整的轉換到後續時間點的影像上。將本論文所開發之對位演算法應用在拍攝過程中有大幅移動的實驗豬隻連續影像中,並利用Dice Similarity Coefficient、Hausdorff Distance以及平均邊界距離,來衡量連續150張影像之對位結果與相對應半自動圈選區域之間的表現差異,能得到150張影像的Dice Similarity Coefficient平均為96.1% ± 1%,Hausdorff Distance平均為64.92 ± 13.04像素,平均邊界距離之平均值則為20.56 ± 5.36像素,且於對位過程中並沒有因為豬隻掙扎或移動而大幅影響對位結果。將演算法應用於實驗病患案例一對位結果之Dice Similarity Coefficient平均則為95.9% ± 1%,Hausdorff Distance平均為32.87 ± 8.80像素,平均邊界距離之平均值則為9.58 ± 2.65像素,表示所開發之對位演算法能用於不同情況下之皮瓣區域對位,且能夠克服因為病患移動或環境改變所造成的困難,達到多時間點對位之目的。
zh_TW
dc.description.abstractMicrovascular free flap surgery has been a reliable and important reconstruction method for patients who are suffered from complicated large area tissue injure from cancer surgery. Although the success rate of free flap surgery has reached to 95%, flap pedicle thrombosis still has the chance to occur after the surgery, which causes irreversible damage on free flap. Therefore, to minimize flap losses, careful real-time postoperative monitoring and analyzing are generally recommended. Nevertheless, postoperative monitoring is a demanding task for nursing staffs. To aid the monitoring of free flap thrombosis, less the demand of medical crews, and minimize the flap losses, an effective free flap situation monitoring method with low-cost, non-invasive and contactless characteristics is highly desirable.
In order to provide better care and assist the real-time monitoring of postoperative free flap surgery patients, we developed an automatic free flap region monitoring system, which consolidated the free flap tracking and registration algorithms, and the analysis algorithms for free flap temperature and color performance. This study focused on the automatic free flap tracking and registration system. To examine the applicability in clinical environments, this research cooperated with Dr. Cherng-Kang Perng and Taipei Veterans General Hospital, and proceeded the animal experimentation, clinical trials, and the system development at the same time. All images used and experiments in this study were approved by the institutional review board of Taipei Veterans General Hospital.
Based on the characteristics of moving freedom, safety, and simple operation, the tracking system adopted TM5-900 robotic arm developed by Techman Robot Inc. and a camera embedded on the arm. To accomplish the tracking objective, we incorporated the stereovision, optical flow method and the robotic arm. Once detecting the moving of free flap region, the algorithm measured three-dimensional coordinates of points on the flap edges, and calculated the distance from the plane they formed to the camera. And then the position the robotic arm should reach could be retrieved. To register longitudinal images and transform the free flap region information between them, the free flap registration algorithm extracted the feature points around the delineated free flap edges by SIFT algorithm, which provided the feature information for tracking and registration algorithm. After capturing the next image, the free flap region edges from the formal image would be transferred by Coherent Point Drift and Thin Plate Spline algorithm to the image captured at the new position, and followed by the optimization algorithm.
The results of the research have proved that the system can keep in sight of the moving path of the free flap region. Meanwhile, registered by the proposed longitudinal registration algorithm by this study, the average Dice Similarity Coefficient was 96.1% ± 1%, average Hausdorff Distance was 64.92 ± 13.04 pixels, and average edge distance between the results and delineated contours was 20.56 ± 5.36 pixels in the images captured from animal experimentation. On the other hand, applied the registration algorithm in the images acquired from the clinical trial, the average Dice Similarity Coefficient was 95.9% ± 1%, average Hausdorff Distance was 32.87 ± 8.80 pixels, and average edge distance between the results and delineated contours was 9.58 ± 2.65 pixels. Based on the performance of registration, the free flap region registration algorithm demonstrated its effectiveness under various situation, and the capability to overcome the difficulty caused by the moving of patients and environmental variation.
en
dc.description.provenanceMade available in DSpace on 2021-06-17T06:03:19Z (GMT). No. of bitstreams: 1
ntu-108-R05548018-1.pdf: 7406083 bytes, checksum: c23a75ab5f75c21fc484fb097fcfbfc7 (MD5)
Previous issue date: 2019
en
dc.description.tableofcontents口試委員會審定書 i
致謝 ii
中文摘要 iii
ABSTRACT v
目錄 vii
圖目錄 ix
表目錄 xi
第一章 緒論 1
1.1 研究背景 1
1.1.1 游離皮瓣介紹 1
1.1.2 皮瓣術後血液灌流觀察 3
1.2 研究動機與目的 6
第二章 文獻回顧 8
2.1 立體視覺 8
2.1.1 攝影機系統 8
2.1.2 搜尋對應點 9
2.2 多時間點影像對位 12
第三章 研究材料與方法 14
3.1 研究材料 14
3.1.1 動物實驗材料 14
3.1.2 病患資料 15
3.1.3 硬體設備 16
3.2 研究方法流程 17
3.3 皮瓣區域立體追蹤演算法 19
3.3.1 機械手臂與立體視覺 19
3.3.2 特徵點擷取 21
3.3.3 立體視覺影像對應點搜尋與匹配 23
3.4 多時間點皮瓣區域對位演算法 26
3.4.1 臉部辨識 27
3.4.2 錯誤匹配特徵點刪除 29
3.4.3 皮瓣區域樣板比對 31
3.4.4 特徵點群非剛性對位 34
3.4.5 皮瓣區域形變 35
第四章 結果與討論 38
4.1 皮瓣區域立體追蹤演算法結果與討論 38
4.2 多時間點皮瓣區域對位演算法結果與討論 41
第五章 結論與未來展望 51
REFERENCE 52
附錄一 56
附錄二 62
dc.language.isozh-TW
dc.subject游離皮瓣zh_TW
dc.subject機械手臂zh_TW
dc.subject特徵點擷取zh_TW
dc.subject物體追蹤zh_TW
dc.subject多時間點影像對位zh_TW
dc.subject立體視覺zh_TW
dc.subjectLongitudinal image registrationen
dc.subjectStereovisionen
dc.subjectRobotic armen
dc.subjectObject trackingen
dc.subjectFeature points extractionen
dc.subjectFree flapen
dc.title立體視覺導引之游離皮瓣機械手臂追蹤系統zh_TW
dc.titleStereovision Guided Postoperative Free-Flap Tracking System
Using Robotic Arm with Embedded Camera
en
dc.typeThesis
dc.date.schoolyear107-1
dc.description.degree碩士
dc.contributor.oralexamcommittee林沛群(Pei-Chun Lin),李佳燕(Chia-Yen Lee),彭成康(Cherng-Kang Perng)
dc.subject.keyword游離皮瓣,立體視覺,機械手臂,物體追蹤,特徵點擷取,多時間點影像對位,zh_TW
dc.subject.keywordFree flap,Stereovision,Robotic arm,Object tracking,Feature points extraction,Longitudinal image registration,en
dc.relation.page65
dc.identifier.doi10.6342/NTU201900170
dc.rights.note有償授權
dc.date.accepted2019-01-28
dc.contributor.author-college工學院zh_TW
dc.contributor.author-dept醫學工程學研究所zh_TW
顯示於系所單位:醫學工程學研究所

文件中的檔案:
檔案 大小格式 
ntu-108-1.pdf
  未授權公開取用
7.23 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved