Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101731
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor劉浩澧zh_TW
dc.contributor.advisorHao-Li Liuen
dc.contributor.author黃梓豪zh_TW
dc.contributor.authorZi-Hao Huangen
dc.date.accessioned2026-03-04T16:07:23Z-
dc.date.available2026-04-09-
dc.date.copyright2026-03-04-
dc.date.issued2025-
dc.date.submitted2026-02-24-
dc.identifier.citation[1] J.-F. Saillant, R. Marlier, F. Navacchia, and F. Baqué, “Ultrasonic transducer for non-destructive testing of structures immersed in liquid sodium at 200 °c,” Sensors, vol. 19, no. 19, 2019.
[2] X. Liu, F. Qiu, L. Hou, and X. Wang, “Review of noninvasive or minimally invasive deep brain stimulation,” Frontiers in Behavioral Neuroscience, vol. 15, 01 2022.
[3] J. Lindovský, Z. Nichtova, N. Dragano, D. Pajuelo, J. Prochazka, H. Fuchs, S. Marschall, V. Gailus-Durner, R. Sedlacek, M. Angelis, J. Rozman, and N. Spielmann, “A review of standardized high-throughput cardiovascular phenotyping with a link to metabolism in mice,” Mammalian Genome, vol. 34, pp. 1–16, 06 2023.
[4] W. Lee, H. Kim, Y. J. Jung, Y. Chung, H. J. Kim, S. Lee, and J. H. Lee, “Transcranial focused ultrasound stimulation of motor cortical areas in freely-moving awake rats,” BMC Neuroscience, vol. 19, no. 1, p. 57, 2018.
[5] C.-F. Li, M.-Y. Yang, G.-W. Hong, and H.-L. Liu, “Design and implementation of a fpga-based airborne ultrasound sensing and radiation phased array device,” in 2024 IEEE Ultrasonics, Ferroelectrics, and Frequency Control Joint Symposium (UFFCJS), pp. 1–4, 2024.
[6] S. Donkov, E. Bouzbib, M. Aldea, J. Irisarri, S. Elizondo, I. Ezcurdia, and A. Marzo,“SparkTouch: Contactless Haptic Spatial Patterns on the Palm and Fingertip using Electric Sparks,” in World Haptics 2025, (Suwon, South Korea), IEEE, July 2025.
[7] A. Marzo, S. A. Seah, B. W. Drinkwater, D. R. Sahoo, B. Long, and S. Subramanian, “Holographic acoustic elements for manipulation of levitated objects,” Nature Communications, vol. 6, p. 8661, October 2015.
[8] A. Marzo, M. Caleap, and B. W. Drinkwater, “Acoustic virtual vortices with tunable orbital angular momentum for trapping of mie particles,” Phys. Rev. Lett., vol. 120, p. 044301, Jan 2018.
[9] Y. Ochiai, T. Hoshi, and I. Suzuki, “Holographic whisper: Rendering audible sound spots in three-dimensional space by focusing ultrasonic waves,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17), (Denver, CO, USA), pp. 4314–4325, Association for Computing Machinery, 2017.
[10] R. Cheng, W. Heinzelman, M. Sturge-Apple, and Z. Ignjatovic, “A motion-tracking ultrasonic sensor array for behavioral monitoring,” IEEE Sensors Journal, vol. 12, no. 3, pp. 707–714, 2012.
[11] Y. Wang, Z. Hao, X. Dang, Z. Zhang, and M. Li, “Ultrasonicgs: A highly robust gesture and sign language recognition method based on ultrasonic signals,” Sensors, vol. 23, no. 4, 2023.
[12] L. W. Schmerr, Fundamentals of Ultrasonic Phased Arrays. Springer Science & Business Media, 2007.
[13] K. Nakahata and N. Kono, “3-d modelings of an ultrasonic phased array transducer and its radiation properties in solid,” in Ultrasonic Waves (A. A. dos Santos Júnior, ed.), ch. 3, Rijeka: IntechOpen, 2012. Open Access book chapter, licensed under CC BY 3.0.
[14] A. Marzo, T. Corkett, and B. W. Drinkwater, “Ultraino: An open phased-array system for narrowband airborne ultrasound transmission,” IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, vol. 65, no. 1, pp. 102–111, 2018.
[15] H. Bruus, “Acoustofluidics 1: Governing equations in microfluidics,” Lab Chip, vol. 11, pp. 3742–3751, 2011.
[16] E. Kicukov and A. Gursel, “Ultrasonic welding of dissimilar materials: A review,” Periodicals of Engineering and Natural Sciences (PEN), vol. 3, 06 2015.
[17] T. J. Mason and J. P. Lorimer, Synthesis, ch. 3, pp. 75–130. John Wiley & Sons, Ltd, 2002.
[18] Q. Huang and Z. Zeng, “A review on real-time 3d ultrasound imaging technology,” BioMed Research International, vol. 2017, pp. 1–20, 03 2017.
[19] J. E. Kennedy, “High-intensity focused ultrasound in the treatment of solid tumours,” Nature Reviews Cancer, vol. 5, no. 4, pp. 321–327, 2005.
[20] N. McDannold, N. Vykhodtseva, and K. Hynynen, “Temporary disruption of the blood-brain barrier by use of ultrasound and microbubbles: safety and efficacy evaluation in rhesus macaques,” Cancer Research, vol. 72, no. 14, pp. 3652–3663, 2012.
[21] O. Naor, S. Krupa, and S. Shoham, “Ultrasonic neuromodulation,” Journal of Neural Engineering, vol. 13, no. 3, p. 031003, 2016.
[22] I. Acquah, D. Adarkwah, I. Andorful, and Y. Ahmed, “Shea butter as a viable couplant for ultrasound imaging,” Journal of Biomedical Science and Engineering, vol. 12, pp. 31–39, 01 2019.
[23] R. Manwar, L. Saint-martin, and K. Avanaki, “Couplants in acoustic biosensing systems,” Chemosensors, vol. 10, p. 181, 05 2022.
[24] T. Arsiwala, S. Sprowls, K. Blethen, C. Adkins, P. Saralkar, R. Fladeland, W. Pentz, A. Gabriele, B. Kielkowski, R. Mehta, P. Wang, J. Carpenter, M. Ranjan, U. Najib, A. Rezai, and P. Lockman, “Ultrasound-mediated disruption of the blood tumor barrier for improved therapeutic delivery,” Neoplasia, vol. 23, no. 7, pp. 676–691, 2021.
[25] F. A. Duck, “Chapter 4 - acoustic properties of tissue at ultrasonic frequencies,” in Physical Properties of Tissues (F. A. Duck, ed.), pp. 73–135, London: Academic Press, 1990.
[26] T. P. Abello, “Absorption of ultrasonic waves by various gases,” Phys. Rev., vol. 31, pp. 1083–1091, Jun 1928.
[27] M.-C. Niérat, P. Laveneziana, B.-P. Dubé, P. Shirkovskiy, R.-K. Ing, and T. Similowski, “Physiological validation of an airborne ultrasound based surface motion camera for a contactless characterization of breathing pattern in humans,” Frontiers in Physiology, vol. 10, 2019.
[28] W. Moore, A. Makdani, W. Frier, and F. McGlone, “Virtual touch: Sensing and feeling with ultrasound,” bioRxiv, 2021.
[29] F. Ijaz, H. K. Yang, A. W. Ahmad, and C. Lee, “Indoor positioning: A review of indoor ultrasonic positioning systems,” in 2013 15th International Conference on Advanced Communications Technology (ICACT), pp. 1146–1150, 2013.
[30] M. Kaur and J. Pal, “Distance measurement of object by ultrasonic sensor hc-sr04,” International Journal for Scientific Research & Development (IJSRD), vol. 3, no. 5, pp. 503–505, 2015. Paper ID: IJSRDV3I50440, Published: Aug. 1, 2015.
[31] S. Gezici, Z. Tian, G. B. Giannakis, H. Kobayashi, A. F. Molisch, H. V. Poor, and Z. Sahinoglu, “Localization via ultra-wideband radios: a look at positioning aspects for future sensor networks,” IEEE Signal Processing Magazine, vol. 22, no. 4, pp. 70–84, 2005.
[32] S. Lan, R. Yu, G. Yu, and L. S. Davis, “Modeling local geometric structure of 3d point clouds using geo-cnn,” in 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 998–1008, 2019.
[33] L. Zhan, W. Li, and W. Min, “Fa-resnet: Feature affine residual network for largescale point cloud segmentation,” International Journal of Applied Earth Observation and Geoinformation, vol. 118, p. 103259, 2023.
[34] X. Lin, D. Wang, G. Zhou, C. Liu, and Q. Chen, “Transpose: 6d object pose estimation with geometry-aware transformer,” arXiv preprint arXiv:2310.16279, 2024. Preprint.
[35] F. Murtagh, “Multilayer perceptrons for classification and regression,” Neurocomputing, vol. 2, no. 5, pp. 183–197, 1991.
[36] R. Q. Charles, H. Su, M. Kaichun, and L. J. Guibas, “Pointnet: Deep learning on point sets for 3d classification and segmentation,” in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 77–85, 2017.
[37] L. Ge, Y. Cai, J. Weng, and J. Yuan, “Hand pointnet: 3d hand pose estimation using point sets,” in 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 8417–8426, 2018.
[38] G. K. Gupta and D. K. Sharma, “A review of overfitting solutions in smart depression detection models,” in 2022 9th International Conference on Computing for Sustainable Global Development (INDIACom), pp. 145–151, 2022.
[39] Otona no Kagaku Editorial Department, Otona no Kagaku Magazine: Mini Cleaning Robot. Gakken, 2013.
[40] International Telecommunication Union Radiocommunication Sector (ITU-R), “Recommendation ITU-R BT.601-7: Studio encoding parameters of digital television for standard 4:3 and wide-screen 16:9 aspect ratios,” Tech. Rep. BT.601-7, International Telecommunication Union, Geneva, 2011. Series BT, Broadcasting Service (Television).
[41] R. C. Gonzalez and R. E. Woods, Digital Image Processing. Pearson, 4th ed., 2018.
[42] Z. Wang, A. Bovik, H. Sheikh, and E. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Transactions on Image Processing, vol. 13, no. 4, pp. 600–612, 2004.
[43] S. Suzuki and K. be, “Topological structural analysis of digitized binary images by border following,” Computer Vision, Graphics, and Image Processing, vol. 30, no. 1, pp. 32–46, 1985.
[44] M.-K. Hu, “Visual pattern recognition by moment invariants,” IRE Transactions on Information Theory, vol. 8, no. 2, pp. 179–187, 1962.
[45] J. Postel, “User datagram protocol.” https://datatracker.ietf.org/doc/ html/rfc768, 1980. RFC 768.
[46] T. Zhang, N. Pan, Y. Wang, C. Liu, and S. Hu, “Transcranial focused ultrasound neuromodulation: A review of the excitatory and inhibitory effects on brain activity in human and animals,” Frontiers in Human Neuroscience, vol. 15, p. 749162, 2021.
[47] W. Lee and B. Garra, “How to interpret the ultrasound output display standard for higher acoustic output diagnostic ultrasound devices: Version 2,” Journal of ultrasound in medicine : official journal of the American Institute of Ultrasound in Medicine, vol. 23, pp. 723–6, 06 2004.
[48] Y. Wexler, Y. Benjamini, and I. Golani, “Vertical exploration and dimensional modularity in mice,” Royal Society Open Science, vol. 5, no. 5, p. 180069, 2018.
[49] C. S. Bresee, H. M. Belli, Y. Luo, and M. J. Z. Hartmann, “Comparative morphology of the whiskers and faces of mice (mus musculus) and rats (rattus norvegicus),” Journal of Experimental Biology, vol. 226, p. jeb245597, 10 2023.
[50] A. L. Paulson, L. Zhang, A. M. Prichard, and A. C. Singer, “40 hz sensory stimulation enhances ca3-ca1 coordination and prospective coding during navigation in a mouse model of alzheimer’s disease,” bioRxiv, 2024.
[51] H. Iaccarino, A. Singer, A. Martorell, A. Rudenko, F. Gao, T. Gillingham, H. Mathys, J. Seo, O. Kritskiy, F. Abdurrob, et al., “Gamma frequency entrainment attenuates amyloid load and modifies microglia,” Nature, vol. 540, no. 7632, pp. 230–235,2016.
[52] Z. Li, J. Li, S. Wang, X. Wang, J. Chen, and L. Qin, “Laminar profile of auditory steady-state response in the auditory cortex of awake mice,” Frontiers in Systems Neuroscience, vol. 15, p. 636395, 2021.
[53] R. Y. Cho, C. P. Walker, N. R. Polizzotto, T. A. Wozny, C. Fissell, C.-M. A. Chen, and D. A. Lewis, “Development of sensory gamma oscillations and cross-frequency coupling from childhood to early adulthood,” Cerebral Cortex, vol. 25, pp. 1509–1518, 12 2013.
[54] C.-S. Gong, Z.-H. Huang, and H.-L. Liu, “Development of airborne ultrasound system for freely-moving object tracking and real-time sonication,” IEEE Sensors Letters, 2026.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101731-
dc.description.abstract傳統超音波治療多仰賴麻醉或機械固定以限制動物活動,確保聲場聚焦之穩定性與治療準確度,然而此操作方式不僅降低實驗靈活性,亦對動物福祉造成負擔,並限制其於長期監測與重複性刺激等應用情境之可行性。為突破此限制,本研究提出一套整合超音波感測、機器學習預測與相位陣列控制之即時聚焦系統,實現對自由移動目標之非接觸式追蹤與能量聚焦。系統以超音波接收訊號為輸入特徵,透過 PointNet 模型即時預測目標頭部座標,進一步驅動 10×10 超音波相位陣列進行聚焦控制。整體架構可於無影像輔助情境下獨立運作,具備即時追蹤與對應聚焦能力。實驗結果顯示,系統可穩定產生對應位置之聚焦聲壓,量測結果與理論預期高度一致,展現良好之準確性與穩定性。另透過週期性訊號調變與脈波寬度調變,可實現 40 Hz 或更低頻率之刺激模式,進一步拓展其於神經調控、節律誘發等應用場域之潛力。zh_TW
dc.description.abstractConventional ultrasound therapies often rely on anesthesia or mechanical restraints to limit animal movement, ensuring stability of the acoustic focus and treatment accuracy. However, such procedures reduce experimental flexibility, impose burdens on animal welfare, and constrain the feasibility of long-term monitoring and repeated stimulation applications. In response to these limitations, this study develops a real-time focusing system that integrates ultrasonic sensing, machine learning–based prediction, and phased array control, enabling non-contact tracking and energy focusing on freely moving targets. The system utilizes received ultrasonic signals as input features and employs a PointNet-based model to predict the target’s head position in real time, which subsequently drives a 10×10 ultrasonic phased array for focusing control. The overall architecture can operate independently without visual assistance, supporting real-time tracking and corresponding focal control. Experimental results demonstrate that the system consistently generates focused acoustic pressure at the predicted target positions, with measured values closely matching theoretical expectations, indicating strong accuracy and stability. Furthermore, through periodic signal modulation and pulse width modulation, the system can deliver stimulation patterns at 40 Hz or lower, expanding its potential for applications in neural modulation, rhythm entrainment, and other behavioral research contexts.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2026-03-04T16:07:23Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2026-03-04T16:07:23Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents口試委員審定書 i
致謝 ii
摘要 iii
Abstract iv
目次 v
圖次 viii
表次 xii
第一章 緒論 1
1.1 超音波技術與應用概論 1
1.2 醫療用超音波之特性與現有限制 4
1.3 空氣中聲輻射力傳遞之潛力與應用優勢 6
1.4 空氣超音波相位陣列 7
1.5 結合自動感測之非接觸式聚焦應用潛力 10
1.6 研究動機與貢獻 13
第二章 方法與理論 14
2.1 系統總覽 14
2.2 相位陣列系統之硬體與邏輯設計 16
2.2.1 資料傳輸與指令同步 16
2.2.2 邏輯控制與驅動訊號產生 17
2.2.3 類比驅動與相位陣列控制 18
2.3 超音波聚焦與相位計算方法 19
2.4 超音波接收模組之電路架構與接收原理 22
2.4.1 模組電路架構 22
2.4.2 接收模組之工作原理 23
2.4.3 追蹤場域內接收訊號之儲存方式 24
2.5 機器學習模型架構與特徵處理 25
2.5.1 MLP 模型架構 26
2.5.2 PointNet 模型架構 27
2.5.3 接收資料特徵處理 31
2.6 訓練資料蒐集與標註自動化策略 35
2.6.1 頭部座標自動標註方法 35
2.6.2 訓練集與標籤集配對 44
2.6.3 場域中訓練資料分布特性 45
2.7 相位陣列模擬軟體與系統可視化實現 46
2.7.1 影像擷取與座標導入 46
2.7.2 模擬之聲輻射力圖產生 48
2.7.3 影像疊合 52
2.7.4 傳感器相位計算 52
2.8 場景與實驗設計 53
2.8.1 實驗目的 53
2.8.2 硬體設備 53
2.8.3 實驗規劃 56
第三章 實驗結果與討論 60
3.1 座標預測準確性與性能評估 60
3.1.1 機器學習模型表現分析與比較 60
3.1.1.1 靜止狀態 61
3.1.1.2 運動狀態 64
3.1.2 模型比較與討論68
3.2 聚焦式超音波聲場之量測與控制分析 73
3.2.1 聲場量測與準確性分析 73
3.2.2 聚焦能量分布分析 77
3.2.3 照野範圍改善分析 79
3.2.4 載波頻率之設定與應用 84
3.2.5 聲強度分析與探討 88
3.3 整合系統下之即時聚焦表現與穩定性 91
3.3.1 動態追蹤下之即時聚焦力分析 91
3.3.2 即時聚焦追蹤可視化結果 94
第四章 結論與未來展望 96
4.1 結論 96
4.2 未來展望 98
參考文獻 100
附錄 A — 實作演示影片 107
-
dc.language.isozh_TW-
dc.subject空氣超音波-
dc.subject超音波相位陣列-
dc.subject機器學習座標預測-
dc.subject自由移動目標追蹤-
dc.subject即時聚焦控制-
dc.subject低頻調變-
dc.subjectAirborne ultrasound-
dc.subjectUltrasonic phased array-
dc.subjectMachine learning-based localization-
dc.subjectFreely-moving target tracking-
dc.subjectReal-time focusing control-
dc.subjectLow-frequency modulation-
dc.title空氣傳遞超音波模組整合於自由移動物體追蹤與即時聲波照射系統zh_TW
dc.titleIntegration of Airborne Ultrasound Module for Freely-Moving Object Tracking and Real-Time Sonicationen
dc.typeThesis-
dc.date.schoolyear114-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee沈哲州;李昇憲;謝宗勳;邱錫彥zh_TW
dc.contributor.oralexamcommitteeChe-Chou Shen;Sheng-Shian Li;Tsung-Hsun Hsieh;Shin-Yan Chiouen
dc.subject.keyword空氣超音波,超音波相位陣列機器學習座標預測自由移動目標追蹤即時聚焦控制低頻調變zh_TW
dc.subject.keywordAirborne ultrasound,Ultrasonic phased arrayMachine learning-based localizationFreely-moving target trackingReal-time focusing controlLow-frequency modulationen
dc.relation.page107-
dc.identifier.doi10.6342/NTU202503295-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2026-02-25-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept電機工程學系-
dc.date.embargo-lift2031-02-24-
顯示於系所單位:電機工程學系

文件中的檔案:
檔案 大小格式 
ntu-114-2.pdf
  此日期後於網路公開 2031-02-24
11.1 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved