Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電信工程學研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/20944
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳宏銘(Homer H. Chen)
dc.contributor.authorChung-Hsien Tsaien
dc.contributor.author蔡宗憲zh_TW
dc.date.accessioned2021-06-08T03:11:41Z-
dc.date.issued2017
dc.date.submitted2017-03-24
dc.identifier.citation[1] C. G. Tsai and C. P. Chen, “Musical Tension over Time: Listeners’ Physiological Responses to the ‘Retransition’ in Classical Sonata Form,” J. of New Music Res., vol. 44, no. 3, pp. 271-286, 2015.
[2] B. Kaneshiro, D. T. Nguyen, J. P. Dmochowski, A. M. Norcia, and J. Berger, “Neurophysiological and Behavioral Measures of Musical Engagement,” in Int’l Conf. on Music Percept. and Cogn., 2016.
[3] Y. P. Lin, Y. H. Yang, and T. P. Jung, “Fusion of electroencephalogram dynamics and musical contents for estimating emotional responses in music listening,” Front. Neurosci., vol. 8, no. 94, 2014.
[4] Y. P. Lin, C. H. Wang and T. P. Jung, T. L. Wu, S. K. Jeng, J. R. Duann, and J. H. Chen, “EEG-based Emotion Recognition in Music Listening,” IEEE Trans. on Biomed. Eng., vol. 57, no. 7, pp. 1798–1806, 2010.
[5] S. Stober, T. Prätzlich, and M. Müller, “Brain Beats: Tempo Extraction from EEG Data,” in Proc. 17th Int. Soc. Music Inform. Retrieval Conf., pp. 276-282, 2016.
[6] S. Stober, D.J. Cameron, and J.A. Grahn, “Using convolutional neural networks to recognize rhythm stimuli from electroencephalography recordings,” in Adv. in Neural Inform. Process. Syst., pp. 1449–1457, 2014.
[7] B. Kaneshiro, J. Berger, M. P. Guimaraes, and P. Suppes, “An Exploration of Tonal Expectation Using Single-Trial EEG Classification,” Proc. of the 12th Int’l Conf. on Music Percept. and Cogn., 2012.
[8] R. S. Schaefer, J. Farquhar, Y. Blokland, M. Sadakata, and P. Desain, “Name that tune: Decoding music from the listening brain,” NeuroImage, vol. 56, no. 2, pp. 843–849, 2011.
[9] E. W. Large, “Resonating to musical rhythm: theory and experiment,” The psychology of time (Grondin S, ed.), pp. 189–232, 2008.
[10] S. Nozaradan, I. Peretz, M. Missal, and A. Mouraux, “Tagging the neuronal entrainment to beat and meter,” The J. of Neurosci., vol. 31, no. 28, pp. 10234–10240, 2011.
[11] B. Kaneshiro and J. P. Dmochowski. “Neuroimaging methods for music information retrieval: Current findings and future prospects,” in Proc. 16th Int. Soc. Music Inform. Retrieval Conf., pp. 538–544, 2015.
[12] S. Nozaradan, I. Peretz, and A. Mouraux, “Selective Neuronal Entrainment to the Beat and Meter Embedded in a Musical Rhythm,” The J. of Neurosci., vol. 32, no. 49: pp. 17572–17581, Dec. 2012.
[13] J. C. Brown, “Determination of the meter of musical scores by autocorrelation,” J. Acoust. Soc. Am., vol. 4, pp. 1953-1957, 1993.
[14] F. Gouyon and S. Dixon, “A review of automatic rhythm description systems,” Comput. Music J., vol. 29, pp. 34–54, 2005.
[15] S. Dixon, E. Pampalk, and G. Widmer., “Classification of dance music by periodicity patterns,” in 4th Int. Conf. on Music Inform. Retrieval, 2003.
[16] P. Toiviainen and T. Eerola, “Autocorrelation in meter induction: The role of accent structure,” J. Acoust. Soc. Am., vol. 119, 2006.
[17] F. Gouyon and P. Herrera, “Determination of the meter of musical audio signals: Seeking recurrences in beat segment descriptors,” in 114th AES Convention, 2003.
[18] A. Pikrakis, I. Antonopoulos, and S. Theodoridis, “Music Meter And Tempo Tracking From Raw Polyphonic Audio,” 5th Int. Conf. on Music Inform. Retrieval, 2004.
[19] M. Gainza, “Automatic musical meter detection,” IEEE Int. Conf. on Acoust., Speech, and Signal Process., pp. 329–332, 2009.
[20] M. Davies and M. D. Plumbley, 'Context-dependent beat tracking of musical audio,' IEEE Trans. on Audio, Speech and Language Process., vol. 15, no. 3, pp. 1009-1020, 2007.
[21] W. O. Tatum, B. A. Dworetzky, and D. L. Schomer, “Artifact and recording concepts in EEG,” J. of Clin. Neurophysiol., vol. 28, pp. 252-263, Jun. 2011.
[22] T. W. Lee, M. Girolami, and T. J. Sejnowski, “Independent Component Analysis Using an Extended Infomax Algorithm for Mixed Subgaussian and Supergaussian Sources,” Neural Comput., vol. 11, no. 2, pp. 417–441, 1999.
[23] S. Stober, A. Sternin, A.M. Owen, and J.A. Grahn, “Towards music imagery information retrieval: Introducing the OpenMIIR dataset of EEG recordings from music perception an imagination,” in Proc. 16th Int. Soc. Music Inform. Retrieval Conf., pp. 763–769, 2015.
[24] S.J. Luck, “An introduction to the event-related potential technique,” MIT press, 2014.
[25] S. Stober, A. Sternin, A. M. Owen, and J. A. Grahn, “Deep feature learning for EEG recordings,” arXiv preprint arXiv:1511.04306, 2015.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/20944-
dc.description.abstract利用神經生理反應來進行音樂資訊檢索是近年來的新趨勢,其可行性來自於人類在聆聽音樂的過程中,音樂會被大腦編碼。在這篇論文當中,我們將嘗試使用腦電圖(EEG)來進行音樂節拍(musical meter)分類。過去的實驗顯示腦波頻率會受節拍頻率(beat frequency)與音樂節拍的影響,出現神經導引效應(neural entrainment),並產生節拍頻率與其次諧波(subharmonic)的振盪,而音樂節拍可由節拍頻率與次諧波的倍數關係求得。然而在一般的音樂作為刺激時,由於音樂的節奏較複雜而產生了上述頻率以外的振盪,加上腦波訊號受到許多外在雜訊干擾,本身訊噪比(signal-to-noise ratio)較低,以致於難以從音樂聆聽者的腦波取得節拍資訊。
為了克服上述困難,我們首先用一些去除雜訊的技巧來提高訊號的訊噪比。其中包含了獨立訊號分析(independent component analysis)、空間濾波器(spatial filtering)與訊號平均法(averaging technique)。接著我們偵測腦波強度頻譜的峰值,並經由音樂學的背景知識與分析峰值頻率的倍數關係來找出可能的節拍頻率,最後再藉由分析節拍頻率之次諧波的強度來協助我們找到音樂的節拍。
我們使用了九名音樂聆聽者的腦波來評估我們的演算法,這些音樂聆聽者均聆聽了十二首包含兩種節拍的音樂片段,每首重複聽五次,並同時量測腦波。我們將這些腦波作為演算法的輸入並計算分類準確度(accuracy)。這個分類問題的準確度可達到83.33%,結果顯示,我們的方法可以有效對音樂聆聽者的腦波作節拍分類。
zh_TW
dc.description.abstractSolving music information retrieval (MIR) problem by analyzing neural response is feasible since music is encoded by the brain. In this thesis, the musical meters are classified using the EEG signals recorded from music listeners. Previous studies show that simple rhythmic stimuli can induce EEG signals to resonate at the beat frequency (F0) of the stimulus and the subharmonics of F0, which is called entrainment response. The musical meter can be determined by the frequency ratio of F0 and its subharmonics. However, a music stimulus can induce more complicated EEG spectra than a simple rhythmic stimulus does. Besides, EEG signals have a low signal-to-noise ratio (SNR). These reasons make it difficult to classify musical meter using the EEG signals recorded from music listeners.
To overcome the above difficulties, we first improve the SNR of EEG signals by using several denoising techniques involving independent component analysis (ICA), trial averaging technique, and spatial filtering. Then, we detect the peaks of an EEG magnitude spectrum and analyze ratio of peaks’ frequencies to select possible F0. Finally, we analyze the magnitude of F0’s subharmonics to help us determine the musical meter. In our evaluation, the test EEG signals are recorded from the participants listening to music stimuli in two kinds of meter types, and these EEG signals are used as the input of the proposed algorithm. The accuracy of the musical meter classification reached 83.33%. Our result shows that we succeed to solve MIR problem with neural response.
en
dc.description.provenanceMade available in DSpace on 2021-06-08T03:11:41Z (GMT). No. of bitstreams: 1
ntu-106-R02942069-1.pdf: 641133 bytes, checksum: c81d05b228cee99d1bfe54f9b74002b5 (MD5)
Previous issue date: 2017
en
dc.description.tableofcontents誌謝 i
中文摘要 ii
ABSTRACT iii
CONTENTS iv
LIST OF FIGURES vi
LIST OF TABLES vii
Chapter 1 Introduction 1
Chapter 2 Related Work 3
2.1 Audio-based Musical Meter Classification 3
2.2 Experiment of Neural Entrainment 3
Chapter 3 EEG Entrainment and Musical Meter 5
3.1 EEG Entrainment 5
3.2 EEG Features Relevant to Musical Meter Perception 5
Chapter 4 System Overview 7
4.1 Preprocessing 8
4.2 Beat Frequency Estimation 8
4.3 Musical Meter Classification 8
Chapter 5 Preprocessing 9
5.1 Independent Component Analysis 9
5.2 Trial Averaging Technique 9
5.3 Spatial Filtering 10
Chapter 6 Musical Meter Classification 11
6.1 Spectral Peak Detection 11
6.2 Beat Frequency Estimation 11
6.3 Musical Meter Classification 12
Chapter 7 System Evaluation and Discussion 15
7.1 Experimental Setup 15
7.2 Performance Evaluation 16
7.3 Number of Trials for Average 18
Chapter 8 Conclusion 21
REFERENCES 22
dc.language.isoen
dc.subject音樂節拍分類zh_TW
dc.subject音樂資料檢索zh_TW
dc.subject神經導引效應zh_TW
dc.subject腦電圖信號分析zh_TW
dc.subjectEEG signal analysisen
dc.subjectneural entrainmenten
dc.subjectMusical meter classificationen
dc.subjectmusic information retrievalen
dc.title以腦波訊號分類音樂節拍zh_TW
dc.titleMusical Meter Classification Using EEG Signalsen
dc.typeThesis
dc.date.schoolyear105-2
dc.description.degree碩士
dc.contributor.oralexamcommittee張智星(Jyh-Shing Roger Jang),蔡銘峰(Ming-Feng Tsai),楊奕軒(Yi-Hsuan Yang)
dc.subject.keyword音樂節拍分類,腦電圖信號分析,神經導引效應,音樂資料檢索,zh_TW
dc.subject.keywordMusical meter classification,EEG signal analysis,neural entrainment,music information retrieval,en
dc.relation.page24
dc.identifier.doi10.6342/NTU201700702
dc.rights.note未授權
dc.date.accepted2017-03-24
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept電信工程學研究所zh_TW
顯示於系所單位:電信工程學研究所

文件中的檔案:
檔案 大小格式 
ntu-106-1.pdf
  未授權公開取用
626.11 kBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved