請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/82206完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 陳中平(Chung-Ping Chen) | |
| dc.contributor.author | Chi-Sheng Chen | en |
| dc.contributor.author | 陳麒升 | zh_TW |
| dc.date.accessioned | 2022-11-25T06:33:40Z | - |
| dc.date.copyright | 2021-11-10 | |
| dc.date.issued | 2021 | |
| dc.date.submitted | 2021-07-01 | |
| dc.identifier.citation | [1] K. R. Krishnan, “Comorbidity and depression treatment,” Biological Psychiatry, vol. 53, no. 8, pp. 701-706, 2003. [2] B. Kennard et al., “Remission and residual symptoms after short-term treatment in the Treatment of Adolescents with Depression Study (TADS),” Journal of the American Academy of Child and Adolescent Psychiatry, vol. 45, no. 12, pp. 1404-1411, 2006. [3] N. Kennedy et al., “Residual symptoms at remission from depression: impact on long-term outcome,” Journal of affective disorders, vol. 80, no. 2-3, pp. 135-144, 2004. [4] S.P. Roose et al., “Relationship between depression and other medical illnesses,” JAMA: the journal of the American Medical Association, vol. 286, no. 14, pp. 1687-1690, 2001. [5] World Health Organization (WHO), Available: https://www.who.int/mental_health/management/depression/en/, 2020. [6] World Health Organization (WHO), Available: https://www.who.int/news-room/fact-sheets/detail/depression , 2017. [7] M. S. Reddy, “Depression: the disorder and the burden,” Indian J Psychol Med, vol. 32, no. 1, pp. 1-2, 2010. [8] National Institute of Mental Health (NIH), Available: https://www.nimh.nih.gov/health/statistics/major-depression.shtml. [9] Warden D et al., “The STAR*D Project results: a comprehensive review of findings,” Curr Psychiatry Rep, vol. 9, no. 6, pp. 449-459, 2007. [10] A.J. Rush et al., “Acute and longer-term outcomes in depressed outpatients requiring one or several treatment steps: a STAR*D report,” Am J Psychiatry, vol. 163, no. 11, pp. 1905-1917, 2006. [11] C.T. Li, Class Lecture, Topic: “Treatment-Resistant Depression (TRD) and Antidepressant Mechanisms of Theta-Burst Stimulation (TBS),” Department of Psychiatry, Taipei Veterans General Hospital, 2021. [12] C.T. Li et al., “Critical role of glutamatergic and GABAergic neurotransmission in the central mechanisms of theta‐burst stimulation,” Human Brain Mapping, vol. 40, no. 6, pp. 2001-2009, 2019. [13] C.T. Li et al, “Antidepressant mechanism of add-on repetitive transcranial magnetic stimulation in medication-resistant depression using cerebral glucose metabolism,” Journal of Affective Disorders, vol. 127, no. 1-3, pp. 219-229, 2010. [14] C.T. Li et al., “Efficacy of prefrontal theta-burst stimulation in refractory depression: a randomized sham-controlled study,” Brain, vol. 137, no. 7, pp. 2088-2098, 2014. [15] C.T. Li et al., “Antidepressant Efficacy of Prolonged Intermittent Theta Burst Stimulation Monotherapy for Recurrent Depression and Comparison of Methods for Coil Positioning: A Randomized, Double-Blind, Sham-Controlled Study,” Biological Psychiatry, vol. 87, no. 5, pp. 443-450, 2020. [16] Qingqing Liu et al., “Changes in the global burden of depression from 1990 to 2017: Findings from the Global Burden of Disease study,” Journal of Psychiatric Research, vol. 126, pp. 134-140, 2020. [17] A. L. Brody et al., “Prefrontal-subcortical and limbic circuit mediation of major depressive disorder,” Seminars in Clinical Neuropsychiatry, vol. 6, no. 2, pp. 102-112, 2001. [18] W. C. Drevets, “Neuroimaging studies of mood disorders,” Biological Psychiatry, vol. 48, no. 8, pp. 813-829, 2000. [19] R. J. Davidson et al., “Depression: perspectives from affective neuroscience,” Annual Review of Psychology, vol. 53, pp. 545-574, 2002. [20] J. F. Thayer et al., “A model of neurovisceral integration in emotion regulation and dysregulation,” Journal of Affective Disorders, vol. 61, no. 3, pp. 201-216, 2000. [21] O. Devinsky et al., “Contributions of anterior cingulate cortex to behavior,” Brain, vol. 118, no. 1, pp. 279-306, 1995. [22] D. A. Pizzagalli et al., “Anterior Cingulate Activity as a Predictor of Degree of Treatment Response in Major Depression: Evidence From Brain Electrical Tomography Analysis,” The American Journal of Psychiatry, vol. 158, no. 3, pp. 405-415, 2001. [23] H. Asada et al., “Frontal midline theta rhythms reflect alterative activation of prefrontal cortex and anterior cingulate cortex in humans,” Neuroscience Letters, vol. 274, no. 1, pp. 29-32, 1999. [24] P. S. Cooper et al., “Theta frontoparietal connectivity associated with proactive and reactive cognitive control processes,” Neuroimage, vol. 108, pp. 354-363, 2015. [25] J. F. Cavanagh et al., “Frontal theta as a mechanism for cognitive control,” Trends Cogn Sci, vol. 18, no. 8, pp. 414-421, 2014. [26] Y. Noda et al., “Neurobiological mechanisms of repetitive transcranial magnetic stimulation of the dorsolateral prefrontal cortex in depression: a systematic review,” Psychol Med, vol. 45, no. 16, pp. 3411-3432, 2015. [27] M. Tik et al., “Towards understanding rTMS mechanism of action: Stimulation of the DLPFC causes network-specific increase in functional connectivity,” Neuroimage, vol. 162, pp. 289-296, 2017. [28] Kevin C. Bickart et al., “The amygdala as a hub in brain networks that support social life,” Neuropsychologia, vol. 63, pp. 235-248, 2014. [29] C.T. Hsu, '探討經由認知作業程式驅動後的前額葉 Theta 波預測憂鬱症病患未來之療效以及認知作業程式開發與驗證,' Master Thesis, National Taiwan University, 2017. [30] Ina Wu, “Real Time Computer Aided Detection System for the Prediction of Clinical Antidepressant Responses,” Master Thesis, National Taiwan University, 2017. [31] Yi-Chen Li, “Real Time EEG Analysis for Prediction of Antidepressant Responses of Transcranial Magnetic Stimulation in Major Depressive Disorder Based on Machine Learning,” Master Thesis, National Taiwan University, 2019. [32] C.T. Li et al., “Cognition-Modulated Frontal Activity in Prediction and Augmentation of Antidepressant Efficacy: A Randomized Controlled Pilot Study,” Cerebral Cortex, vol. 26, no. 1, pp. 202-210, 2014. [33] Fatemeh Hasanzadeh et al., “Prediction of rTMS treatment response in major depressive disorder using machine learning techniques and nonlinear features of EEG signal,” Journal of Affective Disorders, vol. 256, pp. 132-142, 2019. [34] Amin Zandvakili et al., “Use of machine learning in predicting clinical response to transcranial magnetic stimulation in comorbid posttraumatic stress disorder and major depression: A resting state electroencephalography study,” Journal of Affective Disorders, vol. 252, pp. 47-54, 2019. [35] Turker Tekin Erguzel et al., “Classification of major depressive disorder subjects using Pre-rTMS electroencephalography data with support vector machine approach,” 2014 IEEE Science and Information Conference, pp. 410-414, 2014. [36] Nathan Bakker, “Resting-State Functional Connectivity Predicts Individual Treatment Outcomes of Repetitive Transcranial Magnetic Stimulation for Major Depressive Disorder,” Master Thesis, University of Toronto, 2014. [37] Jue Wang et al., “High-Frequency rTMS of the Motor Cortex Modulates Cerebellar and Widespread Activity as Revealed by SVM,” Frontiers in Neuroscience, vol. 14, pp. 186, 2020. [38] Turker Tekin Erguzel et al., “Machine Learning Approaches to Predict Repetitive Transcranial Magnetic Stimulation Treatment Response in Major Depressive Disorder,” Proceedings of SAI Intelligent Systems Conference (IntelliSys) 2016, pp. 391-401, 2016. [39] Turker Tekin Erguzel et al., “Feature Selection and Classification of Electroencephalographic Signals: An Artificial Neural Network and Genetic Algorithm Based Approach,” Clinical EEG and Neuroscience, vol. 46, no. 4, pp. 321-326, 2015. [40] Wei Wu et al, “An electroencephalographic signature predicts antidepressant response in major depression,” Nature Biotechnology, vol. 38, pp. 439-447, 2020. [41] Zhijiang Wan et al., “HybridEEGNet: A Convolutional Neural Network for EEG Feature Learning and Depression Discrimination,” IEEE Access, vol. 8, pp. 30332-30342, 2020. [42] U. RajendraAcharya et al., “Automated EEG-based screening of depression using deep convolutional neural network,” Computer Methods and Programs in Biomedicine, vol. 161, pp. 103-113, 2018. [43] Javier Alcazar et al., “Classical versus quantum models in machine learning: insights from a finance application,” Machine Learning: Science and Technology, vol. 1, no. 3, 035003, 2020. [44] YaoChong Li et al., “A quantum mechanics-based framework for EEG signal feature extraction and classification,” IEEE Transactions on Emerging Topics in Computing, vol. 14, no. 8, pp. 1-1, 2020. [45] Bárbara M. de Andrade et al., “Comparison of the performance of multiclass classifiers in chemical data: Addressing the problem of overfitting with the permutation test,” Chemometrics and Intelligent Laboratory Systems, vol. 201, 104013, 2020. [46] RaviGarg et al., “Automating Ischemic Stroke Subtype Classification Using Machine Learning and Natural Language Processing,” Journal of Stroke and Cerebrovascular Diseases, vol. 28, no. 7, pp. 2045-2051, 2019. [47] Natalia Jaworska et al., “Leveraging Machine Learning Approaches for Predicting Antidepressant Treatment Response Using Electroencephalography (EEG) and Clinical Data,” Frontiers in Psychiatry, vol. 9, pp. 768, 2019. [48] Milena Cukic et al., “EEG machine learning with Higuchi fractal dimension and Sample Entropy as features for successful detection of depression,” arXiv preprint arXiv:1803.05985v1, 2018. [49] Thigo M. Nunes et al., “EEG signal classification for epilepsy diagnosis via optimum path forest – A systematic assessment,” Neurocomputing, vol. 136, pp. 103-123, 2014. [50] G. Ratsch et al., “Constructing boosting algorithms from SVMs: an application to one-class classification,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 9, pp. 1184-1199, 2002. [51] Dixita Mali et al., “A Machine Learning Technique to Analyze Depressive Disorders,” Research Square preprint Research Square: rs.3.rs-322564/v1, 2021. [52] Abraham J. Wyner et al., “Explaining the Success of AdaBoost and Random Forests as Interpolating Classifiers,” Journal of Machine Learning Research, vol. 18, no. 18, pp. 1-33, 2017. [53] Qiao Yuanhua et al., “Machine Learning Approaches for MDD Detection and Emotion Decoding Using EEG Signals,” Frontiers in Human Neuroscience, vol. 14, pp. 284, 2020. [54] Turker Tekin Erguze et al., “Neural Network Based Response Prediction of rTMS in Major Depressive Disorder Using QEEG Cordance,” Psychiatry Investig, vol. 12, no. 1, pp. 61-65, 2015. [55] Amin Zandvakili et al., “Changes in functional connectivity after theta-burst transcranial magnetic stimulation for post-traumatic stress disorder: a machine-learning study,” European Archives of Psychiatry and Clinical Neuroscience, vol. 271, pp. 29-37, 2021. [56] Behshad Hosseinifard et al., “Classifying depression patients and normal subjects using machine learning techniques and nonlinear features from EEG signal,” Comput Methods Programs Biomed, vol. 109, pp. 339-345, 2013. [57] Wajid Mumtaz et al., “A machine learning framework involving EEG-based functional connectivity to diagnose major depressive disorder (MDD),” Medical Biological Engineering Computing, vol. 56, no. 2, pp. 233-246, 2018. [58] Xiao-WeiWang et al., “Emotional state classification from EEG data using machine learning approach,” Neurocomputing, vol. 129, pp. 94-106, 2014. [59] Dan V. Iosifescu, “Are Electroencephalogram-Derived Predictors of Antidepressant Efficacy Closer to Clinical Usefulness?,” Invited Commentary, JAMA Psychiatry, 2020. [60] Y. P. Lin et al., “Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening,” Frontiers in Neuroscience, vol. 8, pp. 94, 2014. [61] Diego Alvarez-Estevez, “European Data Format,” Available: European Data Format (EDF) (edfplus.info) [62] Pierre Comon et al., “Independent Component Analysis,” Higher-Order Statistics, pp. 29-38, 1992. [63] Jutten et al., “Independent component analysis versus principal component analysis,” Signal Processing IV: Theo. and Appl., 1988. [64] Peizhen Peng et al., “Epileptic Seizure Prediction in Scalp EEG Using an Improved HIVE-COTE Model,” 39th IEEE Chinese Control Conference (CCC), pp. 6450-6457, 2020. [65] Chenglong Dai et al., “CenEEGs: Valid EEG Selection for Classification,” ACM Transactions on Knowledge Discovery from Data, vol. 14, no. 2, pp. 18:1-18:25, 2020. [66] Alejandro Pasos Ruiz et al., “Benchmarking Multivariate Time Series Classification Algorithms,” arXiv preprint arXiv:2007.13156, 2020. [67] Chenglong Dai et al., “Shapelet-transformed Multi-channel EEG Channel Selection,” ACM Transactions on Intelligent Systems and Technology, vol. 11, no. 5, pp. 58:1-58:27, 2020. [68] Yann LeCun et al., “Convolutional Networks for Images, Speech, and Time-Series,” in The handbook of brain theory and neural networks, MIT Press, 1995. [69] J. Long et al., “Fully Convolutional Networks for Semantic Segmentation,” Conference on Computer Vision and Pattern Recognition 2015 (CVPR 2015), pp. 3431-3440, 2015. [70] Jian Bo Yang et al., “Deep Convolutional Neural Networks On Multichannel Time Series For Human Activity Recognition,” Twenty-Fourth International Joint Conference on Artificial Intelligence (IJCAI 2015), pp. 3995-4001, 2015. [71] Kaiming He et al., “Deep Residual Learning for Image Recognition,” arXiv preprint arXiv:1512.03385, 2015. [72] Le Guennec et al., “Data Augmentation for Time Series Classification using Convolutional Neural Networks,” ECML/PKDD Workshop on Advanced Analytics and Learning on Temporal Data, 2016. [73] Hassan Ismail Fawaz et al., “InceptionTime: Finding AlexNet for Time Series Classification,” arXiv preprint arXiv:1909.04939, 2019. [74] Christian Szegedy et al., “Going deeper with convolutions,” arXiv preprint arXiv:1409.4842v1, 2014. [75] Hinton et al., “Learning representations by back-propagating errors,” Nature, vol. 323, pp. 533-536, 1986. [76] Sepp Hochreiter et al., “LONG SHORT-TERM MEMORY,” Neural Computation, vol. 9, no. 8, pp. 1735-1780, 1997. [77] Xuchao Zhang et al., “TapNet: Multivariate Time Series Classification with Attentional Prototypical Network,” Association for the Advancement of Artificial Intelligence Conference 2020 (AAAI 2020), vol. 34, no. 04, pp. 6845-6852, 2020. [78] Joan Serrà et al., “Towards a Universal Neural Network Encoder for Time Series,” arXiv preprint arXiv:1805.03908, 2018. [79] Fazle Karim et al., “LSTM Fully Convolutional Networks for Time Series Classification,” IEEE Access, vol. 6, pp. 1662-1669, 2017. [80] Fazle Karim et al., “Multivariate LSTM-FCNs for Time Series Classification,” Neural Networks, vol. 116, pp. 237-245, 2019. [81] Dmitry Ulyanov et al., “Instance Normalization: The Missing Ingredient for Fast Stylization,” arXiv preprint arXiv:1607.08022, 2016. [82] Bing Xu et al., “Empirical Evaluation of Rectified Activations in Convolution Network,” arXiv preprint arXiv:1505.00853v2, 2015. [83] He et al., “Delving deep into rectifiers: Surpassing human-level performance on imagenet classification,” arXiv preprint arXiv:1502.01852, 2015. [84] Patrick Schäfer, “Bag-Of-SFA-Symbols in Vector Space (BOSS VS),” ZIB-Report, vol. 30, 2015. [85] Patrick Schäfer, “Scalable time series classification,” Data Mining and Knowledge Discovery, vol. 30, no. 5, pp. 1273-1298, 2015. [86] Patrick Schäfer et al., “The BOSS is concerned with time series classification in the presence of noise,” Data Mining and Knowledge Discovery, vol. 29, pp. 1505-1530, 2015. [87] Patrick Schäfer et al., “SFA: A Symbolic Fourier Approximation and Index for Similarity Search in High Dimensional Datasets,” EDBT '12: Proceedings of the 15th International Conference on Extending Database Technology, pp. 516-527, 2012. [88] Zellig Harris, “Distributional Structure,” WORD, vol. 10, pp. 146-162, 1954. [89] Xueqi Zhang et al., “Time-Series Prediction of Environmental Noise for Urban IoT Based on Long Short-Term Memory Recurrent Neural Network,” Applied Sciences, vol. 10, no. 3, pp. 1144, 2020. [90] C. Lee Giles et al., “Noisy Time Series Prediction using Recurrent Neural Networks and Grammatical Inference,” Machine Learning, vol. 44, pp. 161-183, 2001. [91] A. Wolf et al., “Determining Lyapunov Exponents From a Time Series,” Physica D: Nonlinear Phenomena, vol. 16, no. 3, pp. 285-317, 1985. [92] L. F. Márton et al., “Detrended Fluctuation Analysis of EEG Signals,” Procedia Technology, vol. 12, pp. 125-132, 2014. [93] S. M Pincus et al., “A regularity statistic for medical data analysis,” Journal of Clinical Monitoring and Computing, vol. 7, pp. 335-345, 1991. [94] M. J. Katz, “Fractals and the analysis of waveforms,” Computers in Biology and Medicine, vol. 18, no. 3, pp. 145-156, 1988. [95] T. Higuchi, “Approach to an irregular time series on the basis of the fractal theory,” Physica D: Nonlinear Phenomena, vol. 31, no. 2, pp. 277-283, 1988. [96] P. D. Welch, “The use of fast Fourier transform for the estimation of power spectra: A method based on time averaging over short, modified periodograms,” IEEE Trans. Audio Electroacoust, vol. 15, no. 2, pp. 70-73, 1967. [97] L. Breiman, “Random Forests,” Machine Learning, vol. 45, pp. 5-32, 2001. [98] Freund et al., “A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting,” Journal of Computer and System Sciences, vol. 55, no. 1, pp. 119-139, 1997. [99] Tianqi Chen et al., “XGBoost: A Scalable Tree Boosting System,” KDD '16: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785-794, 2016. [100] Cortes et al., “Support-Vector Networks,” Machine Learning, vol. 20, pp. 273-297, 1995. [101] Anna Veronika Dorogush et al., “CatBoost: gradient boosting with categorical features support,” arXiv preprint arXiv:1810.11363, 2018. [102] Zhang, K. et al., “Domain adaptation under target and conditional shift,” Proceedings of Machine Learning Research (PMLR), vol. 28, no. 3, pp. 819-827, 2013. [103] Patrick Rebentrost et al., “Quantum Support Vector Machine for Big Data Classification,” Physical Review Letters, vol. 113, no. 130503, 2014. [104] Peter Wittek, Quantum Machine Learning: What Quantum Computing Means to Data Mining, Academic Press, 2014. [105] Christopher Havenstein et al., “Comparisons of Performance between Quantum and Classical Machine Learning,” SMU Data Science Review, vol. 1, no. 4, pp. 11, 2018. [106] Vojtech Havlicek et al. “Supervised learning with quantum enhanced feature spaces,” Nature, vol. 567, pp. 209-212, 2019. [107] Aram W. Harrow et al., “Quantum algorithm for solving linear systems of equations,” arXiv preprint arXiv:0811.3171, 2008. [108] P V Zahorodko et al., “Comparisons of performance between quantum-enhanced and classical machine learning algorithms on the IBM Quantum Experience,” Journal of Physics: Conference Series, no. 1840, 2021. [109] Andrew Cross, “The IBM Q experience and QISKit open-source quantum computing software,” American Physical Society March Meeting (APS), no. L58.003, 2018. [110] James Large et al., “sktime-dl,” Github; https://github.com/sktime/sktime-dl (Accessed 2020 November-2021 March). [111] Fazle Karim et al., “MLSTM-FCN,” Github; https://github.com/titu1994/MLSTM-FCN (Accessed 2020 November-2021 March). [112] Zhang et al., “tapnet,” Github; https://github.com/xuczhang/tapnet (Accessed 2021 February-2021 March). [113] Ingo Mierswa, “Controlling overfitting with multi-objective support vector machines,” GECCO '07: Proceedings of the 9th annual conference on Genetic and evolutionary computation, pp. 1830-1837, 2007. [114] Henry Hen et al., “Overcome Support Vector Machine Diagnosis Overfitting,” Cancer Informatics, vol. 13, no. s1, 2014. [115] C.T. Li et al., “Effects of prefrontal theta-burst stimulation on brain function in treatment-resistant depression: A randomized sham-controlled neuroimaging study,” Brain Stimulation, vol. 11, no. 5, pp. 1054-1062, 2018. [116] Davide Anguita et al., “Model Selection for Support Vector Machines: Advantages and Disadvantages of the Machine Learning Theory,” International Joint Conference on Neural Networks (IJCNN), 2010. [117] R. Zhang et al., “An improved SVM method P‐SVM for classification of remotely sensed data,” International Journal of Remote Sensing, vol. 29, no. 20, pp. 6029-6036, 2008. [118] I. V. Tetko et al., “Neural network studies. 1. Comparison of overfitting and overtraining,” Journal of Chemical Information and Modeling, vol. 35, no. 5, pp. 826-833, 1995. [119] Vladimir Vapnik, The nature of statistical learning theory, Springer, 2000. [120] Vladimir Vapnik et al., “On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities,” Theory of Probability Its Applications, vol. 16, no. 2, pp.264, 1971. [121] Mikhail Belkin et al., “Overfitting or perfect fitting? Risk bounds for classification and regression rules that interpolate,” Advances in Neural Information Processing Systems 31 (NeurIPS 2018), 2018. [122] Arthur Jacot et al., “Neural Tangent Kernel: Convergence and Generalization in Neural Networks,” Advances in Neural Information Processing Systems 31 (NeurIPS 2018), 2018. [123] Sanjeev Arora et al., “Fine-Grained Analysis of Optimization and Generalization for Overparameterized Two-Layer Neural Networks,” arXiv preprint arXiv:1901.08584, 2019. [124] Kenji Kawaguchi et al., “Generalization in Deep Learning,” arXiv preprint arXiv:1710.05468, 2020. [125] Fengxiang He et al., “Recent advance in deep learning theory,” arXiv preprint arXiv:2012.10931, 2021. [126] Photograph of Qubit and Bit, depositphotos, Accessed on: May, 5, 2021. [Online]. Available: https://cn.depositphotos.com/419582472/stock-illustration-qubit-bit-states-classical-bit.html [127] Sebastian Ruder, “An overview of gradient descent optimization Algorithms,” arXiv preprint arXiv:1609.04747v2, 2016. [128] Haoyan Xu et al., “Multivariate Time Series Classification with Hierarchical Variational Graph Pooling,” arXiv preprint arXiv:2010.05649, 2020. [129] LinWang et al., “An effective multivariate time series classification approach using echo state network and adaptive differential evolution algorithm,” Expert Systems with Applications, vol. 43, pp. 237-249, 2016. [130] Lines et al., “Time Series Classification with HIVE-COTE: The Hierarchical Vote Collective of Transformation-based Ensembles,” ACM Transactions on Knowledge Discovery from Data, vol. 12, no. 5, 2018. [131] R. P. Feynman, “Simulating physics with computers,” International Journal of Theoretical Physics, vol. 21, no. 6-7, pp. 467-488, 1982. [132] J. Biamonte et al., “Quantum machine learning,” Nature, vol. 549, no. 7671, pp. 195–202, 2017. [133] V. Dunjko et al., “Quantum-Enhanced Machine Learning,” Physical Review Letters, vol. 117, no. 13, pp. 130501, 2016. [134] I. Cong et al., “Quantum discriminant analysis for dimensionality reduction and classification,” New Journal of Physics, vol. 18, no. 7, pp. 073011, 2016. [135] M. Schuld et al., “Implementing a distance-based classifier with a quantum interference circuit,” EPL (Europhysics Letters), vol. 119, no. 6, pp. 60002, 2017. [136] M. Schuld et al., “Quantum ensembles of quantum classifiers,” Scientific reports, vol. 8, no. 1, pp. 2772, 2018. [137] Y. Dang et al., “Image classification based on quantum K-Nearest-Neighbor algorithm,” Quantum Information Processing, vol. 17, no. 9, pp. 239, 2018. [138] J. Zhao et al., “Building quantum neural networks based on a swap test,” Physical Review A, vol. 100, no. 1, pp. 012334, 2019. [139] N. Wiebe et al., “Quantum algorithm for data fitting,” Physical Review Letters, vol. 109, no. 5, pp. 050505, 2012. [140] M. Schuld et al., “Prediction by linear regression on a quantum computer,” Physical Review A, vol. 94, no. 2, pp. 022342, 2016. [141] G. Wang et al., “Quantum algorithm for linear regression,” Physical Review A, vol. 96, no. 1, pp. 012335, 2017. [142] C.-H. Yu et al., “An improved quantum algorithm for ridge regression,” IEEE Transactions on Knowledge and Data Engineering, 2019. [143] D. Horn et al., “Algorithm for data clustering in pattern recognition problems based on quantum mechanics,” Physical Review Letters, vol. 88, no. 1, pp. 187 021–187 024, 2002. [144] E. Aımeur et al., “Quantum speed-up for unsupervised learning,” Machine Learning, vol. 90, no. 2, pp. 261-287, 2013. [145] J. Romero et al., “Quantum autoencoders for efficient compression of quantum data,” Quantum Science and Technology, vol. 2, no. 4, pp. 045001, 2017. [146] C.-H. Yu et al., “Quantum data compression by principal component analysis,” Quantum Information Processing, vol. 18, no. 8, p. 249, 2019. [147] B. Neyshabur et al., “In search of the real inductive bias: On the role of implicit regularization in deep learning,” arXiv preprint arXiv:1412.6614, 2014. [148] J. Snell et al., “Prototypical networks for few-shot learning,” Advances in Neural Information Processing Systems 30 (NIPS 2017), pp. 4077-4087, 2017. [149] Ulf Grenander, “The Nyquist frequency is that frequency whose period is two sampling intervals,” Probability and Statistics: The Harald Cramér Volume, 1959. [150] H. M. Yang et al., “Robust Classification with Convolutional Prototype Learning,” 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, no. 18347865, 2018. [151] M. Ilse et al., “Attention-based deep multiple instance learning,” arXiv preprint arXiv:1802.04712, 2018. [152] Arindam Banerjee et al, “Clustering with Bregman Divergences,” Journal of Machine Learning Research, vol. 6, pp. 1705-1749, 2005. [153] D. P. Kingma et al., “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014. [154] Joaquin Vanschoren, “Meta-Learning: A Survey,” arXiv preprint arXiv:1810.03548, 2018. [155] Richard Socher et al., “Zero-Shot Learning Through Cross-Modal Transfer,” arXiv preprint arXiv:1301.3666, 2013. [156] Abhinav Kandala et al., “Hardware-efficient Variational Quantum Eigensolver for Small Molecules and Quantum Magnets,” arXiv preprint arXiv:1704.05018, 2017. [157] O. Chapelle et al., “Semi-Supervised Learning,” IEEE Transactions on Neural Networks, vol. 20, no. 3, pp. 542, 2009. [158] Jesper E. van Engelen et al., “A survey on semi-supervised learning,” Machine Learning, vol. 109, pp. 373-440, 2020. [159] Flood Sung et al. “Learning to Compare: Relation Network for Few-Shot Learning,” the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1199-1208, 2018. [160] Sachin Ravi et al., “Optimization as a Model for Few-Shot Learning,” International Conference on Learning Representations (ICLR), 2017. [161] Kay Gregor Hartmann et al., “EEG-GAN: Generative adversarial networks for electroencephalograhic (EEG) brain signals,” arXiv preprint arXiv:1806.01875, 2018. [162] Shiliang Sun et al., “A review of adaptive feature extraction and classification methods for EEG-based brain-computer interfaces,” 2014 International Joint Conference on Neural Networks (IJCNN), no. 14563793, 2014. [163] Wan Amirah W Azlan et al., “Feature extraction of electroencephalogram (EEG) signal - A review,” 2014 IEEE Conference on Biomedical Engineering and Sciences (IECBES), no. 14950668, 2014. [164] 李士勇等。智能優化演算法與湧現計算。清華大學出版社,2019。 [165] Donald A. Sofge et al., “Toward a Framework for Quantum Evolutionary Computation,” 2006 IEEE Conference on Cybernetics and Intelligent Systems, no. 9231748, 2006. [166] Gexiang Zhang, “Quantum-inspired evolutionary algorithms: a survey and empirical study,” Journal of Heuristics, vol. 17, pp. 303-351, 2010. [167] R.K.Agrawal et al., “Quantum based Whale Optimization Algorithm for wrapper feature selection,” Applied Soft Computing, vol. 89, no. 106092, 2020. [168] X. D. Cai et al., ““Entanglement-based machine learning on a quantum computer,” Physical Review Letters, vol.114, no. 11, p. 110504, 2015. [169] Z. Li et al., “Experimental realization of a quantum support vector machine,” Physical Review Letters, vol. 114, no. 14, p. 140504, 2015. [170] F. Tacchino et al., “An artificial neuron implemented on an actual quantum processor,” npj Quantum Information, vol. 5, no. 1, p. 26, 2019. [171] Albert Reuther et al., “Interactive Supercomputing on 40,000 Cores for Machine Learning and Data Analysis,” 2018 IEEE High Performance extreme Computing Conference (HPEC), no. 18290412, 2018. [172] Trevor Bekolay et al., “Nengo: a Python tool for building large-scale functional brain models,” Frontiers in Neuroimformatics, vol. 7, pp. 48, 2014. [173] W.S. Pritchard et al., “Measuring Chaos in the Brain - A Tutorial Review of EEG Dimension Estimation,” Brain and Cognition, vol. 27, no. 3, pp. 353-397, 1995. [174] Yu Cheng et al., “A Survey of Model Compression and Acceleration for Deep Neural Networks,” arXiv preprint arXiv:1710.09282, 2017. [175] Alexander Novikov et al., “Tensorizing Neural Networks,” Advances in Neural Information Processing Systems 28 (NIPS 2015), 2015. [176] Yinchong Yang et al., “Tensor-Train Recurrent Neural Networks for Video Classification,” arXiv preprint arXiv:1707.01786, 2017. [177] P. A. Robinson et al., “Neurophysical Modeling of Brain Dynamics,” Nature Neuropsychopharmacology, vol. 28, pp. S74-S79, 2003. [178] Felix Bloch, “Nuclear induction,” Phys. Rev., vol. 70, no. 7-8, pp. 460–474, 1946. [179] Nicholas I. Sapankevych et al., “Time Series Prediction Using Support Vector Machines: A Survey,” IEEE Computational Intelligence Magazine, vol. 4, no. 2, pp. 24-38, 2009. [180] Juliana Tolles et al., “Logistic Regression Relating Patient Characteristics to Outcomes,” JAMA Guide to Statistics and Methods, vol. 316, no. 5, pp. 533-534, 2016. [181] Fletcher et al., Practical Methods of Optimization (2nd ed.), New York: John Wiley Sons, 1987. [182] Yaser S. et al., Learning From Data: A Short Course (Hardcover), AMLBook, 2012. [183] S. Amari, Information Geometry and Its Applications, Springer, 2016. [184] A. M. Chekroud et al., “Cross-trial prediction of treatment outcome in depression: a machine learning approach,” Lancet Psychiatry, vol. 3, no. 3, pp. 243-250, 2016. [185] Bulat Ibragimov et al., “Minimal Variance Sampling in Stochastic Gradient Boosting,” arXiv preprint arXiv:1910.13204, 2019. [186] Mohammad Teshnehlab et al., “Feature Extraction and Classification of EEG Signals Using Wavelet Transform, SVM and Artificial Neural Networks for Brain Computer Interfaces,” International Joint Conferences on Bioinformatics, Systems Biology and Intelligent Computing, pp. 352-355, 2009. "……… | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/82206 | - |
| dc.description.abstract | 重度憂鬱症現今被認為是一種慢性惡化的精神疾病,並且具有與其他症狀甚至是自殺意念併發的風險。有一定比例的重度憂鬱症患者在嘗試過數種抗憂鬱藥物的治療過後並無顯著好轉,而此類型的病患被發現其有一定機率能被經顱磁刺激所治療。目前經顱磁刺激較為常見的類型有重複性經顱磁刺激與間歇性θ脈衝式經顱磁刺激兩種,而是否能為患者於臨床治療前預測各參數對各自病人的抗憂鬱反應並提供個人化精準有效的診療參數建議在未來將會是一個重要的技術。本研究利用129位重度憂鬱症患者的臨床腦電波資料訓練數種傳統與量子機器學習演算法並建構全新的深度學習模型來對兩種模式的經顱磁刺激療效進行訓練與預測,尤其針對幾乎沒有前人研究的間歇性θ脈衝式經顱磁刺激進行療效預測,並在包含模型類型的選取、深度學習模型的損失函數等等地方盡最大努力下避免模型過擬合。在結果中,本研究所提出的新式注意力機制卷積時間序列深度學習模型在常見的深度學習模型中預測兩種經顱磁刺激療效的效果均為最好。傳統機器學習方法中則使用了集成模型中的套袋與助推兩類模型,本研究也提出了一個能夠優化模型敏感度的資料前處理演算法,其綜合效果也較前人的支持向量機效果好。最後本研究也在真實的超導量子電腦上訓練了量子機器學習模型,並由結果證明其效果比同種傳統演算法好上許多。 | zh_TW |
| dc.description.provenance | Made available in DSpace on 2022-11-25T06:33:40Z (GMT). No. of bitstreams: 1 U0001-2906202117440700.pdf: 5917102 bytes, checksum: 0c5cff8a9b57793ad084924ff6ca8ed8 (MD5) Previous issue date: 2021 | en |
| dc.description.tableofcontents | "口試委員審定書(掃描本) iii 誌謝 iv 中文摘要 1 ABSTRACT 2 CONTENTS 4 LIST OF FIGURES 8 LIST OF TABLES 12 Chapter 1 Introduction 16 1.1 Thesis Motivation 16 1.2 Major Depressive Disorder (MDD) 17 1.3 Advanced Treatments for MDD Patients 18 1.4 Electroencephalogram 21 1.4.1 Prediction of TMS Response by EEG 21 1.4.2 Prediction of TMS Response by Artificial Intelligence with EEG Data 22 1.4.3 The Overfitting Problem on SVM 24 1.5 Thesis Organization 27 Chapter 2 Previous Research 28 2.1 Rostral Anterior Cingulate Cortex (rACC)-Engaging Cognitive Task (RECT) 28 2.2 Machine Learning on MDD EEG Data 30 2.3 Deep Learning on MDD EEG Data 31 2.4 Quantum Machine Learning on Time Series Data 31 2.5 Previous Work is Overfitting on the New Dataset 33 2.6 Bagging/Boosting is Less Overfitting than SVM 34 2.7 Thesis Aims and Hypothesis 35 Chapter 3 Methodology 37 3.1 EEG Data Acquisition 38 3.1.1 Psychiatric Evaluations 38 3.1.2 EEG Data Acquisition 39 3.1.3 MDD EEG Datasets 40 3.1.3.1 Randomized Controlled Trial Data (RCTD) (Previous Dataset) 42 3.1.3.2 Clinical OPD dataset 43 3.2 EEG Data Preprocessing 45 3.2.1 EEG Signal Resampling 45 3.2.2 Band Pass Filter 46 3.2.3 Independent Component Analysis 46 3.3 ACTSNet: Attentional Convolution Time Series Neural Network 48 3.3.1 ACTSNet Architecture 50 3.3.2 Prototypical Learning 52 3.4 Multivariate Time Series Classification (MTSC) Methods 55 3.4.1 Bag-of-SFA-Symbols Ensemble (BOSS Ensemble) 55 3.5 Classical Machine Learning Methods 56 3.5.1 Data Preprocessing 57 3.5.2 Feature Extraction 59 3.5.2.1 Largest Lyapunov Exponent 60 3.5.2.2 Detrended Fluctuation Analysis 61 3.5.2.3 Katz Fractal Dimension (KFD) and Higuchi Fractal Dimension (HFD) 62 3.5.2.3.1 Katz Fractal Dimension 62 3.5.2.3.2 Higuchi Fractal Dimension 63 3.5.2.4 Approximate Entropy 64 3.5.2.5 Welch Periodogram 65 3.5.3 Feature Selection 65 3.5.4 Logistic Regression 66 3.5.5 Support Vector Machine (SVM) 67 3.5.6 Bagging and Boosting Classification Models 69 3.5.6.1 XGBoost 70 3.5.6.2 CatBoost 73 3.5.6.3 Random Forest 76 3.5.7 The Booster Transformation 80 3.6 Quantum Machine Learning Methods 82 3.6.1 Data Preprocessing, Feature Extraction and Selection 83 3.6.2 Introduction to Quantum Computing 84 3.6.3 Quantum Support Vector Machine (QSVM) 86 Chapter 4 Experiment Results 93 4.1 Deep Learning Results 93 4.1.1 Deep Machine Learning Results 93 4.2 Multivariate Time Series Classification (MTSC) Results 98 4.2.1 MTSC Results 98 4.2.2 BOSS Ensemble vs. ACTSNet 99 4.3 The Statistically Significant features for Machine Learning on OPD Dataset 100 4.3.1 OPD Dataset 100 4.3.2 Mixed Dataset 105 4.3.3 Dataset Comparison 113 4.4 Classical Machine Learning Results 115 4.4.1 Classical Machine Learning Results 115 4.4.1.1 Compare with Linear Method and SVM on the Clinical Trial Dataset 115 4.4.1.2 SVM and the Overfitting Problem 118 4.4.1.3 Compare with Previous Work on the Clinical Trial Dataset 119 4.4.1.4 Compare with Previous Work on the OPD Dataset 121 4.4.1.5 Compare with Previous Work on the Mixed Dataset 122 4.4.2 Booster Transformation Results 123 4.4.2.1 Compare Booster with Non-Booster Methods on Random Forest with rTMS Mixed Dataset 124 4.4.2.2 Compare Booster with Non-Booster methods on XGBoost with rTMS Mixed Dataset 125 4.4.2.3 Compare Booster with Non-Booster Methods on CatBoost with rTMS Mixed Dataset 127 4.4.2.4 Compare Booster with Non-Booster Methods on Random Forest with iTBS Mixed Dataset 128 4.4.2.5 Compare Booster with Non-Booster Methods on XGBoost with iTBS Mixed Dataset 130 4.4.2.6 Compare Booster with Non-Booster Methods on CatBoost with iTBS Mixed Dataset 131 4.4.3 Booster-Classical Machine Learning Results 134 4.4.3.1 Compare with Previous Work on the OPD Dataset 134 4.4.3.2 Compare with Previous Work on the Mixed Dataset 135 4.4.4 Feature Importance and Digital Biomarkers 137 4.4.4.1 rTMS Data on Bagging and Boosting Machine Learning Algorithms 137 4.4.4.2 iTBS OPD Data on Bagging and Boosting Machine Learning Algorithms 140 4.5 Quantum Machine Learning Results 143 Chapter 5 Conclusion 145 Chapter 6 Future Work 147 Reference 149 " | |
| dc.language.iso | en | |
| dc.subject | 重度憂鬱症 | zh_TW |
| dc.subject | 量子計算 | zh_TW |
| dc.subject | 深度學習 | zh_TW |
| dc.subject | 機器學習 | zh_TW |
| dc.subject | 經顱磁刺激 | zh_TW |
| dc.subject | 腦電圖 | zh_TW |
| dc.subject | Deep learning | en |
| dc.subject | Quantum computing | en |
| dc.subject | Transcranial magnetic stimulation | en |
| dc.subject | Electroencephalography | en |
| dc.subject | Major depressive disorder | en |
| dc.subject | Machine learning | en |
| dc.title | 基於注意力機制之時間序列原型卷積神經網路與傳統及量子機器學習模型應用於重度憂鬱症腦波之經顱磁刺激抗憂鬱療效預測與分析 | zh_TW |
| dc.title | EEG Analysis for Prediction of Antidepressant Responses of Transcranial Magnetic Stimulation in Major Depressive Disorder Based on Attentional Convolution Time Series Prototypical Neural Network Model and Classical/Quantum Machine Learning Approaches | en |
| dc.date.schoolyear | 109-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.author-orcid | 0000-0003-0807-0217 | |
| dc.contributor.coadvisor | 李正達(Cheng-Ta Li) | |
| dc.contributor.oralexamcommittee | 黃聖傑(Hsin-Tsai Liu),楊智傑(Chih-Yang Tseng) | |
| dc.subject.keyword | 重度憂鬱症,腦電圖,經顱磁刺激,機器學習,深度學習,量子計算, | zh_TW |
| dc.subject.keyword | Major depressive disorder,Electroencephalography,Transcranial magnetic stimulation,Machine learning,Deep learning,Quantum computing, | en |
| dc.relation.page | 167 | |
| dc.identifier.doi | 10.6342/NTU202101201 | |
| dc.rights.note | 未授權 | |
| dc.date.accepted | 2021-07-01 | |
| dc.contributor.author-college | 電機資訊學院 | zh_TW |
| dc.contributor.author-dept | 生醫電子與資訊學研究所 | zh_TW |
| dc.date.embargo-lift | 2026-06-29 | - |
| 顯示於系所單位: | 生醫電子與資訊學研究所 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| U0001-2906202117440700.pdf 未授權公開取用 | 5.78 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
