請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99463完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 丘政民 | zh_TW |
| dc.contributor.advisor | Jeng-Min Chiou | en |
| dc.contributor.author | 塗銘鈞 | zh_TW |
| dc.contributor.author | Min-Jun Tu | en |
| dc.date.accessioned | 2025-09-10T16:21:59Z | - |
| dc.date.available | 2025-09-11 | - |
| dc.date.copyright | 2025-09-10 | - |
| dc.date.issued | 2025 | - |
| dc.date.submitted | 2025-07-29 | - |
| dc.identifier.citation | James MacQueen. Some methods for classification and analysis of multivariate observations. In Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Statistics, volume 5, pages 281–298. University of California press, 1967.
James O Ramsay and Bernard W Silverman. Functional Data Analysis. Springer, New York, 2nd edition, 2005. Jane-Ling Wang, Jeng-Min Chiou, and Hans-Georg Müller. Functional data analysis. Annual Review of Statistics and its application, 3(1):257–295, 2016. Christophe Abraham, Pierre-André Cornillon, ERIC Matzner-Løber, and Nicolas Molinari. Unsupervised curve clustering using b-splines. Scandinavian journal of statistics, 30(3):581–595, 2003. Jeng-Min Chiou and Pai-Ling Li. Functional clustering and identifying substructures of longitudinal data. Journal of the Royal Statistical Society Series B: Statistical Methodology, 69(4):679–699, 2007. Qingzhi Zhong and Xinyuan Song. Functional nonlinear principal component analysis. Computational Statistics & Data Analysis, 209:108169, 2025. Geoffrey E Hinton and Ruslan R Salakhutdinov. Reducing the dimensionality of data with neural networks. science, 313(5786):504–507, 2006. Tsung-Yu Hsieh, Yiwei Sun, Suhang Wang, and Vasant Honavar. Functional autoencoders for functional data representation learning. In Proceedings of the 2021 SIAM International Conference on Data Mining (SDM), pages 666–674. SIAM, 2021. Sidi Wu, Cédric Beaulac, and Jiguo Cao. Functional autoencoder for smoothing and representation learning. Statistics and Computing, 34(6):203, 2024. Fabrice Rossi and Brieuc Conan-Guez. Functional multi-layer perceptron: a non-linear tool for functional data analysis. Neural networks, 18(1):45–60, 2005. Aniruddha Rajendra Rao and Matthew Reimherr. Nonlinear functional modeling using neural networks. Journal of Computational and Graphical Statistics, 32(4):1248–1257, 2023. Junwen Yao, Jonas Mueller, and Jane-Ling Wang. Deep learning for functional data analysis with adaptive basis layers. In International conference on machine learning, pages 11898–11908. PMLR, 2021. Diederik P. Kingma and Jimmy Ba. Adam: A method for stochastic optimization. In International Conference on Learning Representations, 2015. Published as a conference poster at ICLR 2015. Lutz Prechelt. Early stopping-but when? In Neural Networks: Tricks of the trade, pages 55–69. Springer, 2002. Maziar Moradi Fard, Thibaut Thonet, and Eric Gaussier. Deep k-means: Jointly clustering with k-means and learning representations. Pattern Recognition Letters, 138:185–192, 2020. Pasi Fränti and Sami Sieranoja. Clustering accuracy. Applied Computing and Intelligence, 2024. Lawrence Hubert and Phipps Arabie. Comparing partitions. Journal of classification, 2:193–218, 1985. Trevor Hastie, Robert Tibshirani, Jerome Friedman, et al. The elements of statistical learning, 2009. Bob Kemp, Aeilko H Zwinderman, Bert Tuk, Hilbert AC Kamphuisen, and Josefien JL Oberye. Analysis of a sleep-dependent neuronal feedback loop: the slow-wave microcontinuity of the eeg. IEEE Transactions on Biomedical Engineering, 47(9):1185–1194, 2000. Xiang Zhang, Ziyuan Zhao, Theodoros Tsiligkaridis, and Marinka Zitnik. Self-supervised contrastive pre-training for time series via time-frequency consistency. Advances in neural information processing systems, 35:3988–4003, 2022. | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99463 | - |
| dc.description.abstract | 本研究提出傳統 K-中心函數分群法(KCFC)的非線性延伸,建立一種嶄新的函數型資料分群方法:K-中心函數型自編碼器分群法(KCFAEC)。本方法將 KCFC 中的函數主成分分析(FPCA)替換為非線性的函數型自編碼器(FAE),後者將傳統的自編碼器架構延伸至函數型資料上。與 KCFC 相同,本方法以重建誤差作為重新指派群集的準則,透過迭代更新各群集的模型與樣本的群集標籤,達到分群目的。在此架構下,每個群集可學習自身的非線性表徵空間,藉以捕捉群集間潛在的結構差異。模擬研究與真實資料實驗(包含音素及腦電波資料)顯示,本方法的分群表現優於或相當於現有傳統方法,尤其在群集間具有明顯非線性結構差異的情境中,更能凸顯 KCFAEC 的效能,證實了本方法在實務上的價值與應用潛力。 | zh_TW |
| dc.description.abstract | This study proposes a nonlinear extension of the traditional K-centres functional clustering (KCFC) method, introducing a novel clustering approach for functional data: the K-Centres Functional Autoencoder Clustering (KCFAEC). In the proposed method, functional principal component analysis (FPCA) used in KCFC is replaced with a nonlinear functional autoencoder (FAE), which extends the autoencoder network to functional data. Similar to KCFC, the reconstruction loss is used for cluster reassignment, and the clustering process alternates between updating cluster-specific models and cluster labels. This process enables each cluster to possess its own nonlinear representation space, capturing latent structural differences across clusters. Simulation studies and real data examples—including the Phoneme and EEG datasets—demonstrate that KCFAEC consistently outperforms or matches other traditional methods in terms of clustering performance, especially in scenarios where clusters exhibit distinct nonlinear substructures, confirming the effectiveness and practical value of the proposed method. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-09-10T16:21:59Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2025-09-10T16:21:59Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | 致謝 i
摘要 ii Abstract iii Contents v List of Figures vii List of Tables x Chapter 1 Introduction 1 Chapter 2 Preliminaries 5 2.1 Overview of K-Centres Functional Clustering 5 2.1.1 Basic Concepts 5 2.1.2 The Clustering Procedure 7 2.2 Overview of Functional Autoencoder 8 2.2.1 Basic Concepts 8 2.2.2 Mathematical Formulation 9 Chapter 3 Methodology 12 3.1 K-Centres Functional Autoencoder Clustering 12 3.2 Functional Autoencoder Training Procedure 16 Chapter 4 Simulations 21 4.1 Experiment Settings 21 4.1.1 Data Generations 21 4.1.2 Comparison Methods and Hyperparameters 26 4.1.3 Performance Metrics 28 4.2 Experiment Results 29 4.2.1 Case 1-1 Results 30 4.2.2 Case 1-2 Results 31 4.2.3 Case 2-1 Results 32 4.2.4 Case 2-2 Results 33 4.2.5 Case 3 Results 34 4.2.6 Reconstruction Results 35 4.2.7 Effects of Network Architecture 36 4.2.8 Computation Time Comparison 37 Chapter 5 Real Data Examples 38 5.1 Phoneme Dataset 38 5.2 Sleeping EEG Dataset 40 Chapter 6 Conclusion 43 References 45 Appendix A — Supplementary Results 48 | - |
| dc.language.iso | en | - |
| dc.subject | 函數型資料 | zh_TW |
| dc.subject | K-中心分群法 | zh_TW |
| dc.subject | 自編碼器 | zh_TW |
| dc.subject | 非線性表徵 | zh_TW |
| dc.subject | K-centres clustering | en |
| dc.subject | Autoencoder | en |
| dc.subject | Functional data | en |
| dc.subject | Nonlinear representation | en |
| dc.title | 透過函數型自編碼器實現K-中心函數分群法的非線性擴展 | zh_TW |
| dc.title | A Nonlinear Extension of K-Centres Functional Clustering via Functional Autoencoder | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 113-2 | - |
| dc.description.degree | 碩士 | - |
| dc.contributor.oralexamcommittee | 李百靈;林書勤 | zh_TW |
| dc.contributor.oralexamcommittee | Pai-Ling Li;Shu-Chin Lin | en |
| dc.subject.keyword | 函數型資料,K-中心分群法,自編碼器,非線性表徵, | zh_TW |
| dc.subject.keyword | Functional data,K-centres clustering,Autoencoder,Nonlinear representation, | en |
| dc.relation.page | 49 | - |
| dc.identifier.doi | 10.6342/NTU202502580 | - |
| dc.rights.note | 未授權 | - |
| dc.date.accepted | 2025-07-30 | - |
| dc.contributor.author-college | 理學院 | - |
| dc.contributor.author-dept | 統計與數據科學研究所 | - |
| dc.date.embargo-lift | N/A | - |
| 顯示於系所單位: | 統計與數據科學研究所 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-113-2.pdf 未授權公開取用 | 6.52 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
