Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 理學院
  3. 統計與數據科學研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99463
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor丘政民zh_TW
dc.contributor.advisorJeng-Min Chiouen
dc.contributor.author塗銘鈞zh_TW
dc.contributor.authorMin-Jun Tuen
dc.date.accessioned2025-09-10T16:21:59Z-
dc.date.available2025-09-11-
dc.date.copyright2025-09-10-
dc.date.issued2025-
dc.date.submitted2025-07-29-
dc.identifier.citationJames MacQueen. Some methods for classification and analysis of multivariate observations. In Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Statistics, volume 5, pages 281–298. University of California press, 1967.
James O Ramsay and Bernard W Silverman. Functional Data Analysis. Springer, New York, 2nd edition, 2005.
Jane-Ling Wang, Jeng-Min Chiou, and Hans-Georg Müller. Functional data analysis. Annual Review of Statistics and its application, 3(1):257–295, 2016.
Christophe Abraham, Pierre-André Cornillon, ERIC Matzner-Løber, and Nicolas Molinari. Unsupervised curve clustering using b-splines. Scandinavian journal of statistics, 30(3):581–595, 2003.
Jeng-Min Chiou and Pai-Ling Li. Functional clustering and identifying substructures of longitudinal data. Journal of the Royal Statistical Society Series B: Statistical Methodology, 69(4):679–699, 2007.
Qingzhi Zhong and Xinyuan Song. Functional nonlinear principal component analysis. Computational Statistics & Data Analysis, 209:108169, 2025.
Geoffrey E Hinton and Ruslan R Salakhutdinov. Reducing the dimensionality of data with neural networks. science, 313(5786):504–507, 2006.
Tsung-Yu Hsieh, Yiwei Sun, Suhang Wang, and Vasant Honavar. Functional autoencoders for functional data representation learning. In Proceedings of the 2021 SIAM International Conference on Data Mining (SDM), pages 666–674. SIAM, 2021.
Sidi Wu, Cédric Beaulac, and Jiguo Cao. Functional autoencoder for smoothing and representation learning. Statistics and Computing, 34(6):203, 2024.
Fabrice Rossi and Brieuc Conan-Guez. Functional multi-layer perceptron: a non-linear tool for functional data analysis. Neural networks, 18(1):45–60, 2005.
Aniruddha Rajendra Rao and Matthew Reimherr. Nonlinear functional modeling using neural networks. Journal of Computational and Graphical Statistics, 32(4):1248–1257, 2023.
Junwen Yao, Jonas Mueller, and Jane-Ling Wang. Deep learning for functional data analysis with adaptive basis layers. In International conference on machine learning, pages 11898–11908. PMLR, 2021.
Diederik P. Kingma and Jimmy Ba. Adam: A method for stochastic optimization. In International Conference on Learning Representations, 2015. Published as a conference poster at ICLR 2015.
Lutz Prechelt. Early stopping-but when? In Neural Networks: Tricks of the trade, pages 55–69. Springer, 2002.
Maziar Moradi Fard, Thibaut Thonet, and Eric Gaussier. Deep k-means: Jointly clustering with k-means and learning representations. Pattern Recognition Letters, 138:185–192, 2020.
Pasi Fränti and Sami Sieranoja. Clustering accuracy. Applied Computing and Intelligence, 2024.
Lawrence Hubert and Phipps Arabie. Comparing partitions. Journal of classification, 2:193–218, 1985.
Trevor Hastie, Robert Tibshirani, Jerome Friedman, et al. The elements of statistical learning, 2009.
Bob Kemp, Aeilko H Zwinderman, Bert Tuk, Hilbert AC Kamphuisen, and Josefien JL Oberye. Analysis of a sleep-dependent neuronal feedback loop: the slow-wave microcontinuity of the eeg. IEEE Transactions on Biomedical Engineering, 47(9):1185–1194, 2000.
Xiang Zhang, Ziyuan Zhao, Theodoros Tsiligkaridis, and Marinka Zitnik. Self-supervised contrastive pre-training for time series via time-frequency consistency. Advances in neural information processing systems, 35:3988–4003, 2022.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99463-
dc.description.abstract本研究提出傳統 K-中心函數分群法(KCFC)的非線性延伸,建立一種嶄新的函數型資料分群方法:K-中心函數型自編碼器分群法(KCFAEC)。本方法將 KCFC 中的函數主成分分析(FPCA)替換為非線性的函數型自編碼器(FAE),後者將傳統的自編碼器架構延伸至函數型資料上。與 KCFC 相同,本方法以重建誤差作為重新指派群集的準則,透過迭代更新各群集的模型與樣本的群集標籤,達到分群目的。在此架構下,每個群集可學習自身的非線性表徵空間,藉以捕捉群集間潛在的結構差異。模擬研究與真實資料實驗(包含音素及腦電波資料)顯示,本方法的分群表現優於或相當於現有傳統方法,尤其在群集間具有明顯非線性結構差異的情境中,更能凸顯 KCFAEC 的效能,證實了本方法在實務上的價值與應用潛力。zh_TW
dc.description.abstractThis study proposes a nonlinear extension of the traditional K-centres functional clustering (KCFC) method, introducing a novel clustering approach for functional data: the K-Centres Functional Autoencoder Clustering (KCFAEC). In the proposed method, functional principal component analysis (FPCA) used in KCFC is replaced with a nonlinear functional autoencoder (FAE), which extends the autoencoder network to functional data. Similar to KCFC, the reconstruction loss is used for cluster reassignment, and the clustering process alternates between updating cluster-specific models and cluster labels. This process enables each cluster to possess its own nonlinear representation space, capturing latent structural differences across clusters. Simulation studies and real data examples—including the Phoneme and EEG datasets—demonstrate that KCFAEC consistently outperforms or matches other traditional methods in terms of clustering performance, especially in scenarios where clusters exhibit distinct nonlinear substructures, confirming the effectiveness and practical value of the proposed method.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-09-10T16:21:59Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2025-09-10T16:21:59Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents致謝 i
摘要 ii
Abstract iii
Contents v
List of Figures vii
List of Tables x
Chapter 1 Introduction 1
Chapter 2 Preliminaries 5
2.1 Overview of K-Centres Functional Clustering 5
2.1.1 Basic Concepts 5
2.1.2 The Clustering Procedure 7
2.2 Overview of Functional Autoencoder 8
2.2.1 Basic Concepts 8
2.2.2 Mathematical Formulation 9
Chapter 3 Methodology 12
3.1 K-Centres Functional Autoencoder Clustering 12
3.2 Functional Autoencoder Training Procedure 16
Chapter 4 Simulations 21
4.1 Experiment Settings 21
4.1.1 Data Generations 21
4.1.2 Comparison Methods and Hyperparameters 26
4.1.3 Performance Metrics 28
4.2 Experiment Results 29
4.2.1 Case 1-1 Results 30
4.2.2 Case 1-2 Results 31
4.2.3 Case 2-1 Results 32
4.2.4 Case 2-2 Results 33
4.2.5 Case 3 Results 34
4.2.6 Reconstruction Results 35
4.2.7 Effects of Network Architecture 36
4.2.8 Computation Time Comparison 37
Chapter 5 Real Data Examples 38
5.1 Phoneme Dataset 38
5.2 Sleeping EEG Dataset 40
Chapter 6 Conclusion 43
References 45
Appendix A — Supplementary Results 48
-
dc.language.isoen-
dc.subject函數型資料zh_TW
dc.subjectK-中心分群法zh_TW
dc.subject自編碼器zh_TW
dc.subject非線性表徵zh_TW
dc.subjectK-centres clusteringen
dc.subjectAutoencoderen
dc.subjectFunctional dataen
dc.subjectNonlinear representationen
dc.title透過函數型自編碼器實現K-中心函數分群法的非線性擴展zh_TW
dc.titleA Nonlinear Extension of K-Centres Functional Clustering via Functional Autoencoderen
dc.typeThesis-
dc.date.schoolyear113-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee李百靈;林書勤zh_TW
dc.contributor.oralexamcommitteePai-Ling Li;Shu-Chin Linen
dc.subject.keyword函數型資料,K-中心分群法,自編碼器,非線性表徵,zh_TW
dc.subject.keywordFunctional data,K-centres clustering,Autoencoder,Nonlinear representation,en
dc.relation.page49-
dc.identifier.doi10.6342/NTU202502580-
dc.rights.note未授權-
dc.date.accepted2025-07-30-
dc.contributor.author-college理學院-
dc.contributor.author-dept統計與數據科學研究所-
dc.date.embargo-liftN/A-
顯示於系所單位:統計與數據科學研究所

文件中的檔案:
檔案 大小格式 
ntu-113-2.pdf
  未授權公開取用
6.52 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved