Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/98746
Title: 基於適應性多尺度時頻域多層感知器於時間序列預測之架構設計
AMTF-MLP: Adaptive Multi-Scale Time-Frequency MLP for Time Series Forecasting
Authors: 戴廷磬
Ting-Ching Tai
Advisor: 王勝德
Sheng-De Wang
Keyword: 多變量時間序列預測,純多層感知器模型,頻域分析,自適應特徵融合,多尺度時域建模,長期預測,短期預測,
Multivariate Time Series Forecasting,MLP-based Models,Frequency-Domain Analysis,Adaptive Feature Fusion,Multi-Scale Temporal Modeling,Long-Term Forecasting,Short-Term Forecasting,
Publication Year : 2025
Degree: 碩士
Abstract: 準確預測多變量時間序列對於工業應用至關重要,然而,由於數據中固有的短期瞬態變化、長期相依性、隱藏的週期性,以及嚴格的計算效率要求,這項任務充滿挑戰。為了解決這些問題,我們提出了AMTF-MLP,一個創新的純多層感知器(MLP)架構,它透過自適應融合機制,整合了多尺度時域混合器與頻域頻譜學習。AMTF-MLP透過並行分支處理信號:一個帶有層級化區塊混合器的時域分支,用以捕捉局部與全域的時間模式;以及一個帶有頻譜MLP的頻域分支,用以建模週期性。
在多樣化的公開基準上進行的大量實驗,驗證了我們模型的有效性與通用性。在長期預測方面,AMTF-MLP展現了高度的競爭力,與iTransformer和PatchTST等主流Transformer模型相比,其均方誤差(MSE)降低了9.3%至20.4%。在高頻率的短期預測場景中,它取得了優異的成果,在PEMS數據集上,其MSE分別比AMD和iTransformer等強力競爭對手降低了24.3%和40.4%。
此模型優異的跨域性能,同時也體現在計算效率上。在其高效能的MLP同類模型中,AMTF-MLP展現了優異的記憶體用量,記憶體消耗比AMD少1.80倍,同時維持著快1.5倍的訓練速度。消融實驗證實,我們設計的每個組件都對模型
的效能至關重要。憑藉其線性複雜度與強大的實證結果,AMTF-MLP為真實世界的預測系統提供了一個強大且實用的解決方案。
Accurately predicting multivariate time series is essential for industrial applications, yet it poses significant challenges due to short-term transients, long-term dependencies, and hidden periodicities, alongside stringent computational efficiency requirements.To address these issues, we introduce AMTF-MLP, an innovative pure Multi-Layer Perceptron (MLP) architecture that integrates multi-scale time-domain mixers and frequency domain spectral learning, unified through adaptive fusion. AMTF-MLP processes the signal in parallel branches: a time-domain branch with hierarchical patch mixers to capture local and global temporal patterns, and a frequency-domain branch with a spectral MLP to model periodicities.
Extensive experiments on diverse public benchmarks validate our model's effectiveness and versatility. In long-term forecasting, it delivers highly competitive performance, reducing Mean Squared Error (MSE) by 9.3% to 20.4% compared to prominent Transformer models like iTransformer and PatchTST. In high-frequency short-term scenarios, it achieves leading results, reducing MSE by 24.3% and 40.4% against strong competitors like AMD and iTransformer, respectively, on the PEMS datasets. The model's excellent cross-domain performance is also reflected in its computational efficiency: among its high-performance MLP peers, it exhibits the most superior memory efficiency, consuming 1.8 × less memory than AMD, while maintaining a training speed 1.5 × faster. Ablation studies confirm that each component of our design is critical to the model’s efficacy. With its linear complexity and strong empirical results, AMTF-MLP presents a powerful and practical solution for real-world forecasting systems.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/98746
DOI: 10.6342/NTU202503526
Fulltext Rights: 同意授權(限校園內公開)
metadata.dc.date.embargo-lift: 2025-08-19
Appears in Collections:電機工程學系

Files in This Item:
File SizeFormat 
ntu-113-2.pdf
Access limited in NTU ip range
4.92 MBAdobe PDF
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved