Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 理學院
  3. 心理學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101255
標題: 以動態個人化的深度學習替代個體建模的個殊心理學方法
Idiographic Psychology without Individual Modeling by Dynamic Personalization in Deep Learning
作者: 李彥廷
Yen-Ting Li
指導教授: 黃從仁
Tsung-Ren Huang
關鍵字: 情緒動態,經驗取樣法密集縱貫資料個殊心理學深度學習
Emotion Dynamics,Experience Sampling MethodIntensive Longitudinal DataIdiographic PsychologyDeep Learning
出版年 : 2025
學位: 碩士
摘要: 情緒動態研究已從靜態取向轉向關注情緒隨時間的變化模式,並更強調個體差異。然而,常用的個殊式模型雖能捕捉個體內動態,卻難以泛化至新個體;通則式模型雖可預測新個體,但往往忽略個體差異而降低準確度。此外,傳統線性模型無法充分描述情緒歷程中的非線性與狀態切換特性。本研究建立兩面向模型比較架構,比較不同建模策略在情緒動態預測上的表現:其一為線性模型(VAR 系列)與非線性深度學習模型(LSTM 系列)的預測能力,其二為不同跨個體資訊整合策略,包括完全個殊式、群體式與階層/表徵式架構。本研究以 Markov Switching VAR 生成具狀態切換的模擬資料,操弄樣本異質性與序列長度,並評估八種模型的預測與泛化表現。結果顯示,在預測實驗中,表徵式 LSTM 變體整體表現最佳;非線性模型在較長序列與預測準確度上表現更佳。在泛化實驗中,直接從資料學習受試者表徵的 LSTM 面對新個體時更具優勢。樣本異質性顯著影響所有模型的泛化表現,但在提供準確受試者表徵時仍能維持較佳表現。本研究證實,表徵式深度學習架構能在捕捉個體差異與跨個體泛化之間取得平衡,為情緒動態建模提供具實務價值的折衷方案,對發展個人化情緒預測系統、早期預警指標與臨床介入策略具有重要意涵,並為未來整合真實情緒追蹤資料與表徵學習方法奠定基礎。
Research on emotion dynamics has increasingly shifted from static approaches to a focus on temporal patterns of emotional change and the importance of individual differences. However, the idiographic models commonly used in this field, while capable of capturing within-person dynamics, are difficult to generalize to new individuals; nomothetic models, although able to predict for new individuals, often ignore individual differences and thereby reduce predictive accuracy. In addition, traditional linear models are insufficient to fully describe the nonlinearity and regime-switching properties that characterize emotional processes. This study establishes a two-dimensional framework for model comparison to evaluate the performance of different modeling strategies in predicting emotion dynamics. The first dimension compares linear models (VAR family) and nonlinear deep learning models (LSTM family) in terms of predictive ability; the second dimension compares strategies for integrating information across individuals, including fully idiographic, pooled, and hierarchical/representation-based architectures. We use Markov Switching VAR models to generate simulated data with regime-switching properties, manipulate sample heterogeneity and sequence length, and evaluate the predictive and generalization performance of eight models. The results show that, in the prediction experiment, representation-based LSTM variants outperform the other models overall. Nonlinear models benefit more from longer sequences and achieve higher predictive accuracy. In the generalization experiment, LSTMs that directly learn participant representations from the data have greater advantages when predicting new individuals. Sample heterogeneity has a substantial impact on the generalization performance of all models, but providing accurate participant representations helps maintain better performance. Overall, the findings demonstrate that representation-based deep learning architectures can strike a balance between capturing individual differences and achieving cross-person generalization, offering a practically valuable compromise for modeling emotion dynamics. These results have important implications for the development of personalized emotion prediction systems, early warning indicators, and clinical intervention strategies, and lay the groundwork for future work that integrates real-world emotion-tracking data with representation learning methods.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101255
DOI: 10.6342/NTU202504865
全文授權: 同意授權(限校園內公開)
電子全文公開日期: 2026-01-14
顯示於系所單位:心理學系

文件中的檔案:
檔案 大小格式 
ntu-114-1.pdf
授權僅限NTU校內IP使用(校園外請利用VPN校外連線服務)
1.59 MBAdobe PDF
顯示文件完整紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved