Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101545
Title: 基於多方安全計算實現長序列 Transformer 解碼器模型之高效率安全推論
SecLoT: Towards Efficient Secure Inference of Long-sequence Transformer Decoder Models with MPC
Authors: 林奕安
Yi-An Lin
Advisor: 吳沛遠
Pei-Yuan Wu
Keyword: 隱私維護機器學習,多方安全計算安全兩方計算隱私維護 Transformer安全 Transformer 推論
Privacy Preserving Machine Learning,Secure multiparty computationTwo-party computationPrivacy Preserving TransformerSecure Transformer Inference
Publication Year : 2026
Degree: 碩士
Abstract: 多方安全計算(Secure Multi-Party Computation, SMPC)可在不洩漏資料內容的情況下進行隱私保護推論,然而在 SMPC 設定下,針對 Transformer decoder 模型進行長序列生成的研究仍相當有限且缺乏系統性探討。本文提出 SecLoT,一個旨在提升安全 Transformer decoder 推論效率的框架,透過同時解決演算法層面與系統層面的限制,推進長序列生成在 SMPC 環境中的可行性。我們將固定矩陣大小的 causal mask self-attention 機制引入 SMPC 設定中,使自回歸解碼過程得以在不依賴安全 softmax 的情況下進行,並達成每個生成 token 幾乎固定的計算時間與通訊成本。此外,我們擴充 CrypTen 編譯器以支援更複雜的模型架構,提出無輸入範圍限制且高效率的除法與倒數平方根演算法,並設計一個具正確性保證的截斷協定,以消除既有實作中可能發生的隨機錯誤。在基於 decoder-only Transformer 的 MNIST 圖像生成實驗中,結果顯示 SecLoT 相較於以 softmax 為基礎的方法,能有效降低計算時間與通訊開銷,且此優勢隨著生成序列長度的增加而更加明顯。
Secure Multi-Party Computation (SMPC) enables privacy-preserving inference, yet long-sequence generation with Transformer decoder models has not been systematically explored under SMPC. In this work, we present SecLoT, a framework that advances secure Transformer decoder inference by addressing both algorithmic inefficiencies and system-level limitations. We adapt a fixed-size causal mask self-attention formulation to the SMPC setting, enabling autoregressive decoding without secure softmax and achieving nearly constant computation time and communication cost per generated token. In addition, we extend the CrypTen compiler to support more complex model architectures, propose domain-unrestricted and efficient algorithms for division and inverse square root, and introduce a correct truncation protocol that eliminates probabilistic errors in existing implementations. Experiments on decoder-only Transformer-based MNIST image generation demonstrate that SecLoT significantly reduces computation time and communication overhead compared to softmax-based baselines, particularly as sequence length increases.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/101545
DOI: 10.6342/NTU202600086
Fulltext Rights: 未授權
metadata.dc.date.embargo-lift: N/A
Appears in Collections:電機工程學系

Files in This Item:
File SizeFormat 
ntu-114-1.pdf
  Restricted Access
4.21 MBAdobe PDF
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved