Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電機工程學系
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/97019
Title: 運用認知語言學增強大型語言模型之幻覺偵測
Enhancing Hallucination Detection in Large Language Models through Question Paraphrasing Strategies Inspired by Cognitive Linguistics
Authors: 邱品萍
Pin-Ping Ciou
Advisor: 謝宏昀
Hung-Yun Hsieh
Keyword: 大型語言模型,幻覺檢測,認知語言學,問題改寫,認知挑戰,
Large Language Models (LLMs),Hallucination Detection,Cognitive Linguistics,Question Paraphrasing,Cognitive Challenges,
Publication Year : 2025
Degree: 碩士
Abstract: 大型語言模型在自然語言處理領域展現卓越性能,但它們經常產生錯誤信息,這種現象被稱為「幻覺」。現有的幻覺偵測方法面臨雙重侷限:過度仰賴外部知識庫,以及需深入模型內部結構進行分析,這對於基於API的模型來說是不切實際的。雖然近期基於一致性的方法通過問題改寫提供了一個有前景的方向,但由於它們依賴語言模型,可能會引入額外的不穩定性和潛在的幻覺。為了解決這些問題,我們提出了一種基於認知語言學的新穎問題改寫方法,消除了對語言模型的依賴。受到認知語言學研究的啟發,研究表明人類在面對認知挑戰時仍能保持語義理解,因此我們提出了假設:健全的語言模型在面對這些挑戰時也應該同樣保持語義,其中回覆的不一致性將表明潛在的幻覺。我們的方法引入了四種轉換方法:語法錯誤變化、句法結構轉換、標點符號修改及罕用詞彙替換,每種方法針對不同的認知挑戰,同時保持語義。我們在事實性問答和邏輯推理任務的數據集上評估了我們的框架。實驗結果顯示顯著改進:在事實性問答中,我們三種方法的最佳組合實現了0.755和0.720的F1分數,超過了基準的0.600和0.603分數。對於邏輯推理任務,我們的標點符號修改方法單獨就實現了0.850和0.955的F1分數,顯著優於基準的0.649和0.762分數。這些結果表明,在不需要使用額外語言模型的情況下,我們的框架提供了更可靠的幻覺檢測。
Large Language Models (LLMs) have demonstrated remarkable capabilities in natural language processing tasks, but they frequently generate incorrect information, a phenomenon known as hallucination. Current approaches to hallucination detection either rely heavily on external knowledge bases or require access to model internals, which is impractical for API-based models. While recent consistency-based methods offer a promising direction through question paraphrasing, they may introduce additional instability and potential hallucinations due to their dependence on language models. To address these problems, we propose a novel cognitive-linguistics-based approach for paraphrasing questions that eliminates dependence on language models.
Inspired by cognitive linguistics research showing that humans maintain semantic understanding despite cognitive challenges, we hypothesize that robust language models should similarly preserve meaning when facing these challenges, where inconsistencies in responses would indicate potential hallucinations. Our approach introduces four transformation methods: Grammatical Errors Variation, Sentence Structure Conversion, Punctuation Modification, and Lexical Substitution with Rare Words, each targeting distinct cognitive challenges while preserving meaning. We evaluate our framework on datasets covering both factoid question answering and logical reasoning tasks. Our experimental results demonstrate significant improvements: in factoid question answering, our optimal combination of three methods achieves F1 scores of 0.755 and 0.720, surpassing the baseline's scores of 0.600 and 0.603. For logical reasoning tasks, our Punctuation Modification method alone achieves F1 scores of 0.850 and 0.955, significantly outperforming the baseline's scores of 0.649 and 0.762. These results demonstrate that our framework provides more reliable and efficient hallucination detection by eliminating multiple language model calls.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/97019
DOI: 10.6342/NTU202500638
Fulltext Rights: 未授權
metadata.dc.date.embargo-lift: N/A
Appears in Collections:電機工程學系

Files in This Item:
File SizeFormat 
ntu-113-1.pdf
  Restricted Access
3.23 MBAdobe PDF
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved