Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
  • 幫助
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/96077
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor廖世偉zh_TW
dc.contributor.advisorShih-Wei Liaoen
dc.contributor.author王敬順zh_TW
dc.contributor.authorChing-Shun Wangen
dc.date.accessioned2024-10-14T16:04:57Z-
dc.date.available2024-10-15-
dc.date.copyright2024-10-14-
dc.date.issued2024-
dc.date.submitted2024-09-30-
dc.identifier.citationX. Bao, X. Jiang, Z. Wang, Y. Zhang, and G. Zhou. Opinion tree parsing for aspect-based sentiment analysis. In Proceedings of the 2023 Annual Conference of the Association for Computational Linguistics (ACL), pages 1–12. Association for Computational Linguistics, 2023.
X. Bao, Z. Wang, X. Jiang, R. Xiao, and S. Li. Aspect-based sentiment analysis with opinion tree generation. In Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, IJCAI 2022, pages 4044–4050. ijcai.org, 2022.
H. Cai, R. Xia, and J. Yu. Aspect-category-opinion-sentiment quadruple extraction with implicit aspects and opinions. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 340–350. Association for Computational Linguistics, 2021.
C. Chen, Z. Teng, Z. Wang, and Y. Zhang. Discrete opinion tree induction for aspect-based sentiment analysis. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2051–2064. Association for Computational Linguistics, 2022.
L. Cui, S. Yang, and Y. Zhang. Investigating non-local features for neural constituency parsing. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2065–2075. Association for Computational Linguistics, 2022.
J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova. Bert: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 4171–4186. Association for Computational Linguistics, 2019.
N. Houlsby, A. Giurgiu, S. Jastrzebski, B. Morrone, Q. de Laroussilhe, A. Gesmundo, M. Attariyan, and S. Gelly. Parameter-efficient transfer learning for nlp. In Proceedings of the 2019 Conference (specify the conference name). Specify the organizing body if known, 2019.
D. P. Kingma and J. Ba. Adam: A method for stochastic optimization. In 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings, 2015.
N. Kitaev, S. Cao, and D. Klein. Multilingual constituency parsing with self-attention and pre-training. In Proceedings of the 57th Conference of the Association for Computational Linguistics, ACL 2019, Florence, Italy, July 28- August 2, 2019, Volume 1: Long Papers, pages 3499–3505. Association for Computational Linguistics, 2019.
N. Kitaev and D. Klein. Constituency parsing with a self-attentive encoder. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, ACL 2018, Melbourne, Australia, July 15-20, 2018, Volume 1: Long Papers, pages 2676–2686. Association for Computational Linguistics, 2018.
B. Lester, R. Al-Rfou, and N. Constant. The power of scale for parameter-efficient prompt tuning. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP). Google Research, 2021.
X. L. Li and P. Liang. Prefix-tuning: Optimizing continuous prompts for generation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP), Stanford University, 2021.
H. Liu, D. Tam, M. Muqeeth, J. Mohta, T. Huang, M. Bansal, and C. Raffel. Few-shot parameter-efficient fine-tuning is better and cheaper than in-context learning. In Proceedings of the 2022 International Conference on Learning Representations (ICLR), 2022.
W. Liu, Z. Qiu, Y. Feng, Y. Xiu, Y. Xue, L. Yu, H. Feng, Z. Liu, J. Heo, S. Peng, Y. Wen, M. J. Black, A. Weller, and B. Scholkopf. Parameter-efficient orthogonal finetuning via butterfly factorization. In Proceedings of the 2024 International Conference on Learning Representations (ICLR), 2024.
A. Meena and T. Prabhakar. Sentence level sentiment analysis in the presence of conjuncts using linguistic analysis. In G. Amati, C. Carpineto, and G. Romano, editors, Advances in Information Retrieval, volume 4425 of Lecture Notes in Computer Science, pages 91–100, Berlin, Heidelberg, 2007. Springer.
R. Mukherjee, T. Nayak, Y. Butala, S. Bhattacharya, and P. Goyal. Paste: A tagging-free decoding framework using pointer networks for aspect sentiment triplet extraction. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 9279–9291. Association for Computational Linguistics, 2021.
M. Nikdan, S. Tabesh, E. Crncevic, and D. Alistarh. Rosa: Accurate parameter-efficient fine-tuning via robust adaptation. In Proceedings of the 2024 Conference (specify the conference name), 2024.
H. Peng, L. Xu, L. Bing, F. Huang, W. Lu, and L. Si. Knowing what, how and why: A near complete solution for aspect-based sentiment analysis. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 8600–8607, 2020.
G. Qiu, B. Liu, J. Bu, and C. Chen. Opinion word expansion and target extraction through double propagation. Computational Linguistics, 37(1):9–27, 2011.
C. Raffel, N. Shazeer, A. Roberts, K. Lee, S. Narang, M. Matena, Y. Zhou, W. Li, and P. J. Liu. Exploring the limits of transfer learning with a unified text-to-text transformer. volume 21, pages 1–67, 2020.
D. Tang, B. Qin, X. Feng, and T. Liu. Effective lstms for target-dependent sentiment classification. In COLING 2016, pages 3298–3307, 2016.
H. Wan, Y. Yang, J. Du, Y. Liu, K. Qi, and J. Z. Pan. Target-aspect-sentiment joint detection for aspect-based sentiment analysis. In AAAI 2020, pages 9122–9129, 2020.
Q. Wang, Z. Wen, Q. Zhao, M. Yang, and R. Xu. Progressive self-training with discriminator for aspect term extraction. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 257–268. Association for Computational Linguistics, 2021.
L. Xu, H. Li, W. Lu, and L. Bing. Position-aware tagging for aspect sentiment triplet extraction. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 2339–2349. Association for Computational Linguistics, 2020.
H. Yan, J. Dai, T. Ji, X. Qiu, and Z. Zhang. A unified generative framework for aspect-based sentiment analysis. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 2416–2429. Association for Computational Linguistics, 2021.
Y. Yu, C.-H. H. Yang, J. Kolehmainen, P. G. Shivakumar, Y. Gu, S. Ryu, R. Ren, Q. Luo, A. Gourav, I.-F. Chen, Y.-C. Liu, T. Dinh, A. Gandhe, D. Filimonov, S. Ghosh, A. Stolcke, A. Rastow, and I. Bulyko. Low-rank adaptation of large language model rescoring for parameter-efficient speech recognition. In Proceedings of the 2023 Annual Meeting of the Association for Computational Linguistics (ACL), 2023.
M. Zhang and T. Qian. Convolution over hierarchical syntactic and lexical graphs for aspect level sentiment analysis. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 3540–3549. Association for Computational Linguistics, 2020
Q. Zhang, M. Chen, A. Bukharin, N. Karampatziakis, P. He, Y. Cheng, W. Chen, and T. Zhao. Adalora: Adaptive budget allocation for parameter-efficient fine-tuning. In Proceedings of the 2023 International Conference on Learning Representations (ICLR), 2023.
W. Zhang, X. Li, Y. Deng, L. Bing, and W. Lam. Aspect sentiment quad prediction as paraphrase generation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 9209–9219. Association for Computational Linguistics, 2021.
W. Zhang, X. Li, Y. Deng, L. Bing, and W. Lam. Towards generative aspect-based sentiment analysis. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 504–510. Association for Computational Linguistics, 2021.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/96077-
dc.description.abstract分析社交媒體用戶的評論呈現出重大挑戰,因為要辨識留言主體與留言情感之間的關係在複雜性上有所困難,特別是當使用者的評論在長度上有很大的變化時。本文介紹了一種新穎的意見樹解析模型,該模型能夠處理評論中不同方面之間錯綜複雜的互動,在訓練模型時,加入連接詞和語義修飾詞來提高解析的準確度。且在模型複雜化之下,為了提高訓練過程的效率並管理計算需求,我們在模型上實作了可以使參數量減少卻能達到差不多效能的方法(PEFT)。
我們在 ACOS 數據集上評估了我們提出的模型,鑑於描述用戶對特定方面情感的數據集有限,以及由於其資源密集性對大型預訓練語言模型(LLMs)進行下游調整的挑戰,我們的方法提出了一種改變計算方式的 OTP 模型。這種方法改變了模型的Loss function,專注於戰略性放置的模塊訓練,且在加入Adpater的情況下,顯著減少了 GPU 記憶體占用,並減輕了記憶體不足(OOM)問題,而不損害預訓練模型的整體完整性。這種方法不僅提高了訓練效率,而且還維持了與原始 LLM 配置接近的性能水平。
zh_TW
dc.description.abstractAnalyzing social media user comments presents significant challenges due to the complexity of discerning relationships between opinions and aspects, particularly when comments vary greatly in length. This paper introduces a novel Opinion Tree Parser Model that navigates the intricate interplay between different aspects within comments, utilizing conjunctions and semantic modifiers to enhance the parsing accuracy. To improve the efficiency of the training process and manage the computational demands, we have implemented Position-Encoded Fine-Tuning (PEFT) methods on the decoder side.
We evaluated our proposed model on ACOS datasets, given the limited availability of datasets that describe user sentiments towards specific aspects and the challenges of fine-tuning large pre-trained language models (LLMs) due to their resource intensity, our approach proposes an advanced context-free opinion grammar. This method integrates an adapter to focus training on strategically placed modules, significantly reducing the GPU memory footprint and mitigating out-of-memory (OOM) issues without compromising the overall integrity of the pre-trained model. This approach not only enhances training efficiency but also maintains performance levels close to those of the original LLM configurations.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-10-14T16:04:57Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2024-10-14T16:04:57Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsAcknowledgements i
摘要 ii
Abstract iii
Contents v
List of Figures vii
List of Tables viii
Chapter 1 Introduction 1
Chapter 2 Related Work 4
Chapter 3 Methodology 6
3.1 A Grammar Framework for Opinions Based on Context-Free Rules 6
3.1.1 Revised Fundamental Definitions 6
3.1.2 Additional Rules 8
3.2 Opinion Tree Parser 10
3.2.1 Span Scores and Context-Aware Encoding 10
3.2.2 Tree Scores and Chart-based Decoding 11
3.2.3 Objective Functions and Training 12
3.3 Efficient Tuning Approach 13
3.4 A Comprehensive Examination of Our Proposed Model 14
Chapter 4 Evaluation 17
4.1 Dataset and Settings 17
4.1.1 Parameter Tuning and Model Configuration 18
4.1.2 Evaluation Metrics 18
4.2 Main Result 18
4.2.1 Enhanced Context-Free Opinion Tree Model 19
4.2.2 Enhanced Context-Free Opinion Tree Model with Adapters 19
4.3 Analysis and Discussion 20
4.3.1 Effect of Conjunctions and Semantic Modifiers on Opinion Grammar 21
4.3.2 Results of Different Adapters on Opinion Grammar 22
Chapter 5 Conclusion 24
References 25
Appendix A — General LLM performance on Sentiment Analysis 31
A.1 Introduction 31
A.2 GPT on Aspect-based Sentiment Analysis task 32
A.3 Analysis of GPT-4's Performance on ACOS Tasks 33
-
dc.language.isoen-
dc.title使用PEFT在維持高效率的訓練下對使用者評論進行基於面向的情感分析zh_TW
dc.titleABSA : Opinion Tree Parsing with PEFT for Aspect-based Semtiment Analysisen
dc.typeThesis-
dc.date.schoolyear113-1-
dc.description.degree碩士-
dc.contributor.oralexamcommittee傅楸善;李逸元;葉春超;盧瑞山zh_TW
dc.contributor.oralexamcommitteeChiou-Shann Fuh;Yi-Yuan Lee;Chun-Chao Ye;Rui-Shan Luen
dc.subject.keyword主題提取,主題意見提取,情感分析,自然語言處理,社群媒體使用者留言,zh_TW
dc.subject.keywordAspect Extract,Opinion Extract,Aspect-based Sentiment Analysis,Natural Language Processing,Comments from social media users,en
dc.relation.page33-
dc.identifier.doi10.6342/NTU202404405-
dc.rights.note未授權-
dc.date.accepted2024-10-01-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept資訊工程學系-
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-113-1.pdf
  目前未授權公開取用
712.84 kBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved