Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 管理學院
  3. 資訊管理學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88548
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor李瑞庭zh_TW
dc.contributor.advisorAnthony J. T. Leeen
dc.contributor.author潘躍升zh_TW
dc.contributor.authorYueh-Sheng Panen
dc.date.accessioned2023-08-15T16:47:20Z-
dc.date.available2023-11-09-
dc.date.copyright2023-08-15-
dc.date.issued2023-
dc.date.submitted2023-07-26-
dc.identifier.citationBlei DM, Lafferty JD (2006) Dynamic topic models. Proceedings of the 23rd International Conference on Machine Learning. 113–120.
Blei DM, Ng AY, Jordan MI (2003) Latent Dirichlet allocation. The Journal of Machine Learning Research 3:993–1022.
Brody S, Alon U, Yahav E (2022) How attentive are graph attention networks? Proceedings of the 10th International Conference on Learning Representation. 149.
Chen X, Xu H, Zhang Y, Tang J, Cao Y, Qin Z, Zha H (2018) Sequential recommendation with user memory networks. Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining. 108–116.
Churchill R, Singh L (2022) Dynamic topic-noise models for social media. Proceedings of the 26th Pacific-Asia Conference on Knowledge Discovery and Data Mining. 429–443.
Churchill R, Singh L, Kirov C (2018) A temporal topic model for noisy mediums. Proceedings of the 22th Pacific-Asia Conference on Knowledge Discovery and Data Mining. 42–53.
Dieng AB, Ruiz FJR, Blei DM (2019a) The dynamic embedded topic model. in: ArXiv:1907.05545, 2019.
Dieng AB, Ruiz FJR, Blei DM (2019b) Topic modeling in embedding spaces. in: ArXiv:1907.04907, 2019.
Gao S, Chen X, Ren Z, Zhao D, Yan R (2021) Meaningful answer generation of e-commerce question-answering. ACM Transactions on Information Systems 39(2):18:1-18:26.
Graves A, Wayne G, Danihelka I (2014) Neural Turing machines. in: ArXiv:1410.5401, 2014.
Graves A, Wayne G, Reynolds M, Harley T, Danihelka I, Grabska-Barwińska A, Colmenarejo SG, et al. (2016) Hybrid computing using a neural network with dynamic external memory. Nature 538(7626):471–476.
Grootendorst M (2022) BERTopic: Neural topic modeling with a class-based TF-IDF procedure. in: ArXiv:2203.05794, 2022.
Gu H, Dong X, Zhou D (2022) Dynamic key-value memory networks based on concept structure for knowledge tracing. Proceedings of the 4th International Conference on Computer Science and Technologies in Education. 290–294.
Hansen S, Pritzel A, Sprechmann P, Barreto A, Blundell C (2018) Fast deep reinforcement learning using online adjustments from the past. Advances in Neural Information Processing Systems. 10590–10600.
Hofmann T (2001) Unsupervised learning by probabilistic latent semantic analysis. Machine Learning 42(1):177–196.
Iwata T, Yamada T, Sakurai Y, Ueda N (2010) Online multiscale dynamic topic models. Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 663–672.
Kang D, Lee M (2019) Seq-DNC-seq: Context aware dialog generation system through external memory. Proceedings of the International Joint Conference on Neural Networks. 1–8.
Kim B, Kim H, Kim G (2019) Abstractive summarization of reddit posts with multi-level memory networks. Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics. 2519–2531.
Le H, Tran T, Nguyen T, Venkatesh S (2018) Variational memory encoder-decoder. Advances in Neural Information Processing Systems. 1508–1518.
Liu Q, Zhang H, Zeng Y, Huang Z, Wu Z (2018) Content attention model for aspect based sentiment analysis. Proceedings of the World Wide Web Conference. 1023–1032.
Manning C, Surdeanu M, Bauer J, Finkel J, Bethard S, McClosky D (2014) The Stanford CoreNLP natural language processing toolkit. Proceedings of 52nd Annual Meeting of the Association for Computational Linguistics: System Demonstrations. 55–60.
Miller A, Fisch A, Dodge J, Karimi AH, Bordes A, Weston J (2016) Key-value memory networks for directly reading documents. Proceedings of the Conference on Empirical Methods in Natural Language Processing. 1400–1409.
Mimno D, Wallach HM, Talley E, Leenders M, McCallum A (2011) Optimizing semantic coherence in topic models. Proceedings of the Conference on Empirical Methods in Natural Language Processing. 262–272.
Nallapati RM, Ditmore S, Lafferty JD, Ung K (2007) Multiscale topic tomography. Proceedings of the 13th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 520–529.
Nowak RD (1999) Bayesian Inference in Wavelet-Based Models. New York: Springer.
Parikh R, Karlapalem K (2013) ET: Events from tweets. Proceedings of the 22nd International Conference on World Wide Web. 613–620.
Pennington J, Socher R, Manning C (2014) GloVe: Global vectors for word representation. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. 1532–1543.
Sukhbaatar S, Szlam A, Weston J, Fergus R (2015) End-to-end memory networks. Advances in Neural Information Processing Systems. 2431–2439.
Sutskever I, Vinyals O, Le QV (2014) Sequence to sequence learning with neural networks. Advances in Neural Information Processing Systems. 3104–3112.
Wang X, McCallum A (2006) Topics over time: A non-Markov continuous-time model of topical trends. Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining. 424–433.
Weston J, Chopra S, Bordes A (2015) Memory networks. Proceedings of the 3rd International Conference on Learning Representation. 8.
Ying L, Yu H, Wang J, Ji Y, Qian S (2021) Fake news detection via multi-modal topic memory network. IEEE Access 9:132818–132829.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88548-
dc.description.abstract近年來,區塊鏈受到廣泛的關注,尤其是加密貨幣在區塊鏈生態系統中扮演著重要的腳色。創辦人、項目所有者、愛好者或名人等網紅常透過社交媒體平台(如 Twitter),公開分享他們對加密貨幣相關事件和市場趨勢的觀點,提供了許多寶貴的資訊。然而,大多數的推文長度十分簡短且包含縮寫,要從推文中有效地提取有價值的見解具有相當的挑戰性。一些既有方法並非針對短文本進行設計,或部分既有方法未能充分利用資料的特性來進行主題追蹤。為了解決上述的這些問題,我們提出了一個創新的方法,名為動態主題記憶網絡,自動從推文中提取主題的變化並進行時間主題分析。模型的框架包含四個階段,分別是詞嵌入、圖卷積、鍵值記憶網絡和時間主題分析,在無監督的方式下透過匯集詞嵌入語意關係的相關資訊,並通過關鍵組件和核心向量來生成一致性和多樣性的主題。實驗結果顯示,我們的方法在主題連貫性、主題多樣性、主題質量和質化分析上均優於現有的方法。我們的方法可以幫助區塊鏈架構師和加密貨幣交易者掌握最新動態,並可協助加密貨幣市場經理或去中心化金融經理發覺新興主題和趨勢,以擬定其行銷與定位策略。zh_TW
dc.description.abstractBlockchain has received considerable attention in recent years, especially with cryptocurrency playing a crucial role in the blockchain ecosystem. Influencers, including founders, project owners, enthusiasts, and celebrities, publicly share their opinions on crypto-related events and market trends through social media platforms such as Twitter, providing valuable information as a reference in these ever-changing crypto events. Since most tweets are brief and include abbreviations, capturing and extracting valuable insights from such tweet posts can be challenging. Moreover, some previous methods are not designed for short text, while some may not fully leverage the characteristics of text data. To resolve such problems, in this study, we propose a novel approach, called Dynamic Topic Memory Network (DTMN), to automatically extract the evolution of topics derived from tweet posts and perform temporal topic analysis. The proposed framework contains four phases, namely, token embeddings, graph convolution, key-value memory network, and temporal topic analysis to aggregate relevant information from neighboring token dependencies and construct coherence and diverse topics extracted by key components and core vectors in an unsupervised manner. The experimental results show that our proposed approach outperforms the state-of-the-art methods in terms of topic coherence, topic diversity, topic quality, and qualitative evaluation. Our approach can help blockchain architects and cryptocurrency traders stay updated on the latest developments, and the crypto marketing managers or decentralized finance managers identify emerging topics and trends for implementing their marketing and positioning strategies.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-15T16:47:20Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2023-08-15T16:47:20Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsList of Figures ii
List of Tables iii
Chapter 1 Introduction 1
Chapter 2 Related Work 4
2.1 Temporal Topic Model 4
2.2 Memory Network 5
Chapter 3 The Proposed Framework 7
3.1 Token Embeddings 8
3.2 Graph Convolution 8
3.3 Key-Value Memory Network 9
3.4 Temporal Topic Analysis 11
Chapter 4 Experimental Results 12
4.1 Dataset and Experiment Setup 12
4.2 Evaluation Metric and Comparing Methods 13
4.3 Performance Evaluation 14
4.3.1 Quantitative Evaluation 14
4.3.2 Qualitative Evaluation 19
Chapter 5 Conclusions and Future Work 23
References 25
-
dc.language.isoen-
dc.title加密貨幣網紅社群貼文主題變化模型zh_TW
dc.titleTopic Evolution for Cryptocurrency Influencers on Social Media Platformsen
dc.typeThesis-
dc.date.schoolyear111-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee鄭麗珍;戴敏育zh_TW
dc.contributor.oralexamcommitteeLi-Chen Cheng;Min-Yuh Dayen
dc.subject.keyword主題變化,加密貨幣,區塊鏈,時間主題模型,圖注意力網路,記憶機制,zh_TW
dc.subject.keywordtopic evolution,cryptocurrency,blockchain,temporal topic model,graph attention network,memory mechanism,en
dc.relation.page27-
dc.identifier.doi10.6342/NTU202302045-
dc.rights.note同意授權(限校園內公開)-
dc.date.accepted2023-07-28-
dc.contributor.author-college管理學院-
dc.contributor.author-dept資訊管理學系-
dc.date.embargo-lift2028-07-25-
顯示於系所單位:資訊管理學系

文件中的檔案:
檔案 大小格式 
ntu-111-2.pdf
  目前未授權公開取用
834.47 kBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved