請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/79759完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 魏志平(Chih-Ping Wei) | |
| dc.contributor.author | Kai-Yi Lin | en |
| dc.contributor.author | 林楷翊 | zh_TW |
| dc.date.accessioned | 2022-11-23T09:10:09Z | - |
| dc.date.available | 2021-08-20 | |
| dc.date.available | 2022-11-23T09:10:09Z | - |
| dc.date.copyright | 2021-08-20 | |
| dc.date.issued | 2021 | |
| dc.date.submitted | 2021-08-18 | |
| dc.identifier.citation | 1. Araci, D. (2019). FinBERT: Financial sentiment analysis with pre-trained language models. arXiv preprint arXiv:1908.10063. 2. Azhar, N. A., Pan, G., Seow, P. S., Koh, A., Tay, W. Y. (2019). Text analytics approach to examining corporate social responsibility. Asian Journal of Accounting and Governance, 11, 85-96. 3. Beltagy, I., Lo, K., Cohan, A. (2019). SciBERT: A pretrained language model for scientific text. arXiv preprint arXiv:1903.10676. 4. Capelle-Blancard, G., Petit, A. (2019). Every little helps? ESG news and stock market reaction. Journal of Business Ethics, 157(2), 543-565. 5. Devlin, J., Chang, M. W., Lee, K., Toutanova, K. (2018). Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805. 6. Delmas, M. A., Burbano, V. C. (2011). The drivers of greenwashing. California Management Review, 54(1), 64-87. ISO 690 7. Friede, G., Busch, T., Bassen, A. (2015). ESG and financial performance: aggregated evidence from more than 2000 empirical studies. Journal of Sustainable Finance Investment, 5(4), 210-233. 8. Giese, G., Nagy, Z., Lee, L. E. (2021). Deconstructing ESG ratings performance: risk and return for E, S, and G by time horizon, sector, and weighting. The Journal of Portfolio Management, 47(3), 94-111. 9. Guo, T. (2020). ESG2risk: A deep learning framework from ESG news to stock volatility prediction. Available at SSRN 3593885. 10. He, M., Song, Y., Xu, K., Yu, D. (2020). On the Role of Conceptualization in Commonsense Knowledge Graph Construction. arXiv preprint arXiv:2003.03239. 11. Hochreiter, S., Schmidhuber, J. (1997). Long short-term memory. Neural Computation, 9(8), 1735-1780. 12. Hutto, C., Gilbert, E. (2014, May). Vader: A parsimonious rule-based model for sentiment analysis of social media text. In Proceedings of the International AAAI Conference on Web and Social Media (Vol. 8, No. 1). 13. Kingma, D. P., Ba, J. (2014). Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980. 14. Kiriu, T., Nozaki, M. (2020). A Text Mining Model to Evaluate Firms' ESG Activities: An Application for Japanese Firms. Asia-Pacific Financial Markets, 27(4), 621-632. 15. Lee, J., Yoon, W., Kim, S., Kim, D., Kim, S., So, C. H., Kang, J. (2020). BioBERT: A pre-trained biomedical language representation model for biomedical text mining. Bioinformatics, 36(4), 1234-1240. 16. Liew, W. T., Adhitya, A., Srinivasan, R. (2014). Sustainability trends in the process industries: A text mining-based analysis. Computers in Industry, 65(3), 393-400. 17. Mikolov, T., Sutskever, I., Chen, K., Corrado, G. S., Dean, J. (2013). Distributed representations of words and phrases and their compositionality. In Advances in Neural Information Processing Systems (pp. 3111-3119). 18. Nugent, T., Stelea, N., Leidner, J. L. (2020). Detecting ESG topics using domain-specific language models and data augmentation approaches. arXiv preprint arXiv:2010.08319. 19. Peters, M. E., Neumann, M., Iyyer, M., Gardner, M., Clark, C., Lee, K., Zettlemoyer, L. (2018). Deep contextualized word representations. arXiv preprint arXiv:1802.05365. 20. Shahi, A. M., Issac, B., Modapothala, J. R. (2014). Automatic analysis of corporate sustainability reports and intelligent scoring. International Journal of Computational Intelligence and Applications, 13(01), 1450006. 21. Schuster, M., Paliwal, K. K. (1997). Bidirectional recurrent neural networks. IEEE Transactions on Signal Processing, 45(11), 2673-2681. 22. Serafeim, G., Yoon, A. (2021). Stock price reactions to ESG news: The role of ESG ratings and disagreement. Harvard Business School Accounting Management Unit Working Paper (21-079). 23. Sun, C., Qiu, X., Xu, Y., Huang, X. (2019, October). How to fine-tune BERT for text classification? In China National Conference on Chinese Computational Linguistics (pp. 194-206). Springer, Cham. 24. US SIF Foundation, 2020 Biennial Report on US Sustainable, Responsible and Impact Investing Trends, US SIF Foundation (2020). 25. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., Polosukhin, I. (2017). Attention is all you need. In Advances in Neural Information Processing Systems (pp. 5998-6008). 26. Vo, N. N., He, X., Liu, S., Xu, G. (2019). Deep learning for decision making and the optimization of socially responsible investments and portfolio. Decision Support Systems, 124, 113097. 27. Zhao, C., Guo, Y., Yuan, J., Wu, M., Li, D., Zhou, Y., Kang, J. (2018). ESG and corporate financial performance: Empirical evidence from China's listed power generation companies. Sustainability, 10(8), 2607. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/79759 | - |
| dc.description.abstract | 環境社會治理(ESG)原則已經儼然變成是當今最具變革性的議題之一。無論是環保主義者、消費者、投資人或甚至是企業都開始關注一家公司與其供應鏈的永續治理表現。隨著ESG越來越受到關注,有些企業組織開始建立起ESG的評分機構。然而,在現行的ESG機構評分架構下,由於需要收集公司的ESG報告和其他相關資訊,評分機構需要花費大量的時間與勞力資源才能夠完成評分。因此,隨著自然語言處理(NLP)的快速發展,我們希望能夠利用NLP提出一個利用新聞文件或其他公開可取得的資訊,自動分類一家公司的ESG表現的技術。明確而言,我們嘗試使用向量空間(Vector Space Model)與基於深度學習(Deep Learning-Based Model)的模型去建立這套評分系統。除此之外,有鑒於預訓練語言模型應用在各項不同語言任務的成功,我們也研究了BERT跟其延伸模型ESG-BERT。在此篇論文中,我們將詳細的描述了我們採用的文字探勘模型、資料集合以及評估結果。 | zh_TW |
| dc.description.provenance | Made available in DSpace on 2022-11-23T09:10:09Z (GMT). No. of bitstreams: 1 U0001-1708202116110200.pdf: 2004576 bytes, checksum: 1a018f83eaab690a5255b3c05cf4b400 (MD5) Previous issue date: 2021 | en |
| dc.description.tableofcontents | 口試委員審定書 i 誌謝 ii Abstract iii 摘要 iv Introduction 1 1.1 Background 1 1.2 Research Motivation 2 1.3 Research Objective 4 Literature Review 6 2.1 Text Mining on ESG domain 6 2.2 Pre-trained Language Model 7 2.2.1 BERT (Bidirectional Encoder Representations from Transformers) 8 2.2.2 BERT on Specific Domains 9 Methodology 11 3.1 Vector Space Model 12 3.2 News Embedding via BERT Encoder Model 14 3.2.1 Normal Average 16 3.2.2 ESG Weighted Average 17 3.3 BERT for Classification 18 3.4 LSTM with Embedding Layer 20 Empirical Evaluation 22 4.1 Data Collection 22 4.1.1 Company Selection 22 4.1.2 Company News Data 23 4.1.3 Company ESG Scores 23 4.2 Classification Algorithms 27 4.2.1 k Nearest Neighbors 28 4.2.2 Random Forest 28 4.2.3 XGBoost 28 4.2.4 Backpropagation Neural Network 29 4.3 Experiment Setting 29 4.3.1 Evaluation Metrics 29 4.3.2 Training Strategies and Hyperparameters Setting 30 4.4 Experimental Results and Discussions 32 4.4.1 Vector Space Model 32 4.4.2 News Embedding via BERT Encoder Model 35 4.4.3 BERT for Classification 36 4.4.4 LSTM with Embedding Layer 37 4.4.5 Model Comparison and Discussions 38 4.5 Additional Experiments 39 Conclusion 42 References 44 Appendix A - E-Description Document 48 Appendix B - S-Description Document 50 Appendix C - G-Description Document 52 | |
| dc.language.iso | en | |
| dc.subject | 預訓練語言模型 | zh_TW |
| dc.subject | 環境社會治理 | zh_TW |
| dc.subject | 企業ESG表現分類預測技術 | zh_TW |
| dc.subject | 向量空間模型 | zh_TW |
| dc.subject | 深度學習 | zh_TW |
| dc.subject | 自然語言處理 | zh_TW |
| dc.subject | deep learning | en |
| dc.subject | pre-trained language model | en |
| dc.subject | natural language processing | en |
| dc.subject | ESG | en |
| dc.subject | ESG performance classification | en |
| dc.subject | text mining | en |
| dc.subject | vector space model | en |
| dc.title | 運用商業新聞與文字探勘技術預測公司永續治理表現 | zh_TW |
| dc.title | Predicting ESG Performance of Firms Using Business News: A Text Mining Approach | en |
| dc.date.schoolyear | 109-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 楊錦生(Hsin-Tsai Liu),吳家齊(Chih-Yang Tseng) | |
| dc.subject.keyword | 環境社會治理,企業ESG表現分類預測技術,向量空間模型,深度學習,自然語言處理,預訓練語言模型, | zh_TW |
| dc.subject.keyword | ESG,ESG performance classification,text mining,vector space model,deep learning,natural language processing,pre-trained language model, | en |
| dc.relation.page | 53 | |
| dc.identifier.doi | 10.6342/NTU202102439 | |
| dc.rights.note | 同意授權(全球公開) | |
| dc.date.accepted | 2021-08-19 | |
| dc.contributor.author-college | 管理學院 | zh_TW |
| dc.contributor.author-dept | 資訊管理學研究所 | zh_TW |
| 顯示於系所單位: | 資訊管理學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| U0001-1708202116110200.pdf | 1.96 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
