Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 共同教育中心
  3. 統計碩士學位學程
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/84149
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor石百達zh_TW
dc.contributor.advisorPAI-TA SHIHen
dc.contributor.author陳威宇zh_TW
dc.contributor.authorWEI-YU CHENen
dc.date.accessioned2023-03-19T22:05:24Z-
dc.date.available2023-12-26-
dc.date.copyright2022-07-13-
dc.date.issued2022-
dc.date.submitted2002-01-01-
dc.identifier.citation1. A. Vaswani, N. Shazeer, N. Parmar, and J. Uszkoreit,“Attention is all you need,” arXiv Preprint, arXiv:1706.03762, 2017
2. Borochin, P. A., Cicon, J. E., DeLisle, R. J. and Price, S. M. (2018). The effects of conference call tones on market perceptions of value uncertainty. Journal of Financial Markets, 40, 75- 91.
3. Brockman, P., Li, X., & Price, S. M. (2015). Differences in conference call tones: Managers versus analysts. Financial Analysts Journal, 71, 4, 24-42.
4. Davis, A. K., Ge, W., Matsumoto, D., & Zhang, J. L. (2015). The effect of manager-specific optimism on the tone of earnings conference calls. Review of Accounting Studies, 20, 639-673.
5. Young, T., Hazarika, D., Poria, S., and Cambria, E. (2017). Recent Trends in Deep Learning Based
6. Bahdanau, D., Cho, K., and Bengio, Y., “Neural machine translation by jointly learning to align and translate,” arXiv Preprint, arXiv:1409.0473, 2014.
7. Price, S. M., Doran, J. S., Peterson, D. R., & Bliss, B. A. (2012). Earnings conference calls and stock returns: The incremental informativeness of textual tone. Journal of Banking and Finance, 36, 992-1011.
8. Kothari, S.P., X.Li, and J.E.Short. “The effect of disclosures by management, analysts, and business press on cost of capital, return volatility, and analyst forecasts: A study using content analysis.” Accounting Review 84, No.5(2009):1639-16770
9. Saltzman, B., & Yung, J. (2018). A machine learning approach to identifying different types of uncertainty. Economics Letters, 171, 58-62
10. Mukul Jaggi et al.(2021), Text mining of stock twits data for predicting stock Prices.
11. Zhuang Liu et al.(2020), FinBERT: A Pre-trained financial language representation model for financial text mining. AI in FinTech, 4513-4519
12. Aaryan Gupta et al.(2020), Comprehensive review of text-mining applications in finance
13. Rapheal, O., Daniel, S., and Ida, P.(2021), A Two-Step Optimised BERT-Based NLP Algorithm for Extracting Sentiment from Financial News, 17th IFIP WG 12.5 International Conference, AIAI 2021 Hersonissos, 745-755
14. Jingfei Duy, Edouard Gravey, Beliz Gunelz (2020), Self-training Improves Pre-training for Natural Language Understanding
15. Shapiro, A. H., & Wilson, D. (2019). Taking the fed at its word: A new approach to estimating central bank objectives using text analysis. Federal Reserve Bank of San Francisco Working Paper, 2, 2019.
16. Natural Language Processing. Devlin, J., Chang, M.-W., Lee, K., and Toutanova, K. Bert: Pretraining of deep bidirectional transformers for language understanding.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/84149-
dc.description.abstract利率,代表著貨幣的時間價值,在金融活動中,是不可或缺的重要指標。而隨著當今美國金融業的強勢,又以美元的利率最受投資人關注。本研究主要目的為試圖找出褐皮書的文字資料與FOMC利率決策之相對關係,以1983年到2020年所發佈的褐皮書資料,透過NLP的方式進行分析與預測,其中NLP模型選用Bert作為訓練的模型,試圖以褐皮書中對經濟狀況的文字表述,萃取出其可能對整體利率變化的影響,進而達到預測利率變動決策的成果。

進行褐皮書文字資料分析時,以Bert模型為主要模型,結合TF-IDF、Random Forest、 Self-labeling等處理,分別進行升息模型以及降息模型的訓練。升息模型以使用Self-labeling的模型為最佳,整體準確率達66.7%(Total Accuracy),而對於升息決策的預測亦有66.7%的召回率(Recall rate);而在降息模型中則是以Self-labeling與TF-IDF的結合使用效果最好,有78.3%的整體準確率(Total Accuracy),此外對於降息事件的預測則是40%的召回率(Recall rate)。

研究顯示,在升降息模型的比較中,升息模型的預測普遍表現的比降息模型要好。其主因推測為,升息時往往是市場景氣的升溫,進而較容易在褐皮書中找到經濟成長的脈絡;而相對的,當市場趨向降息時,往往是發生了突發事件,對經濟形成一定的衝擊,進而需要降息刺激,然而此時由於事件的突發,導致難以從褐皮書中找到脈絡,進而使得降息模型的表現普遍較差。而在幾種資料處理方式中,Self-labeling能對預測結果造成明顯的提升,而TF-IDF、Random Forest提供的改善則較為有限。
zh_TW
dc.description.abstractInterest, in other word is the cost of money. In financial world, interest is one of most important factors that everybody cared. And because of the domination of US dollar in the world, the interest of US dollar is also the leading indicator in financial world. On the other hand, FOMC (Federal Open Market Committee) is the Committee that Fed decide the national monetary policy which includes the change of interest. And the Beige Book is report that Fed gathers anecdotal information on current economic conditions, and those economic conditions may be the trigger of interest change decision. so, we can maybe use the beige book data to predict the FOMC interest decision, then can be well prepared for the market change.

When analyzing the text data of the Beige Book, the Bert model is used as the main model, combined with TF-IDF, Random Forest, Self-labeling, etc., to train the interest increasing model and the interest decreasing model respectively. The model with Self-labeling is the best in interest increasing predicting, with an Total Accuracy of 66.7%, and the Recall rate in predicting interest increasing event is also 66.7%; in the interest decreasing model, it is The combination of Self-labeling and TF-IDF is the best, with Total Accuracy 78.3%, and the Recall rate in predicting interest decreasing event is 40%.

Research shows that interest increasing models generally perform better than interest decreasing models. The main reason is speculated that when the interest rate is raised, the market prosperity is often warmed up, and it is easier to find the context of economic growth in the Beige Book; on the other hand, when the market tends to decrease interest rates, emergencies often occur, which have a negative impact on the economy. However, due to the suddenness of events at this time, it is difficult to find the context in the Beige Book, which in turn makes the performance of the interest decreasing models generally poor. Besides, among several data processing methods, Self-labeling can significantly improve the prediction results, while the improvement provided by TF-IDF and Random Forest is relatively limited.
en
dc.description.provenanceMade available in DSpace on 2023-03-19T22:05:24Z (GMT). No. of bitstreams: 1
U0001-0107202202201700.pdf: 1893223 bytes, checksum: edefcc9d69687ffa2ad8074c8536e1ea (MD5)
Previous issue date: 2022
en
dc.description.tableofcontents第一章、緒論 1
第一節、研究動機 1
第二節、研究目的 1
第三節、研究限制 1
第二章、文獻探討 2
第一節、名詞解釋 2
一、聯邦基金利率(Fed Fund Rate) 2
二、聯邦公開市場委員會(FOMC) 2
三、褐皮書(Beige Book) 3
第二節、自然語言處理 4
一、自然語言處理模型趨勢 4
二、Google BERT 模型 6
第三節、財金領域應用 7
一、公司表現分析 7
二、經濟概況分析 7
第三章、研究方法 9
第一節、資料處理 9
第二節、數據強化 11
一、TF-IDF 11
二、Random Forest 12
第三節、模型訓練 12
一、BERT fine-tune 12
二、Self-Labeling 14
第四節、模型驗證方法 14
第四章、研究結果 16
第一節、資料組成 16
第二節、模型預測結果 17
一、Bert Model 17
二、Self-labeling Bert Model 19
三、升降息決策預測 24
第五章、結論與建議 26
第一節、結論 26
第二節、後續研究建議 27
參考文獻 28
-
dc.language.isozh_TW-
dc.title以褐皮書資料預測FOMC利率決策zh_TW
dc.titleBeige Book for Predicting FOMC Interest Decisionen
dc.typeThesis-
dc.date.schoolyear110-2-
dc.description.degree碩士-
dc.contributor.coadvisor蔡政安zh_TW
dc.contributor.coadvisorCHEN-AN TSAIen
dc.contributor.oralexamcommittee洪偉峰;盧佳琪zh_TW
dc.contributor.oralexamcommitteeWEI-FENG HUNG;CHIA-CHI LUen
dc.subject.keyword褐皮書,利率預測,Bert,文字探勘,深度學習,zh_TW
dc.subject.keywordBeige Book,Interest Predict,Bert,Text Mining,Deep Learning,en
dc.relation.page29-
dc.identifier.doi10.6342/NTU202201236-
dc.rights.note同意授權(限校園內公開)-
dc.date.accepted2022-07-08-
dc.contributor.author-college共同教育中心-
dc.contributor.author-dept統計碩士學位學程-
dc.date.embargo-lift2027-07-01-
顯示於系所單位:統計碩士學位學程

文件中的檔案:
檔案 大小格式 
ntu-110-2.pdf
  目前未授權公開取用
1.85 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved