請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/72566
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 林守德(Shou-De Lin) | |
dc.contributor.author | Yeh-Wen Tsao | en |
dc.contributor.author | 曹爗文 | zh_TW |
dc.date.accessioned | 2021-06-17T07:01:02Z | - |
dc.date.available | 2024-08-07 | |
dc.date.copyright | 2019-08-07 | |
dc.date.issued | 2019 | |
dc.date.submitted | 2019-08-01 | |
dc.identifier.citation | [1] F. Blasques, P. Gorgi, and S. Koopman. Accelerating score-driven time series models. Journal of Econometrics, 2019.
[2] G. E. Box, G. M. Jenkins, G. C. Reinsel, and G. M. Ljung. Time series analysis: forecasting and control. John Wiley & Sons, 2015. [3] D. Canaday, A. Griffith, and D. J. Gauthier. Rapid time series prediction with a hardware-based reservoir computer. Chaos: An Interdisciplinary Journal of Nonlin- ear Science, 28(12):123119, 2018. [4] C. P. Chen and J. Z. Wan. A rapid learning and dynamic stepwise updating algo- rithm for flat neural networks and the application to time-series prediction. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 29(1):62–72, 1999. [5] K. Cho, B. Van Merri ̈enboer, D. Bahdanau, and Y. Bengio. On the proper- ties of neural machine translation: Encoder-decoder approaches. arXiv preprint arXiv:1409.1259, 2014. [6] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova. Bert: Pre-training of deep bidi- rectional transformers for language understanding. arXiv preprint arXiv:1810.04805, 2018. [7] H.I.Fawaz,G.Forestier,J.Weber,L.Idoumghar,andP.-A.Muller.Deeplearning for time series classification: a review. Data Mining and Knowledge Discovery, 33(4):917– 963, 2019. [8] J. H. Friedman. Greedy function approximation: a gradient boosting machine. Annals of statistics, pages 1189–1232, 2001. [9] X. Geng, Y. Li, L. Wang, L. Zhang, Q. Yang, J. Ye, and Y. Liu. Spatiotemporal multi-graph convolution network for ride-hailing demand forecasting. In 2019 AAAI Conference on Artificial Intelligence (AAAI’19), 2019. [10] S. Hochreiter and J. Schmidhuber. Long short-term memory. Neural computation, 9(8):1735–1780, 1997. [11] D. P. Kingma and J. Ba. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014. [12] G. Lai, W.-C. Chang, Y. Yang, and H. Liu. Modeling long-and short-term temporal patterns with deep neural networks. In The 41st International ACM SIGIR Con- ference on Research & Development in Information Retrieval, pages 95–104. ACM, 2018. [13] Y. LeCun, Y. Bengio, and G. Hinton. Deep learning. nature, 521(7553):436, 2015. [14] Y. Li, R. Yu, C. Shahabi, and Y. Liu. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. arXiv preprint arXiv:1707.01926, 2017. [15] M. Majidpour, C. Qiu, P. Chu, R. Gadh, and H. R. Pota. Fast prediction for sparse time series: Demand forecast of ev charging stations for cell phone applications. IEEE Transactions on Industrial Informatics, 11(1):242–250, 2014. [16] Y. Qin, D. Song, H. Chen, W. Cheng, G. Jiang, and G. Cottrell. A dual-stage attention-based recurrent neural network for time series prediction. arXiv preprint arXiv:1704.02971, 2017. [17] A. J. Smola and B. Sch ̈olkopf. A tutorial on support vector regression. Statistics and computing, 14(3):199–222, 2004. [18] V. Vapnik, S. E. Golowich, and A. J. Smola. Support vector method for function approximation, regression estimation and signal processing. In Advances in neural information processing systems, pages 281–287, 1997. [19]A.Vaswani,N.Shazeer,N.Parmar,J.Uszkoreit,L.Jones,A.N.Gomez,L.Kaiser, and I. Polosukhin. Attention is all you need. In Advances in neural information processing systems, pages 5998–6008, 2017. [20] S. Xingjian, Z. Chen, H. Wang, D.-Y. Yeung, W.-K. Wong, and W.-c. Woo. Convo- lutional lstm network: A machine learning approach for precipitation nowcasting. In Advances in neural information processing systems, pages 802–810, 2015. [21] W. Yan, H. Qiu, and Y. Xue. Gaussian process for long-term time-series forecasting. In 2009 International Joint Conference on Neural Networks, pages 3420–3427. IEEE, 2009. [22] H. Yao, X. Tang, H. Wei, G. Zheng, and Z. Li. Revisiting spatial-temporal similar- ity: A deep learning framework for trac prediction. In 2019 AAAI Conference on Artificial Intelligence (AAAI’19), 2019. [23] H. Yao, F. Wu, J. Ke, X. Tang, Y. Jia, S. Lu, P. Gong, J. Ye, and Z. Li. Deep multi-view spatial-temporal network for taxi demand prediction. In Thirty-Second AAAI Conference on Artificial Intelligence, 2018. [24] B. Yu, H. Yin, and Z. Zhu. Spatio-temporal graph convolutional networks: A deep learning framework for trac forecasting. arXiv preprint arXiv:1709.04875, 2017. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/72566 | - |
dc.description.abstract | 時間序列預測在許多領域是重要的研究問題,包含金融市場、天氣、電力消耗 以及交通阻塞。然而近期的研究使用機器學習訓練的時間非常長,需要使用複雜 的模型。因此針對這個問題,我們在這篇論文提出一個簡化的深度學習的模型以 達到有效率的表現。我們的模型單純基於卷積神經網路,並以此學習長期以及短 期的資訊。本研究總共在七個資料集上進行實驗,能在訓練時間上大幅贏過目前 最新的模型,且能獲得近乎相同甚至更好的準確率。 | zh_TW |
dc.description.abstract | Time series forecasting is an important research area across many domains, such as predictions of financial market, weather, electricity consumption, and traffic jam situation. However, most of recent works are usually time-consuming and complex. In this paper, we propose a deep learning model to tackle this issue, and deliver efficient performance. Our model uses purely Convolutional Neural Network (CNN) structure to capture both long-term and short-term features. Thorough empirical studies based upon the total seven different dataset demonstrate that the our model can outperform state-of-the-art methods over training time with comparable performance. | en |
dc.description.provenance | Made available in DSpace on 2021-06-17T07:01:02Z (GMT). No. of bitstreams: 1 ntu-108-R06922022-1.pdf: 4149590 bytes, checksum: 38941d8bfd230ed0b3894f3ea4e55f60 (MD5) Previous issue date: 2019 | en |
dc.description.tableofcontents | Contents
Acknowledgments ii Abstract iii List of Figures vii List of Tables viii Chapter 1 Introduction 1 Chapter 2 Related Works 5 Chapter 3 Methodology 7 3.1 Notations and Problem Formulation................... 7 3.2 Temporal2D................................ 8 3.3 SpatialTemporal2D ........................... 9 3.4 Loss function and Optimization ..................... 10 Chapter 4 Experiments 11 4.1 DatasetsandFeatures .......................... 11 4.2 Metrics................................... 13 4.3 Hyper-parameter and Experimental Settings . . . . . . . . . . . . . . 14 4.4 ResultandDiscussion .......................... 15 Chapter 5 Conclusions 24 Chapter 6 Future Works 25 Bibliography 27 | |
dc.language.iso | en | |
dc.title | 加速基於深度學習的時間序列預測 | zh_TW |
dc.title | A Fast Deep Learning Model for Time Series Prediction | en |
dc.type | Thesis | |
dc.date.schoolyear | 107-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 葉彌妍(Mi-Yen Yeh),李政德,駱宏毅 | |
dc.subject.keyword | 時間序列預測,機器學習,深度學習,加速, | zh_TW |
dc.subject.keyword | Time Series Prediction,Machine Learning,Deep Learning,Accelerating, | en |
dc.relation.page | 29 | |
dc.identifier.doi | 10.6342/NTU201901589 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2019-08-01 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
顯示於系所單位: | 資訊工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-108-1.pdf 目前未授權公開取用 | 4.05 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。