Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/51087
Full metadata record
???org.dspace.app.webui.jsptag.ItemTag.dcfield???ValueLanguage
dc.contributor.advisor項傑(Jieh Hsiang)
dc.contributor.authorMeng-Lin Huangen
dc.contributor.author黃孟霖zh_TW
dc.date.accessioned2021-06-15T13:24:58Z-
dc.date.available2021-08-17
dc.date.copyright2020-08-24
dc.date.issued2020
dc.date.submitted2020-08-18
dc.identifier.citationSutskever and I.Vinyals and Le, Q., “Sequence to sequence learning with neuralnetworks,”In Advances in Neural Information Processing Systems,NIPS 2014, 2014.ix, 5
A. Vaswani an d N. Shazeer and N. Parmar and J. Uszkoreit, L. Jones and A. N.Gomez. Kaiser and I. Polosukhin, “Attention is all you need,”in Advances in neuralinformation processing systems, pp. 5998–6008, 2017. ix, 6, 7, 27
J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, “BERT: Pre-training of deepbidirectional transformers for language understanding,”Proceedings of the 2019Conference of the North American Chapter of the Association for ComputationalLinguistics: Human Language Technologies, 2019. ix, 9, 10, 27
J. Elman, “Finding structure in time,”Cognitive Science, vol. 14, no. 2, pp. 179–211,1990. 4
S. Hochreiter and J. Schmidhuber, “Long short-term memory,”Neural Computation,vol. 9, no. 8, pp. 1735–1780, 1997. 5
J.Gehring, M.Auli, D.Grangier, D.Yarats, and Y.N.Dauphin, “Convolutional sequenceto sequence learning,”in Proceedings of the 34th International Conference on Ma-chine Learning-Volume, vol. 70, pp. 1243–1252, 2017. 6
T. Mikolov, K.Chen, G.Corrado, and J.Dean, “Efficient estimation of word represen-tations in vector space,”CCoRR, vol. abs/1301.3781, 2013. 9
Daniel G Bobrow., “Natural language input for a computer problem solving system .”In Minsky, M., ed., Semantic information processing, pp. 146–226, 1964. 11
M. J. Hosseini, H. Hajishirzi, O. Etzioni, and N. Kushman, “Learning to solvearithmetic word problems with verb categorization,”In Proceedings of the 2014Conference on Empirical Methods in Natural Language Processing, pp. 523–533,2014. 11
N. Kushman, L. Zettlemoyer, R. Barzilay, and Y. Artzi, “Learning to automaticallysolve algebra word problems.”In Proceedings of the 52nd Annual Meeting of theAssociation for Computational Linguistics,ACL 2014, pp. 271–281, 2014. 11
Lei Wang and Dongxiang Zhang and Lianli Gao and Jingkuan Song, Long Guoand Heng Tao Shen., “ MathDQN: Solving arithmetic word problems via deepSolving general reinforcement learning.”In Proceedings of the Thirty-Second AAAIConference on Artificial Intelligence., 2018. 12
Zhipeng Xie and Shichao Sun, “A Goal-Driven Tree-Structured Neural Model forMath Word Problems,”Proceedings of the Twenty-Eighth International Joint Confer-ence on Artificial Intelligence,IJCAI2019, pp. 5299–5305, 2019. 12
Ting-Rui Chiang and Yun-Nung Chen, “Semantically-Aligned Equation Generationfor Solving and Reasoning Math Word Problems.”Proceedings of the 2019 Confer-ence of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies, vol. 1, p. 2656–2668, 2019. 12
Chien-Tsung Huang and Yi-Chung Lin and Chao-Chun Liang and Kuang-Yi Hsuand Shen-Yun Miao and Wei-Yun Ma and Lun-Wen Ku and Churn-Jung Liau andKeh-Yih Su, “Designing a Tag-Based Statistical Math Word Problem Solver withReasoning and Explanation,”Proceedings of the 27th Conference on ComputationalLinguistics and Speech Processing (ROCLING 2015), pp. 58–63, 2015. 12, 13
Chien-Tsung Huang and Yi-Chung Lin and Keh-Yih Su, “Explanation Generation fora Math Word Problem Solver,”Proceedings of the 27th Conference on ComputationalLinguistics and Speech Processing (ROCLING 2015), pp. 64–70, 2015. 12, 13
Peng-Hsuan Li and Tsu-Jui Fu and Wei-Yun Ma, “Why Attention? Analyze BiLSTMDeficiency and Its Remedies in the Case of NER,”AAAI2020, 2019. 25
Ming Zhong and Pengfei Liu and Danqing Wang and Xipeng Qiu and XuanjingHuang, “Searching for Effective Neural Extractive Summarization: What Worksand What’s Next,”Proceedings of the 57th Annual Meeting of the Association forComputational Linguistics,ACL 2019, p. 1049–1058, 2019. 27
Shi Feng and Eric Wallace and Alvin Grissom II and Mohit Iyyer and Pedro Ro-driguez and Jordan Boyd-Graber, “Pathologies of Neural Models Make Interpre-tations Difficult,”Proceedings of the 2018 Conference on Empirical Methods inNatural Language Processing, emnlp 2018, pp. 3719–3728, 2018. 28
Javid Ebrahimi and Anyi Rao and Danial Lowd and Deijing Dou, “HotFlip:White-boxadversarial examples for text classification,”In Proceedings of the Association forComputational Linguistics, 2017. 28
Jiwei Li and Will Monroe and Daniel Jurafsky, “Understanding neural networksthrough representation erasure,”arXiv preprint arXiv 1612.08220. 28
Leila Arras and Franziska Horn and Gregoire Montavon and Klaus-Robert Mullerand Wojciech Samek, “Explaining predictions of non-linear classifiers in NLP,”InWorkshop on Representation Learning for NLP, 2016. 28
Yiming Cui and Wanxiang Che and Ting Liu and Bing Qin and Ziqing Yang andShijin Wang and Guoping Hu, “Pre-Training with Whole Word Masking for ChineseBERT,”arXiv preprint arXiv:1906.08101, 2019. 29
Cunxiang Wang and Shuailong Liang and Yue Zhang and Xiaonan Li and TianGao, “Does it Make Sense?And Why?A Pilot Study for Sense Making and Explana-tion,”Proceedings of the 57th Annual Meeting of the Association for ComputationalLinguistics, ACL 2019, pp. 4020–4026, 2019. 38
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/51087-
dc.description.abstract數學文字題包含自然語言理解與邏輯推斷,所以一直以來都被視為人工智慧領域重要的應用,其中小學數學問題並沒有複雜的數學計算與文字敘述,研究人員能讓其設計出來的系統專注於語言理解與邏輯推斷的部分,而不需要涵蓋廣大的文法以及數學相關的Domain Knowledge,也因此非常適合用來當作衡量的數據集。
本篇論文提出一個新的小學數學解題系統,與現今其他以神經網路建構的系統的不同之處在於此系統著重於邏輯推斷過程。我們的系統除了可以算出答案以外還能產生生出此答案的解釋,並且透過簡化題目來減少所需要設計的規則。此外針對此系統目前處理不了的問題我們也做了錯誤分析,並設計了新的句子簡化方法,更進一步的將其與系統進行結合,最終此系統在我們提出的小學數學數據集上得到顯著的成果。
zh_TW
dc.description.abstractMath Word Problem involves natural language understanding and logical inference, thus it has been regarded as a major application of AI. Because there is no complicated mathematical calculation in elementary math word problem, researchers could concentrate on the task of understanding and inference.
In this thesis, we propose a novel elementary math solver system, named “elementary mathematics problem solver”. The difference between prevailing neural network based systems and ours is that we focus more on the process of logical inference and domain common sense. Our solver not only answers the question, but also gives corresponding explanations. We adopt label sequence as “pattern” to identify similar sentences and problems. Solution strategy is described in natural language script so that teachers can make change at will. We greatly reduce the patterns needed by filtering out non-essential words in problem sentences. Since the explanations are written in natural language, error analysis is very intuitive. Our problem solving philosophy is based on mimicking how humans learn to solve a problem. Ultimately, our system achieves significant performance on our elementary mathematics dataset.
en
dc.description.provenanceMade available in DSpace on 2021-06-15T13:24:58Z (GMT). No. of bitstreams: 1
U0001-1008202015540300.pdf: 15055239 bytes, checksum: 161dea2870b5c7b5c3b6f92c3f53887c (MD5)
Previous issue date: 2020
en
dc.description.tableofcontents1 緒論 1
1.1 研究目的與動機 1
1.2 數學文字題 1
1.3 主要貢獻 2
1.4 章節概要 2
2 相關研究 4
2.1 背景 4
2.1.1 循環神經網路(Recurrent Neural Network,RNN) 4
2.1.2 長短期記憶模型(Long Short Term Memory Model,LSTM) 4
2.1.3 Sequence to Sequence模型 5
2.1.4 Transformer 6
2.1.5 TF-IDF 8
2.1.6 Word2Vec 9
2.1.7 BERT 9
2.1.8 Ontology(本體論) 10
2.2 相關研究 11
2.2.1 歷史脈絡 11
2.2.2 小學數學解題系統 13
3 資料集 14
3.1 問題描述與資料集 14
4 小學數學解題系統 16
4.1 小學數學解題流程 16
4.1.1 切句與簡化(reduction) 17
4.1.2 對到框架並產生instancemap 19
4.1.3 改寫(rewrite) 20
4.1.4 跑script與整合解釋 21
5 句子過濾 24
5.1 通用過濾法 24
5.1.1 Rouge-N based 25
5.1.2 Transformer based 25
5.1.3 Pre-trained BERT based 28
5.1.4 Label Sequence 31
6 實驗 34
6.1 小學數學解題系統答題統計及分析 34
6.2 各簡化機制實驗 35
6.2.1 額外答對題數比較 35
6.2.2 句子通順度比較 38
6.2.3 通用簡化法 39
6.2.4 Label Sequence加上小學數學解題系統分析 40
7 結論與未來展望 42
7.1 結論 42
7.2 未來展望 42
Bibliography 44
dc.language.isozh-TW
dc.subject數學文字問題zh_TW
dc.subject自然語言理解zh_TW
dc.subject句子過濾zh_TW
dc.subject邏輯推斷zh_TW
dc.subject深度學習zh_TW
dc.subject數學文字問題zh_TW
dc.subject自然語言理解zh_TW
dc.subject句子過濾zh_TW
dc.subject邏輯推斷zh_TW
dc.subject深度學習zh_TW
dc.subjectnatural language understandingen
dc.subjectlogic inferenceen
dc.subjectmath word problemen
dc.subjectnatural language understandingen
dc.subjectsentence input filteringen
dc.subjectlogic inferenceen
dc.subjectdeep learningen
dc.subjectsentence input filteringen
dc.subjectmath word problemen
dc.subjectdeep learningen
dc.title以簡化法處理數學文字題zh_TW
dc.titleA Reduction Based Approach for Math Word Problemen
dc.typeThesis
dc.date.schoolyear108-2
dc.description.degree碩士
dc.contributor.oralexamcommittee許聞廉(Wen-Lian Hsu),戴鴻傑(Hong-Jie Dai)
dc.subject.keyword數學文字問題,自然語言理解,句子過濾,邏輯推斷,深度學習,zh_TW
dc.subject.keywordmath word problem,natural language understanding,sentence input filtering,logic inference,deep learning,en
dc.relation.page46
dc.identifier.doi10.6342/NTU202002823
dc.rights.note有償授權
dc.date.accepted2020-08-19
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
Appears in Collections:資訊工程學系

Files in This Item:
File SizeFormat 
U0001-1008202015540300.pdf
  Restricted Access
14.7 MBAdobe PDF
Show simple item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved