請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/68227
標題: | 記憶增強的主動少量學習 Memory-Augmented Active Few-shot Learning |
作者: | Sipun Kumar Pradhan 施庫馬 |
指導教授: | 廖世偉 |
關鍵字: | Active learning,One-shot Learning,Memory-Augmented Neural Networks., |
出版年 : | 2017 |
學位: | 碩士 |
摘要: | Interactive selection of desired training samples for labeling from a hay stack of unlabeled examples is an extremely challenging task in supervised learning. In many real world applications, active selection of training examples can significantly reduce the number of labelled training examples to learn a classification function. Different strategies of active learning have been proposed that iteratively select a single new example from a set of unlabeled examples, query the corresponding class label and then perform retraining of the current classifier. However, to reduce computational time for training, it might be necessary to select batches of new training examples instead of single examples. My research goal in this thesis is to develop learning models that can automatically learn new facts to optimize selection learning without having to be re-trained will full corpus. Our method applies the active learning methodology, and the user only needs to label minimal initial training data and subsequent query data.
I investigate a new class of learning models called active learning with few shot learning. The main advantage of this framework is that it requires little feature engineering and domain specificity whilst matching or surpassing state-of-the-art results. Furthermore, it can easily be trained to be used with any kind of Open-domain. The premise of active learning is that there are costs associated with labeling and with making an incorrect prediction. Reinforcement learning allows for the explicit specification of those costs, and directly finds a labelling policy to optimize those costs. Finally, I show that with memory augmentation our model can reach promising and learn to perform non-trivial operations. I confirm those results by comparing my system to various well-crafted baseline Datasets and future work is discussed. |
URI: | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/68227 |
DOI: | 10.6342/NTU201704312 |
全文授權: | 有償授權 |
顯示於系所單位: | 資訊工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-106-1.pdf 目前未授權公開取用 | 1.22 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。