Please use this identifier to cite or link to this item:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/50214Full metadata record
| ???org.dspace.app.webui.jsptag.ItemTag.dcfield??? | Value | Language |
|---|---|---|
| dc.contributor.advisor | 林守德 | |
| dc.contributor.author | Yi Huang | en |
| dc.contributor.author | 黃伊 | zh_TW |
| dc.date.accessioned | 2021-06-15T12:32:47Z | - |
| dc.date.available | 2016-08-03 | |
| dc.date.copyright | 2016-08-03 | |
| dc.date.issued | 2016 | |
| dc.date.submitted | 2016-08-02 | |
| dc.identifier.citation | [1] G. Salton and M. McGill, editors. Introduction to Modern Information Retrieval. McGraw-Hill, 1983.
[2] S. Deerwester, S. Dumais, T. Landauer, G. Furnas, and R. Harshman. Indexing by la- tent semantic analysis. Journal of the American Society of Information Science, 41(6):391–407, 1990. [3] W. Xu, X. Liu, Y. Gong. Document Clustering based on non- negative matrix factor- ization. ACM SIGIR Conference, 2003. [4] T. Hofmann. Probabilistic latent semantic indexing. Proceedings of the Twenty- Second Annual International SIGIR Conference, 1999. [5] D. Blei, A. Ng, M. Jordan. Latent Dirichlet allocation, Journal of Machine Learning Research, 3: pp. 993–1022, 2003. [6] P. Xie and E. P. Xing. Integrating Document Clustering and Topic Modeling. In UAI, 2013. [7] Xiaoping, S. Textual document clustering using topic models. Semantics, Knowledge and Grids (SKG), pp. 1–4, 2014. [8] Y. W. Teh, M. I. Jordan, M. J. Beal, and D. M. Blei. Hierarchical Dirichlet processes. Journal of the American Statistical Association, 101(476):1566–1581, 2006. [9] D. Blei and M. Jordan. Variational inference for Dirichlet process mixtures. Journal of Bayesian Analysis, 1(1):121–144, 2006. [10] https://github.com/blei-lab/hdp [11] http://ana.cachopo.org/datasets-for-single-label-text-categorization [12] http://qwone.com/ jason/20Newsgroups/ [13] http://glaros.dtc.umn.edu/gkhome/cluto/cluto/overview | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/50214 | - |
| dc.description.abstract | 本論文設計了一個結合主題模型的非參數文本分群模型。我們的模型假設群的數量是直接從數據中習得的。模型同時對優化文本表達和非參數分群兩個任務進行優化。非參數分群的部分使用的是狄利克雷過程混合模型,文本表達的部分結合的是分層狄利克雷過程。我們使用變分推斷方法來近似後延分佈,并採用EM算法的方式學習所有的變量。實驗驗證了模型的有效性。 | zh_TW |
| dc.description.abstract | We describe a nonparametric document clustering model leveraging the topic modeling technique. In our model, the number of clusters is assumed to be inferred from data. Our model jointly optimizing two tasks: representing each document using its topic distribution, and nonparametric clustering on this transformed topic space. The clustering is built based on Dirichlet process mixture model (DPM) and the topic modeling shares similar structure with hierarchical Dirichlet process (HDP). We employ a variational inference solution to approximate the intractable posterior distribution and adopt the EM algorithm for parameter learning. Experiments on a variety of datasets are conducted to justify the effectiveness of the model. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-15T12:32:47Z (GMT). No. of bitstreams: 1 ntu-105-R03922145-1.pdf: 1305339 bytes, checksum: 7b3883f0dc56bb2e19bd6477e88c444c (MD5) Previous issue date: 2016 | en |
| dc.description.tableofcontents | 口試委員會審定書 i
誌謝 ii Acknowledgements iii 摘要 iv Abstract v 1 Introduction 1 2 Dirichlet Process Mixture Model 5 2.1 MixtureModel ............................... 5 2.2 Dirichlet Process .............................. 6 2.3 Dirichlet Process Mixture model ...................... 7 3 DCTM 9 4 Variational Inference and Parameter Learning 12 4.1 Target distribution and Evidence lower bound . . . . . . . . . . . . . . . 13 4.2 VariationalInference ............................ 14 4.2.1 Updating q(α|λ).......................... 14 4.2.2 Updating q(μ|γ).......................... 14 4.2.3 Updating q(η|ζ).......................... 15 4.2.4 Updating q(θ|τ).......................... 15 4.2.5 Updating q(z|φ).......................... 16 4.2.6 Updating q(β|δ).......................... 16 4.3 ParameterLearning............................. 17 4.3.1 Updatingεk ............................. 17 4.3.2 Updatingπ ............................. 18 4.3.3 Updatingρ ............................. 18 5 Experiments 20 5.1 Baseline method............................... 20 5.2 Experimental settings............................ 20 5.3 Experiments on common dataset ...................... 21 5.3.1 Datasets............................... 21 5.3.2 Results ............................... 21 5.4 Experiments on datasets with two level cluster structure . . . . . . . . . . 24 5.4.1 Datasets............................... 24 5.4.2 Results ............................... 25 5.5 Influence of number of topic K....................... 27 5.6 Influence of truncated level T........................ 28 6 Conclusion and Future Works........................30 Reference........................31 Appendix A........................33 Appendix B........................35 | |
| dc.language.iso | en | |
| dc.subject | 文本分群 | zh_TW |
| dc.subject | 非共軛模型 | zh_TW |
| dc.subject | 混合模型 | zh_TW |
| dc.subject | 非參數模型 | zh_TW |
| dc.subject | 主題模型 | zh_TW |
| dc.subject | document clustering | en |
| dc.subject | mixture model | en |
| dc.subject | nonconjugate model | en |
| dc.subject | nonparametric model | en |
| dc.subject | topic modeling | en |
| dc.title | 應用主題模型的非參數文本分群 | zh_TW |
| dc.title | Nonparametric Document Clustering with Topic Modeling | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 104-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 張嘉惠,陳信希,鄭卜壬 | |
| dc.subject.keyword | 文本分群,主題模型,非參數模型,混合模型,非共軛模型, | zh_TW |
| dc.subject.keyword | document clustering,topic modeling,nonparametric model,mixture model,nonconjugate model, | en |
| dc.relation.page | 35 | |
| dc.identifier.doi | 10.6342/NTU201601803 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2016-08-03 | |
| dc.contributor.author-college | 電機資訊學院 | zh_TW |
| dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
| Appears in Collections: | 資訊工程學系 | |
Files in This Item:
| File | Size | Format | |
|---|---|---|---|
| ntu-105-1.pdf Restricted Access | 1.27 MB | Adobe PDF |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
