請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/37177完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 林智仁(Chih-Jen Lin) | |
| dc.contributor.author | Cheng-Yu Lee | en |
| dc.contributor.author | 李振宇 | zh_TW |
| dc.date.accessioned | 2021-06-13T15:20:41Z | - |
| dc.date.available | 2008-08-06 | |
| dc.date.copyright | 2008-08-06 | |
| dc.date.issued | 2008 | |
| dc.date.submitted | 2008-07-22 | |
| dc.identifier.citation | G. Andrew and J. Gao. Scalable training of L1-regularized log-linear models. In ICML, 2007.
S. Benson and J. J. Mor e. A limited memory variable metric method for bound constrained minimization. Technical report, Argonne Lab., 2001. D. P. Bertsekas. Nonlinear Programming. Athena Scienti c, Belmont, MA 02178-9998, second edition, 1999. R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu. A limited memory algorithm for bound constrained optimization. SIAM J. Sci. Statist. Comput., 16:1190-1208,1995. A. Genkin, D. D. Lewis, and D. Madigan. Large-scale Bayesian logistic regression for text categorization. Technometrics, 49(3):291-304, 2007. J. Kazama and J. Tsujii. Evaluation and extension of maximum entropy models with inequality constraints. In EMNLP, 2003. K. Koh, S.-J. Kim, and S. Boyd. An interior-point method for large-scale l1-regularized logistic regression. JMLR, 2007. To appear. P. Komarek and A. W. Moore. Making logistic regression a core data mining tool. Technical report, Carnegie Mellon University, 2005. C.-J. Lin and J. J. Mor e. Newton's method for large-scale bound constrained problems. SIAM Journal on Optimization, 9:1100-1127, 1999. C.-J. Lin, R. C. Weng, and S. S. Keerthi. Trust region Newton method for large-scale logistic regression. In ICML, 2007. T. Zhang and F. J. Oles. Text categorization based on regularized linear classification methods. Information Retrieval, 4(1):5-31, 2001. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/37177 | - |
| dc.description.abstract | 邏輯迴歸是一種常被應用在文件分類與計算語言學上的技術。L1 正規化的邏輯迴歸可被視為一種特徵選取的方式,然而它不可微分的特性增加了問題的困難度。近年來有多種最佳化方法被用在解決這個問題上,但這些方法彼此之間卻缺乏嚴謹的比較。在這篇論文之中,我們提出了一種信賴區間牛頓法,並將它與數種已知的最佳化方法比較。實驗結果顯示我們提出的方法並不亞於目前最新的最佳化方法。另一個實驗比較了 L1 與 L2 正規化的邏輯迴歸,結果證實了在達到相似準確度的前提之下,使用 L1 正規化邏輯迴歸可得到比 L2 正規化邏輯迴歸更為稀疏的向量解。 | zh_TW |
| dc.description.abstract | Large-scale logistic regression is useful for document classification and computational linguistics. The L1-regularized form can be used for feature selection, but its non-differentiability causes more difficulties in training. Various optimization methods are proposed in recent years, but no serious comparison between them has been made. In this thesis we propose a trust region Newton method and compare several existing methods. Result shows that our method is competitive with some state-of-art L1-regularized logistic regression solvers. To investigate the applicability of L1-regularized logistic regression, we also conduct an experiment to show that compared to L2-regularized logistic regression, a sparser solution is obtained with similar accuracy. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-13T15:20:41Z (GMT). No. of bitstreams: 1 ntu-97-R95922035-1.pdf: 4053175 bytes, checksum: 7f925e0a4a25e60206f38782bf900f20 (MD5) Previous issue date: 2008 | en |
| dc.description.tableofcontents | 口試委員審定書 i
中文摘要 ii ABSTRACT iii LIST OF FIGURES vi LIST OF TABLES vii CHAPTER I. Introduction 1 II. Review of Existing Methods 5 2.1 Limited memory BFGS 5 2.2 Interior point methods (IPM) 8 2.3 Coordinate Descent 10 III. A Trust Region Newton Method for Large-Scale L1-Regularized Logistic Regression 15 3.1 The Framework 15 3.2 Cauchy Point 17 3.3 Newton Direction 18 3.4 Discussion and Implementation Issues 20 IV. Experiments 23 4.1 Settings 23 4.2 Issue of Obtaining Sparse Solutions via IPM 24 4.3 Comparison of Dierent Optimization Methods 25 4.4 Comparison of L1-regularized and L2-regularized Logistic Regression 27 V. Conclusions and Future Work 32 BIBLIOGRAPHY 34 | |
| dc.language.iso | en | |
| dc.subject | 邏輯迴歸 | zh_TW |
| dc.subject | 牛頓法 | zh_TW |
| dc.subject | 特徵選取 | zh_TW |
| dc.subject | 最佳化 | zh_TW |
| dc.subject | L1正規化 | zh_TW |
| dc.subject | Feature selection | en |
| dc.subject | Newton method | en |
| dc.subject | L1-regularized | en |
| dc.subject | Optimization | en |
| dc.subject | Logistic regression | en |
| dc.title | 大規模L1正規化邏輯迴歸最佳化方法之比較 | zh_TW |
| dc.title | A Comparison of Optimization Methods for Large-scale L1-regularized Logistic Regression | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 96-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 李育杰(Yuh-Jye Lee),鮑興國(Hsing-Kuo Pao) | |
| dc.subject.keyword | 邏輯迴歸,最佳化,L1正規化,牛頓法,特徵選取, | zh_TW |
| dc.subject.keyword | Logistic regression,Optimization,L1-regularized,Newton method,Feature selection, | en |
| dc.relation.page | 33 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2008-07-24 | |
| dc.contributor.author-college | 電機資訊學院 | zh_TW |
| dc.contributor.author-dept | 資訊工程學研究所 | zh_TW |
| 顯示於系所單位: | 資訊工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-97-1.pdf 未授權公開取用 | 3.96 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
