Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/37177
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor林智仁(Chih-Jen Lin)
dc.contributor.authorCheng-Yu Leeen
dc.contributor.author李振宇zh_TW
dc.date.accessioned2021-06-13T15:20:41Z-
dc.date.available2008-08-06
dc.date.copyright2008-08-06
dc.date.issued2008
dc.date.submitted2008-07-22
dc.identifier.citationG. Andrew and J. Gao. Scalable training of L1-regularized log-linear models. In ICML, 2007.
S. Benson and J. J. Mor e. A limited memory variable metric method for bound constrained minimization. Technical report, Argonne Lab., 2001.
D. P. Bertsekas. Nonlinear Programming. Athena Scienti c, Belmont, MA 02178-9998, second edition, 1999.
R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu. A limited memory algorithm for bound constrained optimization. SIAM J. Sci. Statist. Comput., 16:1190-1208,1995.
A. Genkin, D. D. Lewis, and D. Madigan. Large-scale Bayesian logistic regression for text categorization. Technometrics, 49(3):291-304, 2007.
J. Kazama and J. Tsujii. Evaluation and extension of maximum entropy models with inequality constraints. In EMNLP, 2003.
K. Koh, S.-J. Kim, and S. Boyd. An interior-point method for large-scale l1-regularized logistic regression. JMLR, 2007. To appear.
P. Komarek and A. W. Moore. Making logistic regression a core data mining tool.
Technical report, Carnegie Mellon University, 2005.
C.-J. Lin and J. J. Mor e. Newton's method for large-scale bound constrained problems. SIAM Journal on Optimization, 9:1100-1127, 1999.
C.-J. Lin, R. C. Weng, and S. S. Keerthi. Trust region Newton method for large-scale logistic regression. In ICML, 2007.
T. Zhang and F. J. Oles. Text categorization based on regularized linear classification methods. Information Retrieval, 4(1):5-31, 2001.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/37177-
dc.description.abstract邏輯迴歸是一種常被應用在文件分類與計算語言學上的技術。L1 正規化的邏輯迴歸可被視為一種特徵選取的方式,然而它不可微分的特性增加了問題的困難度。近年來有多種最佳化方法被用在解決這個問題上,但這些方法彼此之間卻缺乏嚴謹的比較。在這篇論文之中,我們提出了一種信賴區間牛頓法,並將它與數種已知的最佳化方法比較。實驗結果顯示我們提出的方法並不亞於目前最新的最佳化方法。另一個實驗比較了 L1 與 L2 正規化的邏輯迴歸,結果證實了在達到相似準確度的前提之下,使用 L1 正規化邏輯迴歸可得到比 L2 正規化邏輯迴歸更為稀疏的向量解。zh_TW
dc.description.abstractLarge-scale logistic regression is useful for document classification and computational linguistics. The L1-regularized form can be used for feature selection, but its non-differentiability causes more difficulties in training. Various optimization methods are proposed in recent years, but no serious comparison between them has been made. In this thesis we propose a trust region Newton method and compare several existing methods. Result shows that our method is competitive with some state-of-art L1-regularized logistic regression solvers. To investigate the applicability of L1-regularized logistic regression, we also conduct an experiment to show that compared to L2-regularized logistic regression, a sparser solution is obtained with similar accuracy.en
dc.description.provenanceMade available in DSpace on 2021-06-13T15:20:41Z (GMT). No. of bitstreams: 1
ntu-97-R95922035-1.pdf: 4053175 bytes, checksum: 7f925e0a4a25e60206f38782bf900f20 (MD5)
Previous issue date: 2008
en
dc.description.tableofcontents口試委員審定書 i
中文摘要 ii
ABSTRACT iii
LIST OF FIGURES vi
LIST OF TABLES vii
CHAPTER
I. Introduction 1
II. Review of Existing Methods 5
2.1 Limited memory BFGS 5
2.2 Interior point methods (IPM) 8
2.3 Coordinate Descent 10
III. A Trust Region Newton Method for Large-Scale
L1-Regularized Logistic Regression 15
3.1 The Framework 15
3.2 Cauchy Point 17
3.3 Newton Direction 18
3.4 Discussion and Implementation Issues 20
IV. Experiments 23
4.1 Settings 23
4.2 Issue of Obtaining Sparse Solutions via IPM 24
4.3 Comparison of Dierent Optimization Methods 25
4.4 Comparison of L1-regularized and
L2-regularized Logistic Regression 27
V. Conclusions and Future Work 32
BIBLIOGRAPHY 34
dc.language.isoen
dc.subject邏輯迴歸zh_TW
dc.subject牛頓法zh_TW
dc.subject特徵選取zh_TW
dc.subject最佳化zh_TW
dc.subjectL1正規化zh_TW
dc.subjectFeature selectionen
dc.subjectNewton methoden
dc.subjectL1-regularizeden
dc.subjectOptimizationen
dc.subjectLogistic regressionen
dc.title大規模L1正規化邏輯迴歸最佳化方法之比較zh_TW
dc.titleA Comparison of Optimization Methods for Large-scale L1-regularized Logistic Regressionen
dc.typeThesis
dc.date.schoolyear96-2
dc.description.degree碩士
dc.contributor.oralexamcommittee李育杰(Yuh-Jye Lee),鮑興國(Hsing-Kuo Pao)
dc.subject.keyword邏輯迴歸,最佳化,L1正規化,牛頓法,特徵選取,zh_TW
dc.subject.keywordLogistic regression,Optimization,L1-regularized,Newton method,Feature selection,en
dc.relation.page33
dc.rights.note有償授權
dc.date.accepted2008-07-24
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-97-1.pdf
  未授權公開取用
3.96 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved