Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 管理學院
  3. 資訊管理學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/83251
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor莊裕澤zh_TW
dc.contributor.advisorYuh-Jzer Joungen
dc.contributor.author黃翔岳zh_TW
dc.contributor.authorHsiang-Yueh Huangen
dc.date.accessioned2023-02-01T17:04:54Z-
dc.date.available2023-11-09-
dc.date.copyright2023-02-01-
dc.date.issued2022-
dc.date.submitted2023-01-17-
dc.identifier.citationAhsan, M., Seldon, H. L., & Sayeed, S. (2012). Personal health records: retrieving contextual information with google custom search. Studies in health technology and informatics, 182, 10-18.
Anastasiu, C., Behnke, H., Lück, S., Malesevic, V., Najmi, A., & Poveda-Panter, J. (2021). DeepTitle--Leveraging BERT to generate Search Engine Optimized Headlines. arXiv preprint arXiv:2107.10935.
BACKLINKO. ( October 10, 2021). Google’s 200 Ranking Factors: The Complete List (2022). https://backlinko.com/google-ranking-factors
BACKLINKO. (October 14, 2022). Here's what we learned about organic click through rate. https://backlinko.com/google-ctr-stats
Chen, Q., Lin, J., Zhang, Y., Yang, H., Zhou, J., & Tang, J. (2019). Towards knowledge-based personalized product description generation in e-commerce. Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining,
Chitrakala, S., Moratanch, N., Ramya, B., Revanth Raaj, C., & Divya, B. (2016). Concept-based extractive text summarization using graph modelling and weighted iterative ranking. International Conference on Emerging Research in Computing, Information, Communication and Applications,
Chowdhury, G. G. (2003). Natural language processing. Annual Review of Information Science and Technology, 37(1), 51-89. https://doi.org/https://doi.org/10.1002/aris.1440370103
Cui, Y., Che, W., Liu, T., Qin, B., & Yang, Z. (2021). Pre-training with whole word masking for chinese bert. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 29, 3504-3514.
Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2018). Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805.
El-Kassas, W. S., Salama, C. R., Rafea, A. A., & Mohamed, H. K. (2021). Automatic text summarization: A comprehensive survey. Expert Systems with Applications, 165, 113679.
Gong, Y., Luo, X., Zhu, K. Q., Ou, W., Li, Z., & Duan, L. (2019). Automatic generation of chinese short product titles for mobile display. Proceedings of the AAAI Conference on Artificial Intelligence,
Kemp, S. (February 18, 2020). DIGITAL 2020: TAIWAN. https://datareportal.com/reports/digital-2020-taiwan
Killoran, J. B. (2013). How to use search engine optimization techniques to increase website visibility. IEEE Transactions on professional communication, 56(1), 50-66.
Ledford, J. L. (2015). Search engine optimization bible (Vol. 584). John Wiley & Sons.
Li, H., Yuan, P., Xu, S., Wu, Y., He, X., & Zhou, B. (2020). Aspect-aware multimodal summarization for chinese e-commerce products. Proceedings of the AAAI Conference on Artificial Intelligence,
Lin, C.-Y. (2004). Rouge: A package for automatic evaluation of summaries. Text summarization branches out,
Liu, Y. (2019). Fine-tune BERT for extractive summarization. arXiv preprint arXiv:1903.10318.
Liu, Y., & Lapata, M. (2019). Text summarization with pretrained encoders. arXiv preprint arXiv:1908.08345.
Matošević, G. (2018). Text summarization techniques for meta description generation in process of search engine optimization. Computer Science On-line Conference,
MOZ. (May 14, 2019). How Often Does Google Update Its Algorithm? https://moz.com/blog/how-often-does-google-update-its-algorithm
Nallapati, R., Zhou, B., Gulcehre, C., & Xiang, B. (2016). Abstractive text summarization using sequence-to-sequence rnns and beyond. arXiv preprint arXiv:1602.06023.
Paul Tardy, I. P., Jordi Mas, Sai, Shahbaz Syed, Guillaume Klein, Borgar Lie, Vesko Cholakov, , & Hane Liu, K. J. (July 30, 2021). Rouge. In https://github.com/pltrdy/rouge
Peters, M. E., Neumann, M., Iyyer, M., Gardner, M., Clark, C., Lee, K., & Zettlemoyer, L. (2018, June). Deep Contextualized Word Representations.Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers) New Orleans, Louisiana.
Qiu, X., Sun, T., Xu, Y., Shao, Y., Dai, N., & Huang, X. (2020). Pre-trained models for natural language processing: A survey. Science China Technological Sciences, 63(10), 1872-1897.
Radford, A., Narasimhan, K., Salimans, T., & Sutskever, I. (2018). Improving language understanding by generative pre-training.
Rognerud, J. (2008). Ultimate guide to search engine optimization: drive traffic, boost conversion rates and make lots of money. Jon Rognerud SEO.
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł., & Polosukhin, I. (2017). Attention is all you need. Advances in neural information processing systems, 30.
Wang, J., Tian, J., Qiu, L., Li, S., Lang, J., Si, L., & Lan, M. (2018). A multi-task learning approach for improving product title compression with user search log data. Proceedings of the AAAI Conference on Artificial Intelligence,
Wang, S., Zhao, X., Li, B., Ge, B., & Tang, D. (2017). Integrating extractive and abstractive models for long text summarization. 2017 IEEE International Congress on Big Data (BigData Congress),
Wu, Y., Schuster, M., Chen, Z., Le, Q. V., Norouzi, M., Macherey, W., Krikun, M., Cao, Y., Gao, Q., & Macherey, K. (2016). Google's neural machine translation system: Bridging the gap between human and machine translation. arXiv preprint arXiv:1609.08144.
Zhu, C., & Wu, G. (2011). Research and analysis of search engine optimization factors based on reverse engineeing. 2011 Third International Conference on Multimedia Information Networking and Security,
未來流通. (民國100年8月17日). 【商業數據圖解】2020台灣「零售&電商」產業市佔率英雄榜. https://www.mirai.com.tw/2020-taiwan-retail-ec-market-share-analysis/
未來流通. (民國109年12月14日). 【商業數據圖解】2020台灣主要零售業別商品結構基因圖譜. https://www.mirai.com.tw/2021-taiwan-retail-industry-commodity-composition-analysis/
財團法人台灣網路資訊中心. (無日期). 2020台灣網路報告. https://report.twnic.tw/2020/
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/83251-
dc.description.abstract隨著網際網路的發展,各類產品的銷售管道已不侷限於實體店面販賣,透過電子商務,商家能接觸到比實體店面更多的顧客。隨著PCHOME、淘寶等電子商務平臺的出現,企業或一般民眾都能在平臺上販賣產品。由於同質性產品間競爭激烈,商家為了提升產品的曝光度進而增加電商轉換率,常會針對電子商務平臺設計搜尋引擎最佳化的標題,此類標題通常會以簡潔有力的方式表達產品的特色並吸引消費者目光,以及增加產品被搜索機會的字詞,其目的皆為使產品能在搜尋結果頁中占據較好的排名。但電子商務產品種類繁多,以人力撰寫搜尋引擎最佳化標題較為繁雜,且不同種類的產品標題有其偏重關注的特色。為解決上述問題,本研究將標題生成任務類比為文字摘要生成任務,使用深度學習技術實作一個搜尋引擎最佳化標題生成系統,使用者輸入產品敘述文案後,系統即可生成適合該產品的搜尋引擎最佳化標題。本研究使用BERTSUM預訓練模型,並以TaoDescribe商品敘述資料集訓練系統自產品敘述文案生成搜尋引擎最佳化標題的能力。而在最後的實驗結果中,本系統在自動評估上與其他應用於不同任務的模型有著相當的表現。在搜尋引擎最佳化方面,本系統的生成標題在搜尋結果頁的排名上與原始標題表現相當,且針對不同種類的產品皆可生成符合該產品類別特性的標題。zh_TW
dc.description.abstractWith the development of the Internet, the sales pipeline of various products is no longer limited to physical stores. Through e-commerce, merchants can reach more customers than physical stores. With the emergence of e-commerce platforms such as PCHOME and Taobao, enterprises or ordinary people can sell products on the platform. Due to the fierce competition among homogeneous products, e-commerce merchants often use search engine optimization (SEO) techniques to improve product titles to enhance product exposure and increase conversion rates. These titles are usually simple and powerful in expressing the product's features and attracting consumers' attention. They also tend to use words that can help product to be searched by search engines. However, there are many different types of e-commerce products, and writing SEO titles is quite a challenge to human. Besides, different types of product titles have different characteristics that favor search engine attention. In order to solve this problem, our study analogizes title generation task to text summary generation task, and uses deep learning technology to implement a SEO title generation system. We use BERTSUM pre-training model and TaoDescribe product description dataset training system to generate SEO titles from product description. The results show that our system performs comparably to other models applied to different tasks in terms of automatic evaluation. In terms of search engine optimization, our system generates headlines that are ranked by search engines similarly to the ranks of the original titles. The system also performs well for different types of products.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-02-01T17:04:54Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2023-02-01T17:04:54Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents誌謝 i
口試委員會審定書 ii
中文摘要 iii
ABSTRACT iv
目錄 v
圖目錄 vii
表目錄 viii
第一章、 緒論 1
1.1研究背景與動機 1
1.2 研究目的 4
1.3論文架構 4
第二章、 文獻探討 5
2.1 搜尋引擎最佳化 5
2.2文本摘要生成 7
2.3自然語言處理與預訓練模型 9
2.4評估方法 11
2.5總結 13
第三章、 研究方法 14
3.1研究流程 14
3.2資料集 15
3.3 BERTSUM與transformer decoder模型 15
第四章、 研究成果 16
4.1 實驗軟硬體設置 16
4.2 自動文本摘要實驗結果 16
4.3 生成標題實驗結果 17
4.3.1 實驗結果一:「服飾鞋襪」產品—牛仔褲 18
4.3.2 實驗結果二:「食品」產品—麵包 20
4.3.3 實驗結果三:「家具&雜貨」產品—沙發 23
4.3.4 實驗結果四:「電腦&周邊」產品—耳機 25
4.3.5實驗結果五:「彩妝保養」產品—口紅 28
4.4 搜尋引擎最佳化實驗設計與實驗結果 30
4.4.1 實驗設計 30
4.4.2 Google Programmable Search Engine程式化搜尋引擎 31
4.4.3實驗流程 32
4.4.4搜尋引擎最佳化實驗結果 35
4.4.5 其他模型的實驗結果表現 42
第五章、 結論 47
5.1 研究成果 47
5.2 研究貢獻 48
5.3 研究限制 48
5.4 未來研究方向 49
Reference 50
-
dc.language.isozh_TW-
dc.subject預訓練模型zh_TW
dc.subject文本摘要生成zh_TW
dc.subject電子商務zh_TW
dc.subject搜尋引擎最佳化zh_TW
dc.subject深度學習zh_TW
dc.subjectSearch Engine Optimizationen
dc.subjectText Summarizationen
dc.subjectPre-trained Modelen
dc.subjectDeep Learningen
dc.subjectE-commerceen
dc.title基於BERTSUM的搜尋引擎最佳化商品標題生成模型zh_TW
dc.titleSearch Engine Optimization For Product Title Generation Model Based On BERTSUMen
dc.title.alternativeSearch Engine Optimization For Product Title Generation Model Based On BERTSUM-
dc.typeThesis-
dc.date.schoolyear111-1-
dc.description.degree碩士-
dc.contributor.oralexamcommittee陳建錦;黃俊堯;盧信銘zh_TW
dc.contributor.oralexamcommitteeChien-Chin Chen;Chun-Yao Huang;Hsin-Min Luen
dc.subject.keyword文本摘要生成,預訓練模型,深度學習,電子商務,搜尋引擎最佳化,zh_TW
dc.subject.keywordText Summarization,Pre-trained Model,Deep Learning,E-commerce,Search Engine Optimization,en
dc.relation.page52-
dc.identifier.doi10.6342/NTU202300134-
dc.rights.note同意授權(全球公開)-
dc.date.accepted2023-01-18-
dc.contributor.author-college管理學院-
dc.contributor.author-dept資訊管理學系-
顯示於系所單位:資訊管理學系

文件中的檔案:
檔案 大小格式 
U0001-0518230116585013.pdf1.58 MBAdobe PDF檢視/開啟
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved