請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/94376
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 莊裕澤 | zh_TW |
dc.contributor.advisor | Yuh-Jzer Joung | en |
dc.contributor.author | 賴彥良 | zh_TW |
dc.contributor.author | Yen-Liang Lai | en |
dc.date.accessioned | 2024-08-15T17:09:16Z | - |
dc.date.available | 2024-08-16 | - |
dc.date.copyright | 2024-08-15 | - |
dc.date.issued | 2024 | - |
dc.date.submitted | 2024-08-06 | - |
dc.identifier.citation | Bachman, P., Hjelm, R. D., & Buchwalter, W. (2019). Learning representations by maximizing mutual information across views. Advances in neural information processing systems, 32.
Branco, P., Torgo, L., & Ribeiro, R. P. (2017). Smogn: A pre-processing approach for imbalanced regression. First international workshop on learning with imbalanced domains: Theory and applications, 36–50. Cao, K., Wei, C., Gaidon, A., Arechiga, N., & Ma, T. (2019). Learning imbalanced datasets with label-distribution-aware margin loss. Advances in neural information processing systems, 32. Chawla, N. V., Bowyer, K. W., Hall, L. O., & Kegelmeyer, W. P. (2002). Smote: Synthetic minority over-sampling technique. Journal of artificial intelligence research, 16, 321–357. Chen, T., Kornblith, S., Norouzi, M., & Hinton, G. (2020). A simple framework for contrastive learning of visual representations. International conference on machine learning, 1597–1607. Cui, Y., Jia, M., Lin, T.-Y., Song, Y., & Belongie, S. (2019). Class-balanced loss based on effective number of samples. Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 9268–9277. Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019, June). BERT: Pre-training of deep bidirectional transformers for language understanding. In J. Burstein, C. Doran, & T. Solorio (Eds.), Proceedings of the 2019 conference of the north American chapter of the association for computational linguistics: Human language technologies, volume 1 (long and short papers) (pp. 4171–4186). Association for Computational Linguistics. https://doi.org/10.18653/v1/N19-1423 Ding, K., Wang, R., & Wang, S. (2019). Social media popularity prediction: A multiple feature fusion approach with deep neural networks. Proceedings of the 27th ACM International Conference on Multimedia, 2682–2686. Dong, Q., Gong, S., & Zhu, X. (2018). Imbalanced deep learning by minority class incremental rectification. IEEE transactions on pattern analysis and machine intelligence, 41(6), 1367–1381. Dosovitskiy, A., Beyer, L., Kolesnikov, A., Weissenborn, D., Zhai, X., Unterthiner, T., Dehghani, M., Minderer, M., Heigold, G., Gelly, S., Uszkoreit, J., & Houlsby, N. (2021). An image is worth 16x16 words: Transformers for image recognition at scale. International Conference on Learning Representations. https://openreview.net/forum?id=YicbFdNTTy Freberg, K., Graham, K., McGaughey, K., & Freberg, L. A. (2010). Who are the social media influencers. A study of public perceptions of personality. Public Relations Review, G model Pubrel-861, 3. Gao, T., Yao, X., & Chen, D. (2021, November). SimCSE: Simple contrastive learning of sentence embeddings. In M.-F. Moens, X. Huang, L. Specia, & S. W.-t. Yih (Eds.), Proceedings of the 2021 conference on empirical methods in natural language processing (pp. 6894–6910). Association for Computational Linguistics. https://doi.org/10.18653/v1/2021.emnlp-main.552 Gelli, F., Uricchio, T., Bertini, M., Del Bimbo, A., & Chang, S.-F. (2015). Image popularity prediction in social media using sentiment and context features. Proceedings of the 23rd ACM international conference on Multimedia, 907–910. Gong, Y., Mori, G., & Tung, F. (2022). RankSim: Ranking similarity regularization for deep imbalanced regression. International Conference on Machine Learning (ICML). Haixiang, G., Yijing, L., Shang, J., Mingyun, G., Yuanyue, H., & Bing, G. (2017). Learning from class-imbalanced data: Review of methods and applications. Expert systems with applications, 73, 220–239. Hayes, N. (2008). Influencer marketing: Who really influences your customers?. Taylor & francis. He, H., & Ma, Y. (2013). Imbalanced learning: Foundations, algorithms, and applications. He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. Proceedings of the IEEE conference on computer vision and pattern recognition, 770–778. Hsu, C.-C., Kang, L.-W., Lee, C.-Y., Lee, J.-Y., Zhang, Z.-X., & Wu, S.-M. (2019). Popularity prediction of social media based on multi-modal feature mining. Proceedings of the 27th ACM International Conference on Multimedia, 2687–2691. Jin, X., Gallagher, A., Cao, L., Luo, J., & Han, J. (2010). The wisdom of social multimedia: Using flickr for prediction and forecast. Proceedings of the 18th ACM international conference on Multimedia, 1235–1244. Keramati, M., Meng, L., & Evans, R. D. (2023). Conr: Contrastive regularizer for deep imbalanced regression. CoRR, abs/2309.06651. https://doi.org/10.48550/ARXIV.2309.06651 Khosla, P., Teterwak, P., Wang, C., Sarna, A., Tian, Y., Isola, P., Maschinot, A., Liu, C., & Krishnan, D. (2020). Supervised contrastive learning. Advances in neural information processing systems, 33, 18661–18673. Kim, S., Jiang, J.-Y., Nakada, M., Han, J., & Wang, W. (2020). Multimodal post attentive profiling for influencer marketing. Proceedings of The Web Conference 2020, 2878–2884. Lai, X., Zhang, Y., & Zhang, W. (2020). Hyfea: Winning solution to social media popularity prediction for multimedia grand challenge 2020. Proceedings of the 28th ACM International Conference on Multimedia, 4565–4569. McInnes, L., Healy, J., Saul, N., & Großberger, L. (2018). Umap: Uniform manifold approximation and projection. Journal of Open Source Software, 3(29), 861. https://doi.org/10.21105/joss.00861 McParlane, P. J., Moshfeghi, Y., & Jose, J. M. (2014). "nobody comes here anymore, it’s too crowded"; predicting image popularity on flickr. Proceedings of international conference on multimedia retrieval, 385–391. Mikolov, T., Chen, K., Corrado, G., & Dean, J. (2013). Efficient estimation of word representations in vector space. In Y. Bengio & Y. LeCun (Eds.), 1st international conference on learning representations, ICLR 2013, scottsdale, arizona, usa, may 2-4, 2013, workshop track proceedings. http://arxiv.org/abs/1301.3781 Perozzi, B., Al-Rfou, R., & Skiena, S. (2014). Deepwalk: Online learning of social representations. Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining, 701–710. Radford, A., Kim, J. W., Hallacy, C., Ramesh, A., Goh, G., Agarwal, S., Sastry, G., Askell, A., Mishkin, P., Clark, J., et al. (2021). Learning transferable visual models from natural language supervision. International conference on machine learning, 8748–8763. Reimers, N., & Gurevych, I. (2019). Sentence-bert: Sentence embeddings using siamese bert-networks. In K. Inui, J. Jiang, V. Ng, & X. Wan (Eds.), Emnlp/ijcnlp (1) (pp. 3980–3990). Association for Computational Linguistics. http://dblp.uni-trier.de/db/conf/emnlp/emnlp2019-1.html#ReimersG19 Rolínek, M., Musil, V., Paulus, A., Vlastelica, M., Michaelis, C., & Martius, G. (2020). Optimizing rank-based metrics with blackbox differentiation. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 7620–7630. Tian, Y., Sun, C., Poole, B., Krishnan, D., Schmid, C., & Isola, P. (2020). What makes for good views for contrastive learning? Advances in neural information processing systems, 33, 6827–6839. Torgo, L., Branco, P., Ribeiro, R. P., & Pfahringer, B. (2015). Resampling strategies for regression. Expert systems, 32(3), 465–476. Torgo, L., Ribeiro, R. P., Pfahringer, B., & Branco, P. (2013). Smote for regression. Portuguese conference on artificial intelligence, 378–389. Wang, Y., Jiang, Y., Li, J., Ni, B., Dai, W., Li, C., Xiong, H., & Li, T. (2022). Contrastive regression for domain adaptation on gaze estimation. Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 19376–19385. Wu, B., Cheng, W.-H., Liu, P., Liu, B., Zeng, Z., & Luo, J. (2019). SMP challenge: An overview of social media prediction challenge 2019. Proceedings of the 27th ACM International Conference on Multimedia. Wu, B., Cheng, W.-H., Zhang, Y., Qiushi, H., Jintao, L., & Mei, T. (2017). Sequential prediction of social media popularity with deep temporal context networks. International Joint Conference on Artificial Intelligence (IJCAI). Wu, B., Mei, T., Cheng, W.-H., & Zhang, Y. (2016). Unfolding temporal dynamics: Predicting social media popularity using multi-scale temporal decomposition. Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI). Wu, J., Zhao, L., Li, D., Xie, C.-W., Sun, S., & Zheng, Y. (2022). Deeply exploit visual and language information for social media popularity prediction. Proceedings of the 30th ACM International Conference on Multimedia, 7045–7049. Xu, K., Lin, Z., Zhao, J., Shi, P., Deng, W., & Wang, H. (2020). Multimodal deep learning for social media popularity prediction with attention mechanism. Proceedings of the 28th ACM International Conference on Multimedia, 4580–4584. Yang, X., Kim, S., & Sun, Y. (2019). How do influencers mention brands in social media? Sponsorship prediction of Instagram posts. Proceedings of the 2019 IEEE/ACM international conference on advances in social networks analysis and mining, 101–104. Yang, Y., Lv, H., & Chen, N. (2023). A survey on ensemble learning under the era of deep learning. Artificial Intelligence Review, 56(6), 5545–5589. Yang, Y. (2021). Strategies and tactics for regression on imbalanced data [accessed June 22, 2024]. https://towardsdatascience.com/strategies-and-tactics-for-regression-on-imbalanced-data-61eeb0921fca Yang, Y., Zha, K., Chen, Y., Wang, H., & Katabi, D. (2021). Delving into deep imbalanced regression. International conference on machine learning, 11842–11851. Yen, S., & Lee, Y. (2006). Under-sampling approaches for improving prediction of the minority class in an imbalanced dataset. Lecture notes in control and information sciences, 344, 731. Zha, K., Cao, P., Son, J., Yang, Y., & Katabi, D. (2024). Rank-n-contrast: Learning continuous representations for regression. Advances in Neural Information Processing Systems, 36. | - |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/94376 | - |
dc.description.abstract | 社群貼文的熱門程度往往反映受眾對於內容的喜愛程度,社群網紅或廠商可透過觀察貼文的按讚數變化,制定更有效的行銷策略,進而提升社群行銷的成效。因此,如何準確預測社群貼文的熱門度是一大關鍵。然而,現實世界的社群媒體資料具備不平衡的特性,極冷門與極熱門的貼文往往只具備很少的資料量,造成預測熱門度時的失準。有鑒於近年來對比學習在特徵學習的成功,以及將對比學習概念引進回歸任務的新興趨勢,本研究提出加權排序對比迴歸(Weighted-Rank Contrastive Regression) 損失函數,以解決真實世界的回歸問題中數據不平衡的問題。我們在社群媒體熱門度預測資料集 (B. Wu et al., 2019) 上進行實驗,實驗結果顯示我們的方法優於傳統方法 (僅以 L1 損失函數進行擬合) 和當前的最先進的對比迴歸方法 Rank-N-Contrast (Zha et al., 2024),尤其在處理高熱門貼文、缺乏負樣本數的異常值方面表現更佳。本研究所提出的加權排序對比迴歸損失函數不僅解決了社群媒體熱門度預測中的數據不平衡問題,更提供了一種可泛化的特徵學習方法,可推廣至其他任意的不平衡迴歸任務中。 | zh_TW |
dc.description.abstract | Social Media Popularity Prediction (SMPP) is the task of forecasting the level of engagement a social media post will receive. In SMPP, it is crucial for understanding audience engagement and enabling targeted marketing strategies. The popularity of social media posts often reflects the audience’s preference for the content. Social media influencers or brands can design more effective marketing strategies by observing the changes in the number of likes on posts, thereby enhancing the effectiveness of social media marketing. However, the inherent imbalance in real-world social media data, where certain popularity levels are underrepresented, posed a significant challenge. In this study, we leveraged the recent success of contrastive learning and its growing integration into regression tasks, and introduced a weighted-rank contrastive regression loss to counteract the data imbalance challenges. Experiments on the Social Media Prediction Dataset (B. Wu et al., 2019) demonstrated that our method outperformed the vanilla approach (solely fit on L1 loss) and the current state of the art (SOTA) contrastive regression approach Rank-N-Contrast (Zha et al., 2024) , especially for challenging outliers with high popularity and few negative counterparts. The proposed weighted-rank contrastive regression loss not only addressed the inherent data imbalance in SMPP but also offered a robust representation learning solution that could be generalized to other real-world imbalanced regression tasks. | en |
dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-08-15T17:09:16Z No. of bitstreams: 0 | en |
dc.description.provenance | Made available in DSpace on 2024-08-15T17:09:16Z (GMT). No. of bitstreams: 0 | en |
dc.description.tableofcontents | Acknowledgements i
摘要 ii Abstract iii Contents v List of Figures vii List of Tables ix Chapter 1 Introduction 1 1.1 Background 1 1.2 Research Motivation and Objectives 4 Chapter 2 Literature Review 5 2.1 Social Media Popularity Prediction 6 2.1.1 Manually Preprocessed Features 6 2.1.2 Pretrained Models 7 2.2 Overcome the Imbalance Regression 8 2.2.1 Data Imbalance 10 2.2.2 Deep Imbalance Regression 12 2.2.3 Contrastive Regression 18 2.3 Summary 23 Chapter 3 Methodology 24 3.1 Problem Definition 24 3.2 Overview of Our Proposed Framework 24 3.3 Post Representation Extraction 25 3.4 Dense Features 26 3.5 Weighted-Rank Contrastive Regression 26 Chapter 4 Empirical Experiments 30 4.1 Dataset 30 4.2 Evaluation Metrics 30 4.3 Hyperparameters and Experimental Settings 31 4.3.1 Hyperparameters 31 4.3.2 Experimental Settings 32 4.4 Evaluation Results 32 4.5 Additional Experiments 34 4.5.1 Curated Dataset 34 Chapter 5 Conclusion 37 5.1 Conclusion 37 5.2 Limitations and Future Works 38 References 40 Appendix A — Qualitative Visualization 46 A.1 Feature Similarity 46 A.2 Feature Visualization 47 | - |
dc.language.iso | en | - |
dc.title | 加權排序對比回歸學習用於資料不平衡之社群媒體熱門度預測 | zh_TW |
dc.title | Weighted-Rank Contrastive Regression for Robust Learning on Imbalance Social Media Popularity Prediction | en |
dc.type | Thesis | - |
dc.date.schoolyear | 112-2 | - |
dc.description.degree | 碩士 | - |
dc.contributor.oralexamcommittee | 陳建錦;楊立偉;陳以錚 | zh_TW |
dc.contributor.oralexamcommittee | Chien-Chin Chen;Li-Wei Yang;Yi-Cheng Chen | en |
dc.subject.keyword | 社群媒體,熱門度預測,對比學習,不平衡回歸,網紅行銷, | zh_TW |
dc.subject.keyword | Social Media,Popularity Prediction,Contrastive Learning,Imbalance Regression,Influencer Marketing, | en |
dc.relation.page | 48 | - |
dc.identifier.doi | 10.6342/NTU202403142 | - |
dc.rights.note | 同意授權(全球公開) | - |
dc.date.accepted | 2024-08-09 | - |
dc.contributor.author-college | 管理學院 | - |
dc.contributor.author-dept | 資訊管理學系 | - |
顯示於系所單位: | 資訊管理學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-112-2.pdf | 5.98 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。