Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 電信工程學研究所
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/91509
Full metadata record
???org.dspace.app.webui.jsptag.ItemTag.dcfield???ValueLanguage
dc.contributor.advisor陳銘憲zh_TW
dc.contributor.advisorMing-Syan Chenen
dc.contributor.author鄭淑綾zh_TW
dc.contributor.authorShu-Ling Chengen
dc.date.accessioned2024-01-28T16:19:09Z-
dc.date.available2024-01-29-
dc.date.copyright2024-01-27-
dc.date.issued2023-
dc.date.submitted2023-08-10-
dc.identifier.citation[1] Durmus Alp Emre Acar, Yue Zhao, Ruizhao Zhu, Ramon Matas, Matthew Mattina, Paul Whatmough, and Venkatesh Saligrama. Debiasing model updates for improving personalized federated training. In International Conference on Machine Learning, pages 21–31. PMLR, 2021.
[2] David Arthur and Sergei Vassilvitskii. k-means++: the advantages of careful seeding. In ACM-SIAM Symposium on Discrete Algorithms, 2007.
[3] Hongyan Chang and Reza Shokri. Bias propagation in federated learning. In The Eleventh International Conference on Learning Representations, 2023.
[4] Junming Chen, Meirui Jiang, Qi Dou, and Qifeng Chen. Federated domain generalization for image recognition via cross-client style transfer. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pages 361–370, 2023.
[5] Liam Collins, Hamed Hassani, Aryan Mokhtari, and Sanjay Shakkottai. Exploiting shared representations for personalized federated learning. In International Conference on Machine Learning, pages 2089–2099. PMLR, 2021.
[6] Yin Cui, Menglin Jia, Tsung-Yi Lin, Yang Song, and Serge Belongie. Class-balanced loss based on effective number of samples. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 9268–9277, 2019.
[7] Ittai Dayan, Holger R Roth, Aoxiao Zhong, Ahmed Harouni, Amilcare Gentili, Anas Z Abidin, Andrew Liu, Anthony Beardsworth Costa, Bradford J Wood, ChienSung Tsai, et al. Federated learning for predicting clinical outcomes in patients with covid-19. Nature medicine, 27(10):1735–1743, 2021.
[8] Jia Deng, Wei Dong, Richard Socher, Li-Jia Li, Kai Li, and Li Fei-Fei. Imagenet: A large-scale hierarchical image database. In 2009 IEEE Conference on Computer Vision and Pattern Recognition, pages 248–255, 2009.
[9] Youlong Ding, Xueyang Wu, Zhitao Li, Zeheng Wu, Shengqi Tan, Qian Xu, Weike Pan, and Qiang Yang. An efficient industrial federated learning framework for aiot: a face recognition application. arXiv preprint arXiv:2206.13398, 2022.
[10] Alexey Dosovitskiy, Lucas Beyer, Alexander Kolesnikov, Dirk Weissenborn, Xiaohua Zhai, Thomas Unterthiner, Mostafa Dehghani, Matthias Minderer, Georg Heigold, Sylvain Gelly, et al. An image is worth 16 x 16 words: Transformers for image recognition at scale. In International Conference on Learning Representations, 2021.
[11] Yahya H Ezzeldin, Shen Yan, Chaoyang He, Emilio Ferrara, and Salman Avestimehr. Fairfed: Enabling group fairness in federated learning. 2023.
[12] Pierre Foret, Ariel Kleiner, Hossein Mobahi, and Behnam Neyshabur. Sharpness-aware minimization for efficiently improving generalization. arXiv preprint arXiv:2010.01412, 2020.
[13] Andrew Hard, Chloé M Kiddon, Daniel Ramage, Francoise Beaufays, Hubert Eichner, Kanishka Rao, Rajiv Mathews, and Sean Augenstein. Federated learning for mobile keyboard prediction, 2018.
[14] Ali Hatamizadeh, Hongxu Yin, Holger R Roth, Wenqi Li, Jan Kautz, Daguang Xu, and Pavlo Molchanov. Gradvit: Gradient inversion of vision transformers. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 10021–10030, 2022.
[15] Laurent Jacob, Jean-philippe Vert, and Francis Bach. Clustered multi-task learning: A convex formulation. Advances in neural information processing systems, 21, 2008.
[16] Menglin Jia, Luming Tang, Bor-Chun Chen, Claire Cardie, Serge Belongie, Bharath Hariharan, and Ser-Nam Lim. Visual prompt tuning. In Computer Vision–ECCV 2022: 17th European Conference, Tel Aviv, Israel, October 23–27, 2022, Proceedings, Part XXXIII, pages 709–727. Springer, 2022.
[17] Meirui Jiang, Zirui Wang, and Qi Dou. Harmofl: Harmonizing local and global drifts in federated learning on heterogeneous medical images. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pages 1087–1095, 2022.
[18] Sai Praneeth Karimireddy, Satyen Kale, Mehryar Mohri, Sashank Reddi, Sebastian Stich, and Ananda Theertha Suresh. Scaffold: Stochastic controlled averaging for federated learning. In International Conference on Machine Learning, pages 5132–5143. PMLR, 2020.
[19] Qinbin Li, Yiqun Diao, Quan Chen, and Bingsheng He. Federated learning on noniid data silos: An experimental study. In 2022 IEEE 38th International Conference on Data Engineering (ICDE), pages 965–978. IEEE, 2022.
[20] Qinbin Li, Bingsheng He, and Dawn Song. Model-contrastive federated learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 10713–10722, 2021.
[21] Tian Li, Ahmad Beirami, Maziar Sanjabi, and Virginia Smith. Tilted empirical risk minimization. In International Conference on Learning Representations, 2021.
[22] Tian Li, Maziar Sanjabi, Ahmad Beirami, and Virginia Smith. Fair resource allocation in federated learning. In International Conference on Learning Representations, 2020.
[23] Xiang Li, Kaixuan Huang, Wenhao Yang, Shusen Wang, and Zhihua Zhang. On the convergence of fedavg on non-iid data. arXiv preprint arXiv:1907.02189, 2019.
[24] Xiaoxiao Li, Meirui JIANG, Xiaofei Zhang, Michael Kamp, and Qi Dou. FedBN: Federated learning on non-IID features via local batch normalization. In International Conference on Learning Representations, 2021.
[25] Paul Pu Liang, Terrance Liu, Liu Ziyin, Nicholas B Allen, Randy P Auerbach, David Brent, Ruslan Salakhutdinov, and Louis-Philippe Morency. Think locally, act globally: Federated learning with local and global representations. arXiv preprint arXiv:2001.01523, 2020.
[26] Wang Lu, Jindong Wang, Yiqiang Chen, Xin Qin, Renjun Xu, Dimitrios Dimitriadis, and Tao Qin. Personalized federated learning with adaptive batchnorm for healthcare. IEEE Transactions on Big Data, 2022.
[27] Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Aguera y Arcas. Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics, pages 1273–1282. PMLR, 2017.
[28] Mehryar Mohri, Gary Sivek, and Ananda Theertha Suresh. Agnostic federated learning. In International Conference on Machine Learning, pages 4615–4625. PMLR, 2019.
[29] Chao Pan, Jin Sima, Saurav Prakash, Vishal Rana, and Olgica Milenkovic. Machine unlearning of federated clusters. In The Eleventh International Conference on Learning Representations, 2023.
[30] Xingchao Peng, Qinxun Bai, Xide Xia, Zijun Huang, Kate Saenko, and Bo Wang. Moment matching for multi-source domain adaptation. In Proceedings of the IEEE/ CVF international conference on computer vision, pages 1406–1415, 2019.
[31] Zhe Qu, Xingyu Li, Rui Duan, Yao Liu, Bo Tang, and Zhuo Lu. Generalized federated learning via sharpness aware minimization. In International Conference on Machine Learning, pages 18250–18280. PMLR, 2022.
[32] Saquib Sarfraz, Vivek Sharma, and Rainer Stiefelhagen. Efficient parameter-free clustering using first neighbor relations. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 8934–8943, 2019.
[33] Xian Shuai, Yulin Shen, Siyang Jiang, Zhihe Zhao, Zhenyu Yan, and Guoliang Xing. Balancefl: Addressing class imbalance in long-tail federated learning. In 2022 21st ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN), pages 271–284, 2022.
[34] Yue Tan, Guodong Long, Lu Liu, Tianyi Zhou, Qinghua Lu, Jing Jiang, and Chengqi Zhang. Fedproto: Federated prototype learning across heterogeneous clients. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pages 8432–8440, 2022.
[35] Yue Tan, Guodong Long, Jie Ma, Lu Liu, Tianyi Zhou, and Jing Jiang. Federated learning from pre-trained models: A contrastive learning approach. In NeurIPS, 2022.
[36] Lixu Wang, Shichao Xu, Xiao Wang, and Qi Zhu. Addressing class imbalance in federated learning. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 10165–10173, 2021.
[37] Lei Xu and Michael I. Jordan. On convergence properties of the em algorithm for gaussian mixtures. Neural Computation, 8(1):129–151, 1996.
[38] Yuan-Yi Xu, Ci-Siang Lin, and Yu-Chiang Frank Wang. Bias-eliminating augmentation learning for debiased federated learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 20442–20452, June 2023.
[39] Chun-Han Yao, Boqing Gong, Hang Qi, Yin Cui, Yukun Zhu, and Ming-Hsuan Yang. Federated multi-target domain adaptation. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pages 1424–1433, 2022.
[40] Hongxu Yin, Arun Mallya, Arash Vahdat, Jose M Alvarez, Jan Kautz, and Pavlo Molchanov. See through gradients: Image batch recovery via gradinversion. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 16337–16346, 2021.
[41] Tehrim Yoon, Sumin Shin, Sung Ju Hwang, and Eunho Yang. Fedmix: Approximation of mixup under mean augmented federated learning. In 9th International Conference on Learning Representations, ICLR, 2021.
[42] Yu Zhang and Dit-Yan Yeung. A regularization approach to learning task relationships in multitask learning. ACM Transactions on Knowledge Discovery from Data (TKDD), 8(3):1–31, 2014.
[43] Aoxiao Zhong, Hao He, Zhaolin Ren, Na Li, and Quanzheng Li. FedDAR: Federated domain-aware representation learning. In The Eleventh International Conference on Learning Representations, 2023.
[44] Kaiyang Zhou, Yongxin Yang, Timothy Hospedales, and Tao Xiang. Learning to generate novel domains for domain generalization. In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part XVI 16, pages 561–578. Springer, 2020.
[45] Tianfei Zhou and Ender Konukoglu. FedFA: Federated feature augmentation. In The Eleventh International Conference on Learning Representations, 2023.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/91509-
dc.description.abstract聯邦學習(FL)已經成為一種協作訓練框架,用於去中心化數據,但它面臨著客戶之間的數據異質性帶來的挑戰。現有方法主要集中於解決標籤偏斜或非獨立同分佈特徵變化的問題,卻忽視了客戶分群以及每個群集的數據不平衡。在這項工作中,我們提出了FedGCR(具有分群客製和分群重新加權的聯邦學習)的分層FL方案,以解決這些限制。在FedGCR中,我們使用「分群客製(GC)」來將具有相似特徵分佈的客戶進行分組,使它們可以相互學習領域特定知識,並從共享服務器模型中學習領域不變知識。此外,我們通過一種新穎的「分群重新加權(GR)」算法來解決群集大小不平衡的問題,該算法增強了不同群組之間的性能一致性。在多個數據集上的實驗評估表明,FedGCR在準確性和性能一致性方面優於現有方法。所提出的方法促進了聯邦學習的進步,使得在具有客戶相似性和群集不平衡的場景中能夠更有效地進行知識共享並提高性能。zh_TW
dc.description.abstractFederated learning (FL) has emerged as a collaborative training framework for decentralized data, but it faces challenges due to data heterogeneity among clients. Existing approaches primarily focus on addressing label skew or non-IID feature shift, while neglecting client clustering and the data imbalance of each cluster. In this work, we propose FedGCR (Federated learning with Group Customization and Reweighting), a stratified FL scheme, to address these limitations. In FedGCR, we use Group Customization (GC) to group clients with similar feature distributions, allowing them to learn domain-specific knowledge from one another and domain-invariant knowledge from the shared server model. Additionally, we tackle the issue of imbalanced cluster sizes through a novel Group Reweighting (GR) algorithm, which enhances performance uniformity among different groups. Experimental evaluations on multiple datasets demonstrate that FedGCR outperforms state-of-the-art methods in terms of accuracy and performance uniformity. The proposed approach contributes to the advancement of federated learning by enabling more effective knowledge sharing and improved performance in scenarios with client similarity and imbalanced clusters.en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-01-28T16:19:09Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2024-01-28T16:19:09Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontentsVerification Letter from the Oral Examination Committee i
Acknowledgements ii
摘要 iii
Abstract iv
Contents vi
List of Figures viii
List of Tables ix
Chapter 1 Introduction 1
Chapter 2 Related Work 5
2.1 Non-IID Feature Shift in Federated Learning 5
2.2 Data Imbalance in Federated Learning 6
Chapter 3 Problem Formulation 7
Chapter 4 Methodology 9
4.1 Network Architecture 11
4.2 Loss Function Design 12
4.3 Server Aggregation with Group Reweighting 14
4.3.1 Clustering with Contrastively Learned Proxy 14
4.3.2 Group Reweighting 15
Chapter 5 Experiments 16
5.1 Experimental Settings 16
5.1.1 Dataset Details 16
5.1.2 Domain Imbalance Factor (DIF) and Client Data Setting 17
5.1.3 Baselines 18
5.1.4 Evaluation Metrics 19
5.1.5 Implementation Details 19
5.2 Experiment Results 20
5.2.1 Results under Varying Levels of Domain Imbalance 20
5.2.2 Experiments on One-Client-Per-Domain Setting 21
5.2.3 Clustering Quality 23
5.2.4 Inter-Domain Differences between Datasets 24
Chapter 6 Ablation Study 25
6.1 Contribution of Each Component 25
6.2 Results under Varying Levels of Inter-Cluster Similarity 26
Chapter 7 Conclusion 28
References 29
-
dc.language.isoen-
dc.subject域名轉移zh_TW
dc.subject公平性zh_TW
dc.subject聚類zh_TW
dc.subject聯邦式學習zh_TW
dc.subjectClusteringen
dc.subjectDomain Shiften
dc.subjectFederated Learningen
dc.subjectFairnessen
dc.title以分群客製及分群重新加權提升聯邦學習在域名轉移下的效能及公平性zh_TW
dc.titleFederated Learning on Imbalanced Domain Distribution via Group Customization and Group Reweightingen
dc.typeThesis-
dc.date.schoolyear111-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee吳沛遠;曾新穆;吳尚鴻zh_TW
dc.contributor.oralexamcommitteePei-Yuan Wu;Vincent S. Tseng;Shan-Hung Wuen
dc.subject.keyword聯邦式學習,域名轉移,聚類,公平性,zh_TW
dc.subject.keywordFederated Learning,Domain Shift,Clustering,Fairness,en
dc.relation.page35-
dc.identifier.doi10.6342/NTU202301867-
dc.rights.note未授權-
dc.date.accepted2023-08-11-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept電信工程學研究所-
Appears in Collections:電信工程學研究所

Files in This Item:
File SizeFormat 
ntu-111-2.pdf
  Restricted Access
1.04 MBAdobe PDF
Show simple item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved