Maverick Matters: Client Contribution and Selection in Federated Learning – ScienceOpen
8
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Book Chapter: found
      Is Open Access
      Biomimetic and Biohybrid Systems : 12th International Conference, Living Machines 2023, Genoa, Italy, July 10–13, 2023, Proceedings, Part I 

      Maverick Matters: Client Contribution and Selection in Federated Learning

      other

      Read this book at

      Buy book Bookmark
          There is no author summary for this book yet. Authors can add summaries to their books on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Federated learning (FL) enables collaborative learning between parties, called clients, without sharing the original and potentially sensitive data. To ensure fast convergence in the presence of such heterogeneous clients, it is imperative to timely select clients who can effectively contribute to learning. A realistic but overlooked case of heterogeneous clients are Mavericks, who monopolize the possession of certain data types, e.g., children hospitals possess most of the data on pediatric cardiology. In this paper, we address the importance and tackle the challenges of Mavericks by exploring two types of client selection strategies. First, we show theoretically and through simulations that the common contribution-based approach, Shapley Value, underestimates the contribution of Mavericks and is hence not effective as a measure to select clients. Then, we propose FedEMD, an adaptive strategy with competitive overhead based on the Wasserstein distance, supported by a proven convergence bound. As FedEMD adapts the selection probability such that Mavericks are preferably selected when the model benefits from improvement on rare classes, it consistently ensures the fast convergence in the presence of different types of Mavericks. Compared to existing strategies, including Shapley Value-based ones, FedEMD improves the convergence speed of neural network classifiers with FedAvg aggregation by 26.9% and its performance is consistent across various levels of heterogeneity.

          Related collections

          Most cited references15

          • Record: found
          • Abstract: not found
          • Article: not found

          Gradient-based learning applied to document recognition

            Bookmark
            • Record: found
            • Abstract: not found
            • Conference Proceedings: not found

            Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Incentive Mechanism for Reliable Federated Learning: A Joint Optimization Approach to Combining Reputation and Contract Theory

                Bookmark

                Author and book information

                Book Chapter
                2023
                May 28 2023
                : 269-282
                10.1007/978-3-031-33377-4_21
                4a5ed81e-414d-4e6c-8036-cd8ce9aeb194
                History

                Comments

                Comment on this book

                Book chapters

                Similar content91

                Cited by1