11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Learning in the Air: Secure Federated Learning for UAV-Assisted Crowdsensing

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references55

          • Record: found
          • Abstract: not found
          • Article: not found

          Adaptive Federated Learning in Resource Constrained Edge Computing Systems

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Blockchained On-Device Federated Learning

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Robust and Communication-Efficient Federated Learning From Non-i.i.d. Data

              Federated learning allows multiple parties to jointly train a deep learning model on their combined data, without any of the participants having to reveal their local data to a centralized server. This form of privacy-preserving collaborative learning, however, comes at the cost of a significant communication overhead during training. To address this problem, several compression methods have been proposed in the distributed training literature that can reduce the amount of required communication by up to three orders of magnitude. These existing methods, however, are only of limited utility in the federated learning setting, as they either only compress the upstream communication from the clients to the server (leaving the downstream communication uncompressed) or only perform well under idealized conditions, such as i.i.d. distribution of the client data, which typically cannot be found in federated learning. In this article, we propose sparse ternary compression (STC), a new compression framework that is specifically designed to meet the requirements of the federated learning environment. STC extends the existing compression technique of top- k gradient sparsification with a novel mechanism to enable downstream compression as well as ternarization and optimal Golomb encoding of the weight updates. Our experiments on four different learning tasks demonstrate that STC distinctively outperforms federated averaging in common federated learning scenarios. These results advocate for a paradigm shift in federated optimization toward high-frequency low-bitwidth communication, in particular in the bandwidth-constrained learning environments.
                Bookmark

                Author and article information

                Contributors
                Journal
                IEEE Transactions on Network Science and Engineering
                IEEE Trans. Netw. Sci. Eng.
                Institute of Electrical and Electronics Engineers (IEEE)
                2327-4697
                2334-329X
                April 1 2021
                April 1 2021
                : 8
                : 2
                : 1055-1069
                Article
                10.1109/TNSE.2020.3014385
                85278c1c-7737-477f-b2c3-a79a649e3564
                © 2021

                https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html

                https://doi.org/10.15223/policy-029

                https://doi.org/10.15223/policy-037

                History

                Comments

                Comment on this article