A Weighted Federated Averaging Framework to Reduce the Negative Influence from the Dishonest Users

Fengpan Zhao, Yan Huang, Saide Zhu, Venkata Malladi, Yubao Wu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Federated learning becomes popular for it can train an excellent performance global model without exposing clients’ privacy. However, most FL applications failed to consider there exists fake local trained models returned from attackers or dishonest users. Not only would the fake parameters be harmful to the convergence of the global model but also be wasting of other users’ computational resources. In this paper, we propose a framework to grade the users’ credit score based on the performances of the returned local models on the testing dataset. We also consider historical data using the exponential moving average to give a relatively higher weight for the most recent testing results. The experiments show that our system can efficiently and effectively find out the fake local models and then speed up the convergence of the global model.

Original languageEnglish (US)
Title of host publicationSecurity, Privacy, and Anonymity in Computation, Communication, and Storage - 13th International Conference, SpaCCS 2020, Proceedings
EditorsGuojun Wang, Bing Chen, Wei Li, Roberto Di Pietro, Xuefeng Yan, Hao Han
PublisherSpringer Science and Business Media Deutschland GmbH
Pages241-250
Number of pages10
ISBN (Print)9783030688509
DOIs
StatePublished - 2021
Event13th International Conference on Security, Privacy, and Anonymity in Computation, Communication, and Storage, SpaCCS 2020 - Nanjing, China
Duration: Dec 18 2020Dec 20 2020

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12382 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference13th International Conference on Security, Privacy, and Anonymity in Computation, Communication, and Storage, SpaCCS 2020
Country/TerritoryChina
CityNanjing
Period12/18/2012/20/20

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Cite this