Document detail
ID

oai:arXiv.org:2408.08655

Topic
Computer Science - Machine Learnin... Computer Science - Artificial Inte...
Author
Ding, Binbin Yang, Penghui Ge, Zeqing Huang, Shengjun
Category

Computer Science

Year

2024

listing date

8/21/2024

Keywords
backdoor attacks
Metrics

Abstract

Federated learning enables multiple clients to collaboratively train machine learning models under the overall planning of the server while adhering to privacy requirements.

However, the server cannot directly oversee the local training process, creating an opportunity for malicious clients to introduce backdoors.

Existing research shows that backdoor attacks activate specific neurons in the compromised model, which remain dormant when processing clean data.

Leveraging this insight, we propose a method called Flipping Weight Updates of Low-Activation Input Neurons (FLAIN) to defend against backdoor attacks in federated learning.

Specifically, after completing global training, we employ an auxiliary dataset to identify low-activation input neurons and flip the associated weight updates.

We incrementally raise the threshold for low-activation inputs and flip the weight updates iteratively, until the performance degradation on the auxiliary data becomes unacceptable.

Extensive experiments validate that our method can effectively reduce the success rate of backdoor attacks to a low level in various attack scenarios including those with non-IID data distribution or high MCRs, causing only minimal performance degradation on clean data.

Ding, Binbin,Yang, Penghui,Ge, Zeqing,Huang, Shengjun, 2024, Mitigating Backdoor Attacks in Federated Learning via Flipping Weight Updates of Low-Activation Input Neurons

Document

Open

Share

Source

Articles recommended by ES/IODE AI

Investigation of Heavy Metal Analysis on Medicinal Plants Used for the Treatment of Skin Cancer by Traditional Practitioners in Pretoria
heavy metals medicinal plants skin cancer icp-ms health risk assessment treatment cancer plants 0 metal health medicinal