Détail du document
Identifiant

oai:arXiv.org:2411.03569

Sujet
Computer Science - Machine Learnin... Computer Science - Artificial Inte... Computer Science - Cryptography an... Computer Science - Computer Vision...
Auteur
Wang, Pengju Liu, Bochao Guo, Weijia Li, Yong Ge, Shiming
Catégorie

Computer Science

Année

2024

Date de référencement

13/11/2024

Mots clés
local historical global federated knowledge learning model computer
Métrique

Résumé

Federated learning is a distributed machine learning paradigm designed to protect data privacy.

However, data heterogeneity across various clients results in catastrophic forgetting, where the model rapidly forgets previous knowledge while acquiring new knowledge.

To address this challenge, personalized federated learning has emerged to customize a personalized model for each client.

However, the inherent limitation of this mechanism is its excessive focus on personalization, potentially hindering the generalization of those models.

In this paper, we present a novel personalized federated learning method that uses global and historical models as teachers and the local model as the student to facilitate comprehensive knowledge distillation.

The historical model represents the local model from the last round of client training, containing historical personalized knowledge, while the global model represents the aggregated model from the last round of server aggregation, containing global generalized knowledge.

By applying knowledge distillation, we effectively transfer global generalized knowledge and historical personalized knowledge to the local model, thus mitigating catastrophic forgetting and enhancing the general performance of personalized models.

Extensive experimental results demonstrate the significant advantages of our method.

;Comment: Accepted by IEEE SMC 2024

Wang, Pengju,Liu, Bochao,Guo, Weijia,Li, Yong,Ge, Shiming, 2024, Towards Personalized Federated Learning via Comprehensive Knowledge Distillation

Document

Ouvrir

Partager

Source

Articles recommandés par ES/IODE IA