CP-FL: Practical Gradient Leakage Defense in Federated Learning with Compressive Privacy

Publisher:
IEEE
Publication Type:
Conference Proceeding
Citation:
GLOBECOM 2023 - 2023 IEEE Global Communications Conference, 2024, 00, pp. 5165-5170
Issue Date:
2024-02-26
Filename Description Size
1714726.pdfPublished version3.73 MB
Adobe PDF
Full metadata record
Federated learning FL requires clients to train constituted models based on their local datasets Clients usually directly train local models using their entire datasets without distinguishing which information of data is task relevant or irrelevant Task irrelevant information does not contribute to the learning task but exposes additional privacy information to adversaries Studies have shown that unintended information leakage from gradients during FL iterations threatens clients privacy Researchers applied differential privacy DP to protect clients gradients but it does not help to reduce task irrelevant information from the gradients In this paper we propose a compressive privacy federated learning CP FL scheme to protect the task irrelevant information from gradient leakage attacks In CP FL clients train a local compressive model according to the global task The local compressive model constructs a new representation which extracts task relevant and removes task irrelevant information from clients data Since the global model is updated based on the compressed representation that eliminates the task irrelevant information it can effectively prevent adversaries from inferring those property values from the uploaded gradients Moreover with the help of a powerful local compressive model that sanitizes the challenging data into a low dimension space representation CP FL can use a small global model instead of a sizeable one significantly reducing communication Both theoretical analysis and extensive experimental results demonstrate that CP FL can effectively defend against gradient leakage attacks while maintaining practical utility
Please use this identifier to cite or link to this item: