Utility–Privacy Trade-Off in Distributed Machine Learning Systems
In distributed machine learning (DML), though clients’ data are not directly transmitted to the server for model training, attackers can obtain the sensitive information of clients by analyzing the local gradient parameters uploaded by clients. For this case, we use the differential privacy (DP) mec...
Published in: | Entropy |
---|---|
Main Authors: | , , |
Format: | Text |
Language: | English |
Published: |
Multidisciplinary Digital Publishing Institute
2022
|
Subjects: | |
Online Access: | https://doi.org/10.3390/e24091299 |
_version_ | 1821499170804989952 |
---|---|
author | Xia Zeng Chuanchuan Yang Bin Dai |
author_facet | Xia Zeng Chuanchuan Yang Bin Dai |
author_sort | Xia Zeng |
collection | MDPI Open Access Publishing |
container_issue | 9 |
container_start_page | 1299 |
container_title | Entropy |
container_volume | 24 |
description | In distributed machine learning (DML), though clients’ data are not directly transmitted to the server for model training, attackers can obtain the sensitive information of clients by analyzing the local gradient parameters uploaded by clients. For this case, we use the differential privacy (DP) mechanism to protect the clients’ local parameters. In this paper, from an information-theoretic point of view, we study the utility–privacy trade-off in DML with the help of the DP mechanism. Specifically, three cases including independent clients’ local parameters with independent DP noise, dependent clients’ local parameters with independent/dependent DP noise are considered. Mutual information and conditional mutual information are used to characterize utility and privacy, respectively. First, we show the relationship between utility and privacy for the three cases. Then, we show the optimal noise variance that achieves the maximal utility under a certain level of privacy. Finally, the results of this paper are further illustrated by numerical results. |
format | Text |
genre | DML |
genre_facet | DML |
id | ftmdpi:oai:mdpi.com:/1099-4300/24/9/1299/ |
institution | Open Polar |
language | English |
op_collection_id | ftmdpi |
op_doi | https://doi.org/10.3390/e24091299 |
op_relation | Multidisciplinary Applications https://dx.doi.org/10.3390/e24091299 |
op_rights | https://creativecommons.org/licenses/by/4.0/ |
op_source | Entropy; Volume 24; Issue 9; Pages: 1299 |
publishDate | 2022 |
publisher | Multidisciplinary Digital Publishing Institute |
record_format | openpolar |
spelling | ftmdpi:oai:mdpi.com:/1099-4300/24/9/1299/ 2025-01-16T21:38:33+00:00 Utility–Privacy Trade-Off in Distributed Machine Learning Systems Xia Zeng Chuanchuan Yang Bin Dai 2022-09-14 application/pdf https://doi.org/10.3390/e24091299 EN eng Multidisciplinary Digital Publishing Institute Multidisciplinary Applications https://dx.doi.org/10.3390/e24091299 https://creativecommons.org/licenses/by/4.0/ Entropy; Volume 24; Issue 9; Pages: 1299 differential privacy distributed machine learning mutual information Gaussian noise trade-off Text 2022 ftmdpi https://doi.org/10.3390/e24091299 2023-08-01T06:28:32Z In distributed machine learning (DML), though clients’ data are not directly transmitted to the server for model training, attackers can obtain the sensitive information of clients by analyzing the local gradient parameters uploaded by clients. For this case, we use the differential privacy (DP) mechanism to protect the clients’ local parameters. In this paper, from an information-theoretic point of view, we study the utility–privacy trade-off in DML with the help of the DP mechanism. Specifically, three cases including independent clients’ local parameters with independent DP noise, dependent clients’ local parameters with independent/dependent DP noise are considered. Mutual information and conditional mutual information are used to characterize utility and privacy, respectively. First, we show the relationship between utility and privacy for the three cases. Then, we show the optimal noise variance that achieves the maximal utility under a certain level of privacy. Finally, the results of this paper are further illustrated by numerical results. Text DML MDPI Open Access Publishing Entropy 24 9 1299 |
spellingShingle | differential privacy distributed machine learning mutual information Gaussian noise trade-off Xia Zeng Chuanchuan Yang Bin Dai Utility–Privacy Trade-Off in Distributed Machine Learning Systems |
title | Utility–Privacy Trade-Off in Distributed Machine Learning Systems |
title_full | Utility–Privacy Trade-Off in Distributed Machine Learning Systems |
title_fullStr | Utility–Privacy Trade-Off in Distributed Machine Learning Systems |
title_full_unstemmed | Utility–Privacy Trade-Off in Distributed Machine Learning Systems |
title_short | Utility–Privacy Trade-Off in Distributed Machine Learning Systems |
title_sort | utility–privacy trade-off in distributed machine learning systems |
topic | differential privacy distributed machine learning mutual information Gaussian noise trade-off |
topic_facet | differential privacy distributed machine learning mutual information Gaussian noise trade-off |
url | https://doi.org/10.3390/e24091299 |