Privacy against an adversary (AD) that tries to detect the underlying privacy-sensitive data distribution is studied. The original data sequence is assumed to come from one of the two known distributions, and the privacy leakage is measured by the probability of error of the binary hypothesis test carried out by the AD. A management unit (MU) is allowed to manipulate the original data sequence in an online fashion while satisfying an average distortion constraint. The goal of the MU is to maximize the minimal type II probability of error subject to a constraint on the type I probability of error assuming an adversarial Neyman-Pearson test, or to maximize the minimal error probability assuming an adversarial Bayesian test. The asymptotic exponents of the maximum minimal type II probability of error and the maximum minimal error probability are shown to be characterized by a Kullback-Leibler divergence rate and a Chernoff information rate, respectively. Privacy performances of particular management policies, the memoryless hypothesis-aware policy and the hypothesis-unaware policy with memory, are compared. The proposed formulation can also model adversarial example generation with minimal data manipulation to fool classifiers. At last, the results are applied to a smart meter privacy problem, where the user's energy consumption is manipulated by adaptively using a renewable energy source in order to hide user's activity from the energy provider.

Privacy against a hypothesis testing adversary / Li, Z.; Oechtering, T.; Gunduz, D.. - In: IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY. - ISSN 1556-6013. - 14:6(2019), pp. 1567-1581. [10.1109/TIFS.2018.2882343]

Privacy against a hypothesis testing adversary

D. Gunduz
2019

Abstract

Privacy against an adversary (AD) that tries to detect the underlying privacy-sensitive data distribution is studied. The original data sequence is assumed to come from one of the two known distributions, and the privacy leakage is measured by the probability of error of the binary hypothesis test carried out by the AD. A management unit (MU) is allowed to manipulate the original data sequence in an online fashion while satisfying an average distortion constraint. The goal of the MU is to maximize the minimal type II probability of error subject to a constraint on the type I probability of error assuming an adversarial Neyman-Pearson test, or to maximize the minimal error probability assuming an adversarial Bayesian test. The asymptotic exponents of the maximum minimal type II probability of error and the maximum minimal error probability are shown to be characterized by a Kullback-Leibler divergence rate and a Chernoff information rate, respectively. Privacy performances of particular management policies, the memoryless hypothesis-aware policy and the hypothesis-unaware policy with memory, are compared. The proposed formulation can also model adversarial example generation with minimal data manipulation to fool classifiers. At last, the results are applied to a smart meter privacy problem, where the user's energy consumption is manipulated by adaptively using a renewable energy source in order to hide user's activity from the energy provider.
2019
14
6
1567
1581
Privacy against a hypothesis testing adversary / Li, Z.; Oechtering, T.; Gunduz, D.. - In: IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY. - ISSN 1556-6013. - 14:6(2019), pp. 1567-1581. [10.1109/TIFS.2018.2882343]
Li, Z.; Oechtering, T.; Gunduz, D.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1202583
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 21
  • ???jsp.display-item.citation.isi??? 16
social impact