关键词:
Differential privacy
deep learning
stochastic gradient descent
摘要:
Private individual information are increasingly exposed through high-dimensional and high-order data, with the wide deployment of learning techniques. These data are typically expressed in form of tensors, but there is no principled way to guarantee privacy for tensor-valued queries. Conventional differential privacy is typically applied to scalar values without a precise definition on the shape of the queried data. Realizing that the conventional mechanisms do not take the data structural information into account, we propose Tensor Variate Gaussian (TVG), a new (epsilon, delta)-differential privacy mechanism for tensor-valued queries. We further introduce two mechanisms based on TVG with an improved utility by imposing the unimodal differentially-private noise. With the utility space available, the proposed mechanisms can be instantiated with an optimized utility, and the optimization problem has a closed-form solution scalable to large-scale problems. Finally, we experimentally test our mechanisms on a variety of datasets and models, demonstrating that TVG is superior than other state-of-the-art mechanisms on tensor-valued queries.