New Research on Differential Privacy Published in PoPETs 2025
New Research on Differential Privacy Published in PoPETs 2025
Fredrik Meisingseth, Christian Rechberger, and Fabian Schmid from Graz University of Technology have co-authored a new paper titled “Practical Two-party Computational Differential Privacy with Active Security”, published in Proceedings on Privacy Enhancing Technologies (PoPETs), Issue 2, 2025.
This work introduces the first practical protocol that guarantees computational differential privacy (CDP) in a two-party computation setting with active adversary protection, contributing to the goals of Work Package 2 (WP2) in the CONFIDENTIAL6G project.
Why This Matters
Differential privacy is a leading method to ensure data privacy when analyzing sensitive datasets. While many cryptographic techniques support differential privacy, most are either not secure against actively malicious parties or are inefficient in real deployments.
In real-world 6G systems—where distributed operators, verticals, or public agencies collaborate on sensitive data—there is a strong need for efficient, two-party privacy-preserving computation with strong guarantees even if one party attempts to deviate from the protocol.
Key Contributions of the Paper
The paper presents the first actively secure protocol that satisfies computational differential privacy (as opposed to information-theoretic DP) in a two-party setting.
1. New Security Definition
The authors define a formal framework combining:
-
Computational differential privacy, which limits what an adversary can infer even with unlimited attempts,
-
Simulation-based security in the presence of malicious adversaries, ensuring strong protocol-level guarantees.
2. General Protocol Design
Their protocol:
-
Uses additive secret sharing to divide private inputs across two parties,
-
Applies a differential privacy mechanism securely in a joint computation,
-
Adds zero-knowledge proofs to verify correct behavior and prevent cheating.
3. Noise Generation and Validation
A novel feature is a verifiable noise generation process using pseudo-randomness, ensuring that the noise added (central to DP) is both private and correctly sampled—even when generated jointly by two untrusted parties.
4. Performance Evaluation
The authors implement their protocol and test it on histogram and frequency-count tasks. Results show:
-
Fast runtime (sub-second for many tasks),
-
Low overhead from security mechanisms,
-
Practical use for secure statistics in two-party collaborations.
Impact on CONFIDENTIAL6G
This research directly contributes to the privacy-by-design vision of CONFIDENTIAL6G, providing a real-world mechanism for:
-
Secure cross-border or cross-operator data analysis,
-
Edge-to-edge or operator-to-provider collaboration on analytics,
-
Privacy-preserving threat intelligence sharing with verifiable protection against malicious actors.
By advancing privacy-preserving techniques with formal security and performance results, the work supports both research and deployment ambitions in 6G trust infrastructure.
👉 Full publication is availabe on ZENODO.