Professional Documents
Culture Documents
Application of Compressed Sensing in Privacy Protection
Application of Compressed Sensing in Privacy Protection
Application of Compressed Sensing in Privacy Protection
protection
Jia Liang
Introduction to CS
1. Compressed sensing (CS)
From Nyquist to CS
1. Compressed sensing (CS)
“Can we not just directly measure the part that will not end up being
thrown away ?”
Donoho
Sparse in wavelet-domain
Sparse in wavelet-domain
Our Point-Of-View
Analog
Audio
Signal High-rate Low-rate
Nyquist rate Compressed Compression
Sampling Sensing (e.g. MP3)
1. Compressed sensing: Definition
ym 1 xn 1
m n
Fig1. The schematic of CS model.
1. Compressed sensing: Definition
1. Compressed sensing: Definition
[1] S. S. Chen, D. L. Donoho, and M. A. Saunders, Atomic Decomposition by Basis Pursuit vol. 20: SIAM
Journal on Scientific Computing, 1998.
1. Compressed sensing: Definition
CT Scans
1. Compressed sensing: Applications
Spatial Superresolution
1. Compressed sensing
Our Point-Of-View
◼ We can sample a signal below Nyquist sampling rate.(We must know something
about the signals).
2. CS-based cryptosystem
1. Symmetric Cryptosystem
2. Attack models
2. CS-based cryptosystem
• COA Receiver
Sender
• CPA (Bob)
(Alice)
• KPA
xn Sample ym ym Reconstruction
1 1
信道 1 xn 1
(Encryption) (Decryption)
Measurement matrix
Attacter Measurement matrix
(Key)
(Eve) (Key)
Shannon
2. CS-based cryptosystem
[1] T. Bianchi, V. Bioglio, and E. Magli, “Analysis of one-time random projections for privacy
preserving compressed sensing,” IEEE Trans. Inf. Forensics Security, vol. 11, no. 2, pp. 313–327,
Feb. 2016
2. CS-based cryptosystem
Our Point-Of-View
Our Point-Of-View
➢ Step4: Quantization, .
2. CS-based cryptosystem: Schemes
3. Decoding
2. CS-based cryptosystem: Schemes
3. Decoding
2. CS-based cryptosystem: Schemes
3. Decoding
2. CS-based cryptosystem: Schemes
2. CS-based cryptosystem: Schemes
2. CS-based cryptosystem: Schemes
2. CS-based cryptosystem: Schemes
2. CS-based cryptosystem: Schemes
Exercise :
Try to give the pseudocode. In five minutes, I will ask one person to show his design.
3. Compressed learning(CL)
y Φx T Inference stage
Wy
线性
投影
Tag
Conv. fc
Input measurements
C. -Y. Chou, E. -J. Chang, H. -T. Li and A. -Y. Wu, "Low-Complexity Privacy-Preserving Compressive
Analysis Using Subspace-Based Dictionary for ECG Telemonitoring System," in IEEE Transactions on
Biomedical Circuits and Systems, vol. 12, no. 4, pp. 801-811, Aug. 2018
3. Compressed learning(CL): Compressive analysis
Matrix Factorization
d n d d d n
Y WA Y W A
d 1
Given sample set X,and get W and A . The test sample x is represented: x = Wα + e
z = Φx z = Φ( Wα + e) = ΦWα + e
z Θα Θ = ΦW, Θ md
d m
Similar to the data after dimensionality
α̂ = Θ z
†
reduction!
After the measurement vector is obtained, the approximate data after dimensionality
reduction can be obtained by multiplying a matrix left.
3. Compressed learning(CL): Compressive analysis
Fig.1 Low-complexity privacy preserving scheme based on CS and NMF for image data
3. Compressed learning(CL): Compressive analysis
① COA
Fig.1 Energy relationship between measurement and original signal, (a) CS system (b) scheme of
proposal
3. Compressed learning(CL): Compressive analysis
② KPA
The measurement matrix is estimated by selecting different number of plaintext ciphertext
pairs。
Φ Φeve 2
0.01
Φ 2
numbers
3. Compressed learning(CL): Compressive analysis
③ COA
Measurement 测量矩阵
matrix
(a)
Measure Detection \
Signals CS ments channel Result
classification
Measurement
matrix (b)
3. Compressed learning(CL)
y Φx Ψ y
... .. Tag
Zisselman, Ev et al. “Compressed Learning for Image Classification: A Deep Neural Network Approach.”
(2018).
3. Compressed learning(CL)
4. Others: CS in data collection
Background
CDG
Compresssive Data Gathering
Our scheme
Parameter Design
Selection Strategy
u
kN P
k =0 and N P {1, 2, , N}
NP = P
Unselected node: The remaining N − P nodes.
u
lN P
l =0 NP = N − P
4. Others: CS in data collection
Data Gathering
S1: b1, w1
S3: b2, w2 i = 1, , N k = 1, ,P
The order of the k is determined
S2:d, u2 based on the size of the index i • Embed the blinding factorwk with encoding vector b k
• Increase the security of node information transmission
without imposing additional burden for the sink node to
Selected Node Encoding recover data.
4. Others: CS in data collection
Data Gathering
x + b
i =1
i i
k =1
k wk + dul = y
l
N P
x + b
i =1
i i
k =1
k wk = y
Φx + Bw = y
Correctness Verification
4. Others: CS in data collection
Data Gathering M. Yamac,̧ C¸. Dikici, and B. Sankur,“Hiding data in compressive sensed measurements: A conditionally reversible
data hiding scheme for compressively sensed measurements” Digit. Signal Prog., vol. 48, pp. 188–200, 2016.
F R mM
satisfy
FB = 0
Correctness Verification
4. Others: CS in data collection
Security Analysis
Resist attacks:
• Routing Analysis,
• Size Correlation Analysis
• Content Correlation Analysis
替换您的图片 替换您的图片
4. Others: CS in data collection
Efficiency Analysis
The Table compares the computational overhead with some data gathering scheme.
4. Others: CS in data collection
Performance
4. Others: CS in data collection
1 Background
Attacks
Encryption
Hiding
Characteristics: double protection of the cover and the watermarking, robustness to a variety of attacks
Purpose of the robust watermarking: identity authentication, copyright protection...
2 Research Goal
Watermarking:high robustness
• Framework
Image Owner K3 Data Hider
High consumption!
A.Traditional
compressed
sensing
Original
Image
Sub-block B. Kronecker
compressed
sensing Low consumption!
3 The Proposed Scheme Purpose: Vacating Room
1. Image Encryption and Preprocessing
Sub-block Bi
Reference measurements:
B. Kronecker
compressed
sensing Predicted measurements:
…… Global
Random
Permutation
Room Error
• With interference
Table.1 The NC and BER under different block sizes and sampling rates
It can be seen that the watermark extraction accuracy increases with the rise of the
sampling rate, and smaller sub-blocks correspond to better extraction quality.
4 Performance
[3] Shabir A Parah, Nazir A Loan, Asif A Shah, Javaid A Sheikh, and GM Bhat, “A new secure and robust water_x0002_marking technique based on logistic map and modifica_x0002_tion of dc
coefficient,” Nonlinear Dynamics, vol. 93, no.4, pp. 1933–1951, 2018.
[5] Yang Liu, Shanyu Tang, Ran Liu, Liping Zhang, and Zhao Ma, “Secure and robust digital image watermark_x0002_ing scheme using logistic and rsa encryption,” Expert Systems with Applications,
vol. 97, pp. 95–105, 2018.
[10]Di Xiao, Aozhu Zhao, and Fei Li, “Robust watermark_x0002_ing scheme for encrypted images based on scramblingand kronecker compressed sensing,” IEEE Signal Pro_x0002_cessing Letters,
vol. 29, pp. 484–488, 2022.
4 Performance
5 Conclusion
Challenges
Communication cost: In the centralized optimization, communication costs
are relatively small, and computational costs is dominated. In contrast, in
federated learning(FL), communication costs is dominated. Sharing high-
dimensional gradients across iterative rounds in FL is very costly.
Algorithm framework
CS-DP-SignSGD:
1、sparse representation
2 、linear projection
3 、differentially private
1-bit compression
4 、signal reconstruction
4. Others: CS in Federated Learning
1 Distortion
2 preservation
4. Others: CS in Federated Learning
Definition 2. For any given gradient 𝑦𝑡𝑖 , the compressor dpsign outputs dpsign(𝑦𝑡𝑖 , ϵ, δ), the j-th entry is
given by
𝑦𝑡𝑖 𝑗
1, 𝑤𝑖𝑡ℎ 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 Φ
𝑖
dpsign(𝑦𝑡 , ϵ, δ)𝑗 = 𝜎
𝑦𝑡𝑖 𝑗
−1, 𝑤𝑖𝑡ℎ 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 1 − Φ
𝜎
△ ε𝜎 △ ε𝜎
where 𝜎 is the noise scale satisfying ϕ( − )-𝑒 𝜀 𝜙(− − ) ≤ 𝛿(Analytic Gaussian Mechanism[15]).
2𝜎 △ 2𝜎 △
Accuracy evaluation
Conclusion
The CS-DP-SignSGD realizes data compression in both upstream
and downstream communications, which greatly improves the
communication efficiency , privacy protection and Byzantine
Robustness.