Compressive Data Gathering For Large-Scale Wireless Sensor Networks

You might also like

Download as ppsx, pdf, or txt
Download as ppsx, pdf, or txt
You are on page 1of 17

Compressive Data Gathering for Large-Scale Wireless Sensor Networks

Chong Luo, Feng Wu, Jun Sun and Chang Wen Chen Mobicom09, Beijing, China

Outline
Background
Compressive sensing theory New research opportunities

Compressive Data Gathering


The first complete design to apply CS theory for sensor data gathering

Conclusion

Compressive Sensing

If an N-dimensional signal is K-sparse in a known domain , it can be recovered from M random measurements by:

New Research Opportunities


Compressive Sensing Hallmarks Data Communications Research

Universal
Same random projection op. for any compressible signal

Random linear network coding


Achieves multicast capacity

Democratic
Potentially unlimited number of measurements Each measurement carries the same amount of information

Fountain code
a.k.a. rateless erasure code Perfect reconstruction from N(1+) encoding symbols

Distributed source coding


e.g. Slepian-Wolf coding Blind encoding, joint decoding

Asymmetrical
Simple encoding, most processing at decoder

From Compressive Sensing to Compressive Data Gathering


The asymmetrical property makes CS a perfect match for wireless sensor networks
Compressive Sensing Sample-then-compress Compressive Data Gathering Compress-then-transmit

Sample-with-compression

Compress-with-transmission

Data Gathering in WSNs


Internet or Satellite Sink

Sensing field

Challenges
Global communication cost reduction Energy consumption load balancing

Basic Idea
A simple chain topology

d2

dN

d1 s1 s2

d1 s3

sN

Global comm. cost Baseline transmission Proposed CDG N(N+1)/2 NM

Bottleneck load N M

d2

d1

Is Reconstruction Possible?
M<<N

Facts
Sensor readings exhibit strong spatial correlations

According to CS theory
Reconstruction can be achieved in a noisy setting by solving:

Practical Problem 1
Abnormal readings compromise data sparsity
Signal d1 10 Signal in time domain
10 5 0 -5 -10 5 10 5 0 -5

Representation of d1 in DCT domain Representation in DCT domain

0
-5 -10

Signal d2
10
5 0 -5 -10

Representation of d2 in time domain

Solution:
Overcomplete basis

7-sparse

Practical Problem 2
If a signal is not sparse in any intuitively known domain
value y
d

20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1

Universal Sparsity
CS-based data representation and recovery is optimal in exploiting data sparsity
Encoder
The same random projection operation

Decoder
Select and design representation basis Reorder signal d to make it sparse in a known domain

Neither transform-based compression nor distributed source coding is able to exploit these special types of data sparsity

Network Capacity Gain


Theorem: In a wireless sensor network with N nodes, CDG can achieve a capacity gain of N/M over baseline transmission, given that sensor readings are K-sparse, and M = c1K.
Mathematical proof Simulation verification

Example 1
N=1000, K40
M=100
Temperature ()

CTD data from NOAA


Recon. Precision
Comm. Reduction Capacity Gain
50 40 SNR (dB) 30 20 10 0 0 50 100 150 200 Number of random measurements (M)

30 20 10 0 0 200 400 600 800 Depth / Pressure (dbar) 1000

99.2%
5 10

30 20 10 0 0 -10 200 400 600 800 1000

Example 2
Temperature data from data center
498 temperature sensors sensor readings exhibit little spatial correlations
30 30 26 25 22 20 18 15 14 10

Reorder sensors according to their readings at t0

Utilizing Temporal Correlation


Sensor readings at t0 + t are sparse as well
Temperatures do not change violently with time
30 20 10 30 20 10 30 20 10

(a) Original

(b) Reconstruction from 0.5N measurements

(c) Reconstruction from 0.3N measurements

t=30min

Conclusion
Compressive Sensing is an emerging field which may bring fundamental changes to networking and data communications research Our Contributions
The first complete design to apply CS theory to sensor data gathering CDG exploits universal sparsity CDG improves network capacity

Future Work
Bring innovations to LDPC, NC, DSC, and Fountain code through CS theory

THANKS!

You might also like