Simulation of Hig1

You might also like

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 5

Simulation of High-Performance Digital Communication System:

The Importance-Sampling Technique

Introduction

Digital communication system design involves defining the system architecture, the modeling of the
algorithms and behaviors of the system, to transmit information of data in digital form from one destination
to another. Today’s digital communication system demands high performance, meaning super fast, more
accurate and efficient results, keeping the best quality of services and practically forbids any kind of error
(in general such as delay/distortion/garbage data, etc) as much as possible. Thus, engineers before setting
off with the actual designed system; simulate a model of such real-time system on computers to check on
the performance level of the system so that they achieve the closest level of approximation of their desired
goal. In such simulations, the model behavior will change each simulation according to the set of initial
parameters assumed or suitable for the environment.[6]

In a high performance digital communication system, estimation of bit error rate (BER) for signal
transmission is demandingly a considerable issue for industrial engineers, since (BERs) tend to be
extremely small as it is defined as the probability that one bit of error may occur for every megabit of data
transmission. In such modeled situation, engineers in the early stages used crude/standard “Monte-Carlo”
simulation techniques for estimating the (BER). This conventional Monte-Carlo method uses a great
number of samples to derive (BER) evaluation but only a very small fraction of them contributes to error
determination. Thus is time consuming for evaluating the performance of a reliable system as this method
requires a lot of trial samples to process before detecting a single error which is unaffordable in a high
performance communication system.[3][wikipedia]

Then the researchers introduced this “Importance Sampling”(IS) method in “Monte-Carlo” Simulation
technique, which increases artificially the number of errors by biasing the input distribution, which in easier
words can be explained as predetermining the outcome of this error analysis by modifying the noise
probability density function [pdf ] (i.e. Gaussian noise distribution) which is fed with the digital data that
are going through the error check process and therefore it is possible to reduce the number of simulations
required for BER estimation. Here lies the technique i.e. the mathematical approach or the algorithm for
biasing the pdf , that is the most suitable for a quicker bit error detection, which student in this project will
understand and simulate and testify and will come up with a report of the analysis of each importance
sampling techniques’ bit error rate.[8][9][3][wikipedia]

Backgrounds and Description

1. “Monte Carlo” method

Monte Carlo is the art of approximating an expectation by the sample


Mean of a function of simulated random variables and its simulation is a statistical technique allowing the
modeling of a real system in order to measure its performances.

Monte Carlo methods do not use specific sampling points but instead we choose points at random. The
Monte Carlo estimate of the integral is then,
where the “xi” are randomly sampled points and “ f ” is the arithmetic mean of the values of the function
f(x) at the sampling points.

The standard deviation of the mean is given by

Where

gives an estimate of the statistical error in the Monte Carlo estimate of the integral. Note that the error goes
as 1/√N , independent of the dimensionality of the integral.

After encoding/decoding process, the binary data pass through a block decision and then the emitted and
received sequences are compared. The simulation (MC) counts the number of errors. These simulations
require a large number of samples to do BER
estimation becoming a huge one to estimate very low values of error probability. [7] [3] [4] [9]

2. “Importance sampling techniques”

In 1953, however, a new algorithm for sampling points from a given probability function was introduced.
This algorithm enabled the incorporation of ``importance sampling'' into Monte Carlo integration. Instead
of choosing points from a uniform distribution, they are now chosen from a distribution which concentrates
the points where the function being integrated is large.

Importance sampling (IS) is a simulation technique which aims to reduce the variance of the bit error rate
(BER) estimator. By reducing the variance, IS estimators can achieve a given precision from shorter
simulation runs. The idea behind IS is that certain values of the input random variables in a simulation have
more impact on the parameter being estimated than others. If these “important” values are emphasized by
sampling more
frequently, then the estimator variance can be reduced. Hence, the basic idea in IS is to choose a
distribution which encourages the important values. Choosing a “biasing” distribution means is actually the
sampling techniques that we are going to do in this project. That is falsely generating more numbers of
errors by modifying or predetermining the Gaussian distribution i.e. the noise probability density function.
So by this type of simulation it should take less trail sample to go through the process to give faster (BER).
Choosing or designing a good biased distribution is the "art" of importance sampling. The rewards for a
good distribution can be huge run-time savings; the penalty for a bad distribution can be longer run times
than for a general Monte Carlo simulation without importance sampling.

But this use of a “biased” distribution will result in a biased estimator if applied directly in the simulation.
However, there is a simple procedure whereby the simulation outputs are weighted to correct for the use of
the biased distribution, and this ensures that the new IS estimator is unbiased. Hence, the “art” of designing
quick simulations via IS is entirely dependent on the choice of biased distribution. [8][5][1][9]
Aims And Objectives

As the title of the project suggests that the main objective of this project is to simulate a high performance
digital communication system by using importance sampling techniques. To do so:

• We define “Monte Carlo” technique and “Importance Sampling” technique properly using
mathematical formulae.

• Simulate a pulse pattern generator to send digits as in bits of information which are going to
go under the error analysis i.e. error detector. Aim is to generate and send appropriate number
of digits (to be calling it a virtual high performance system) at a time.

• To define the mathematical error detection algorithm and thus to simulate the algorithm for
error detection, thus this error detector will give the bit error rate for the information of data
which will be send to it from the pulse pattern generator by comparing the inputs with the
outputs, thus simulation of a comparator, integrated in this detector, will also be required.

• To simulate Standard “Monte Carlo” technique for error analysis and the reports on the
number of simulation runs or sample trials that it will require before a good solution of the
error analysis. This will be done with the Matlab software

• To simulate the “importance sampling technique” with Matlab software and to choose the
biased distribution (i.e where several approaches will be implemented and viewed) which will
encourage the important regions of the input variables to identify and the bit error rate for
each techniques used will be recorded and analyzed with each other and will be also
compared with the Standard “Monte Carlo” technique

• For biasing the distribution for the data used, we need to unbiased it again, before getting the
BER which will be again mathematically done.

Deliverables

• Literature review and research on “Monte Carlo” technique and “Importance Sampling”
technique.
• Model of a practical digital communication system with real-world high performance system
behaviors.
• Generation and transmission of at least 10megabit at a time.
• Selection of initial regions to be assessed for [MC]s
• Detection of bit error rate implying the simulated “Monte Carlo” technique and identification of
the error probability domains, thus the simulation runs.
• Record the number of simulation runs required to do so for [MC], thus evaluation of the
performance level.
• Selection of initial regions and identification of important sample points that are to be assessed for
[IS].
• Detection of bit error rate implying the simulated “Importance Sampling” techniques depending
on the choice of biasing distribution.
• Record the number of simulation runs required to do so for [IS], thus evaluation of the
performance level.
• A professional level research paper (thesis) containing the above deliverables.

References

[1] Peter J. Smith, Member, IEEE, Mansoor Shafi, Fellow, IEEE, and Hongsheng Gao

[2] Song, W. T., W. Chiu, and D. Goldsman. 2005. An


importance sampling technique for estimating bit
error rate in digital communication systems, Technical
Report, Tsing Hua University, Taiwan, R.O.C.

[3] http://en.wikipedia.org/wiki/Monte_Carlo_method

[4] http://ib.berkeley.edu/labs/slatkin/eriq/classes/guest_lect/mc_lecture_notes.pdf

[5] http://www.tcm.phy.cam.ac.uk/~ajw29/thesis/node17.html

[6] Proakis J G, “Digital Communications”, 4th Edition, McGraw Hill, 2000, ISBN-13: 978-
0072321111

[7] http://www.tcm.phy.cam.ac.uk/~ajw29/thesis/node16.html

[8] http://perso.telecom-
paristech.fr/~gallion/documents/free_downloads_pdf/PG_conferences/PG_C102.pdf

[9] Proceedings of the 2005 Winter Simulation Conference


M. E. Kuhl, N. M. Steiger, F. B. Armstrong, and J. A. Joines, eds.
http://delivery.acm.org/10.1145/1170000/1163207/p2710-song.pdf?
key1=1163207&key2=2476317521&coll=GUIDE&dl=GUIDE&CFID=60747257&CFTOKEN=47791690

Project Discription Summary

• Reliable digital communication systems are designed to maintain an error rate of one erroneous bit
for every megabit of transmission or better.
• This poses a serious problem for researchers trying to evaluate the performance of the systems via
simulations.
• The computer time needed to generate a sufficient number of errors to make their evaluation of
system performance reliable is forbidding.
• To be able to obtain reliable results at reasonable computer time, new modified Monte Carlo
techniques have been developed.
• These are known as Importance Sampling Monte Carlo simulations in which the noise probability
density function is modified to generate a falsely high number of errors to make reliable
evaluation possible.
• This false modification is then mathematically accounted for to generate the correct result.
• The students are to compare these modified importance sampling techniques with each other as
well as with the standard Monte Carlo technique in order to assess the computational gains
obtained employing each of them.

You might also like