Statistics [24]: Variance Reducing Techniques

2 minute read

Published:

Several techniques to reduce variance of the estimation, including antithetic variates, control variates, stratified sampling and importance sampling.


Antithetic Variates

Consider we have an even number of samples, , from . One approach is to generate correlated samples to reduce the variance by cancellations in thier sum. The estimate:

where

The variance ,

Conclusion:

  1. When , the variance remains the same;
  2. When , the variance decreases;
  3. WHen , the variance increases.

Example 1

Estimate .

Assume and are two samples, let the two samaples be and , then

Hence,


Control Variates

Suppose we have a function for which is known. Let

The estimate can be given by

The variance of is given by

Differentiate with respect to

Then,

where

Example 2

Estimate .

Let , there is

Then


Stratified Sampling

Let’s divide the whole space into subspaces, the final results would be the sum of all partial results.

The MC estimate becomes

where is the number of points on , and is the volume of the subspace.

The variance becomes

where


Importance Sampling

The pdf under the integral, , may not be the best pdf for MC integration. In this case, we can use a different and simpler pdf from which we can draw the samples. is called the importance density. Hence,

By generating samples , the estimate becomes

where

is called importance weight.

Notice that , we have

Hence,

where

are the normalized importance weights. We can see that the average sum becomes weighted sum, reflecting the relative importance of the sample (point). This is the basis for particle filtering.

The variance of is given by

where

The equation holds when

Example 3

Estimate .

Let and , then


Table of Contents

Comments