5

Vague Convergence in Measure-theoretic Probability Theory - Equivalent Condition...

 1 year ago
source link: https://desvl.xyz/2023/02/13/vague-convergence/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Vague Convergence in Measure-theoretic Probability Theory - Equivalent Conditions

Introduction

In analysis and probability theory, one studies various sort of convergences (of random variables) for various reasons. In this post we study vague convergence, which is responsible for the convergence in distribution.

Vaguely speaking, vague convergence is the weakest kind of convergence one can expect (whilst still caring about continuity whenever possible). We do not consider any dependence relation between the sequence of random variables.

Throughout, fix a probability space , where is the sample space, the event space and the probability function. Let be a sequence of random variables on this space. Each random variable canonically induces a probability space where is the Borel -measure. To avoid notation hell we only consider the correspondence where

Here comes the question: if tends to a limit, then we would expect that converges to a limit (say ) in some sense (at least on some intervals). But is that always the case? Even if the sequence converges, can we even have ? We will see through some examples that this is really not the case.

Examples: Failure of Convergence on Intervals

Let , then deterministically. For any , the sequence oscillates between and , i.e. it ends up in the form

which does not converge at all. Likewise, for any , the sequence oscillates between and .

As another example of convergence failure, consider with and as , and let be the sequence of random variables having the uniform distribution on . We see a.e. but , which subjects to the area under between and , may not converge at all, or converge to any number between and .

Example: Failure of Converging to a Probability Measure

We compose an example where converges to a measure where , preventing from being a probability measure. To do this, fix two positive numbers and such that . Consider the sequence of random variables with

Then where

Then . Atoms of this measure has escaped to and .

These examples inspire us to develop a weaker sense of convergence, where we only take intervals into account (because we would expect continuous functions to play a role).

Definitions

From the example above, it is clear that it is not expected to reach all the time. Therefore we consider instead, hence the following weakened version of probability measure and distribution function follow.

Definition 1. A measure on is a subprobability measure (s.p.m.) if . Correspondingly, one defines the subdistribution function (s.d.f.) with respect to by

When , there is nothing new, but even if not, we do not have very much obstacles. Still we see is a right continuous function with and . For brevity’s sake, we will write into from now on, and similarly for other kind of intervals. We also put when because why not.

Our examples also warn us that atoms are a big deal, which leads us to the following definition concerning intervals.

Definition 2. Notation being above, an interval is called a continuous interval if neither nor is an atom of , i.e. if .

One can test if is a continuous interval in our first group of examples. Now we are ready for the definition of vague convergence.

Definition 3. A sequence of s.p.m. is said to converge vaguely to an s.p.m. if there exists a dense subset such that

We write .

Let be the corresponding s.d.f. of and the s.d.f. of . Then we say that converges vaguely to and write .

It is unfair that we are not building the infrastructure for random variables (r.v.) in this context. We introduce the following concept that you may have already studied in the calculus-based probability theory:

Definition 4. Let be a sequence of r.v.’s with corresponding cumulative distribution functions (c.d.f.) . We say converge weakly or in distribution to (with corresponding c.d.f. ) if .

In calculus-based probability theory, one studies that whenever is continuous at . This definition is easier to understand but has skipped a lot of important details.

Equivalent Conditions

In this section we study vague convergence in a view of measure theory, utilising arguments most of the time. We will see that the convergence looks quite similar to the convergence of .

Let be a sequence of real numbers, we can recall that

  • If converges, then the limit is unique.
  • If is bounded, then it has a bounded subsequence.
  • If every subsequence of converges to , then converges to .

These results are natural in the context of calculus, but in the world of topology and functional analysis, these are not naturally expected. However, s.p.m.’s enjoy all three of them (for the second point, notice that an s.p.m. is bounded in a sense anyway.) Nevertheless, it would be too ambitious to include everything here and assume that the reader will finish it in one shot.

Theorem 1. Let and be s.p.m.’s. The following conditions are equivalent:

(2) For every finite interval and , there exists an such that whenever ,

(3) For every continuity interval of , we have

When and are p.m.’s, the second condition is equivalent to the “uniformed” edition:

(4) For every and , there exists such that if , then for every interval , possibly infinite:

Proof. We first study the equivalence of the first three statements. Suppose converges vaguely to . We are given a dense subset of the real line such that whenever and $a0a_1,a_2,b_1,b_2 \in D$ satisfying$,>

By vague convergence, there exists such that whenever ,

for and . It follows that

and on the other hand

Combining both, the implication is clear.

Next, we assume (2), and let be a continuous interval of , i.e we have . The relation implies that

holds for all . On the other hand, as on the left hand side, we see

Likewise, the relation yields

As on the right hand side, we obtain

To conclude both sides, notice that

This forces to converge to . This implies that . To see this, pick another continuous interval which properly contains . Then is another continuous interval. It follows that

Assume (3). Notice that the set of atoms of has to be at most countable, therefore is dense in . On the other hand, is a continuous interval if and only if . This implies (1).


The arguments above also shows that when discussing vague convergence, one can replace with , or freely, as long as is a continuous interval. It also follows that .


For (4), as (4) implies (2) (by taking ), it remains to show that (3) implies (4) assuming that and are p.m.’s. Indeed, it suffices to prove it on a finite interval, and we will firstly justify this action. Let denote the set of atoms of . First of all we can pick integer such that (that is, the interval is so big that the measure is close to enough). Pick such that and (this is possible because is dense). For the interval , we can put a finite partition

such that and for all . Therefore, we have

By (3), there exists depending on and (thereby ) such that

for all . Adding over all , replacing the endpoint with open interval, we see

It follows that

(This is where being p.m. matters.) Therefore when and discussing versus , ignoring results only in an error of . Therefore it suffices to assume that and show that

Since , there exists with such that

This concludes the proof and demonstrates why our specific choice of is important.

We cannot give a treatment of all three points above but the first point, the unicity of vague limit, is now clear.

Corollary 1 (Unicity of vague limit). Notation being in definition 3. If there is another s.p.m. and another dense set such that whenever and , one has , then and are identical.

Proof. Let be the set of atoms of and ; then if , one has and . Therefore . Since is dense in , them two must be identical.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK