1 Then X n(ω) converges to X(ω) for all ω’s and so with probability one. Precise meaning of statements like “X and Y have approximately the ConvergenceinProbability RobertBaumgarth1 1MathematicsResearchUnit,FSTC,UniversityofLuxembourg,MaisonduNombre,6,AvenuedelaFonte,4364 Esch-sur-Alzette,Grand-DuchédeLuxembourg -convergence 1-convergence a.s. convergence convergence in probability (stochastic convergence) implies convergence in probability, Sn → E(X) in probability So, WLLN requires only uncorrelation of the r.v.s (SLLN requires independence) EE 278: Convergence and Limit Theorems Page 5–14. convergence for a sequence of functions are not very useful in this case. Types of Convergence Let us start by giving some deﬂnitions of diﬁerent types of convergence. We write X n →p X or plimX n = X. Econ 620 Various Modes of Convergence Deﬁnitions • (convergence in probability) A sequence of random variables {X n} is said to converge in probability to a random variable X as n →∞if for any ε>0wehave lim n→∞ P [ω: |X n (ω)−X (ω)|≥ε]=0. Convergence in probability essentially means that the probability that jX n Xjexceeds any prescribed, strictly positive value converges to zero. Let X n(ω) = (1 − 1 n)X(ω). Also Binomial(n,p) random variable has approximately aN(np,np(1 −p)) distribution. Convergence in Distribution p 72 Undergraduate version of central limit theorem: Theorem If X 1,...,X n are iid from a population with mean µ and standard deviation σ then n1/2(X¯ −µ)/σ has approximately a normal distribution. convergence of random variables. The basic idea behind this type of convergence is that the probability of an \unusual" outcome becomes smaller and smaller as the sequence progresses. convergence in distribution is quite diﬀerent from convergence in probability or convergence almost surely. Convergence in probability. Lecture 15. • Convergence in probability Convergence in probability cannot be stated in terms of realisations Xt(ω) but only in terms of probabilities. converges has probability 1. everywhere to indicate almost sure convergence. We only require that the set on which X n(!) The notation is the following In probability theory there are four di⁄erent ways to measure convergence: De–nition 1 Almost-Sure Convergence Probabilistic version of pointwise convergence. Theorem 5.5.12 If the sequence of random variables, X1,X2,..., converges in probability to a random variable X, the sequence also converges in distribution to X. 5.2. 4.2 Law of Large Numbers and convergence in probability 63 4.3 Central Limit Theorem 66 5 Characteristic and generating functions 5.1 Calculation of characteristic and generating functions 5.2 Connection with properties of a distribution 5.3 Use of the c.f. The notation X n a.s.→ X is often used for al-most sure convergence, while the common notation for convergence in probability is X n →p X or plim n→∞X = X. Convergence in distribution and convergence in the rth mean are … Obviously, the convergence for all ω’s provides the convergence with probability one. Xt is said to converge to µ in probability (written Xt →P µ) if • Convergence in mean square We say Xt → µ in mean square (or L2 convergence), if E(Xt −µ)2 → 0 as t → ∞. This limiting form is not continuous at x= 0 and the ordinary definition of convergence in distribution cannot be immediately applied to deduce convergence in distribution or otherwise. Just hang on and remember this: the two key ideas in what follows are \convergence in probability" and \convergence in distribution." Note that if … Convergence in Distribution, Continuous Mapping Theorem, Delta Method 11/7/2011 Approximation using CTL (Review) The way we typically use the CLT result is to approximate the distribution of p n(X n )=˙by that of a standard normal. However, it is clear that for >0, P[|X|< ] = 1 −(1 − )n→1 as n→∞, so it is correct to say X n →d X, where P[X= 0] = 1, It is easy to get overwhelmed.