Problem 1. Let $X_1,X_2,X_3,\ldots$ be a sequence of random variables, and let $X$ be a random variable.

- [label=(\alph*)]
- (4 points) Define what it means for $X_n \to X$ in probability, and for $X_n \to X$ in distribution.
- (3 points) Give (with justification) an example of a sequence of variables such that $X_n \to X$ in distribution, but not in probability.
- (3 points) Give (with justification) an example of a sequence of variables such that $X_n \to X$ in distribution, but $E(X_n)$ does not converges to $E(X)$.

Proof . For the first part:

Problem 2. The $\textbf{Poisson Distribution}$ with parameter $\lambda$ is a distribution defined on the non-negative integers, with probabilities given by

$$\Pp(X=k)=e^{-\lambda}\frac{\lambda^k}{k!}$$

You may assume the result of part a for part b even if you did not solve that part.

$$\Pp(X=k)=e^{-\lambda}\frac{\lambda^k}{k!}$$

- (5 points) Suppose that $X$ has this distribution. Show that the characteristic function of $X$ is given by

$$\phi(t)=\exp(\lambda(e^{it}-1))$$ - (5 points) Suppose that $X_1$ and $X_2$ are independent variables, having Poisson distributions with parameters $\lambda_1$ and $\lambda_2$ respectively. Determine, with justification, the distribution of $X_1+X_2$.

You may assume the result of part a for part b even if you did not solve that part.

Proof .

Problem 3. (10 points)

Suppose that $X_1,X_2,\ldots$ are independent, identically distributed random variables satisfying

$$\Pp(0\leq X_i \leq 1)=1 \text{ and } \Pp(X_i=1)<1$$ Show that there is a constant $c>1$ (possibly depending on the common distribution of the $X_i$) such that

$$c^n\prod_{i=1}^nX_i \to 0 \text{ almost surely}$$

Suppose that $X_1,X_2,\ldots$ are independent, identically distributed random variables satisfying

$$\Pp(0\leq X_i \leq 1)=1 \text{ and } \Pp(X_i=1)<1$$ Show that there is a constant $c>1$ (possibly depending on the common distribution of the $X_i$) such that

$$c^n\prod_{i=1}^nX_i \to 0 \text{ almost surely}$$

Proof .

Problem 4. (10 points) Suppose that $X_1,X_2,\ldots$ are (not necessarily independent!) random variables satisfying the following three conditions:

Show that the sample mean

$$\bar{X_n}=\frac{1}{n}\sum_{i=1}^nX_i$$

converges to $0$ in probability.

- $\E(X_i)=0$ for all $i$.
- There is an absolute constant $C$ such that $\Var(X_i) \leq C$ for all $i$.
- $\E(X_iX_j) \leq 0$ for all $i \neq j$.

Show that the sample mean

$$\bar{X_n}=\frac{1}{n}\sum_{i=1}^nX_i$$

converges to $0$ in probability.

Proof .

Problem 5. Let $A_1,A_2,\ldots$ be events satisfying $\Pp(A_i) \geq \frac{1}{2}$ for all $i$. Let $A$ be the event that infinitely many of the $A_i$ occur.

- (2 points) Suppose that the $A_i$ are independent. Why must $\Pp(A)=1$? (Just citing a result is enough here).
- (2 points) Show by example that it is possible to have $\Pp(A) <1$ if the $A_i$ are not independent.
- (6 points) Show that $\Pp(A_i) \geq \frac{1}{2}$ regardless of whether or not the $A_i$ are independent.

- email: kvo020@ucr.edu