Processing math: 100%

Monday, 27 July 2020

Multinomial, Exponential and Gamma distributions

Multinomial, Exponential and Gamma distributions  

Multinomial Distribution: This distribution can be regarded as a generalization of Binomial distribution.

When there are more than two exclusive outcomes of a trial, the observation leads to Multinomial distribution.

Suppose E_1, E_2,......, E_k are mutually exclusive and exhaustive outcomes of a trial with respective probabilities p_1, p_2, .....,p_k.  The probability that, E_1 occurs x_1 times, E_2 occurs x_2 times,………, E_k occurs x_k times in n independent observations, is given by

P[x_1, x_2,......, x_k]=C. p_1^{x_1} p_2^{x_2}....... p_k^{x_k}, ~ where, \sum x_i =n

and C is the number of permutation of the events E_1, E_2,.....,E_k with C=\frac{n!}{x_1!x_2!.....x_k!}

Therefore, P[x_1,x_2,....,x_k]=\frac{n!}{\prod_{i=1}^k x_i!} \prod_{i=1}^k p_i^{x_i}, 0\leq x_i \leq n

Also (p_1+p_2+.......+p_k)^n=1 as \sum p_i =1.

Example 1. There are given a bag of marbles. Inside the bag, there are 5 red marbles, 4 white marbles and 3 blue marbles. Calculate the probability that with 6 trials, we have to choose 3 marbles that are red, 1 marble that is white and 2 marbles that are blue, replacing each marble after it is chosen.

Solution: Here total number of marbles are 12. Therefore probabilities of selecting red, white and blue marbles are p_r=\frac{5}{12}, p_w=\frac{4}{12} and p_b=\frac{3}{12} respectively.

Let E_r, E_w and E_b denote the events of choosing 3 red, 1 white and 2 blue marbles, then the number of marbles to be chose x_r=3, x_w=1 and x_b=2 and number of trials n=6=x_r+x_w+x_b.

Now the number of permutation of the events E_r, E_w and E_b is C=\frac{6!}{3!1!2!}

Therefore, probability that E_r, E_w and E_b  occur 3 times, 1 time and 2 times, is

P[x_r=3, x_w=1, x_b=2]=C. p_r^{x_r} p_w^{x_w} p_b^{x_b}

P[x_r=3, x_w=1, x_b=2]=\frac{6!}{3!1!2!} \left( \frac{5}{12}\right)^3 \left(\frac{1}{12}\right)^1 \left(\frac{3}{12}\right)^2

P[x_r=3, x_w=1, x_b=2]=0.0899

Exercise: We are randomly drawing cards from an ordinary deck of cards. Every-time we pick one, we place it back in the deck. We do this 5 times. What is the probability of drawing 1 heart, 1 space, 1 club and 2 diamonds? Ans. 0.0586 

Exponential and Gamma distributions

Exponential distribution: A random variable X is said to have an exponential distribution with parameter \lambda>0, if its density function is given by f_X(x)=\lambda e^{-\lambda x} for all x\geq 0 and \lambda>0

We write it as X \sim Expo(\lambda)

Gamma distribution: A random variable X is said to have a gamma distribution if its density function is given by f_X(x)=\frac{\lambda}{\Gamma(r)} (\lambda x)^{r-1}e^{-\lambda x} for all x\geq 0,

Where r>0 and \lambda >0 are called the parameters of the gamma distribution.

We write it as X \sim gam(\lambda; r) or X\sim G(\lambda ; r)

Remark 1. \Gamma(r)  is called gamma function and is defined as \Gamma(r)=\int_0^{\infty} x^{r-1}e^{-x}dx

It is easy to verify that \Gamma(1)=1, \Gamma(a+1)=a\Gamma(a), \Gamma(n)=(n-1)!, n is a positive integer. 

Remark 2. Taking r=1, we see that gamma density function becomes exponential density function.

Theorem: If X has an exponential distribution, then  

E[X]=\frac{1}{\lambda}, var[X]=\frac{1}{\lambda^2}~ and~ M_X(t)=\frac{\lambda}{\lambda-t} ~for~ t< \lambda 

Proof: We have E[X]=\int_0^{\infty} xf_X(x) dx = \int_0^{\infty}x \lambda e^{-\lambda x}dx, \lambda>0

Integrating by parts, we get

E[X]=\lambda \left| x \left(\frac{e^{-\lambda x}}{-\lambda}\right)\right|_0^{\infty}+\int_0^{\infty} 1. e^{-\lambda x} dx = 0-\frac{1}{\lambda}\left| e^{-\lambda x}\right|_0^{\infty}=-\frac{1}{\lambda}(0-1)

E[x]=\frac{1}{\lambda} ....... (1)

Now E[x^2]=\int_0^{\infty} x^2 f_X(x) dx =\int_0^{\infty} x^2 \lambda e^{-\lambda x} dx

Integrating by parts, we get

E[X^2]=\lambda \left| x^2 \left( \frac{e^{-\lambda x}}{-\lambda}\right)\right|_0^{\infty}+ 2 \int_0^{\infty} x. e^{-\lambda x} dx=0+0+\frac{2}{\lambda} \int_0^{\infty} e^{-\lambda x}dx

E[X^2]=\frac{2}{\lambda}.\frac{1}{\lambda}=\frac{2}{\lambda^2}

Therefore, Var[X]=E[X^2]-(E[X])^2=\frac{2}{\lambda^2}-\frac{1}{\lambda^2}=\frac{1}{\lambda^2}

The m.g.f. of X is given by

M_X(t) = E[e^{tX}]=\int_0^{\infty} e^{tx}f_X(x) dx=\int_0^{\infty} e^{tx}\left(\lambda e^{-\lambda x}\right)dx

M_X(t)=\lambda \int_0^{\infty} e^{-(\lambda-t)x}dx,~~~ \lambda >t

M_X(t)=- \frac{\lambda}{\lambda-t}\left| e^{-(\lambda-t)x}\right|_0^{\infty}=-\frac{\lambda}{\lambda-t}(0-1)

Hence, M_X(t)=\frac{\lambda}{\lambda - t}, ~ for ~\lambda > t.

Theorem: If X has gamma distribution with parameters r and \lambda, then

E[X]=\frac{r}{\lambda},~ var[X]=\frac{r}{\lambda^2}~ and~ M_X(t)=\left(\frac{\lambda}{\lambda-t}\right)^r~ for ~t<\lambda

Theorem: Show that the sum of independent gamma variates is also a gamma variate

Hint: Let x_i \sim G(\lambda; r_i) for i=1, 2, ......., n; 

Example 1. If X has exponential distribution with mean 2, find P[X<1|X>2]

Solution: We are given that \frac{1}{\lambda}=2 \Rightarrow \lambda=\frac{1}{2}

Now P[X<1|X<2]=\frac{P[(X<1)\cap (X<2)]}{P[X<2]}=\frac{P[X<1]}{P[X<2]}=\frac{\int_0^1 \lambda e^{-\lambda x}dx}{\int_0^2 \lambda e^{-\lambda x}dx}

P[X<1|X<2]=\frac{-\frac{1}{\lambda}|e^{-\lambda x}|_0^{\infty}}{\frac{1}{\lambda}|e^{-\lambda x}|_0^2}= \frac{1-e^{-\lambda}}{1-e^{-2\lambda}}=\frac{1}{1+e^{-\lambda}}

Hence, P[X<1|X<2]=\frac{1}{1+e^{-\frac{1}{2}}}

Exercises:

1. If X has exponential distribution with P[X\leq 1]=P[X>1], then find $var[x]

2.  Find the median of the exponential distribution. 

3. If X\sim Expo (\lambda) find the value k such that \frac{P[X>k]}{P[X\leq k]}=a.


No comments:

Post a Comment

Questions for 1st Sem

Topic: Beta and Gamma Function  Q1. Evaluate \int_0^1 x^4 (1-\sqrt{x})dx Q2. Evaluate \int_0^1 (1-x^3)^{-\frac{1}{2}}dx Q3. Show that $\...