Tuesday, 28 July 2020

Bi-variate Distributions

Bivariate Distributions

Joint Distribution function: Let X and Y be two random variables defined on the same probability space $(\Omega, \tilde{A}, P[.])$. Then (X, Y) is called a two-dimensional random variable.  The joint cumulative distribution function or joint distribution function of X and Y, denoted by $F_{X,Y}(x,y)$, is defined as

$$F_{X,Y}(x,y)= P[X \leq x, Y \leq y]~ \forall x, y \in R $$

It may be observed that the joint distribution function is a function of two variables and its domain is the xy-plane. Sometimes we write $F_{X,Y}(x,y)$ as $F(x,y)$.

Properties of Joint Distribution function:

1.    If $x_1<x_2$ and $y_1<y_2$ then (rectangle rule)
$$ P[x_1 <X \leq x_2, y_1<Y\leq y_2]=F(x_2,y_2)-F(x_2,y_1)-F(x_1,y_2)+F(x_1, y_1) \geq 0$$
2.    (a) $F(-\infty, y)=\lim_{x \to -\infty} F(x,y)=0 ~\forall y \in R$
       (b) $F(x, -\infty)=\lim_{y \to -\infty}F(x,y)=0 ~\forall x \in R$
       (c) $F(x, -\infty)= \lim_{x\to \infty, y\to \infty}F(x,y)=1$
3. $F(x, y)$ is right continuous in each argument i.e.
$$\lim_{h\to 0_+} F(x+h, y) = lim_{h \to 0_+} F(x, y+h)=F(x, y)$$
Remark: Any function of two variables which fails to satisfy one of the above three conditions is not a joint distribution function.

Example 1. Show that the bivariate function: 
$F(x, y) = \left\{\begin{array}{ll} e^{-(x+y) & \quad x>0, y>0 \\ 0 & \quad otherwise\end{array}\right.$
is not a joint distribution function.
Solution: $x^2$

Monday, 27 July 2020

Multinomial, Exponential and Gamma distributions

Multinomial, Exponential and Gamma distributions  

Multinomial Distribution: This distribution can be regarded as a generalization of Binomial distribution.

When there are more than two exclusive outcomes of a trial, the observation leads to Multinomial distribution.

Suppose $E_1, E_2,......, E_k$ are mutually exclusive and exhaustive outcomes of a trial with respective probabilities $p_1, p_2, .....,p_k$.  The probability that, $E_1$ occurs $x_1$ times, $E_2$ occurs $x_2$ times,………, $E_k$ occurs $x_k$ times in $n$ independent observations, is given by

$$P[x_1, x_2,......, x_k]=C. p_1^{x_1} p_2^{x_2}....... p_k^{x_k}, ~ where, \sum x_i =n$$

and $C$ is the number of permutation of the events $E_1, E_2,.....,E_k$ with $C=\frac{n!}{x_1!x_2!.....x_k!}$

Therefore, $P[x_1,x_2,....,x_k]=\frac{n!}{\prod_{i=1}^k x_i!} \prod_{i=1}^k p_i^{x_i}, 0\leq x_i \leq n$

Also $(p_1+p_2+.......+p_k)^n=1$ as $\sum p_i =1$.

Example 1. There are given a bag of marbles. Inside the bag, there are 5 red marbles, 4 white marbles and 3 blue marbles. Calculate the probability that with 6 trials, we have to choose 3 marbles that are red, 1 marble that is white and 2 marbles that are blue, replacing each marble after it is chosen.

Solution: Here total number of marbles are 12. Therefore probabilities of selecting red, white and blue marbles are $p_r=\frac{5}{12}, p_w=\frac{4}{12}$ and $p_b=\frac{3}{12}$ respectively.

Let $E_r, E_w$ and $E_b$ denote the events of choosing 3 red, 1 white and 2 blue marbles, then the number of marbles to be chose $x_r=3, x_w=1$ and $x_b=2$ and number of trials $n=6=x_r+x_w+x_b$.

Now the number of permutation of the events $E_r, E_w$ and $E_b$ is $C=\frac{6!}{3!1!2!}$

Therefore, probability that $E_r, E_w$ and $E_b$  occur 3 times, 1 time and 2 times, is

$P[x_r=3, x_w=1, x_b=2]=C. p_r^{x_r} p_w^{x_w} p_b^{x_b}$

$P[x_r=3, x_w=1, x_b=2]=\frac{6!}{3!1!2!} \left( \frac{5}{12}\right)^3 \left(\frac{1}{12}\right)^1 \left(\frac{3}{12}\right)^2$

$$P[x_r=3, x_w=1, x_b=2]=0.0899$$

Exercise: We are randomly drawing cards from an ordinary deck of cards. Every-time we pick one, we place it back in the deck. We do this 5 times. What is the probability of drawing 1 heart, 1 space, 1 club and 2 diamonds? Ans. 0.0586 

Exponential and Gamma distributions

Exponential distribution: A random variable X is said to have an exponential distribution with parameter $\lambda>0$, if its density function is given by $f_X(x)=\lambda e^{-\lambda x}$ for all $x\geq 0$ and $\lambda>0$

We write it as $X \sim Expo(\lambda)$

Gamma distribution: A random variable X is said to have a gamma distribution if its density function is given by $f_X(x)=\frac{\lambda}{\Gamma(r)} (\lambda x)^{r-1}e^{-\lambda x}$ for all $x\geq 0$,

Where $r>0$ and $\lambda >0$ are called the parameters of the gamma distribution.

We write it as $X \sim gam(\lambda; r)$ or $X\sim G(\lambda ; r)$

Remark 1. $\Gamma(r) $ is called gamma function and is defined as $\Gamma(r)=\int_0^{\infty} x^{r-1}e^{-x}dx$

It is easy to verify that $\Gamma(1)=1, \Gamma(a+1)=a\Gamma(a), \Gamma(n)=(n-1)!, n$ is a positive integer. 

Remark 2. Taking $r=1$, we see that gamma density function becomes exponential density function.

Theorem: If X has an exponential distribution, then  

$$E[X]=\frac{1}{\lambda}, var[X]=\frac{1}{\lambda^2}~ and~ M_X(t)=\frac{\lambda}{\lambda-t} ~for~ t< \lambda$$ 

Proof: We have $E[X]=\int_0^{\infty} xf_X(x) dx = \int_0^{\infty}x \lambda e^{-\lambda x}dx, \lambda>0$

Integrating by parts, we get

$$E[X]=\lambda \left| x \left(\frac{e^{-\lambda x}}{-\lambda}\right)\right|_0^{\infty}+\int_0^{\infty} 1. e^{-\lambda x} dx = 0-\frac{1}{\lambda}\left| e^{-\lambda x}\right|_0^{\infty}=-\frac{1}{\lambda}(0-1)$$

$$E[x]=\frac{1}{\lambda} ....... (1)$$

Now $E[x^2]=\int_0^{\infty} x^2 f_X(x) dx =\int_0^{\infty} x^2 \lambda e^{-\lambda x} dx$

Integrating by parts, we get

$$E[X^2]=\lambda \left| x^2 \left( \frac{e^{-\lambda x}}{-\lambda}\right)\right|_0^{\infty}+ 2 \int_0^{\infty} x. e^{-\lambda x} dx=0+0+\frac{2}{\lambda} \int_0^{\infty} e^{-\lambda x}dx$$

$$E[X^2]=\frac{2}{\lambda}.\frac{1}{\lambda}=\frac{2}{\lambda^2}$$

Therefore, $Var[X]=E[X^2]-(E[X])^2=\frac{2}{\lambda^2}-\frac{1}{\lambda^2}=\frac{1}{\lambda^2}$

The m.g.f. of X is given by

$$ M_X(t) = E[e^{tX}]=\int_0^{\infty} e^{tx}f_X(x) dx=\int_0^{\infty} e^{tx}\left(\lambda e^{-\lambda x}\right)dx $$

$$ M_X(t)=\lambda \int_0^{\infty} e^{-(\lambda-t)x}dx,~~~ \lambda >t$$

$$ M_X(t)=- \frac{\lambda}{\lambda-t}\left| e^{-(\lambda-t)x}\right|_0^{\infty}=-\frac{\lambda}{\lambda-t}(0-1)$$

Hence, $$ M_X(t)=\frac{\lambda}{\lambda - t}, ~ for ~\lambda > t$$.

Theorem: If X has gamma distribution with parameters $r$ and $\lambda$, then

$$E[X]=\frac{r}{\lambda},~ var[X]=\frac{r}{\lambda^2}~ and~ M_X(t)=\left(\frac{\lambda}{\lambda-t}\right)^r~ for ~t<\lambda$$

Theorem: Show that the sum of independent gamma variates is also a gamma variate

Hint: Let $x_i \sim G(\lambda; r_i)$ for $i=1, 2, ......., n;$ 

Example 1. If X has exponential distribution with mean 2, find $P[X<1|X>2]$

Solution: We are given that $\frac{1}{\lambda}=2 \Rightarrow \lambda=\frac{1}{2}$

Now $P[X<1|X<2]=\frac{P[(X<1)\cap (X<2)]}{P[X<2]}=\frac{P[X<1]}{P[X<2]}=\frac{\int_0^1 \lambda e^{-\lambda x}dx}{\int_0^2 \lambda e^{-\lambda x}dx}$

$P[X<1|X<2]=\frac{-\frac{1}{\lambda}|e^{-\lambda x}|_0^{\infty}}{\frac{1}{\lambda}|e^{-\lambda x}|_0^2}= \frac{1-e^{-\lambda}}{1-e^{-2\lambda}}=\frac{1}{1+e^{-\lambda}}$

Hence, $P[X<1|X<2]=\frac{1}{1+e^{-\frac{1}{2}}}$

Exercises:

1. If X has exponential distribution with $P[X\leq 1]=P[X>1],$ then find $var[x]

2.  Find the median of the exponential distribution. 

3. If $X\sim Expo (\lambda)$ find the value $k$ such that $\frac{P[X>k]}{P[X\leq k]}=a.$


Thursday, 16 July 2020

Normal Distribution

Normal Distribution: A random variable X whose probability density function is given by

$$\phi (x)=\frac{1}{\sigma \sqrt{2\pi}}e^{-\frac{(x-\mu)^2}{2\sigma}}, -\infty <x< \infty ..... (1)$$

is called a normal variate. We also say that X is normally distributed with parameters $\mu$ and $\sigma^2$, and is denoted as $X\sim N(\mu, \sigma^2)$. The continuous distribution given by eq. (1) is called the normal distribution.

Remark 1. The cumulative distribution of  $X\sim N(\mu, \sigma^2)$ is denoted as $\phi(x)$. Thus

$$ \phi(x) = P[X\leq x]=\int_{-\infty}^{x}\phi (u) du$$

Remark 2. It can be verified that $\int_{-\infty}^{\infty} \phi(x)dx=  1$

Remark 3. The curve $y=\frac{1}{\sigma \sqrt{2\pi}}e^{-\frac{(x-\mu)^2}{2\sigma}}$ called the normal curve which possesses the following properties:

i.  The curve is bell shaped and symmetrical about the line $x=\mu$ 

ii.       The maximum ordinate is $\frac{1}{\sigma \sqrt{2\pi}}$ , which occur at $x=\mu$

iii.       Mean, median and mode of the normal distribution coincide i.e. at $x=\mu$

Standard Normal Variate: If $X\sim N(\mu,\sigma^2)$, then $Z=\frac{x-\pi}{\sigma}$ is called the standard normal variate with $ E[Z]=0 $ and $ var[z]=1$, and is written as$ Z \sim N(0,1)$.

The p.d.f. of standard normal variate Z is given by  $\phi(z)=\frac{1}{\sqrt2\pi}$

and the corresponding distribution function is given by $ \phi(z)=P[Z \leq z] = \int_{-\infty}^{z} \phi (z)dz$

 Remark: Since $\int_{-\infty}^{\infty} \varphi(z) dz=1$ therefore, $\frac{1}{\sqrt{2 \pi}} e^{-\frac{1}{2}z^2}$

Hence, $\int_{-\infty}^{\infty} e^{-\frac{1}{2} z^{2}} d z=\sqrt{2 \pi}$

Theorem: If $X \sim N\left(\mu, \sigma^{2}\right)$ , then $E[X]=\mu, var[x]=\sigma^2$ and $M_{X}(t)=e^{\mu t+\frac{1}{2} \sigma^{2} t^{2}}$

Proof: m.g.f. of X is given by $M_{X}(t)=E\left[e^{tX}\right]=\int_{-\infty}^{\infty}e^{tX}\cdot \varphi(x) dx$

$M_{X}(t)=\frac{1}{\sigma \sqrt{2 \pi}}\int_{-\infty}^{\infty}e^{tX}\cdot e^{-\frac{1}{2}\left(\frac{x-\mu}{\sigma}\right)^2}dx= \frac{1}{\sqrt{2\pi}}\int_{-\infty}^{\infty}e^{t\{\mu+\sigma z\}}\cdot e^{-\frac{1}{2} z^2}dz$, where $z=\left(\frac{z-\mu}{\sigma}\right)$

$M_{X}(t)=\frac{e^{t \mu}}{\sqrt{2 \pi}}\int_{-\infty}^{\infty}\cdot e^{-\frac{1}{2}\left(z^2-2t\sigma z \right)}dz=\frac{e^{t\mu}}{\sqrt{2\pi}} \int_{-\infty}^{\infty}\cdot e^{-\frac{1}{2}\left((z-\sigma t)^{2}-\sigma^{2} t^{2}\right\}} d z$

$M_{X}(t)=\frac{e^{t \mu+\frac{1}{2} \sigma^2  t^2}}{\sqrt{2\pi}}\int_{-\infty}^{\infty} \cdot e^{-frac{1}{2}(z-\sigma t)^{2}} d z . \quad(\text{ Put } u=z-\sigma t)$

$M_{X}(t)=\frac{e^{t \mu+\frac{1}{2} \sigma^{2} t^{2}}}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} \cdot e^{-\frac{1}{2} u^{2}} d u=e^{t \mu+\frac{1}{2} \sigma^{2} t^{2}} \times 1=e^{t \mu+\frac{1}{2} \sigma^{2} t^{2}} (1)$

Hence, $M_{X}(t)=e^{t \mu+\frac{1}{2} \sigma^{2} t^{2}}$ is the required m.g.f. of the normal distribution.

Differentiating (1) with respect to $t$, we get

$$M_{X}^{\prime}(t)=e^{t\mu+\frac{1}{2}\sigma^2}(\mu + t \sigma^2) \Rightarrow E[X] =\left. M_{X}^{\prime}(t)\right|_{t=0}=\mu....... (2)$$

Differentiating (2) with respect to $t$, we get

$$M_{X}^{\prime \prime} (t)=e^{t\mu +\frac{1}{2}\sigma^2 t^2} (\mu+t\sigma^2)^2 + e^{t\mu +\frac{1}{2}\sigma^2 t^2}\sigma^2$$

$$ E[X^2]=\left. M_{X}^{\prime \prime}\right|_{t=0}=\mu^2+\sigma^2............. (3)$$

Hence, $var[X]=E[X^2]-(E[X])^2=\mu^2 +\sigma^2-\mu^2=\sigma^2$

Theorem: If $X\sim N(\mu, \sigma^2),$ then $P[a<X<b]=\Phi\left(\frac{b-\mu}{\sigma}\right)-\Phi\left(\frac{a-\mu}{\sigma}\right)$

Theorem: If $X\sim (0, 1)$, then $P[a <X <b]=\Phi(b)-\Phi(a)=\int_a^b \Phi(z)dz$

Area under a normal curve: The definite integral xai(z) $=\int_0^z \varphi(z)dz $ is called the normal probability integral, which gives the area under the standard normal curve between the ordinates $Z=0$ to $Z=z$. Also due to symmetry, it gives the same area under the ordinates $Z=-z$ to $Z=0$

The value of $\int_0^z \varphi(z)dz=\frac{1}{\sqrt{2\pi}}\int_0^z e^-\frac{1}{2}z^2dz$ for different values of $z$ 


at the interval of 0.01 are given in the following table:


Important Results:


1.    $P[-z_1 \leq Z \leq z_1 ]=2P[0\leq Z \leq z_1]=2P[-z_1 \leq Z \leq 0]$
2.    $P[-z_1 \leq Z \leq 0]=P[0 \leq Z \leq z_1]$
3.    $P[Z\leq -z_1]=P[z_1 \leq Z]$
4.    $P[Z\leq z_1]=0.5+P[0\leq Z \leq z_1]$
5.    $P[-z_1 \leq Z]=0.5+P[-z_1 \leq Z \leq 0]$
6.    $P[0\leq Z \leq z_1]=0.5$ and $P[-z_1\leq Z \leq 0]=0.5$ whenever $z_1\geq 2.5$ approx.

    While solving problems, we are generally given that $X$ is a normal variate with mean $\mu$ and standard deviation $\sigma$. We should convert the variate $X$ into the standard normal variate $Z$ by means of the transformation $Z=\frac{X-\mu}{\sigma}$ and further the given probability under the normal curve should be converted to the form $P[0\leq Z \leq z] $ in order to make use of the normal table.

Example 1. If X is a normally distributed with mean 2 and variance 1, find $P[|x-2|<1]$.

Solution: We have $\mu=2$ and $\sigma=1$. Let $Z=\frac{X-\mu}{\sigma}=X-2$

Now, $P[|X-2|<1]=P[2-1<X<2+1]=P[1<x<3]=P[1-2<X-2<3-2]$

$\Rightarrow P[|X-2|<1]=P[1-2<X-2<3-1]=P[-1<Z<1]$

$\Rightarrow P[|X-2|<1]=2P[0<Z<1]$

$\Rightarrow P[|X-2|<1]=2 \times 0.3413 = 0.6826$ from table

Example 2: If $X$ is a random variable with mean $50$ and variance $100$, find $P[Y \leq 3137]$,

Where $Y=X^2+1$ and $\xai(0.6)=0.2258$.

Solution: Given that $\mu =50$ and $\sigma=10$.

Therefore, $P[Y\leq 3137]=P[X^2+1 \leq 3137]=P[X^2+1 \leq 3137 +1]$

$$ P[Y\leq 3137]=P[X^2 \leq 3137] = P[X^2 \leq 56^2] = P[-56 \leq X \leq 56]$$

$$ P[Y \leq 3137]= P\left[\frac{-56-50}{10}\leq \frac{X-50}{10}\right]=P[-10.6 \leq Z \leq 0.6]$$ 

$$P[Y\leq 3137] = P=[-10.6\leq Z \leq 0] + P[0\leq Z \leq 0.6], ~by~putting~ Z=\frac{X-\mu}{\sigma}$$

$$ P[Y\leq 3137]=0.5 +\xai(0.6) = 0.5 +0.2258$$

$$ P[Y\leq 3137] = 0.7258$$

Example 3: If $log_e x$ is normally distributed with mean as unity and variance 4, find $P[\frac{1}{2} <X <2]$. Given that $log_e 2=0.693$.

Solution: $P=[\frac{1}{2} <x <2]=P[log_e \frac{1}{2} < log_e x <log_e 2]=P[-0.693<log_e x<0.693]$

$$ P[\frac{1}{2}<x<2] = P[log_e\frac{1}{2} < log_e x<log_e2]=P[-0.693<log_e x<0.693]$$

$$ P[\frac{1}{2}<x<2] =P[-0.693 < y < 0.693], ~where~ y=log_e x \sim N(1, 4)$$

$$P[\frac{1}{2}<x<2]=P[\frac{-0.693-1}{2}<\frac{y-1}{2}<\frac{0.693-1}{2}, take~ Z=\frac{y-\mu}{\sigma}=\frac{y-1}{2} \sim N(0, 1)$$

$$P\left[\frac{1}{2}  <x<2 \right] = P[-0.85 < Z <-0.15] = P[0.15<Z<0.85], by ~ symmetry$$

$$P\left[\frac{1}{2}  <x<2 \right] =\xai(0.85) - \xai(0.15)=0.3023- 0.0596 = 0.2427$$

Example 4. The local authorities of Chapra installed 2000 electric lamps in the street of city. If the lamps have an average life of 1000 burning hours with a S.D. of 200 hours. Assumes that the lives of the lamps are normally distributed. (i) What number of the lamps might be expected to fail in the first 700 burning hours, (ii) after what period of burning hours would we expect that (a) 10% of the lamps would have failed (b) 10% of the lamps would be still burning?

Solution: (i) Let X denote the life of lamps, then $X\sim N(1000, 200^2)$

$$P[X < 700]=P\left[ \frac{X-1000}{200}<\frac{700-1000}{200}\right] =P[Z<-1.5],~ where ~ Z=\frac{X-\mu}{\sigma}=\frac{X-1000}{200}$$ 

$$ P[X<700] = P[Z>1.5]=0.5-P[0\leq Z \leq 1.5] = 0.5 - 0.433, ~ from~table$$

$$P[X<700] = 0.067$$.

Thus the number of lamps expected in first 700 hours of burning $=2000 \times 0.067=134$

ii) (a) Let $t$ denote the period in hours when 10%  of the lamps would still be burning. Then

$P[X< t] = 0.1 \Rightarrow P[X \leq t] =1.0 -0.1 =0.9$

$\Rightarrow P\left[ \frac{X-1000}{200}\leq \frac{t-1000}{200} \right] = 0.9 \Rightarrow P\left[Z \leq \frac{t-1000}{200}\right]=0.9$

$\Rightarrow P \left[0 \leq Z \leq \frac{t-1000}{200}\right]=0.9-0.5=0.4=P[0\leq Z \leq 1.28]$

$\Rightarrow \frac{t-1000}{200}=1.28 \Rightarrow t=1256. (From Table)

Hence 10% of the lamps would still be burning after 1256 burning hours.

(b)  If we draw a figure, it follows from the figure that 10% of the lamps would have failed after 1000-256 = 744 burning hours

Example 5. Of a large group of men, 5% are under 60 inches in height and 40% are between 60 and 65 inches. Assuming a normal distribution, find the mean height and standard deviation.

Solution: If $X \sim N(\mu, \sigma^2)$, then we are given $P[X<60]=0.05$ and $P[60<X<65]=0.40$

$\Rightarrow P[X < 60 ] = 0.05$ and $P[X<65]=0.05+0.40=0.45$

$\Rightarrow P\left[\frac{X-\mu}{\sigma}<\frac{60-\mu}{\sigma}\right]=0.05$ and $\Rightarrow P\left[\frac{X-\mu}{\sigma}<\frac{65-\mu}{\sigma}\right]=0.45$ (Put $Z=\frac{X-\mu}{\sigma}$)

$\Rightarrow P\left[Z<\frac{60-\mu}{\sigma}\right]=0.05$ and $\Rightarrow P\left[Z< \frac{65-\mu}{\sigma}\right]=0.45$

$\Rightarrow P\left[\frac{60-\mu}{\sigma}<Z<0 \right] = 0.5-0.05$ and $P\left[\frac{65-\mu}{\sigma}<Z<0\right]=0.5-0.45$

$\Rightarrow P\left[\frac{60-\mu}{\sigma}<Z<0 \right] = 0.45$ and $P\left[\frac{65-\mu}{\sigma}<Z<0\right]=0.05$ (due to symmetry)

$\Rightarrow P\left[0<Z<\frac{\mu-60}{\sigma}\right] = 0.45$ and $P\left[0<Z<\frac{\mu-65}{\sigma}<Z<0\right]=0.05$ (By table)

$\Rightarrow P\left[0<Z<\frac{\mu-60}{\sigma}\right]=P[0<Z<1.645]$ and $P[0<Z<\frac{\mu-65}{\sigma}=P[0<Z<0.13]$

$\Rightarrow \frac{\mu-60}{\sigma}=1.645$ and $\frac{\mu-65}{\sigma}=0.13$

$\Rightarrow \mu= 65.42$ and $\sigma = 3.29.$

Exercises:

1.      X is a normal variate with mean 30 and S.D. 5. Find the probability that: 

        (i) $26\leq X \leq 40$,  (ii) $X\geq 15$ and  (iii) $|X-30|>5$  

2. If $log_{10}X\sim N(4, 2^2)$ and $log_{10}1202=3.08, log_{10}8318=3.92$ Then find $P[1.202<X<83180000].$

3.  Let $X$ be the life in hour of a radio tube. Assume that $X$ is normally distributed with mean 20 and variance $\sigma^2$. If a purchaser of such radio tubes requires that at least 90% of the tubes have lives exceeding 150 hours. What is the largest value of  $\sigma$ can be and shall have the purchaser satisfied?

4. Suppose that the diameter of the shafts manufactured by a certain machine are normal random variable with mean 10 cms and S.D. 0.1 cm. If for a given application the shaft must meet the requirement that its diameter fall between 9.9 and 10.2 cms, what proportion of the shafts made by this machine will meet the requirement? 

5. If the skulls are classified as A, B and C according as the length-breadth index is under 75, between 75 and 80, and over 80. Find approximately the mean and S. D. of a series in which A are 58%, B are 38% and C are 4%. (Assume that the distribution is normal and use table). 

6. Show that for a normal distribution $\beta_1=0$ and $\beta_2=3$

7. Fit the normal curve to the data (means find $\mu, \sigma$ and normal p.d.f):

    $x$ (mid-point)            2                4                6                8                10

    $f$ (frequency)            1                4                6                4                  1


Questions for 1st Sem

Topic: Beta and Gamma Function  Q1. Evaluate $\int_0^1 x^4 (1-\sqrt{x})dx$ Q2. Evaluate $\int_0^1 (1-x^3)^{-\frac{1}{2}}dx$ Q3. Show that $\...