Thursday 25 June 2020

Continuous Random Variable and Expectation

Continuous Random variable and Probability Density Function: A random variable X with $F_X(.)$ as distribution function, is called continuous if there exists a function such that $f_X(.):R \rightarrow [0,1]$ such that $$F_X(x)=\int_{-\infty}^{\infty}f_X(t)dt~~ for~ all~ x\in R  ....(1)$$

The function $f_X(.)$ or $f(x)$ is called the probability density function (p.d.f) of C or simply density funtion of $X$. From the relation (1), we observe that $f_X(x)=\frac{dF_X(x)}{dx}$

Properties of p.d.f.:

  1. $f(X)\geq 0$ for all $x \in R$
  2. $\int_{-\infty}^{\infty} f(x) dx=1$
  3. $P[a< x \leq b]=\int_a^b f(0) dx, $ for $a<b.

Various measures of central tendency, dispersion, moments and expected value:

Let $X$ be a random variable with density function $f_X(x).$

    1. Mean of $X$, denoted by $\mu_x$ or $E(X)$, is defined as:

        $\mu_x$ or $E(X)=\int_{-\infty}^{\infty} xf_X(x) dx.$ ($X$ is a continuous random variable)

        Similarly, $E(X^2)=\int_{-\infty}^{\infty} x^2 f_X(x)dx$

        If $X$ is a discrete random variable with mass points $x_1, x_2,...., x_n,...;$ then

        $\mu_x$ or $E(X)=\sum_{i=1}^{\infty} x_i f_X(x_i)$

     2. Variance of $X$ denoted by $\sigma_x^2$ or $var[X],$ is defined as 

        $\sigma_X^2$ or $var[X]=\inf_{-\infty}^{\infty}(x-\mu_x)^2f_X(x) dx$  ($X$ is a continuous random variable and $\mu_x$ is the mean)

        If $X$ is discrete random variable with mass points $x_1, x_2, ....x_n,.....;$ then

        $\sigma_X^2$ or $var[X]=\sum_{i=1}^{\infty}(x_i -\mu_x)^2 f_X(x_i).$

         Also $var[X]=E[X^2]-{E(X)}^2$. This is useful formula to determine $var[X]$

    3. Standard deviation of $X$, denoted by $\sigma_x$ is defined as $\sigma_x =+\sqrt{Var[X]}$

    4. Median (M) of a continuous random variable $X$ is given by the relation

        $\int_{-\infty}^M f(x)dx=\frac{1}{2}=\int_M^{\infty} f(x) dx$

    5. Mean deviation about the mean $\mu_x$ is defined as M.D. $=\int_{-\infty}^{\infty} |x-\mu_x|f_X(x)dx$

    6. The first and third quartiles, denoted by $Q_1$ and $Q_3$ respectively are given by

$$\int_{-\infty}^{Q_1}f(x)dx=\frac{1}{4}~and~\int_{-\infty}^{Q_2} f(x)dx=\frac{3}{4}$$

    7. Mode is the value of $X$ for which $f(x)$ is maximum. The modal value of $x$ is given by the relations:

$$f_x^{'}=0~ and~ f_X^{''}<0.$$

    8. The expectation or expected value of the function $g(x)$ of a rand om variable $X$ with $f_X(x)$as p.d.f., denoted by $E[g(X)]$, is defined as:

            i) $E[g(X)]=\sum_{n=1}^{\infty} g(x_n)f_X(x_n)$, where $X$ is discrete random variable with mass points $x_i, x_2,...., x_n,.......;(provided the series is absolutely convergent)$.

            ii) $E[g(x)=\int_{-\infty}^{\infty} g(x)f_X(x)dx$, where $X$ is a continuous random variable (provided $\int_{-\infty}^{\infty} |g(x)|f_x(x)dx < \infty$).

Properties of Expectation:

  1. $E[c]=c$, c being a constant.
  2. $E[c.g(x)]=c.E[g(x)],$ c being a constant.
  3. $E[c_1.g_1(x)+c_2.g_2(x)]=c_1.E[g_1(x)+c_2.E[g_2(x)],$ here $c_1$ and $c_2$ are any real constants.
  4. $E[g_1(x)]\leq E[g_2(x)]$, provied $g_1(x)\leq g_2(x)~ \forall x\in R$.
  5. If $g(x)=x$ then $E[g(x)]=E[X]$ is the mean of $X$.
  6. If $g(x)=(x-\mu_x)^2$, then $E[g(x)]=var[X]$.
  7. If $g(x)=(x-\mu_x)^r$, then $E[g(x)]=\mu_r$, which is the rth moment about the mean $\mu_r^{'}$
  8. If $g(x)=(x-a)^r$, then $E[g(x)]=\mu_r^{'}$, which is rth moment about the point $x=a$.
  9. If $g(x)=x^r$, then $E[g(x)]=E[X^2]=\mu_r^{'}$ which is the rth moment about the point $x=0$.
Example 1. The function $$f(x)=ae^{-\alpha x}.I_{(0, \infty)}, \alpha >0$$ is a p.d.f.

Solution: Consider $$\int_{-\infty}^{\infty} f(x) dx = \int_{-\infty}^{\infty} \alpha e^{-\alpha x}. I_{(0, \infty)}dx =\int_0^{\infty}\alpha e^{-\alpha x}dx=\left[ -e^{-\alpha x}\right]_0^{\infty}=-(0-1)=1$$

Hence the given function is a p.d.f.

Example 2. Let X be a continuous random variable with p.d.f $$f(x)=\tau e^{-\tau x} ~ for x\geq 0$$

$$ E[X]=\int_0^{\infty} x.f(x) dx = \int_0^{\infty} x. \tau e^ {-\tau x} dx$$
$$=\tau \left[ \left[ \frac{x. e^{-\tau x}}{-\tau}\right]_0^{\infty} +\frac{1}{\tau} \int_0^ {\infty} e^{-\tau x}dx\right] = \tau [0-\frac{1}{\tau ^2}[e^{-\tau x}]_0^{-\infty}=\frac{1}{\tau}$$
 

Wednesday 24 June 2020

Distribution Functions

    

Random Variable: A random variable on a given probability space $(\Omega, \tilde{A}, P[.])$, denoted by $X$, is a function $X(.):\Omega \rightarrow R $   such that the set $\{w \in \Omega : X(w) \leq x \}$ as $\{X \leq x \}\in \tilde{A}$ for each $x\in R$.

It is customary to write the set $\{w \in \Omega: X(w)\leq x\} as \{X \leq x\}$.

Remarks 
  1.  A random variable $X$ is a function such that $X(w)$ is a real number for each outcome $w$ of $\Omega$.
  2. For each real number $x$, the set $\{w\in \Omega : X(w) \leq x\}$ is an event, since it belongs to $\tilde{A}$.
  3. Every real-valued function defined on $\Omega$ may not be a random variable.

Example 1. The sample space of the experiment of tossing a coin is $\Omega = {H, T}$.

Define $X(.) : \Omega \rightarrow R$ as   $X(H)=1$ and $X(T)=0$.    .......(1)
Thus $X(.)$ associates a real number with each outcome of the experiment. 
Now we show that $A_x=\{w\in \Omega : X(w)\leq x\}$
We know $\tilde{A}=\{\phi, \{H\}, \{T\}, \Omega\}$
Using (1), we see that for $x<0, A_x = \phi \in \Omega$,
                                    for $0\leq x <1, A_x={T}\in \Omega$
                                    for $x\geq 1, A_x = {H, T}=\Omega \in \Omega$
Hence $A_x \in \Omega$ for each $x\in R$ and so $X(.)$ is a random variable.

Example 2. Consider a sample space  $(\Omega, \tilde{A}, P[.])$, where $\Omega = \{a, b, c, d\}$ and $\tilde{A}=\{\phi, \{a\}, \{b, c, d\}, \Omega\}$
Define $X(.) : \Omega \rightarrow R$ as $X(a)=0, X(b)=0$ and $X(c)=X(d)=1$ ,,... (1)
Thus $X(.)$ associates a real number with each outcome of the experiment. 
Now we show that $A_x=\{w\in \Omega : X(w)\leq x\}$
We know $\tilde{A}=\{\phi, \{a\}, \{b, c, d\}, \Omega\}$
Using (1), We see that for $x<0, A_x = \phi \in \Omega$
                                      for $0 \leq x < 1, A_x = \{a, b\} \notin \Omega$    
Hence $A_x \notin \Omega$ for each $x\in R $ and sso $X(.)$ is not a random variable.


Exercises:

  1. The sample space of the experiment of the tossing a die in $\Omega=\{1, 2, 3, 4, 5, 6\}$. Then the variable $X(.) : \Omega \rightarrow R$ defined as $X(w)=w; w=1, 2, 3, 4, 5, 6$ is a random variable.

  2. The constat function $X(.):\Omega \rightarrow R$ defined as $X(w)= k~ \forall~ w~ \in \Omega $, is a random variable.

  3. Give two examples of random variables in which, one is random variable and other is not a random variable.

Indicator Function: If A is any subset of $\Omega$, then the indicator function of $A$, denoted by $I_A(.)$, is a function $I_A(.): \Omega \rightarrow \{0, 1\}$ such that 
$$I_A(w) = \left\{ \begin{array}{ll} 1, & if~ w\in A\\0, & if~ w\notin A \end{array}\right.$$
Examples:  
  1. $I_{0,1}(x)=\left \{ \begin{array}{ll} 0& if~0<x<1 \\ 0, & otherwise \end{array} \right.$
  2.  The function $f:R\rightarrow R$ defined by $f(x) = \left\{\begin{array}{ll}0, & for~ x\leq 0\\1, & for~ 0<x<1\\2, & for x\geq 1\end{array}\right.$
Can be written as $f(x) = I_{(0,1)}(x)+2I_{(1, \infty)}(x).$

Remark:  $I_A$ is a random variable.

Properties of Indicator Function:

  1. $I_{\Omega}(w) = 1, I_{\phi}(w)=0$
  2. $I_{\bar{A}}(w) = 1-I_A(w)$ for each $A\in \tilde{A}$.
  3. $I_{A\cup B}(w) = max{I_A(w), I_B(w)}$
  4. $I_A^2(w) = I_A (w)$ for $A \in \tilde{A}$
Cumulative Distribution Function: The cumulative distribution function or simple distribution function of a random variable X, denoted by $F_X(.)$, is defined as a function $ F_X(.):R\rightarrow [0,1]$ such that
$F_X(x)=P[X\leq x], \forall x\in R$. This implies that $0\leq F_X(x) \leq 1$.
Recall that $\{X\leq x\}=\{w:X(w) \leq x\}$
This implies that $0\leq F_x(x)\leq 1$. Sometimes we write $F(x)$ instead of $F_X(x)$.

Properties of Distribution Function:

    If $F_X(.)$ is the distribution function of the random variable $X$ and $a < b,$ then
  1. $F_X(a) \leq F_X(b)~for~ a<b \Rightarrow$ Distribution function is monotonically increasing function.
  2. $F_X(.)$ is continuous from the right i.e., $lim_{n\rightarrow} F_X(x+h)=F_X(x)$.
  3. $F_X(-\infty) = lim_{x\rightarrow -\infty} F_X(x)=0$ and $F_X(\infty) = lim_{x\rightarrow\infty} F_X(x)=1.$
Discrete random variable: A random variable $X$ is said to be discrete if the range of $X$ is countable i.e. $X$ takes the values  $x_1, x_2, x_3,...x_n,...$. The countable values of  $X$ are called the mass point of $X$. The distribution function of a discrete random variable $X$ is called a discrete distribution function.

Discrete Density Function: If $X$ is a discrete random variable with distinct values $x_1, x_2,..., x_n,....$ then the function $f_X(.):R\rightarrow [0, 1],$ satisfying

$$f_X(x) = \left\{ \begin{array}{ll}P[X=x_i], & if~ x=x_i, i=1, 2,..., n\\ 0, & if~x\neq x_i, \forall i \in N\end{array}\right.$$
is called the discrete density function of $X$ or prabability mass function or probability function of $X$. The value $f_X(x_i)$ or $f(x_i)$ or is called the mass associated with the mass point $x_i$ such that
  1. $f(x_i)>0~ for~ i=1, 2,....., $
  2. $f(x)=0~for~x\neq x_i, i=1, 2,....$
  3. $\sum_{n=1}^{\infty} x_n = 1$ 

Example 1. A random variable has the following probability distribution:

 $x$              0          1     2     3     4     5     6     7     8
 $p(x)$     $k$     $3k$    $5k$    $7k$    $9k$   $11k$  $13k$  $15k$  $17k$
  1. Determine the value of $k$.
  2. Find $P[X < 4], P[X\geq 5], P[0<X<4]$
  3. Find the distribution function.
  4. Find the smallest value of $x$ for which $P[X \leq x]>\frac{1}{2}$.
Solution: 
  1. Here $f(x)=p(x) \Rightarrow \sum p(x) = 1 \Rightarrow [k+3k+5k+7k+9k+11k+13k+15k+17k]=1$  $\Rightarrow k=\frac{1}{81}$.
  2. $P[X<4]=P[X=0]+P[X=1]+P[X=2]+P[X=3]=k+3k+5k+7k=\frac{16}{81}$                                     $P[X\geq 5] = P[X=5] + P[X= 6]+ P[X=7] +P[X=8]=11k+13k+15k+17k = \frac{56}{81}$             $P[0<X<4]=P[X=1] + P[X=2]+P[X=3] = 3k +5k+7k = \frac{15}{81}$
  3. If $F(x)$ is the distribution function then
 $x$0          1 2 3     4 5 6 7 8
 $f(x)=p(x)$ 1/81 3/81 5/81 7/81 9/81 11/81 13/81 15/81 17/81
 $F(x)=P[X\leq x]$ 1/81 4/81 1/9 16/81 25/81 4/9 49/81 64/81 1

        4. As we know that $\frac{4}{9} < \frac{1}{2}$ and $\frac{49}{81}>\frac{1}{2}$, therefore from the distribution table, we find that $F(x) = P[X\leq x] >\frac{1}{2}$ for $x=6$.


Exercises:

  1.  Let $p(x)$ be the probability function of a discrete random variable $X$ which assumes the values $x_1, x_2, x_3, x_4$   such that $2p(x_1)=3p(x_2)=p(x_3)=5p(x_4)$. Find probability distribution and cumulative distribution of $X$.

  2. Let $X$ be a random variable such that $P[X=-2]=P[X=-1], P[X=2]=P[X=1]$ and $P[X>0]=P[X<0]=P[X=0]$. Obtain the probability mass function and its distribution function.

  3. A random variable $X$ can take all non-negative integral values, and the probability that $X$ takes the value $r$ is proportional to $\alpha^r$ where $(0<\alpha <1)$. Find $P[X=0]$.

  4. A random variable $X$ takes values 0, 1, 2, ....., with probability proportional to $(x+1)(\frac{1}{5})^x$. Find the probability that $X\leq 5.$



*** THE END ***

Conditional Probability, Independent Events and Bay’s Rule

Conditional Probability: Let A and B be two events in a probability space$(\Omega, \tilde{A}, P[.])$. The conditional probability of event A given event B, denoted by P[A/B] is defined by

$$P[A/B]=\frac{P[AB]}{P[B]}, if ~P[B] >0$$

Similarly $P[B/A]=\frac{P[AB]}{P[A]}, if ~P[A] >0$ is the conditional probaility of event B given event A.

From the above relation, We see that $P[AB]=P[A/B]P[B]= P[B/A]P[A],$

Independent Events: Two events A and B defined on a probability space $(\Omega, \tilde{A}, P[.])$ are said to be independent if $P[AB]=P[A\cap B]=P[A] P[B]$.

Similarly: Any $n$ events $A_1, A_2,...., A_n$ defined on a probability space $(\Omega, \tilde{A}, P[.])$ are said to be independent if and only if the following conditons are satisfied:

$P[A_i A_j] = P[A_i]P[A_j], ~ for ~ i\neq j$

$P[A_i A_j A_k] = P[A_i] P[A_j]P[A_k]~ for i\neq j, i\neq k, j\neq k$

....

....

$P[A_1 A_2 .... A_n]=P[A_1]P[A_2]P[A_3]....P[A_n]$.

Theorem: Show that the following conditions are equivalent:

  1. $P[AB]=P[A]P[B]$
  2. $[A/B]= P[A], ~if ~P[B]>0$.
  3. $P[B/A]= P[B], ~if~ P[A] >0 $
Proof. First, we prove that $(1) \Rightarrow (2)$

Let $P[AB]=P[A]P[B]$                                                                ....(1)
By definition $P[A/B]= \frac{P[AB]}{P[B]}$                             ....(2)
From (1) and (2), $P[A/B]=P[A]$. Hence $(1) \Rightarrow (2)$.

Next we show that $(2) \Rightarrow (1)$
Let $P[A/B]=P[A]$                                                                        ....(3)
By definition of conditional probability, we have
$P[AB]=P[A/B]P[B]=P[B/A]P[A]$                                               ....(4)
Therefore, $P[B/A]P[A]=P[A]P[B],$ 
Using $\Rightarrow P[B/A]=P[B]$
Hence $(2) \Rightarrow (3)$
Lastly, we show that $(3)\Rightarrow (1)$
$P[B/A]=P[B]$, where $P[A]>0 $.                                                .....(5)
From (4) and (5), we have $P[AB]= P[A] P[B]$ for $P[A]>0$ 
If $P[A]=0,$ then $P[A]P[B]=0$ by (4), $P[AB] = 0$
Hence $P[AB] = P[A]P[B].$

Theorem: If A and B are independent events, then

  1. A and $\bar{B}$ are independent.
  2.  $\bar{A}$ and B are independent and
  3. $\bar{A}$ and $\bar{B}$ are independent.
Example 1. If A and B are independent and $P[A] = P[B] =\frac{1}{2},$ what is $P[A\bar{B} \cap \bar{A}B]$?

Solution: Since A and B are independent, therefore all $A, B, \bar{A},$ and $\bar{B}$ are indepedent?

Consequently, $P[A\bar{B}]=P[A]P[\bar{B}]=\frac{1}{2}(1-\frac{1}{2})=\frac{1}{4}$,

Now $P[A\bar{B}\bar{A}B] = P[A\bar{B}]+P[\bar{A}B]-P[A\bar{B}\bar{A}B] = \frac{1}{4}+\frac{1}{4}-P[\phi]=\frac{1}{2}$ as $A\bar{A}=\phi$ and $\bar{B}B=\phi$.

Example 2. Five persons of the people have high blood pressure. Of the people with high blood pressure, $75\%$ drink alcohol; whereas, only $50\%$ of the people without high blood pressure drink alcohol. What percent of the drinkers have high blood pressure?

Solution: Let  A denote the event that people have high blood pressure and B denote the people who drink alcohol.

We have $P[A]=0.05, P[B/A]=0.75, P[B/\bar{A}]=0.50$

We have to find $P[A/B]=\frac{P[A\cap B]}{P[B]}$              ......(1)

We know $B = AB \cup \bar{A}B$ and $AB \cap \bar{A}B =\phi$

$\Rightarrow P[B] = P[AB] + P[\bar{A}B]= P[A\cap B] + P[\bar{A}\cap B]$....... (2)

Now $P[B/A]=0.75 \Rightarrow \frac{P[B\cap A]}{P[A]}=0.75 \Rightarrow P[A\cap B]= 0.75 \times 0,05 = 0.0375$ .........(3)

Also $P[B/\bar{A}]=0.50 \Rightarrow \frac{P[B\cap \bar{A}]}{P[\bar{A}]}=0.50 \Rightarrow P[B\cap \bar{A}]=\frac{1}{2}(1-P[A])$

$\Rightarrow P[B\cap \bar{A}] = P[B]- P[A\cap B] = \frac{1}{2}(1-0.05)=0.475$ by (2)

Therefore, $P[B]=0.0375 +0.475 = 0.5125$ by (3).

Hence, by (1), $P[A/B]=\frac{P[A\cap B}{P[B]}=\frac{0.0375}{0.5125}=\frac{3}{41}=0.073$

Hence the required percentage is $73\%$.

Exercises:

  1. If $P[A] = P[B] = P[B/A]=0.5$ are A and B are independent?
  2. If $P[A]=a, P[B]=b,$ then show that $P[A/B] \geq (a+b-1)/b$.
  3. Suppose A and B are events for which $P[A]=p_1$, $P[B]=p_2$, and $P[A \cap B]=p_3.$ Evaluate
      1. $P[\bar{A}\cap B]$
      2. $P[\bar{A}\cup B]$
      3. $P[A\cap \bar{B}]$
      4. $P[\bar{A}\cap \bar{B}]$
      5. $P[\overline{A \cap B}]$
      6. $P[\overline{A \cup B}]$
      7. $P[\bar{A}\cup \bar{B}]$
      8. $P[A/B]$
      9. $P[B/\bar{A}]$
      10. $P[\bar{A}/\bar{B}$
      11. $P[\bar{A}\cap (A\cup B)]$
      12. $P[A\cup (\bar{A}\cap B)]$
  4. Suppose an urn contains $M$ balls of which $k$ are black and $M-K$ are white. A sample of size $n$ is drawn with replacement. Find the probability that the j-th ball drawn is black that the sample contains $K$ black balls.

Theorem of Total Probability:

Let $B_1, B_2,...., B_n$ be a collection of mutually disjoint events in the probability space $(\Omega, \tilde{A}, P[.])$ such that $\Omega = \cup_{j=1}^n B_j$ and $P[B_j]>0, j=1, 2,...,n.$

Then $P[A]= \sum_{j=1}^n P[A/B_j]P[B_j]$ for each $A \in \tilde{A}$

Proof: We have $A = A \cap \Omega = A \cap (\cup_{j=1}^n AB_j)=\sum_{j=1}{n} P[AB_j]$.... (1)

By definition, $P[AB_j]=P[A/B_j]P[B_j]$   ......(2)

From (1) and (2), we get $P[A] =\sum_{j=1}^n P[A/B_j]P[B_j]$.

Corollary: If $A, B \in \tilde{A};$ then $P[A]=P[A/B]P[B]+P[A/\bar{B}]P[\bar{B}], P[B]>0$

Proof : We have $\Omega = B \cup \bar{B}$, where $B$ and $\bar{B}$ are mutually disjoint.

Hence by the above theorem, $P[A]=P[A/B]P[B]+P[A/\bar{B}]P[\bar{B}], P[B]>0$

Bay's Theorem :

Let $B_1, B_2,....., B_n$ be a collection of mutually disjoint events in the probability space $(\Omega, \tilde{A}, P[.])$  such that $\Omega = \cup_{j=1}^n B_j$ and $P[B_j]>0, j=1, 2,....., n.$

Then for each $A \in \tilde{A}$ satisfying $P[A]>0$, we have

$P[B_k/A] = \frac{P[A/B_k]P[B_k] }{\sum_{j=1}^n P[A/B_j]P[B_j]}$, this is know as Bay's formulla.

Proof: By the definition of conditional probability, we have

$P[B_k/A]=\frac{P[B_kA}{P[A]}$ and $P[A/B_k]=\frac{P[AB_k]}{P[B_k]}$ with $P[A]>0$ and $P[B_k]>0$            ......(1)

Using these two, we obtain $P[B_k/A]=\frac{P[A/B_k]P[B_k]}{P[A]}$ ......(2)

By a theorem of total probability, we have

$P[B_k/A]=\frac{P[A/B_k]P[B_k]}{\sum_{j=1}^n P[A/B_j]P[B_j]}$

Hence proved.

Corollary: If $A, B \in \tilde{A}$ then $P[B/A]=\frac{P[A/B_k]P[B_k]}{P[A/B]P[B]+P[A/\bar{B}]P[\bar{B}]}, P[B]> 0$.

Proof: We have $\Omega = B \cup \bar{B}$, where $B$ and $\bar{B}$ are mutually disjoint.

Hence by the Bay's theorem , $P[B/A]=\frac{P[A/B_k]P[B_k]}{P[A/B]P[B]+P[A/\bar{B}]P[\bar{B}]}, P[B]> 0$.

Example 1: Suppose $B_1, B_2$ and $B_3$ are mutually exclusive events. If $P[B_k]=\frac{1}{3}$ and $P[A/B_k]=\frac{k}{6}$ for $k=1, 2, 3.$ What is $P[A]$?

Solution:  By the theorem of total probability, we have

$$ P[A] = \sum_{k=1}^3 P[A/B_k]P[B_k]=\sum_{k=1}^6 \frac{k}{6}\times \frac{1}{3} =\frac{1}{3}$$.

Example 2. The probability that a person can hit the target is 3/5 and the probability that another person can hit the same target is 2/5. But the first person can fire 8 shots in a given time while the second person fires 10 shots. They fire together. What is the probability that the second person shoots the target?

Solution: Let E denote the event of shooting the target, $E_1$ and $E_2$ respectively denote the events that the first person and the second person shoot the target, we are given

$$ P[E/E_1]=\frac{3}{5}~ and ~ P[E/E_2] = \frac{2}{5}$$

the ratio of the shots of the first person to those of the second person in the same time is $\frac{8}{10}=\frac{4}{5}$. Thus $P[E_1]=\frac{4}{5}P[E_2].$ By Bay's theorem we get

$$P[E_2/E]=\frac{P[E/E_2][P[E_2]}{P[E/E_1]P[E_1]+P[E/E_2]P[E_2]}=\frac{\frac{2}{5} P[E_2]}{\frac{3}{5}\times \frac{4}{5}P[E_2]+\frac{2}{5}P[E_2]}$$

$$P[E_2/E_1]=\frac{5}{11}$$

Example 3. An urn contains 10 white and three black balls, while another urn contains 3 white and 5 black balls. Two balls are drawn from the first urn and put into the second urn and then a ball is drawn from the latter. What is the probability that it is a white ball?

Solution: The two balls are drawn from the first urn may be:

(i) both white or (ii) both black or (iii) one white and one black.

Let these events be denoted by A, B, C respectively. Then 

$$P[A]=\frac{10C_2}{13C_2}=\frac{15}{26}, ~~ P[B]=\frac{3C_2}{13C_2}, ~~ P[C]=\frac{10C_1 3C_1}{13C_2}=\frac{10}{26}$$

(i) 5 white and 5 black balls or (ii) 3 white and 7 black balls or (iii) 4 white and 6 black balls.

Let W denote the event of the drawing a white ball from the second urn in the above three cases. Then

 $$P[W/A]=\frac{5}{10}, ~~ P[W/B]=\frac{3}{10},~~ P[W/C]=\frac{4}{10}$$

Hence, $P[W] = P[W/A]P[A]+P[W/B]P[B]+P[W/C]P[C] $

$$=\frac{5}{10}\times \frac{15}{26} + \frac{3}{10}\times\frac{1}{26}+\frac{4}{10}\times\frac{10}{26}=\frac{59}{100}$$

Exercises:

  1. An urn contains  a white and  b black balls, while another urn contains  c white and  d black balls. One ball is transferred from the first urn and put into the second urn and then a ball is drawn from the latter. What is the probability that it will be a white ball?
  2. Three urns $A_1, A_2, A_3$ contain respectively 3 red, 4 white, 1 blue; 1 red, 2 white, 3 blue; 4 red, 3 white, 2 blue balls. One urn is chosen at random and a ball is withdrawn. It is found to be red. Find the probability that it comes from the urn  $A_2$.  
  3. An insurance company insured 2000 scooter drivers, 4000 car drivers, and 6000 truck drivers. The probability of an accident involving a scooter, a car, and a truck are 0.01, 0.03 and 0.15 respectively. One of the insured people meets with an accident. What is the probability that he is a scooter driver?
  4.  In a bolt factory machines A, B, C manufacture respectively 25, 35 and 40 percent of the total. Out of their output 5, 4 and 2 percent ate defective bolts. A bolt is drawn from the produce and is found defective. What is the probabilities that it was manufactured by A, B and C.
  5. Suppose that in answering a question  in a multiple choice test, an examinee knows the answer with probability $p$ or he guesses with probability $1-p $. Assume that the probability of answering a question correctly is unity for an examinee who knows the answer and $1/m$ for the examinee who guesses, where $m$ is the number of multiple-choice alternatives. Show that the probability that an examinee knows the answer to a problem, given that he has correctly answered it, is $\frac{mp}{1+(1-m)p}$

*** THE END ***

Classical Approach to Probability



Random Experiment and Events: Random experiment is defined as an experiment in which when repeated under essentially identical conditions does not give unique results but may result in any one of the several possible outcomes. These outcomes are known as events or cases. Events are denoted as A, B, C, etc. For example
  1. Getting a head (H) or a tail is an event when we loss a coin
  2. Getting any of six faces: 1, 2, 3, 4, 5, 6 is an event when we throw a die.
  3. Getting an ace or a king or a queen is an event when we draw a card from a pack of well-shuffled cards.

Exhaustive events: The total number of possible outcomes in a random experiment (or trial) are known as exhaustive events. For example:

  1. There are two exhaustive events head (H) and tail(T) when tossing a coin.
  2. There are six exhaustive events, 1, 2, 3, 4, 5, 6 when we throw a die.

Favourable events: The events which cause the happening of a particular event A, are called the favourable events to the event A. For example:

  1. There are three favourable events for the occurrence of an even number (or an odd number) in the throwing of a die.
  2. When we draw a card from a pack of cards, there are four favourable events for drawing an ace; there are 12 favourable events fo the drawing of a face card (King, queen, jack).

Mutually exclusive events: Such events where the occurrence of one rules out the occurrence of the other, are called mutually exclusive events. For example:

  1. In a tossing a coin there are two mutually exclusive events, for if the head comes in a trial, then tail cannot come in the same train or vice versa.
  2. There are 52 mutually exclusive events in drawing a card from a pack of cards.

Equally likely events: The events are said to be equally likely if none of them is expected to occur in preference to other. For example:

  1. In tossing a coin, there are two equally likely events H and T.
  2. In tossing a die, there are six equally likely events 1, 2, 3, 4, 5, 6.

Classical Definition of Probability: If there are $n$ exhaustive, mutually exclusive and equally likely events, out of which $m$ are favourable to the happening of an event A. Then the probability of happening of A, denoted by P[A], is defined as 

$$P[A]=\frac{Favourable~number~of~case}{Exhaustive ~of ~cases}=\frac{m}{n}$$

Example 1. Find the probability of getting an even number in a throw of a single die.

Solution: Total number of exhaustive cases $n=6$

Total number of favourable cases $m=3$

Required probability $=\frac{m}{n}=\frac{3}{6}=\frac{1}{2}$

Example 2. From a pack of 52 cards, two cards are drawn at random. Find the chance that one is a king and the other a queen.

Solution: Total number of cases is $n=52C_2$

Since there are 4 kings and 4 queens, so the number of favourable cases is $m = 4C_1 \times 4C_1$ 

Required Probability $=\frac{m}{n}=\frac{52C_2 \times 52C_2}{52C_2}=\frac{8}{663}$

Exercises:

  1. In a single throw with two dice. Find the probability of getting a total of 10.
  2. Two cards are drawn at random from a well-shuffled pack of 52 cards. Find the probability of getting two aces.
  3. If $n$ biscuit be distributed among $N$ beggars. Find chance that a particular beggar received $r( > n) $ biscuits.
  4. What is the chance that a year selected at random will cointains 53 saturday? 

*** The End ***

Questions for 1st Sem

Topic: Beta and Gamma Function  Q1. Evaluate $\int_0^1 x^4 (1-\sqrt{x})dx$ Q2. Evaluate $\int_0^1 (1-x^3)^{-\frac{1}{2}}dx$ Q3. Show that $\...