Showing posts with label Probability Chapter 3. Show all posts
Showing posts with label Probability Chapter 3. Show all posts

Wednesday, 24 June 2020

Conditional Probability, Independent Events and Bay’s Rule

Conditional Probability: Let A and B be two events in a probability space$(\Omega, \tilde{A}, P[.])$. The conditional probability of event A given event B, denoted by P[A/B] is defined by

$$P[A/B]=\frac{P[AB]}{P[B]}, if ~P[B] >0$$

Similarly $P[B/A]=\frac{P[AB]}{P[A]}, if ~P[A] >0$ is the conditional probaility of event B given event A.

From the above relation, We see that $P[AB]=P[A/B]P[B]= P[B/A]P[A],$

Independent Events: Two events A and B defined on a probability space $(\Omega, \tilde{A}, P[.])$ are said to be independent if $P[AB]=P[A\cap B]=P[A] P[B]$.

Similarly: Any $n$ events $A_1, A_2,...., A_n$ defined on a probability space $(\Omega, \tilde{A}, P[.])$ are said to be independent if and only if the following conditons are satisfied:

$P[A_i A_j] = P[A_i]P[A_j], ~ for ~ i\neq j$

$P[A_i A_j A_k] = P[A_i] P[A_j]P[A_k]~ for i\neq j, i\neq k, j\neq k$

....

....

$P[A_1 A_2 .... A_n]=P[A_1]P[A_2]P[A_3]....P[A_n]$.

Theorem: Show that the following conditions are equivalent:

  1. $P[AB]=P[A]P[B]$
  2. $[A/B]= P[A], ~if ~P[B]>0$.
  3. $P[B/A]= P[B], ~if~ P[A] >0 $
Proof. First, we prove that $(1) \Rightarrow (2)$

Let $P[AB]=P[A]P[B]$                                                                ....(1)
By definition $P[A/B]= \frac{P[AB]}{P[B]}$                             ....(2)
From (1) and (2), $P[A/B]=P[A]$. Hence $(1) \Rightarrow (2)$.

Next we show that $(2) \Rightarrow (1)$
Let $P[A/B]=P[A]$                                                                        ....(3)
By definition of conditional probability, we have
$P[AB]=P[A/B]P[B]=P[B/A]P[A]$                                               ....(4)
Therefore, $P[B/A]P[A]=P[A]P[B],$ 
Using $\Rightarrow P[B/A]=P[B]$
Hence $(2) \Rightarrow (3)$
Lastly, we show that $(3)\Rightarrow (1)$
$P[B/A]=P[B]$, where $P[A]>0 $.                                                .....(5)
From (4) and (5), we have $P[AB]= P[A] P[B]$ for $P[A]>0$ 
If $P[A]=0,$ then $P[A]P[B]=0$ by (4), $P[AB] = 0$
Hence $P[AB] = P[A]P[B].$

Theorem: If A and B are independent events, then

  1. A and $\bar{B}$ are independent.
  2.  $\bar{A}$ and B are independent and
  3. $\bar{A}$ and $\bar{B}$ are independent.
Example 1. If A and B are independent and $P[A] = P[B] =\frac{1}{2},$ what is $P[A\bar{B} \cap \bar{A}B]$?

Solution: Since A and B are independent, therefore all $A, B, \bar{A},$ and $\bar{B}$ are indepedent?

Consequently, $P[A\bar{B}]=P[A]P[\bar{B}]=\frac{1}{2}(1-\frac{1}{2})=\frac{1}{4}$,

Now $P[A\bar{B}\bar{A}B] = P[A\bar{B}]+P[\bar{A}B]-P[A\bar{B}\bar{A}B] = \frac{1}{4}+\frac{1}{4}-P[\phi]=\frac{1}{2}$ as $A\bar{A}=\phi$ and $\bar{B}B=\phi$.

Example 2. Five persons of the people have high blood pressure. Of the people with high blood pressure, $75\%$ drink alcohol; whereas, only $50\%$ of the people without high blood pressure drink alcohol. What percent of the drinkers have high blood pressure?

Solution: Let  A denote the event that people have high blood pressure and B denote the people who drink alcohol.

We have $P[A]=0.05, P[B/A]=0.75, P[B/\bar{A}]=0.50$

We have to find $P[A/B]=\frac{P[A\cap B]}{P[B]}$              ......(1)

We know $B = AB \cup \bar{A}B$ and $AB \cap \bar{A}B =\phi$

$\Rightarrow P[B] = P[AB] + P[\bar{A}B]= P[A\cap B] + P[\bar{A}\cap B]$....... (2)

Now $P[B/A]=0.75 \Rightarrow \frac{P[B\cap A]}{P[A]}=0.75 \Rightarrow P[A\cap B]= 0.75 \times 0,05 = 0.0375$ .........(3)

Also $P[B/\bar{A}]=0.50 \Rightarrow \frac{P[B\cap \bar{A}]}{P[\bar{A}]}=0.50 \Rightarrow P[B\cap \bar{A}]=\frac{1}{2}(1-P[A])$

$\Rightarrow P[B\cap \bar{A}] = P[B]- P[A\cap B] = \frac{1}{2}(1-0.05)=0.475$ by (2)

Therefore, $P[B]=0.0375 +0.475 = 0.5125$ by (3).

Hence, by (1), $P[A/B]=\frac{P[A\cap B}{P[B]}=\frac{0.0375}{0.5125}=\frac{3}{41}=0.073$

Hence the required percentage is $73\%$.

Exercises:

  1. If $P[A] = P[B] = P[B/A]=0.5$ are A and B are independent?
  2. If $P[A]=a, P[B]=b,$ then show that $P[A/B] \geq (a+b-1)/b$.
  3. Suppose A and B are events for which $P[A]=p_1$, $P[B]=p_2$, and $P[A \cap B]=p_3.$ Evaluate
      1. $P[\bar{A}\cap B]$
      2. $P[\bar{A}\cup B]$
      3. $P[A\cap \bar{B}]$
      4. $P[\bar{A}\cap \bar{B}]$
      5. $P[\overline{A \cap B}]$
      6. $P[\overline{A \cup B}]$
      7. $P[\bar{A}\cup \bar{B}]$
      8. $P[A/B]$
      9. $P[B/\bar{A}]$
      10. $P[\bar{A}/\bar{B}$
      11. $P[\bar{A}\cap (A\cup B)]$
      12. $P[A\cup (\bar{A}\cap B)]$
  4. Suppose an urn contains $M$ balls of which $k$ are black and $M-K$ are white. A sample of size $n$ is drawn with replacement. Find the probability that the j-th ball drawn is black that the sample contains $K$ black balls.

Theorem of Total Probability:

Let $B_1, B_2,...., B_n$ be a collection of mutually disjoint events in the probability space $(\Omega, \tilde{A}, P[.])$ such that $\Omega = \cup_{j=1}^n B_j$ and $P[B_j]>0, j=1, 2,...,n.$

Then $P[A]= \sum_{j=1}^n P[A/B_j]P[B_j]$ for each $A \in \tilde{A}$

Proof: We have $A = A \cap \Omega = A \cap (\cup_{j=1}^n AB_j)=\sum_{j=1}{n} P[AB_j]$.... (1)

By definition, $P[AB_j]=P[A/B_j]P[B_j]$   ......(2)

From (1) and (2), we get $P[A] =\sum_{j=1}^n P[A/B_j]P[B_j]$.

Corollary: If $A, B \in \tilde{A};$ then $P[A]=P[A/B]P[B]+P[A/\bar{B}]P[\bar{B}], P[B]>0$

Proof : We have $\Omega = B \cup \bar{B}$, where $B$ and $\bar{B}$ are mutually disjoint.

Hence by the above theorem, $P[A]=P[A/B]P[B]+P[A/\bar{B}]P[\bar{B}], P[B]>0$

Bay's Theorem :

Let $B_1, B_2,....., B_n$ be a collection of mutually disjoint events in the probability space $(\Omega, \tilde{A}, P[.])$  such that $\Omega = \cup_{j=1}^n B_j$ and $P[B_j]>0, j=1, 2,....., n.$

Then for each $A \in \tilde{A}$ satisfying $P[A]>0$, we have

$P[B_k/A] = \frac{P[A/B_k]P[B_k] }{\sum_{j=1}^n P[A/B_j]P[B_j]}$, this is know as Bay's formulla.

Proof: By the definition of conditional probability, we have

$P[B_k/A]=\frac{P[B_kA}{P[A]}$ and $P[A/B_k]=\frac{P[AB_k]}{P[B_k]}$ with $P[A]>0$ and $P[B_k]>0$            ......(1)

Using these two, we obtain $P[B_k/A]=\frac{P[A/B_k]P[B_k]}{P[A]}$ ......(2)

By a theorem of total probability, we have

$P[B_k/A]=\frac{P[A/B_k]P[B_k]}{\sum_{j=1}^n P[A/B_j]P[B_j]}$

Hence proved.

Corollary: If $A, B \in \tilde{A}$ then $P[B/A]=\frac{P[A/B_k]P[B_k]}{P[A/B]P[B]+P[A/\bar{B}]P[\bar{B}]}, P[B]> 0$.

Proof: We have $\Omega = B \cup \bar{B}$, where $B$ and $\bar{B}$ are mutually disjoint.

Hence by the Bay's theorem , $P[B/A]=\frac{P[A/B_k]P[B_k]}{P[A/B]P[B]+P[A/\bar{B}]P[\bar{B}]}, P[B]> 0$.

Example 1: Suppose $B_1, B_2$ and $B_3$ are mutually exclusive events. If $P[B_k]=\frac{1}{3}$ and $P[A/B_k]=\frac{k}{6}$ for $k=1, 2, 3.$ What is $P[A]$?

Solution:  By the theorem of total probability, we have

$$ P[A] = \sum_{k=1}^3 P[A/B_k]P[B_k]=\sum_{k=1}^6 \frac{k}{6}\times \frac{1}{3} =\frac{1}{3}$$.

Example 2. The probability that a person can hit the target is 3/5 and the probability that another person can hit the same target is 2/5. But the first person can fire 8 shots in a given time while the second person fires 10 shots. They fire together. What is the probability that the second person shoots the target?

Solution: Let E denote the event of shooting the target, $E_1$ and $E_2$ respectively denote the events that the first person and the second person shoot the target, we are given

$$ P[E/E_1]=\frac{3}{5}~ and ~ P[E/E_2] = \frac{2}{5}$$

the ratio of the shots of the first person to those of the second person in the same time is $\frac{8}{10}=\frac{4}{5}$. Thus $P[E_1]=\frac{4}{5}P[E_2].$ By Bay's theorem we get

$$P[E_2/E]=\frac{P[E/E_2][P[E_2]}{P[E/E_1]P[E_1]+P[E/E_2]P[E_2]}=\frac{\frac{2}{5} P[E_2]}{\frac{3}{5}\times \frac{4}{5}P[E_2]+\frac{2}{5}P[E_2]}$$

$$P[E_2/E_1]=\frac{5}{11}$$

Example 3. An urn contains 10 white and three black balls, while another urn contains 3 white and 5 black balls. Two balls are drawn from the first urn and put into the second urn and then a ball is drawn from the latter. What is the probability that it is a white ball?

Solution: The two balls are drawn from the first urn may be:

(i) both white or (ii) both black or (iii) one white and one black.

Let these events be denoted by A, B, C respectively. Then 

$$P[A]=\frac{10C_2}{13C_2}=\frac{15}{26}, ~~ P[B]=\frac{3C_2}{13C_2}, ~~ P[C]=\frac{10C_1 3C_1}{13C_2}=\frac{10}{26}$$

(i) 5 white and 5 black balls or (ii) 3 white and 7 black balls or (iii) 4 white and 6 black balls.

Let W denote the event of the drawing a white ball from the second urn in the above three cases. Then

 $$P[W/A]=\frac{5}{10}, ~~ P[W/B]=\frac{3}{10},~~ P[W/C]=\frac{4}{10}$$

Hence, $P[W] = P[W/A]P[A]+P[W/B]P[B]+P[W/C]P[C] $

$$=\frac{5}{10}\times \frac{15}{26} + \frac{3}{10}\times\frac{1}{26}+\frac{4}{10}\times\frac{10}{26}=\frac{59}{100}$$

Exercises:

  1. An urn contains  a white and  b black balls, while another urn contains  c white and  d black balls. One ball is transferred from the first urn and put into the second urn and then a ball is drawn from the latter. What is the probability that it will be a white ball?
  2. Three urns $A_1, A_2, A_3$ contain respectively 3 red, 4 white, 1 blue; 1 red, 2 white, 3 blue; 4 red, 3 white, 2 blue balls. One urn is chosen at random and a ball is withdrawn. It is found to be red. Find the probability that it comes from the urn  $A_2$.  
  3. An insurance company insured 2000 scooter drivers, 4000 car drivers, and 6000 truck drivers. The probability of an accident involving a scooter, a car, and a truck are 0.01, 0.03 and 0.15 respectively. One of the insured people meets with an accident. What is the probability that he is a scooter driver?
  4.  In a bolt factory machines A, B, C manufacture respectively 25, 35 and 40 percent of the total. Out of their output 5, 4 and 2 percent ate defective bolts. A bolt is drawn from the produce and is found defective. What is the probabilities that it was manufactured by A, B and C.
  5. Suppose that in answering a question  in a multiple choice test, an examinee knows the answer with probability $p$ or he guesses with probability $1-p $. Assume that the probability of answering a question correctly is unity for an examinee who knows the answer and $1/m$ for the examinee who guesses, where $m$ is the number of multiple-choice alternatives. Show that the probability that an examinee knows the answer to a problem, given that he has correctly answered it, is $\frac{mp}{1+(1-m)p}$

*** THE END ***

Questions for 1st Sem

Topic: Beta and Gamma Function  Q1. Evaluate $\int_0^1 x^4 (1-\sqrt{x})dx$ Q2. Evaluate $\int_0^1 (1-x^3)^{-\frac{1}{2}}dx$ Q3. Show that $\...