Continuous Random variable and Probability Density Function: A random variable X with $F_X(.)$ as distribution function, is called continuous if there exists a function such that $f_X(.):R \rightarrow [0,1]$ such that $$F_X(x)=\int_{-\infty}^{\infty}f_X(t)dt~~ for~ all~ x\in R ....(1)$$
The function $f_X(.)$ or $f(x)$ is called the probability density function (p.d.f) of C or simply density funtion of $X$. From the relation (1), we observe that $f_X(x)=\frac{dF_X(x)}{dx}$
Properties of p.d.f.:
- $f(X)\geq 0$ for all $x \in R$
- $\int_{-\infty}^{\infty} f(x) dx=1$
- $P[a< x \leq b]=\int_a^b f(0) dx, $ for $a<b.
Various measures of central tendency, dispersion, moments and
expected value:
Let $X$ be a random variable with density function $f_X(x).$
1. Mean of $X$, denoted by $\mu_x$ or $E(X)$, is defined as:
$\mu_x$ or $E(X)=\int_{-\infty}^{\infty} xf_X(x) dx.$ ($X$ is a continuous random variable)
Similarly, $E(X^2)=\int_{-\infty}^{\infty} x^2 f_X(x)dx$
If $X$ is a discrete random variable with mass points $x_1, x_2,...., x_n,...;$ then
$\mu_x$ or $E(X)=\sum_{i=1}^{\infty} x_i f_X(x_i)$
2. Variance of $X$ denoted by $\sigma_x^2$ or $var[X],$ is defined as
$\sigma_X^2$ or $var[X]=\inf_{-\infty}^{\infty}(x-\mu_x)^2f_X(x) dx$ ($X$ is a continuous random variable and $\mu_x$ is the mean)
If $X$ is discrete random variable with mass points $x_1, x_2, ....x_n,.....;$ then
$\sigma_X^2$ or $var[X]=\sum_{i=1}^{\infty}(x_i -\mu_x)^2 f_X(x_i).$
Also $var[X]=E[X^2]-{E(X)}^2$. This is useful formula to determine $var[X]$
3. Standard deviation of $X$, denoted by $\sigma_x$ is defined as $\sigma_x =+\sqrt{Var[X]}$
4. Median (M) of a continuous random variable $X$ is given by the relation
$\int_{-\infty}^M f(x)dx=\frac{1}{2}=\int_M^{\infty} f(x) dx$
5. Mean deviation about the mean $\mu_x$ is defined as M.D. $=\int_{-\infty}^{\infty} |x-\mu_x|f_X(x)dx$
6. The first and third quartiles, denoted by $Q_1$ and $Q_3$ respectively are given by
$$\int_{-\infty}^{Q_1}f(x)dx=\frac{1}{4}~and~\int_{-\infty}^{Q_2} f(x)dx=\frac{3}{4}$$
7. Mode is the value of $X$ for which $f(x)$ is maximum. The modal value of $x$ is given by the relations:
$$f_x^{'}=0~ and~ f_X^{''}<0.$$
8. The expectation or expected value of the function $g(x)$ of a rand om variable $X$ with $f_X(x)$as p.d.f., denoted by $E[g(X)]$, is defined as:
i) $E[g(X)]=\sum_{n=1}^{\infty} g(x_n)f_X(x_n)$, where $X$ is discrete random variable with mass points $x_i, x_2,...., x_n,.......;(provided the series is absolutely convergent)$.
ii) $E[g(x)=\int_{-\infty}^{\infty} g(x)f_X(x)dx$, where $X$ is a continuous random variable (provided $\int_{-\infty}^{\infty} |g(x)|f_x(x)dx < \infty$).
Properties of Expectation:
- $E[c]=c$, c being a constant.
- $E[c.g(x)]=c.E[g(x)],$ c being a constant.
- $E[c_1.g_1(x)+c_2.g_2(x)]=c_1.E[g_1(x)+c_2.E[g_2(x)],$ here $c_1$ and $c_2$ are any real constants.
- $E[g_1(x)]\leq E[g_2(x)]$, provied $g_1(x)\leq g_2(x)~ \forall x\in R$.
- If $g(x)=x$ then $E[g(x)]=E[X]$ is the mean of $X$.
- If $g(x)=(x-\mu_x)^2$, then $E[g(x)]=var[X]$.
- If $g(x)=(x-\mu_x)^r$, then $E[g(x)]=\mu_r$, which is the rth moment about the mean $\mu_r^{'}$
- If $g(x)=(x-a)^r$, then $E[g(x)]=\mu_r^{'}$, which is rth moment about the point $x=a$.
- If $g(x)=x^r$, then $E[g(x)]=E[X^2]=\mu_r^{'}$ which is the rth moment about the point $x=0$.
No comments:
Post a Comment