Title: MA523 Dr. Imad Khamis
1MA523Dr. Imad Khamis
- Chapter 4. Bivariate Distributions
- Fall 2006
2Two And Higher Dimensional R. V.'s
- In many statistical investigations, one is
frequently interested in studying the
relationship between two or more r.v.'s, such as
the relationship between annual income and yearly
savings per family or the relationship between
occupation and hypertension. In this chapter, we
consider n-dimensional vector valued r.v.'s,
however, we start with the case of n 2.
3- Let E be a statistical experiment with a sample
space S. Let X(s) and Y(s) be two functions each
assigning a real number to each outcome s?S by - (X,Y)(s) (X(s),Y(s)).
- We call the pair (X, Y) a two-dimensional random
vector or a bivariate r.v. The joint cumulative
distribution function of (X,Y) is defined by, - F(x, y) P(X ? x, Y ? y),?(x, y) .
4- Definition (X, Y) is called a two-dimensional
discrete r.v., if the possible values of (X, Y)
are finite or countable, i.e. - R(X, Y) (x1, y1), (x2, y2), ....
- Similarly, (X, Y) is said to be continuous
two-dimensional r.v.'s if (X, Y) assumes a
continuum of values in some sub-set of the
Euclidian plane R2, i.e. R(X, Y) (x, y) (x,
y) A where A ? R2
5Definition
- Let (X, Y) be a 2-dim. Discrete r.v. with each
value of (xi , yi), the probability - P(X xi ,Y yj ) p(xi , yj )
- is called the joint probability (mass)
function (jpmf), and it satisfies the conditions -
Definition The joint cumulative distribution
function (jcdf ) is given by
6- Definition
- A bivariate r.v. (X, Y) is called continuous
r.v. if its jpdf F is twice differentiable such
that - f(., .) is called the joint probability density
function (jpdf) of (X, Y) and satisfies the
following conditions - f(x, y) ? 0, (x, y) ? R2,
- P(X ? x, Y ? y) F(x, y).
7Independence of R.V.'s
- Definition Two r.v.'s X and Y are called
- independent, if
- P(X?A, Y?B) P(X?A) P(Y?B), ? A B.
- ? F(x, y) FX(x) FY(y) , ?x y
- p(x,y) p(Xx) p(Yy) ?x y discrete case,
- ? f(x,y) fX(x) fY(y) ?x y continuous case.
8- Example p(x, y) (x2 y)/7, (x,y)(1,1),(1,2),(2,1
) 0 o/w - p(1,1) 1/7, pX(1) 1/7 2/7 3/7, pY(1)
5/7 - Since p(1,1) ? p(X1) p(Y1), X Y are not
independent (or X Y are dependent. - Example)
- (X,Y) f(x,y) x2 e -x(y1), if x gt0 y gt 0 0
o/w
Since f(x,y) ? fX(x) fY(y),? x y,?X and Y are
not independent.
9- Marginal and Conditional Distributions
- Discrete case Notice that
- P(X xi) pX(xi) P(X xi,Y y1,Y y2, )
- Similarly,
- pY(yj) P(Y yj)
- pX(xi) and pY(yj) are called the marginal pmf of
X - and Y respectively.
10Exercise P(X gt Y) p(1, 0) p(2, 0) p(2, 1)
2/5 P(X Y?2) p(1, 0) p(1, 1) p(2, 0)
7/25 Ex. 5. For 0? x ?4, 0? y ?4, 0? x y ?4 ,
11Continuous case
- Similarly, in the continuous case, we define the
marginal pdf's of X Y. Let X and Y be two
continuous r.v.'s with jpdf f(x, y). The
marginal pdf 's of X and Y are defined by
Now, P(a ? X ? b, c ? Y ? d)
f(x, y) dx dy.
12- Problem
- Let (X, Y) f(x, y) x2 (xy)/3, 0 ? x ? 1, 0 ?
y ?2, - find P(X ? (1/2))
-
-
x2y (xy2)/6 2x2 (2x)/3. -
13Conditional Distributions
- Definition
- Let (X, Y) be a discrete r.v. with jpmf, p(xi, yj
). - The conditional pmfs are defined by
- p( xi?yj ) P(X xi ?Y yj) , pY(yj) ? 0
14- Definition The conditional pdf of X given that
Yy is defined by - for all y ? fY(y) ? 0.
- The conditional cdf of X given that Yy is
defined by - Therefore,
15 Problem 10 p. 327 (X, Y) f(x, y) (1/2)
e-x, for x ? 0, ?y?lt x.
y
16- Conditional Expectation
- The conditional expectation of a r.v. X given
that Yy is defined in the discrete case by - and in the continuous case by
17Ex. continued
Conditional Variance Furthermore, the conditional
variance of X given Y y is defined by V(X?Y
y) E(X2 Y y) - E(X?Y y) 2.
18Covariance joint probability
- The covariance measures the strength of the
linear relationship between two variables - The covariance
where X discrete variable X Xi the ith
outcome of X Y discrete variable Y Yi the
ith outcome of Y P(XiYi) probability of
occurrence of the condition affecting the ith
outcome of X and the ith outcome of Y
19The Sample Covariance
20Interpreting Covariance
- Covariance between two random variables
- cov(X,Y) gt 0 X and Y tend to move in the
same direction - cov(X,Y) lt 0 X and Y tend to move in
opposite directions - cov(X,Y) 0 X and Y are independent