Chapter 1
Probability Concepts
1.1 The given sets are:
A = {1,2,3,4} B = {0,1,2,3,4,5,6,7,8}
C = {x | x real and 1 x <3} D = {2,4,7} E = {4,7,8,9,10}.
We observe that:
A is finite and countable. D is finite and countable.
B is finite and countable. E is finite and countable.
C is infinite and uncountable.
1.2 By inspection,
(a) B AI = A = {1,2,3,4}.
(b) E D B A U U U = {1,2,3,4,5,6,7,8,9,10}.
(c) D E B I U ) ( = D = {2,4,7}.
(d) E B = {1,2,3,5,6}.
(e) E D B A I I I ={4}.
1.3 The universal set is U = {0,1,2,3,4,5,6,7,8,9,10,11,12}. The subsets are
A = {0,1,4,6,7,9}, B = {2,4,6,8,10,12} and C = {1,3,5,7,9,11}. By inspection,
(a) B AI = {4,6}. (b) ( B AU ) C I = C AI = {1,7,9}.
1
Signal Detection and Estimation
2
(c) C BU = {0}. (d) A B = {0,1,7,9}.
(e) ) ( ) ( C A B A U I U = ) ( C B A I U = A = {0,1,4,6,7,9}.
(f ) C AI = {0,4,6}. (g) B = C .
(h) C BI = B = {2,4,6,8,10,12}.
1.4 Applying the definitions, we have
U A
B
(a) B A (b) C B A I U ) (
U
A
B
C
U
A B
(e) AB
U
A B
C D
A
U
A d) (
D C B A c I I I ) (
Probability Concepts
3
1.5 This is easily verified by Venn diagrams. Hence, we have
B A and C B , then C A
1.6 By inspection, B and C are mutually exclusive.
1.7 Let R, W and B denote red ball drawn, white ball drawn and blue ball drawn,
respectively
(a) 5 . 0
2
1
7 3 10
10
balls of number Total
balls red of Number
) ( = =
+ +
= = R P .
(b) 15 . 0
20
3
) ( = = W P . (c) 35 . 0
20
7
) ( = = B P .
(d) 5 . 0
2
1
) ( 1 ) ( = = = R P R P . (e) 65 . 0
20
13
20
3 10
) ( = =
+
= W R P U .
1.8 Let B
1
first ball drawn is blue.
W
2
2
nd
ball drawn is white.
R
3
3
rd
ball drawn is red.
(a) The ball is replaced before the next draw the events are independent
and hence,
) | ( ) | ( ) ( ) (
2 1 3 1 2 1 3 2 1
W B R P B W P B P R W B P I I I =
) ( ) ( ) (
3 2 1
R P W P B P =
02625 . 0
8000
210
20
10
20
3
20
7
= = =
A
B
C
U
B C
A
B
10R , 3W,
7B
Signal Detection and Estimation
4
(b) Since the ball is not replaced, the sample size changes and thus, the
events are dependent. Hence,
) | ( ) | ( ) ( ) (
2 1 3 1 2 1 3 2 1
W B R P B W P B P R W B P I I I =
0307 . 0
18
10
19
3
20
7
= =
1.9
Let R
1
and R
2
denote draw a red ball from box B
1
and B
2
respectively, and let
W
1
and W
2
denote also draw a white ball from B
1
and B
2
.
(a) ) ( ) ( ) | ( ) ( ) (
2 1 1 2 1 2 1
R P R P R R P R P R R P = = I since the events are
independent. Hence,
111 . 0
9
1
9
2
20
10
) (
2 1
= = = R R P I .
(b) Similarly, 1 . 0
9
6
20
3
) ( ) ( ) (
2 1 2 1
= = = W P W P W W P I
(c) Since we can have a different color from each box separately, then
25 . 0
20
7
9
6
9
1
20
3
) ( ) ( ) (
1 2 2 1
= + = + = B W P B W P B W P I I I .
1.10 Let B
1
and B
2
denote Box 1 and 2 respectively. Let B denote drawing a black
ball and W a white ball. Then ,
10R , 3W
7B
2R , 6W
1B
B
1
B
2
4W , 2B
3W , 5B
B
1 B
2
Probability Concepts
5
Let B
2
be the larger box, then P(B
2
) = 2P(B
1
). Since 1 ) ( ) (
1 2
= + B P B P , we
obtain
3
2
) ( and
3
1
) (
2 1
= = B P B P .
(a) P(1B | B
2
) = 625 . 0
8
5
= .
(b) P(1B | B
1
) = 3333 . 0
6
2
= .
(c) This is the total probability of drawing a black ball. Hence
. 5278 . 0
3
1
6
2
3
2
8
5
) ( ) | 1 ( ) ( ) | 1 ( ) 1 (
1 1 2 2
= + =
+ =
B P B B P B P B B P B P
(d) Similarly, the probability of drawing a white ball is
. 4722 . 0
3
1
6
4
3
2
8
3
) ( ) | 1 ( ) ( ) | 1 ( ) 1 (
1 1 2 2
= + =
+ =
B P B W P B P B W P W P
1.11 In four tosses:__ __ __ __, we have three 1s and one is not 1. For example
1 111 . Hence, the probability is
6
5
6
1
6
5
6
1
6
1
6
1
3
|
.
|
\
|
= but we have
|
|
.
|
\
|
3
4
ways of
obtaining this. Therefore, the probability of obtaining 3 ones in 4 tosses is
01543 . 0
6
5
6
1
! 1 ! 3
! 4
3
= |
.
|
\
|
.
1.12 Let R, W and G represent drawing a red ball, a white ball, and a green ball
respectively. Note that the probability of selecting Urn A is P(Urn A) = 0.6, Urn B
is P(Urn B) = 0.2 and Urn C is P(Urn C) = 0.2 since
P(Urn A)+P(Urn B)+P(Urn C) =1.
(a) ( P 1W | Urn B) = 3 . 0
100
30
) (Urn
) Urn 1 (
= =
B P
B W P
I
.
(b) ( P 1G | Urn B) = 4 . 0
100
40
= .
(c) P(Urn C | R) =
) (
) (Urn
R P
R C P I
. Also,
Signal Detection and Estimation
6
P(R | Urn C) =
) (
) (Urn ) Urn | (
) | (Urn
) (Urn
(Urn
R P
C P C R P
R C P
C P
R C P
=
) I
.
We need to determine the total probability of drawing a red ball, which is
( ) ( ) ( ) 32 . 0 2 . 0
100
40
2 . 0
100
30
6 . 0
100
30
) (Urn ) Urn | ( ) (Urn ) Urn | ( ) (Urn ) Urn | ( ) (
= + + =
+ + =
C P C R P B P B R P A P A R P R P
Thus, 25 . 0
32 . 0
) 2 . 0 ( ) 4 . 0 (
) | (Urn = = R C P .
1.13 In drawing k balls, the probability that the sample drawn does not contain a
particular ball in the event E
i
, i = 0, 1,2, , 9, is
M
k
j i
k
i
E E P
E P
|
.
|
\
|
=
|
.
|
\
|
=
10
8
) (
10
9
) (
(a) P(A) = P(neither ball 0 nor ball1) = P(E
0
E
1
) =
k
k
10
8
.
(b) P(B) = P( ball 1 does not appear but ball 2 does)
=
k
k k
k
k
k
k
E E P E P
10
8 9
10
8
10
9
) ( ) (
2 1 1
= = .
(c) P(AB) = ) (
2 1 0
E E E P = = ) ( ) (
2 1 0 1 0
E E E P E E P
k
k k
k
k
k
k
10
7 8
10
7
10
8
= .
(d)
k
k k k
AB P B P A P B A P
10
7 8 9
) ( ) ( ) ( ) (
+
= + = U .
1.14 We have
<
+
=
0 , 0
0 , ) 3 (
2
1
2
1
) (
x
x x e
x f
x
X
Probability Concepts
7
(a)
= + = + = + =
0 0 0 0
1
2
1
2
1
) 3 (
2
1
2
1
)] 3 (
2
1
2
1
[ ) ( dx x dx e dx x e dx x f
x x
X
.
Hence, ) (x f
X
is a density function.
(a) P(X = 1) = 0 (the probability at a point of a continuous function is zero).
5 . 0
2
1
) 3 ( = = = X P .
( ) 6839 . 0 1
2
1
2
1
2
1
) ( ) 1 (
1
1 1
= + = + = =
e dx e dx x f X P
x
X
.
1.15
(a) The cumulative distribution function of X for all the range of x is,
+ = = =
x x
X X
x x du du u f x F
3
1 3 for
8
3
8
1
8
1
) ( ) ( ,
and
+ = +
x
x x du
1
1 1 for
2
1
4
1
4
1
4
1
,
and 3 1 for
8
5
8 8
1
4
3
1
+ = +
x
x
du
x
,
(1/2)
1/2
. . x
0 1 2 3
1/8
fX(x)
. . . . x
-3 -2 -1 0 1 2 3
1/4
fx(x)
Signal Detection and Estimation
8
and 3 for 1 ) ( = x x F
X
.
Thus,
(b) Calculating the area from the graph, we obtain
2
1
4
1
2 ) 1 ( = = < X P .
1.16 The density function is as shown
(a) 2 2 for
2
1
4
1
4
1
) ( ) (
2
< + = = =
x x du x F x X P
x
X
(b)
2
1
4
1
) 1 (
1
1
= =
dx X P
(c)
3
4
4
1
2 ] [ , 0 ] [
2
0
2 2 2
=
|
|
.
|
\
|
= = =
dx x X E X E
x
.
(d)
= =
2 sin
2
1
4
] [ ) (
2 2
j
e e
e E
j j
X j
x
.
3 , 1
3 1 ,
8
5
8
1
1 1 ,
2
1
4
1
1 3 ,
8
3
8
1
3 , 0
< +
< +
< +
<
x
x x
x x
x x
x
F
X
(x) =
fX(x)
x
-2 -1 0 2 1
1/4
Probability Concepts
9
1.17 The density function is shown below
(a) 75 . 0
4
3
) 2 (
2
3
2
1
2 / 3
1
1
2 / 1
= = + = |
.
|
\
|
< <
dx x xdx X P .
(b) 1 ) 2 ( ] [
2
1
1
0
2
= + =
xdx x dx x X E as can be seen from the graph.
(c) The MGF of X is
) 1 2 (
1
) 2 ( ] [ ) (
2
2
2
1
1
0
+ = + = =
t t tx tx tX
x
e te
t
dx e x xdx e e E t M .
(d)
3
2
4
2 2 2
0
2 ) 4 ( ) 1 ( 2 ) 1 2 ( 2 ) ( 2 ) (
t
t e t te
t
e te t e e t
dt
t dM
t t t t t t
t
x
+
=
+
=
=
Using L'hopital's rule, we obtain 1 ) 0 ( = = t M
x
.
1.18 (a)
4 2
) (
3
2
] [
1
0
2
+
= + = =
dx x x X E and
1
3
) ( ) (
1
0
2
=
+ = + =
+
dx x dx x f
x
.
Solving the 2 equations in 2 unknowns, we obtain
=
=
2
3 / 1
(b) 511 . 0
45
23
2
3
1
] [
1
0
2 2 2
= = |
.
|
\
|
+ =
dx x x X E .
Then, the variance of X is ( ) 667 . 0
45
3
] [ ] [
2 2 2
= = = X E X E
x
.
fX (x)
1
x
0 1/2 1 3/2 2
Signal Detection and Estimation
10
1.19 (a)
=
j i
j i j i
y x P y x XY E
,
) , ( ] [
0 ] 0 [
6
1
)] 1 )( 1 ( ) 1 )( 1 ( ) 1 )( 1 ( ) 1 )( 1 [(
12
1
= + + + + + + =
( )
3
1
12
4
1 and
3
1
6
2
0 ,
3
1
12
4
1
where, ) ( ] [
= = = = = = = = =
= =
) P(X ) P(X X P
x X P x X E
i
i i
Hence, the mean of X is 0
3
1
1
3
1
0
3
1
1 ] [ = |
.
|
\
|
+ |
.
|
\
|
+ |
.
|
\
|
= X E .
Similarly, 0
3
1
1
3
1
0
3
1
1 ] [ = |
.
|
\
|
+ |
.
|
\
|
+ |
.
|
\
|
= Y E . Therefore, ] [ ] [ ] [ Y E X E XY E = .
(b) We observe that
12
1
) 1 , 1 ( = = = Y X P
9
1
) 1 , 1 ( = = = Y X P , thus X
and Y are not independent.
1.20 (a)
+
+
= = + =
2
0
2
0
8
1
1 ) ( ) , ( k dxdy y x k dxdy y x f
XY
.
(b) The marginal density functions of X and Y are:
+ = + =
2
0
2 0 for
4
1
4
) (
8
1
) ( x
x
dy y x x f
X
.
+ = + =
2
0
2 0 for
4
1
4
) (
8
1
) ( y
y
dx y x y f
Y
.
(c) P(X < 1 | Y < 1)=
3
1
8 / 3
8 / 1
)
4
1
4
1
(
) (
8
1
1
0
1
0
1
0
= =
+
+
dy y
dxdy y x
.
(d) | | | |
= = + =
2
0
6
7
) 1 (
4
Y E dx x
x
X E .
Probability Concepts
11
To determine
xy
, we solve for
| |
= + =
2
0
2
0
3
4
) (
8
dxdy y x
xy
XY E .
6
11
Thus, .
3
5
] [ ] [
2 2
= = = =
y x
Y E X E and the correlation coefficient is
| | | | | |
0909 . 0
11
1
=
=
y x
Y E X E XY E
.
(e) We observe from (d) that X and Y are correlated and thus, they are not
independent.
1.21 (a)
+
+
= = =
4
0
5
1
96
1
1 ) , ( k dy dx kxy dxdy y x f
XY
.
(b)
= = =
2
0
5
3
09375 . 0
32
3
96
) 2 , 3 ( dxdy
xy
Y X P .
03906 . 0
128
5
96
) 3 2 , 2 1 (
3
2
2
1
= = = < < < <
dxdy
xy
Y X P .
(c)
=
< <
< < < <
= < < < <
3
2
) (
128 / 5
) 3 2 (
) 3 2 , 2 1 (
) 3 2 | 2 1 (
dy y f
Y P
Y X P
Y X P
Y
where,
< < = =
5
1
4 0
8 96
) ( y ,
y
dx
xy
y f
Y
. Therefore,
125 0
8
1
16 / 5
128 / 5
) 3 2 | 2 1 ( . Y X P = = = < < < <
(d) | |
= = = = =
5
1
5
1
444 . 3
9
31
12
) ( | dx
x
x dx x xf y Y X E
X
.
Signal Detection and Estimation
12
1.22 (a) We first find the constant k. Hence,
= =
2
1
3
1
6
1
1 k dy dx kxy
(b) The marginal densities of X and Y are
< < = =
2
1
3 1 for
4 6
1
) ( x
x
xydy x f
X
and
< < = =
3
1
2 1 for
3
2
6
1
) ( y y dy dx xy y f
Y
.
Since = = ) , (
6
1
) ( ) ( y x f xy y f x f
xy Y X
X and Y are independent.
1.23 We first determine the marginal densities functions of X and Y to obtain
= =
1
0
3 3
8 16
) (
x
dy
x
y
x f
X
for x > 2.
and
= =
2
3
2
16
) ( y dx
x
y
y f
Y
for 0 < y < 1.
Then, the mean of X is | |
= =
2
4 ) ( dx x xf X E
X
,
and the mean of Y is | |
= =
1
0
2
3
2
2 dy y Y E .
4792 . 0
48
23
6
1
) 3 (
2
1
3
1
= = = < +
y
xydxdy Y X P
y x = 3
1
y
x
2
1 2
3
3
Probability Concepts
13
1.24 We first find the constant of k of ) ( y f
Y
to be
= =
0
3
9 1 k dy kye
y
.
(a)
3 2
1
0
1
0
3 2
14 9 18 1 ) 1 ( 1 ) 1 (
= = + = > +
e e dxdy ye Y X P Y X P
y
y x
.
(b)
= = < <
1
2
1
7 5
4 4 ) , ( ) 1 , 2 1 ( e e dxdy y x f Y X P
XY
.
(c)
= = < <
2
1
4 2 2
2 ) 2 1 ( e e dx e X P
x
.
(d)
= =
1
3 3
4 9 ) 1 ( e dy ye Y P
y
.
(e)
5 2
3
7 5
4
4 4
) 1 (
) 1 , 2 1 (
) 1 | 2 1 (
< <
= < < e e
e
e e
Y P
Y X P
Y X P .
1.25 (a) Using ) (x f
X
, | | | |
+
= = = =
0
2
1 2 2 ) ( ) ( ) ( dx e x dx x f x g X g E Y E
x
X
.
(b) We use the transformation of random variables (the fundamental theorem)
to find the density function of Y to be
y
y
X Y
e e
y
f y f
= = |
.
|
\
|
=
2
2
2
2
1
2 2
1
) ( .
Then, the mean of Y is | |
= =
0
1 dy ye Y E
y
.
x + y = 1
y
1
x
0 1
Signal Detection and Estimation
14
Both results (a) and (b) agree.
1.26 (a) To find the constant k, we solve
+
+
= = =
3
0
3
81
8
1 ) , (
y
XY
k dy dx kxy dxdy y x f .
(b) The marginal density function of X is
= =
0
3
3 0 for
81
4
81
8
) ( x x xydy x f
X
.
(c) The marginal density function of Y is
= =
3
3
3 0 for ) 9 (
81
4
81
8
) (
y
Y
y y y xydx y f .
(d)
= = =
otherwise , 0
0 , 2
81
4
81
8
) (
) , (
) | (
2
3
|
x y
x
y
x
xy
x f
y x f
x y f
X
XY
X Y
and
= =
otherwise , 0
3 ,
9
2
) (
) , (
) | (
2
|
x y
y
x
y f
y x f
Y X f
Y
XY
Y X
1.27 The density function of Y X Z + = is the convolution of X and Y given by
+
= = dx x z f x f Y f x f z f
Y X Y X Z
) ( ) ( ) ( ) ( ) (
0 z - 4 z z-4 0 z
Probability Concepts
15
=
=
otherwize , 0
4 ,
4
) 1 (
4
1
4 0 ,
4
1
4
1
) (
4
4
0
z
e e
dx e
z
e
dx e
z f
z
x
z
z
z z
x
Z
1.28 The density function of Y X Z + = is
=
_
) ( ) ( ) ( dy y z f y f z f
X Y Z
.
Graphically, we have
Hence, we have
fY (y)
0 0.3 0.5 0.2 0 0 0
Z fZ (z)
z = 0 0.4 0.2 0.4 0 0 0 0 0 0 0 0
z = 1 0.4 0.2 0.4 0 0 0 0 0 0 0
z = 2 0.4 0.2 0.4 0 0 0 0 0 0.12
z = 3 0 0.4 0.2 0.4 0 0 0 0 0.26
z = 4 0 0.4 0.2 0.4 0 0 0 0.30
z = 5 0 0 0.4 0.2 0.4 0 0 0.24
z = 6 0 0 0 0.4 0.2 0.4 0 0.08
z = 7 0 0 0 0 0.4 0.2 0.4 0
x
0 1 2 3
0.4 0.4
0.2
fX(x)
y
0 1 2 3
0.5
0.3
0.2
fY(y)
x
-3 -2 -1 0
0.4 0.4
0.2
fX(-x)
Signal Detection and Estimation
16
The plot of ) (z f
Z
is
Note that
= + + + + =
i
i Z
z z f 0 . 1 08 . 0 24 . 0 3 . 0 26 . 0 12 . 0 ) ( as expected.
1.29 (a)
+
= = =
0
/
0
) (
0 for ) ( ) (
y z
y x
Z
z dxdy e z XY P z F XY Z .
= =
0 0 0
/
/
0
1 ) 1 ( dy e dy e e dy e dx e
y
z
y
y y z y
y z
x
.
Therefore,
= =
otherwise , 0
0 ,
1
) ( ) (
0
) (
0
z dy e
y
dy e
dz
d
z F
dz
d
z f
y
z
y
y
z
y
Z Z
.
(b)
+
= + = dy y z f y f z f Y X Z
X Y Z
) ( ) ( ) (
=
z
y z y
dy e e
0
) (
=
otherwise , 0
0 , z ze
z
1.30 The density function of XY Z = is
+
< < = = =
1
1 0 for ln
1
) , (
1
) (
z
XY Z
z z dy
y
dy y
y
z
f
y
z f .
z
0 1 2 3 4 5 6 7
0.3
0.26 0.24
0.12 0.08
0 0
fZ(z)
0 z
Probability Concepts
17
1.31 (a) The marginal density function of X is
=
0
0 for ) ( x e dy e x f
x x
X
.
(b) The marginal density function of Y is
=
0
0 for
1
) ( y dx e y f
x
Y
.
(c) Since = ) , ( ) ( ) ( y x f y f x f
XY Y X
X and Y are statistically independent.
(d)
+
= = + = dy y z f y f y f x f z f Y X Z
X Y Y X Z
) ( ) ( ) ( ) ( ) ( .
y
1
x
e
x
) (x f
X
( ) y f
Y
For < y 0
z- 0 z
( )
=
z
z x
Z
e dx e z f
0
1
1
) (
Signal Detection and Estimation
18
Then,
1.32 ( ) ) ( ) ( ) (
z
x
Y P yz X P z
Y
X
P z Z P z F
Y
X
Z
Z
= = |
.
|
\
|
= = =
0 ,
1
1
1
0 /
> |
.
|
\
|
+ = =
z
z
dxdy e e
z x
y x
Hence, the density function is
<
>
|
.
|
\
|
= =
0 0
0
1
) ( ) (
2
z ,
z ,
z z F
dz
d
z f
Z Z
1.33 (a) Solving the integral
= =
2
1
3
1
2 1 2 1
6
1
1 k dx dx x kx .
(b) The Jacobian of the transformation is
( )
e 1
1
) (z f
z
z
0
0 z- z
For < y
( )
| |
=
z
z
z z x
Z
e e dx e z f
1
) (
Probability Concepts
19
. 2
2
0 1
) , (
2 1
2 1
2
1
2
2
1
2
2
1
1
1
2 1
x x
x x x
x
y
x
y
x
y
x
y
x x J = =
=
Hence,
= =
otherwise , 0
,
12
1
) , (
) , (
) , (
2 1
2 1
2 1
2 1
2 1
2 1
D ,x x
x x J
x x f
Y Y f
X X
Y Y
where D is the domain of definition.
Side 1 :
2
2 2 1 1
1 x y x y = = = , then . 4 1
4 2
1 1
2
2 2
2 2
= =
= =
y
y x
y x
Side 2 :
2
2 2 1 1
3 3 x y x y = = = , then . 12 3
12 2
3 1
2
2 2
2 2
= =
= =
y
y x
y x
Side 3 :
1 1 2 2
4 4 2 y x y x = = = , then
= =
= =
. 12 3
4 1
2 1
2 1
y x
y x
Side 4 :
1 1 2 2
1 y x y x = = = .
Therefore, D is as shown below
x2
3
1 1
x y =
2
2
2 1 2
x x y =
1 2
1
4
x1
0 1 2 3
Signal Detection and Estimation
20
1.34 (a) The marginal density functions of X
1
and X
2
are
+
+
>
= =
0 , 0
0 ,
) (
1
1
2
) ( 2
1
1
2 1
1
x
x e
dx e x f
x
x x
X
and
+
+
>
= =
0 , 0
0 ,
) (
2
2
1
) ( 2
2
2
2 1
2
x
x e
dx e x f
x
x x
X
Since = ) ( ) ( ) , (
2 1 2 1
2 1 2 1
x f x f x x f
X X X X
X
1
and X
2
are independent.
(b) The joint density function of ) , (
2 1
Y Y is given by
( )
=
. ,
D ,x x ,
x x J
x x f
y y f f
X X
Y Y
otherwise 0
,
) , (
) , (
2 1
2 1
2 1
2 1
2 1
2 1
The Jacobian of the transformation is given by
D
+ + + y
1
1 2 3
12 +
+
+
+
+
+
+
4 +
3
1
2
y
Probability Concepts
21
.
1
1
1 1
) , (
2
2
2
1
2
1
2
2
2
1
2
2
1
1
1
2 1
x
x
x
x
x
x
x
y
x
y
x
y
x
y
x x J =
=
Hence,
2
2
2
1
) ( 2
2 1
1
) , (
2 1
2 1
x
x
x
e
y y f
x x
Y Y
=
+
, but
2 1 1
x x y + = and
2 2 1
2
1
2
x y x
x
x
y = = .
Thus, . ) , (
2 1
2
1 2
2 1
1
2 1
x x
x
e y y f
y
Y Y
+
=
Also,
2 2 1 1 1 2
x y y x y x = =
) 1 (
2 1 2
y y x + = .
Making the respective substitutions, we obtain
2
2
1 2
1
2
2
2
1
2
2 1
) 1 (
) 1 (
) , (
1 1
2 1
y
y
e
y
y
y
e y y f
y y
Y Y
+
=
+
=
for 0
1
> y and 0
2
> y .