=========================preview======================
(MATH246)[2004](f)final~PPSpider^sol_10493.pdf
Back to MATH246 Login to download
======================================================
MATH 246 Probability and Random Processes
Solution to Final Examination
Fall 2004 Course Instructor: Prof. Y. K. Kwok

Time allowed: 100 minutes
1. Now E[X|Y = 1] = 3,E[X|Y = 2] = 5+ E[X] and E[X|Y = 3] = 7+ E[X]. We then have
1

E[X]= (3+5+ E[X]+7+ E[X]) so that E[X] = 15.
3
2.
2

. ..1 . x2 , .1 x 1 fX (x)= f(x, y) dy = .
.
. 0, otherwise .2
. .1 . y2 , .1 y 1 fY (y)= f(x, y) dx = .
.
. 0, otherwise.
When .1 x 1 and .1 y 1,

fX (x)fY (y) .= fXY (x, y)
so that X and Y cannot be independent. On the other hand
. 1 2x
E[X]= .1 . x2 dx =0

.1
. 1 2y
E[Y ]= .1 . y2 dy =0

.1 . .
E[XY ]= xyfXY (x, y) dxdy
. .
1.x
. 1 . 2 xy
= dx dy =0

2
.1 . 1.x
so that
COV(X, Y )= E[XY ] . E[X]E[Y ]=0.
Hence, X and Y are uncorrelated.
3. Let TS and TL denote the times between now and the next earthquake in San Francisco and Los Angeles, respectively. To calculate P [TS >TL], we condition on TL:
.
P [TS >TL]= P [TS >TL|TL = y]fTL (y) dy
0
.
= P [TS >y]2e .2y dy
0
. 2 = e .1y2e .2y dy = . 0 1 + 2
1 4. fX (x)=
.. .
1
2
.1 <x< 1
0 otherwise , fY (y)=
.. .
1
0 <y< 2
2
0 otherwise
.
fZ (z)= |y|fXY (yz, y) dy and observe that
.
fXY (yz, y)=
.. .
1
.1 < yz < 1 and 0 <y< 2
4
.
0 otherwise
fXY (yz, y) is non-zero over the above shaded region.
1(i) z>
2
1
z
y 1
fZ (z)=
dy =
4 8z2
0
11
(ii) . z
22
. 2 y 1
fZ (z)= dy =
0 42
1
(iii) z< .
2
. .
1
1
z y
fZ (z)=
dy = .
4 8z2
0
In summary
As a check
...
..
11
|z|
22
.
11
|z| >
8z2 2 fz(z)=
. . .
1 2
1
.
1 1
2
fZ (z) dz = dz +dz + dz
8z2 2 8z2
11
. . .
2
2
.
.
1
2
.

1 11
111
++ =1.+= .
. =
8z 2 8z 4 2 4
1
2
.
2
5. (a) Assume t1 <t2
CN (t1,t2)= E[(N(t1) . t1)(N(t2) . t2)] = E[(N(t1) . t1){N(t2) . N(t1) . (t2 . t1)] + N(t1) . t1}] = E[[N(t1) . t1][N(t2) . N(t1) . (t2 . t1)]] + var(N(t1)).
By the independent increments property, N(t1) . (t1) and [N(t2) . N(t1) . (t2 . t1)] are inde-pendent so that
E[[N(t1) . t1][N(t2) . N(t1) . (t2 . t1)]]
= E[N(t1) . t1]E[N(t2) . N(t1) . (t2 . t1)].

Furthermore, using the stationary increments property, we have
E[N(t2) . N(t1)] = E[N(t2 . t1)] = (t2 . t1)
so that
E[N(t2) . N(t1) . (t2 . t1)] = 0.
Hence,
CN (t1,t2)= t1 = var (N(t1)) = min(t1,t2).
When t2 t1, we can show similarly that
CN (t1,t2)= t2 = min(t1,t2).
(b) Consider
P [N(t)=1|N(1) = 1]
P [N(t)=1,N(1) . N(t) = 0]

=
P [N(1) = 1]
P [N(t) = 1]P [N(1 . t) = 0]

= (independent increments and stationary increments properties)
P [N(1) = 1]
.(1.t)
te.te
== t.
e.
6. mean = E[X(t)] = E[A cos t + B sin t] = cos tE[A] + sin tE[B]=0
autocovariance = E[{X(t1) . mX (t1)}{X(t2) . mX (t2)}] = E[X(t1)X(t2)] = E[(A cos t1 + B