0% found this document useful (0 votes)
17 views2 pages

HW 1 - 217

This document is a homework assignment for STATS217, focusing on discrete-time Markov chains and stationary distributions, due on January 19, 2023. It includes various problems that require students to analyze random processes, determine whether they are Markov chains, and explore properties of Markov chains. The assignment emphasizes clarity and conciseness in the answers and is worth 20% of the total homework grade.

Uploaded by

yutian0717
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views2 pages

HW 1 - 217

This document is a homework assignment for STATS217, focusing on discrete-time Markov chains and stationary distributions, due on January 19, 2023. It includes various problems that require students to analyze random processes, determine whether they are Markov chains, and explore properties of Markov chains. The assignment emphasizes clarity and conciseness in the answers and is worth 20% of the total homework grade.

Uploaded by

yutian0717
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

STATS217: Introduction to Stochastic Processes 1 Homework 1

Instructor: Tselil Schramm Due: January 19, 2023

Homework 1: discrete-time Markov chains and stationary


distributions

This homework assignment is due on January 19, 2023 at 10:00AM on gradescope. It is worth 20% of
your total homework grade. You may work in groups, but you must write up your own answers. Try to
keep your answers clear and concise; the answer key solutions solve each of these problems in a few lines
at most.

0. Optional, yet highly recommended, warm-up (0pts). Practice the basics by doing problems 1.6,
1.12, 1.36(i), and 1.37 from Durrett (second edition).

1. Markov or not? (10 points) For each of the following random processes, decide whether it is a
discrete time Markov chain. If not, explain why. If it is a Discrete time Markov chain, describe the
state space Ω and the the transition probabilities 𝐏𝐫[𝑋1 = 𝑥 ∣ 𝑋0 = 𝑦] for each 𝑥, 𝑦 ∈ Ω.

(a) In an infinite population of individuals, at time 0, patient zero infects Pois(𝜆) people with a
disease. Then, at each time 𝑡 ⩾ 1, each of the individuals infected at time 𝑡 − 1 independently
infects Pois(𝜆) people. Reinfections never occur.
i. The process (𝑋𝑡 )𝑡⩾0 where 𝑋𝑡 = the number of people who contract the disease at time 𝑡.
ii. The process (𝑆𝑡 )𝑡⩾0 where 𝑆𝑡 = the number of people who have ever had the disease (by
time 𝑡).
(b) Two balloons are connected by a straw. There are 𝑁 = 𝑋𝑡 + 𝑌𝑡 gas atoms in the system at all
times 𝑡 ∈ Z+ , where 𝑋𝑡 is the number of atoms in the right-hand balloon and 𝑌𝑡 is the number
in the left-hand balloon. At each time, one of the 𝑁 gas atoms is chosen uniformly at random
and moved through the straw into the opposite balloon.
(c) The process (𝑋𝑡 )𝑡⩾0 defined as follows: let 𝑌0 , 𝑌1 , … be a sequence of independent samples from
Binom(𝑛, 𝑝), and let 𝑋𝑡 = max{𝑌0 , … , 𝑌𝑡 }.
(d) The process (𝑋𝑡 )𝑡⩾0 defined as follows: let 𝑌0 , 𝑌1 , … be a discrete-time Markov chain over {0, 1, … , 𝑛}
with transition matrix 𝑃 and let 𝑋𝑡 = max{𝑌0 , … , 𝑌𝑡 }.

2. Chains beget Chains? (6 points) Suppose that (𝑋𝑡 )𝑡⩾0 and (𝑌𝑡 )𝑡⩾0 are two independent discrete-
time Markov chains on state space Ω with transition matrices 𝑃 and 𝑄 respectively.

(a) Is the process 𝑍𝑡 = (𝑋𝑡 , 𝑌𝑡 ) a discrete time Markov chain? If so, what is its transition matrix (in
terms of 𝑃 and 𝑄)? If not, explain why not.
(b) Suppose Ω ⊂ Z. Is the process 𝑍𝑡 = 𝑋𝑡 ⋅ 𝑌𝑡 a discrete time Markov chain? If so, what is its
transition matrix (in terms of 𝑃 and 𝑄)? If not, explain why not.
(c) Let 𝑘 ⩾ 2 be an integer. Is the fast-forward process 𝑍𝑡 = 𝑋𝑘𝑡 a discrete time Markov chain? If
so, what is its transition matrix (in terms of 𝑃)? If not, explain why not.

3. A condition for converging to uniform. (5 points) Show that if 𝑃 and 𝑃 ⊤ are both transition
matrices of finite, discrete-time Markov chains, then the uniform distribution over the state space is
stationary for 𝑃.

1
4. Like a cycle. (5 points) Show that if 𝑃 is an irreducible discrete-time Markov chain over Ω in which
every state has period 𝑏, then Ω can be partitioned into 𝑏 sets 𝑆1 , … , 𝑆𝑏 so that 𝑃(𝑥, 𝑦) = 0 unless
there is some 𝑖 such that 𝑥 ∈ 𝑆𝑖 and 𝑦 ∈ 𝑆𝑖+1 mod 𝑏 .

5. All pairs at large times. (5 points) Show that if 𝑃 is the transition matrix of an irreducible and
aperiodic discrete-time Markov chain over finite state space Ω, then there exists some integer 𝑡 ⩾ 1
such that 𝑃 𝑡 (𝑥, 𝑦) > 0 for all 𝑥, 𝑦 ∈ Ω.
Hint: Lemma 1.16 in Durrett (second edition) proves that if the period of state 𝑥 is 1, then there exists some 𝑠 ⩾ 0 so that
𝑃 𝑟 (𝑥, 𝑥) > 0 for all 𝑟 ⩾ 𝑠. You can use that lemma without proof (provided that you attest to have read it).

6. Time reversal (6 points). Suppose 𝑃 is the transition matrix of an irreversible, discrete-time


Markov chain with stationary distribution 𝜋. Consider that the matrix

̂ 𝑦) = 𝜋(𝑦) 𝑃(𝑦, 𝑥).


𝑃(𝑥,
𝜋(𝑥)

(a) Prove that 𝑃̂ is a valid transition matrix for a Markov chain.


(b) Give and prove a formula for 𝜇, the stationary distribution of 𝑃.̂
(c) Give a necessary and sufficient condition on 𝑃 for 𝑃̂ to satisfy the detailed balance condition.
(d) Let 𝑋0 , … , 𝑋𝑛 be a discrete-time Markov chain sampled with 𝑋0 ∼ 𝜋 and transitions according
̂ Prove that 𝐏𝐫[𝑋0 =
to 𝑃. Let 𝑌0 , … , 𝑌𝑛 be sampled with 𝑌0 ∼ 𝜇 and transitions according to 𝑃.
𝑎0 , 𝑋1 = 𝑎1 , … , 𝑋𝑛 = 𝑎𝑛 ] = 𝐏𝐫[𝑌0 = 𝑎𝑛 , 𝑌1 = 𝑎𝑛−1 , … , 𝑌𝑛 = 𝑎0 ].

You might also like