Daniel W. Stroock Graduate Texts in Mathematics An Introduction to Markov Processes 2nd ed. 2014 10.1007/978-3-642-40523-5_1
Springer-Verlag Berlin Heidelberg 2014
1. Random Walks, a Good Place to Begin
Daniel W. Stroock 1
(1)
Department of Mathematics, Massachusetts Institute of Technology, Cambridge, MA, USA
Abstract
In order to introduce as soon as possible some of the ideas that underlie the whole book, Chap. 1 provides an introduction of Bernoulli random walks in one and higher dimensions. Emphasis is placed on the calculation of passage times and the dependence of recurrence properties on dimension.
The purpose of this chapter is to discuss some examples of Markov processes that can be understood even before the term Markov process is. Indeed, anyone who has been introduced to probability theory will recognize that these processes all derive from consideration of elementary coin tossing.
1.1 Nearest Neighbor Random Walks on

Let p be a fixed number from the open interval (0,1), and suppose that which are 1 with probability p . That is, for any

and any E ( 1,, n ){1,1} n ,
Next, set
The above family of random variables

is often called a nearest neighbor random walk on

. Nearest neighbor random walks are examples of Markov processes, but the description that I have just given is the one which would be given in elementary probability theory, as opposed to a course, like this one, devoted to stochastic processes. When studying stochastic processes the description should emphasize the dynamic nature of the family. Thus, a stochastic process oriented description might replace () by
where

denotes the conditional probability (cf. Sect. ). Specifically, it says that the process starts from 0 at time n =0 and proceeds so that, at each time

, it moves one step forward with probability p or one step backward with probability q , independent of where it has been before time n .
1.1.1 Distribution at Time n
In this subsection I will present two approaches to computing

. The first computation is based on the description given in () it is clear that

. In addition, it is obvious that
Finally, given m { n ,, n } with the same parity as n and a string E =( 1,, n ){1,1} n with (cf. ()) S n ( E )= m ,

and so
Hence, because, when

is the binomial coefficient choose k , there are

such strings E , we see that
Our second computation of the same probability will be based on the more dynamic description given in (), we see that

equals
That is,
Obviously, () plus induction on n to see that ( P n ) m =0 unless m =2 n for some 0 n and that ( C n )0=( C n ) n =1 and ( C n ) =( C n 1) 1+( C n 1) for 1< n where ( C n ) p q n ( P n )2 n . In other words, the family

are given by Pascals triangle and are therefore the binomial coefficients.
1.1.2 Passage Times via the Reflection Principle
More challenging than the computation in Sect.
Then { a } is the first passage time to a , and our goal here is to find its distribution. Equivalently, we want an expression for

, and clearly, by the considerations in Sect. , we need only worry about n s which satisfy n | a | and have the same parity as a .
Again I will present two approaches to this problem, here based on (), assume that

, suppose that

has the same parity as a , and observe first that
Hence, it suffices for us to compute

. For this purpose, note that for any E {1,1} n 1 with S n 1( E )= a 1, the event {( B 1,, B n 1)= E } has probability

Next page