• Complain

Durrett - Essentials of Stochastic Processes

Here you can read online Durrett - Essentials of Stochastic Processes full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. City: Cham, year: 2016;2018, publisher: Springer International Publishing, genre: Home and family. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

Durrett Essentials of Stochastic Processes
  • Book:
    Essentials of Stochastic Processes
  • Author:
  • Publisher:
    Springer International Publishing
  • Genre:
  • Year:
    2016;2018
  • City:
    Cham
  • Rating:
    3 / 5
  • Favourites:
    Add to favourites
  • Your mark:
    • 60
    • 1
    • 2
    • 3
    • 4
    • 5

Essentials of Stochastic Processes: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Essentials of Stochastic Processes" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

Building upon the previous editions, this textbook is a first course in stochastic processes taken by undergraduate and graduate students (MS and PhD students from math, statistics, economics, computer science, engineering, and finance departments) who have had a course in probability theory. It covers Markov chains in discrete and continuous time, Poisson processes, renewal processes, martingales, and option pricing. One can only learn a subject by seeing it in action, so there are a large number of examples and more than 300 carefully chosen exercises to deepen the readers understanding.
Drawing from teaching experience and student feedback, there are many new examples and problems with solutions that use TI-83 to eliminate the tedious details of solving linear equations by hand, and the collection of exercises is much improved, with many more biological examples. Originally included in previous editions, material too advanced for this first course in...

Durrett: author's other books


Who wrote Essentials of Stochastic Processes? Find out the surname, the name of the author of the book and a list of all author's works by series.

Essentials of Stochastic Processes — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Essentials of Stochastic Processes" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
Springer International Publishing Switzerland 2016
Richard Durrett Essentials of Stochastic Processes Springer Texts in Statistics 10.1007/978-3-319-45614-0_1
1. Markov Chains
Richard Durrett 1
(1)
Mathematics, Duke University, Durham, North Carolina, USA
1.1 Definitions and Examples
The importance of Markov chains comes from two facts: (i) there are a large number of physical, biological, economic, and social phenomena that can be modeled in this way, and (ii) there is a well-developed theory that allows us to do computations. We begin with a famous example, then describe the property that is the defining feature of Markov chains
Example 1.1 (Gamblers Ruin).
Consider a gambling game in which on any turn you win $1 with probability p =0.4 or lose $1 with probability 1 p =0.6. Suppose further that you adopt the rule that you quit playing if your fortune reaches $ N . Of course, if your fortune reaches $0 the casino makes you stop.
Let X n be the amount of money you have after n plays. Your fortune, X n has the Markov property. In words, this means that given the current state, X n , any other information about the past is irrelevant for predicting the next state X n +1. To check this for the gamblers ruin chain, we note that if you are still playing at time n , i.e., your fortune X n = i with 0< i < N , then for any possible history of your wealth i n 1, i n 2, i 1, i 0
Essentials of Stochastic Processes - image 1
since to increase your wealth by one unit you have to win your next bet. Here we have used P ( B | A ) for the conditional probability of the event B given that A occurs. Recall that this is defined by
Essentials of Stochastic Processes - image 2
If you need help with this notion, see Sect. A.1 of the appendix.
Turning now to the formal definition, we say that X n is a discrete time Markov chain with transition matrix p ( i , j ) if for any j , i , i n 1, i 0
11 Here and in what follows boldface indicates a word or phrase that is - photo 3
(1.1)
Here and in what follows, boldface indicates a word or phrase that is being defined or explained.
Equation () we have restricted our attention to the temporally homogeneous case in which the transition probability
does not depend on the time n Intuitively the transition probability gives - photo 4
does not depend on the time n .
Intuitively, the transition probability gives the rules of the game. It is the basic information needed to describe a Markov chain. In the case of the gamblers ruin chain, the transition probability has
When N 5 the matrix is or the chain be represented pictorially as Example - photo 5
When N =5 the matrix is
or the chain be represented pictorially as Example 12 Ehrenfest Chain - photo 6
or the chain be represented pictorially as
Example 12 Ehrenfest Chain This chain originated in physics as a model for - photo 7
Example 1.2 (Ehrenfest Chain).
This chain originated in physics as a model for two cubical volumes of air connected by a small hole. In the mathematical version, we have two urns, i.e., two of the exalted trash cans of probability theory, in which there are a total of N balls. We pick one of the N balls at random and move it to the other urn.
Let X n be the number of balls in the left urn after the n th draw. It should be clear that X n has the Markov property; i.e., if we want to guess the state at time n + 1, then the current number of balls in the left urn X n is the only relevant information from the observed sequence of states X n , X n 1, X 1, X 0. To check this we note that
since to increase the number we have to pick one of the N i balls in the other - photo 8
since to increase the number we have to pick one of the N i balls in the other urn. The number can also decrease by 1 with probability i N . In symbols, we have computed that the transition probability is given by
with p i j 0 otherwise When N 4 for example the matrix is In the - photo 9
with p ( i , j )=0 otherwise. When N =4, for example, the matrix is
In the first two examples we began with a verbal description and then wrote - photo 10
In the first two examples we began with a verbal description and then wrote down the transition probabilities. However, one more commonly describes a Markov chain by writing down a transition probability p ( i , j ) with
  1. (i)
    p ( i , j )0, since they are probabilities.
  2. (ii)
    j p ( i , j )=1, since when X n = i , X n +1 will be in some state j .
The equation in (ii) is read sum p ( i , j ) over all possible values of j . In words the last two conditions say: the entries of the matrix are nonnegative and each ROW of the matrix sums to 1.
Any matrix with properties (i) and (ii) gives rise to a Markov chain, X n . To construct the chain we can think of playing a board game. When we are in state i , we roll a die (or generate a random number on a computer) to pick the next state, going to j with probability p ( i , j ).
Example 1.3 (Weather Chain).
Let X n be the weather on day n in Ithaca, NY, which we assume is either: 1 = rainy , or 2 = sunny . Even though the weather is not exactly a Markov chain, we can propose a Markov chain model for the weather by writing down a transition probability
Essentials of Stochastic Processes - image 11
The table says, for example, the probability a rainy day (state 1) is followed by a sunny day (state 2) is p (1,2)=0.4. A typical question of interest is
Q.
What is the long-run fraction of days that are sunny?
Example 1.4 (Social Mobility).
Let X n be a familys social class in the n th generation, which we assume is either 1 = lower , 2 = middle , or 3 = upper . In our simple version of sociology, changes of status are a Markov chain with the following transition probability
Essentials of Stochastic Processes - image 12
Q.
Do the fractions of people in the three classes approach a limit?
Example 1.5 (Brand Preference).
Suppose there are three types of laundry detergent, 1, 2, and 3, and let X n be the brand chosen on the n th purchase. Customers who try these brands are satisfied and choose the same thing again with probabilities 0.8, 0.6, and 0.4, respectively. When they change they pick one of the other two brands at random. The transition probability is
Essentials of Stochastic Processes - image 13
Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Essentials of Stochastic Processes»

Look at similar books to Essentials of Stochastic Processes. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Essentials of Stochastic Processes»

Discussion, reviews of the book Essentials of Stochastic Processes and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.