Markov Models
An Introduction to Markov Models
By Steven Taylor
Table of Contents
Copyright 2017 by Steven Taylor - All rights reserved.
This document is geared towards providing exact and reliable information in regards to the topic and issue covered. The publication is sold with the idea that the publisher is not required to render accounting, officially permitted, or otherwise, qualified services. If advice is necessary, legal or professional, a practiced individual in the profession should be ordered.
From a Declaration of Principles which was accepted and approved equally by a Committee of the American Bar Association and a Committee of Publishers and Associations.
In no way is it legal to reproduce, duplicate, or transmit any part of this document in either electronic means or in printed format. Recording of this publication is strictly prohibited and any storage of this document is not allowed unless with written permission from the publisher. All rights reserved.
The information provided herein is stated to be truthful and consistent, in that any liability, in terms of inattention or otherwise, by any usage or abuse of any policies, processes, or directions contained within is the solitary and utter responsibility of the recipient reader. Under no circumstances will any legal responsibility or blame be held against the publisher for any reparation, damages, or monetary loss due to the information herein, either directly or indirectly.
Respective authors own all copyrights not held by the publisher.
The information herein is offered for informational purposes solely, and is universal as so. The presentation of the information is without contract or any type of guarantee assurance.
The trademarks that are used are without any consent, and the publication of the trademark is without permission or backing by the trademark owner. All trademarks and brands within this book are for clarifying purposes only and are the owned by the owners themselves, not affiliated with this document.
A Markov model refers to a stochastic model that is used to model systems that spontaneously change and where there is an assumption that future states rely on the current ones, and not on the events that happened before it. Markov models are used varying situations though the most models are categorized into four, and the categorizations mainly depends on whether each and every chronological condition is can or cannot be observed, and whether the model is to be attuned on the basis of the realized observations.
The most basic Markov model is the Markov chain. It is responsible for modelling the state of a system with a random variable that alters over time. This means that this concept is of the view that the distribution for this variable relies solely on the distribution of a previous state.
Markov models are valuable once a decision problem entails risk that is constant over time, when the scheduling of events is imperative. In addition, Markov models are also very valuable when vital events may take place more than once. Coming up with proper representations of such sensitive settings with traditional decision trees is complex and may call for impracticable simplifying assumptions. In medical settings, for instance, Markov models are based on an assumption that a patient is constantly in one of a limited number of distinct states of health, which are called Markov states. In that situation, all events are modeled as transitions coming from one health state to another health state.
Among techniques for evaluating a Markov model include the use of matrix algebra, the use of cohort simulation and the application of a Monte Carlo simulation. There is a recently emerging method for representing Markov models, which uses the Markov-cycle tree. This technique uses a tree depiction of clinical incidents. It can be either a Monte Carlo simulation of a cohort simulation when evaluated. The Markov model has the capacity of representing repetitive or recurring events and this is a powerful advantage. Moreover, the fact that time dependence of the two probabilities and utilities permits clinical settings to be more accurately represented.
There are various ways in which a decision analyst can assign values to these terminal nodes of the decision tree. In some cases the outcome measure is a crude life expectancy; in others it is a quality-adjusted life expectancy. One of the methods for approximation of life expectancy is the diminishing exponential estimation of life expectancy (DEALE). This works out a sick persons or an individuals specific mortality rate considering a particular permutation of patient features and comorbid diseases. Standard life tables or Gompertz schemes of survival can be used to obtain life expectancies. Besides the two sources of life expectancies, the Markov model, developed by Beck and Pauker in 1983 can also serve as an estimation of an individuals life expectancy. It applies Markov chains and processes to depict prognosis for use in healthcare or medical applications. Since its introduction in the 1980s, many Markov models have surfaced and have been used with increasing regularity in published decision evaluations. The advent of the computer has bolstered the effective application of these models by making development, construction and evaluation of the models easier and feasible. These reasons justify a re-examination of the Markov model. This book serves both as a review of the theory behind the Markov model of prognosis and as a practical guide for the construction of Markov models using microcomputer decision-analytic software.
Markov models are particularly useful when a decision problem involves a risk that is ongoing over time. Some clinical examples are the risk of hemorrhage while on anticoagulant therapy, the risk of rupture of an abdominal aortic aneurysm, and the risk of mortality in any person, whether sick or healthy. There are two important consequences of events that have ongoing risk. First, the times at which the events will occur are uncertain. This has important implications because the utility of an outcome often depends on when it occurs. For example, a stroke that occurs immediately may have a different impact on the patient than one that occurs ten years later. For economic analyses, both costs and utilities are discounted; such that later events have less impact than earlier ones. The second consequence is that a given event may occur more than once. As the next example illustrates, representing occurrences that are recurring or that occur with vague timing is not easy if a tree model is to be used.
The Markov model provides a far more convenient way of modelling prognosis for clinical problems with ongoing risk. The model assumes that the patient is always in one of a finite number of states of health referred to as Markov states. All events of interest are modelled as transitions from one state to another. Each state is assigned a utility, and the contribution of this utility to the overall prognosis depends on the length of time spent in the state. In our example of a patient with a prosthetic heart valve, these states are WELL, DISABLED, and DEAD. For the sake of simplicity in this example, we assume that either a bleed or a non-fatal embolus will result in the same state (DISABLED) and that the disability is permanent. The time perspective of the evaluation is split into identical additions of time, called Markov sequences or cycles. During each cycle, the patient may make a transition from one state to another. Later figures will show a commonly used representation of Markov processes, called a state transition diagram, in which each state is represented by a circle. Arrows connecting two different states indicate allowed transitions. Arrows leading from a state to itself indicate that the patient may remain in that state in consecutive cycles. Only certain transitions are allowed. For instance, a person in the WELL condition may instigate a change to the DISABLED condition, but a transformation from DISABLED condition to WELL condition is disallowed. An individual in either the condition of WELL or that of DISABLED will possibly die and therefore exhibit a transition to the condition of DEAD. However, an individual being in the condition of DEAD, obviously, cannot make a transition to any other state. Therefore, a single arrow emanates from the DEAD state, leading back to itself. It is assumed that a patient in a given state can make only a single state transition during a cycle.
Next page