Operational Research
Prabhat Mittal
[email protected]
Prabhat Mittal
1
[email protected]
Markov Processes
Prabhat Mittal
[email protected]
2
Introduction Markov Process (Chain) is a stochastic ( probabilistic) process which has the property that the probability of a transition from a given state to any future state is dependent only on the present state and not on the manner in which it was reached. Markov processes has become a versatile tool for solving some of the management problems specially in the area of marketing . Prabhat Mittal
[email protected]
3
Management Applications •
•
•
It is widely used in examining and predicting the behavior of consumer in terms of their brand loyalty and their switching patterns to other brands. Markov process has also been employed in the study of equipment maintenance and failure problems, analyzing accounts receivable that will ultimately become bad debts. It is also useful in the study of stock market price movements.
. Prabhat Mittal
[email protected]
4
Basic concepts of Markov Process •
•
Markov Process is a sequence of n experiments in which each experiment has m possible outcomes a 1, a2, …, am. Each individual outcome is called a state and the probability that a particular outcome occurs depends only on the probability of the outcome of the preceding experiment. The probability of moving form one state to another or remaining in the same state in a single time period is called transition probability . Because of the probability of moving to one state depends on what happened in the preceding stat e, the transition probability is a conditional probability.
Prabhat Mittal
5
[email protected]
Transition Matrix •
•
The transition probabilities a 1 a 2 a m … can be arranged in a matrix a1 p11 p12 ... p1m form and such a matrix is a2 p 21 p22 ... p2 m called a one-step transition P = . . . . . probability matrix, denoted am pm1 pm 2 ... pmm by P: The matrix P is a square matrix whose element is a non-negative probability and sum of the elements of each row is unity and
Prabhat Mittal
[email protected]
m
∑ p
ij
=
1;
i
=
1, 2 ,..., m
j = 1
0 ≤ pij
≤1
6
Transition Matrix ctd. •
•
In general, any matrix P whose elements are non-negative and sum of the elements either in each row or in column is one is called a transition matrix or a stochastic matrix or a probability matrix. Thus, a transition matrix is a square stochastic matrix (since no. of rows = no. of columns) and therefore it gives the complete description of the Markov process.
Prabhat Mittal
7
[email protected]
Transition Diagram •
The transition probabilities can also represented by two types of diagrams: 0.7 0.3 0 T=
0.3 0.4 0.3 0
transition diagram, which shows the transition probabilities or shifts that can occur in any 0.7 particular situation.
0.3 0.7
0.4 0.3
x2
0.3
0.3 0.3 x1
0.7 x3
The arrows form each state indicate the possible states to which a process can move from the given state. A zero element in the above matrix indicates that the transition is impossible Prabhat Mittal
[email protected]
8
Probability tree diagram 0.7 T=
0.3
0
0.3
0 .4
0.3
0
0.3
0.7
0.7 0.3
x1 0
Probability tree diagram,
x1 x2 x3
0.3 x2
0.4 .3
x1 x2 x3
x1
0 x3
0.3 .7
x2 x3
Prabhat Mittal
9
[email protected]
Example-I Two manufacturers A and B are competing with each other in a restricted market. Over the year, A’s customers have exhibited a high degree of loyalty as measured by the fact that customers using A’s product 80 percent of the time. Also former customers purchasing the product from B have switched back to A’s 60 percent of the time (a)Construct
and interpret the state transition matrix in terms of retention and loss & retention and gain (b)Calculate the probability of a customer purchasing A’s product at the end of the second period. Prabhat Mittal
[email protected]
10
Example 1 Solution Next Purchase (n=1)
Transition Matrix
A Present purchase (n =0)
R & e t g e a n t i i n o n
B
A
0.80
0.20
B
0.60
0.40
P(A1|A0)= prob. Customer using A at present will use A in future P(B1|A0)= prob. Customer using A at present will use B in future
Retention & Loss
Transition Diagram
0 .4
0 .2
0 .8
B 0 .6
A
A
0.8
Probability tree diagram
0.2
A
B
0.6 0.4
B
A B
Prabhat Mittal
11
[email protected]
Example 1 Solution Next Purchase ( n=2 )
Transition Matrix 2nd Stage
P2 = P.P
Present purchase (n =0)
Transition Diagram
0 .7 6
A
B
A
0.76
0.24
B
0.72
0.28
0.8
Probability tree diagram A
[email protected]
B 0 .7 2
A
Prabhat Mittal
0 .2 8
0 .2 4
0.2
A B
0.6 B
0.4
A B 12
Steady State (Equilibrium) Conditions •
•
Given that the process being modeled as a Markov Process has certain properties, it is then possible to analyze its long run behavior to determine the probabilities of outcomes after steady state conditions have been reached. For example, after the Markov Process has been in operation for a long time period a given outcome will tend to occur a fixed percent of time. However, for a markov chain to reach steady-state conditions, the chain must be • •
ergodic regular
Prabhat Mittal
[email protected]
13
Steady State (Equilibrium) Conditions •
•
•
An ergodic Markov chain has the property that it is possible to go from one state to any other state, in a finite number of steps, regardless of the present state. A regular chain is a special type of ergodic chain. It is defined as a chain having a transition Matrix P that for some power of P, it has only non-zero positive probabilities. Thus all regular chains must be ergodic chains but not all ergodic chains must be regular. The easiest way to check if an ergodic chain is regular is to continue squaring the transition matrix P until all zeros are removed.
Prabhat Mittal
[email protected]
14
Example-II Determine if the following transition matrix is ergodic Markov chain Future states 1
Present purchase (n =0)
2
3
4
1
1/3
1/3
0
1/3
2
0
½
¼
¼
3
¼
0
½
¼
4
0
0
1/3
2/3
Prabhat Mittal
15
[email protected]
Example-II (Solution) Check, if it is possible to go from one state to all other states and back. Like from state 1, it is possible to go directly to every other state except state 3. For state 3, it is possible to reach from state 1 to 2 to 3. Therefore, it is possible to go from state 1 to any other state. similarly, from state 2, it is possible to go to state 3 & 4 but not 1. to reach 1 state 2 to 3-1. Also from state 3, state 1 can directly be approached. Finally, from state 4, it is possible to go to state 3, then state 3 to state 1. Hence, the above transition matrix is an ergodic markov chain sin ce we have shown that is possible to go from state 1 to all other states, and to go from all other states to state 1. Prabhat Mittal
[email protected]
16
Example-III Determine if the following transition matrix is ergodic and regular Markov chain. X is some positive probabilities. Future states 1
Present purchase (n =0)
2
3
4
1
0
×
×
0
2
×
0
0
×
3
×
0
0
×
4
0
×
×
0
Prabhat Mittal
17
[email protected]
Example-III (Solution) P=
P
4
=
1
0
×
×
0
2
×
0
0
× 2
3
×
0
0
×
4
0
×
×
0
1
0
×
×
0
2
×
0
0
×
3
×
0
0
×
4
0
×
×
0
P =
8
P
=
1
0
×
×
0
2
×
0
0
×
3
×
0
0
×
4
0
×
×
0
1
0
×
×
0
2
×
0
0
×
3
×
0
0
×
4
0
×
×
0
Note that P raised to an even numbered power gives the result as above, while P raised to an odd-numbered power will give the original matrix. Since all the elements are not nonzero positive elements, therefore the above given matrix is not regular But it is ergodic since it is possible to go from state 1 to state 2 or state 3 from state 2 to state 1 or 4. from state 2 to state 1. From state 3 to state 1. From state 4 to state2 or state 1. Prabhat Mittal
[email protected]
18
Example-IV The number of units of an item that are withdrawn from inventory on a day-to-day basis is a Markov chain process in which requirements for tomorrow depend on today’s requirements. Tomorrow
Today
5
10
12
5
0.6
0.4
0.0
10
0.3
0.3
0.4
12
0.1
0.3
0.6
a. Develop a two day transition matrix b. Comment on how a two day transition matrix might be helpful to a manager who is responsible for inventory management. Prabhat Mittal
[email protected]
19
Example-IV (Solution) Two day transition matrix
(2)
P
= P.P=
Tomorrow 5
10
12
5
0.48
0.36
0.16
10
0.31
0.33
0.36
12
0.21
0.31
0.48
Imagine that each morning a manager must place an order for inventory replenishment. As a result of delivery time requirements, an order placed today arrives 2 days later. The 2 day transition matrix can be used for guiding ordering decisions. For e.g. If today the manager experiences a demand for 5 units, then 2 days later (when replenishment stock arrives in response to today’s order), the probability of needing five units is 0.48, ten units is 0.36 and that twelve units is 0.16 Prabhat Mittal
[email protected]
20
Example-V A manufacturing company has a certain piece of equipment that is inspected at the end of each day and classified as just overhauled, good, fair or inoperative. If the item is inoperative it is overhauled, a procedure that takes one day. Let us denote the four classifications as states 1,2, 3 and 4 respectively. Assume that the working conditions of the equipment follows a Markov processes with the following transition matrix. 1
P= Today
Tomorrow 2 3 4
1
0
¾
¼
0
2
0
½
½
0
3
0
0
½
½
4
1
0
0
0
If it costs Rs.125 to overhaul a machine, on the average, and Rs.75 in production is lost if a machine is found inoperative. Using the steady state probabilities, compute the expected per day cost of maintenance.
Prabhat Mittal
21
[email protected]
Example-V (Solution) The given matrix P represents an ergodic regular Markov process, therefore it shall reach to steady state equilibrium. Let p1, p2,p3,and p4 be steady state probabilities representing the proportion of times that the machine will be in states 1,2 3 and 4 respectively. Using the steady-state equation R= RP
(p1, p2,p3, p4) = (p1, p2,p3, p4 )
Prabhat Mittal
[email protected]
0
¾
¼
0
0
½
½
0
0
0
½
½
1
0
0
0
22
Example-V (Solution) Solving the steady state probability, requires solving the simultaneous equations. p1 = p4 p2 = ¾ p1+ ½ p2 p3 = ¼ p1+ ½ p2 + ½ p3 p4 = ½ p3 And p1+ p2 + p3 + p4 =1 Solving the above equations we obtain the steady state prob. as p1= 2/11, p2 = 3/11, p3 = 4/11 and p4 =2/11. Thus, on an average, two out of every 11 days the machine will be overhauled, three out of every 11 days it will be in good condition, four out of every 11 days it will be in fair condition and two of every 11 days it will be found inoperative at the end of the day. Hence the average cost per day of maintenance will be =2/11*125 + 2/11*75 = Rs.36.36 Prabhat Mittal
23
[email protected]
Excel Worksheet
Prabhat Mittal
[email protected]
24
Example-VI There are three dairies A, B and C in a small town which supply all the milk consumed in the town. Assume that the initial consumer sample is composed of 1000 respondents distributed over three dairies A, B and C. it is known by all the dairies that consumer switch from one dairy to another due to advertising, price and dissatisfaction. All these dairies maintain records of the number of their customers and the dairy from which they obtained each new customer. The following table illustrates the flow of customers over an observation period on one month. Assume that the matrix of transition probabilities remains fairly stable and at the beginning of period one, market shares are A = 25%, B = 45% and C = 30%. Construct the state transition probability matrix to analyze the problem. Dairy
Period one No. of customers
Changes during the period Gain
Prabhat Mittal
Period two No. of customers
Loss
A
250
62
50
262
B
450
53
60
443
C
300
50
55
295
[email protected]
25
Example-VI (Solution) To determine the retention probabilities (Probability to remain in the same state), the customers retained for the period under review are divided by the number of customers at the beginning of the period. Dairy
Period one No. of customers
Changes during the period
Probability of retention
Number Number lost retained
A
250
50
200
200/250=0.80
B
450
60
390
390/450=0.86
C
300
55
245
245/300=0.81
Prabhat Mittal
[email protected]
26
Example-VI (Solution) ctd. ctd. To determine the gain and loss probabilities, it is necessary to show gains and losses among the dairies in order to complete the matrix of transition probabilities. (Probability to remain in the same state), the customers retained for the period under review are divided by the number of customers at the beginning of the period. Dairy
Period one No. of customers
Gain from A
B
C
Loss to A
B
Probability of retention C
A
250
0
35
27
0
25
25
262
B
450
25
0
28
35
0
25
443
C
300
25
25
0
27
28
0
295
Prabhat Mittal
27
[email protected]
Example-VI (Solution) ctd. ctd. •
•
•
The first row shows that dairy A retains 80% of its customers, losses 10% (25) of customers to dairy B and losses 10% (25) to dairy c The second row shows that dairy B losses 8% to A, retains 86.6% of its own customers and losses 5.5% to dairy C The third row shows that dairy C losses 9% of its customers to dairy B and retains 81% of its own customers A
Dairies
Matrix of transition probabilities B C
A
200/250=0.80
25/250=0.10
25/250=0.10
B
35/450=0.08
390/450=0.86
25/450=0.055
C
27/300=0.09
28/300=0.093
245/300=0.81
R e t e n t i o n a n d g a i n
Retention and loss Prabhat Mittal
[email protected]
28