TABLE 7.1: N PERIOD MOVING AVERAGE FORECAST OF

LECTURE – 1
FORECASTING DEMAND IN SERVICES
Learning Objective
 To discuss various methods of forecasting demand in services
7.1 Managing demand in services
 There is no option of inventory buffer to meet variations in service demand
 Perishable nature of service : simultaneous production and consumption of
services
 Fixed capacity of service system restricts flexibility to entertain demand
 Rooms in a hotel and seats in airplane
 Seasonality in demand for some services & spur of the moment decisions of customers
that is unpredictability of demand
 Heart attack emergency cannot can be planned
 Visiting hill station during summer season can be planned
 Personalized service take varying service times
7.1.1 Forecasting demand for services
 Forecasting demand forms the basis for planning activities. Forecasting involves in
estimating future event by systematically combining past data in some predetermined
way.
 Estimates the number of units of services that could be sold
 Number of customers
 Number of hours of service supplied
 Units of service product supplied(liters of petrol, number of caller tuner, number
of transactions)
 Various forecasting methods can be adopted to forecast the demand for service as shown
in Figure 7.1.
Judgmental or Subjective
Association or causal
Delphi
Method
Regression and /or
Econometric models
Cross Impact
analysis
Time series
Moving
Averages
Weighted
Moving
averages
Historical
Analogy
Exponential
Smoothing
FIGURE 7.1: VARIOUS FORECASTING METHODS TO FORECAST
DEMAND FOR SERVICE
7.2 Subjective or qualitative forecasting methods are used where
 No past data is available
 If some data is available, cannot be used for long run forecast
 Mostly used for new technology or new products introduced
 The patterns can be trend, seasonality, cycle, regular and irregular variations as
shown in Figure 7.2.
 Trend: A gradual increase (upward movement) or decrease (downward movement)
in observations over time.
 Cycle: An unpredictable long-term cycling behavior. This behavior may be due to
business cycle or service product life cycle
 Seasonality: A predictable short-term cycling behavior due to time of day, week,
month, season or year.
 Random error: Remaining variation that cannot be explained by the other four
components also called residual variations.
 Irregular variations: Variations due to irregular circumstances which do not reflect
any typical behavior.
 Level: Short term patterns that are not repetitive in nature.
Upward trend
Irregular
variations
Downward trend
Cycle
Seasonality
FIGURE 7.2: TIME SERIES FORECASTING PATTERNS
7.2.1 Delphi Method
 An expert opinion based forecasting method proposed by Olaf Helmer
 Repeatedly or iteratively asking questions to the diverse experts independently till the
experts arrive at a consensus
Steps in Delphi Method
1. The administrator prepares some questions using scale like likert scale and some
open ended questions.
2. Send the questionnaire to the experts in the area. The experts are not allowed to
interact with each other.
3. The experts are expected to give numerical estimates as per the proposed scale.
4. The test administrator tabulates the responses into quartiles. This completes the round
1 of Delphi method.
5. The administrator send the findings from round 1 along with some updated questions
based on the open ended responses to the experts.
6. The experts are expected to reconsider their answers and to justify their opinions.
The steps (2) to (6) are repeated till all the experts arrive at a consensus
7.2.2 Cross Impact Analysis
 The main assumption in this method is that some future event to be occurred is
related to the occurrence of an earlier event.
 The earlier event & future events are correlated.
 The conditional probabilities are estimated for the events, which are revised over a
series of iterations by the experts.
7.2.3 Historical Analogy
 To forecast the growth pattern of new service it is assumed that it may show the
pattern of a similar concept for which data are available.
7.3 Quantitative forecasting methods
 Short term forecasts where future of a data set is assumed to be function of the past of
that set
 An ordered sequence of observations taken at regular intervals of time
 The past data set presents an identifiable pattern over time
 Cannot include new factor in future
7.3.1 Time Series Forecasting: Moving Averages
 Let’s forecast the demand for a service
 N- Period moving average for time period t found by adding the actual observation or
demand during past recent N- periods and dividing by N

For each next time period forecast, the most recent observation of previous forecast
is added and the oldest observation is dropped.
 It helps in smoothing out short term irregularities, also called Level.
 Each observation is weighted equally. If there is 3-period moving average then all
three recent observation will have weight of 1/3.
Example
A hospital wants to forecast the number of surgeries to be performed for the month of
December. The observed number of surgeries for the same year from January to
November is given in the Table 7.1. What is the forecasted number of surgeries a
hospital can expect for December?
TABLE 7.1: N PERIOD MOVING AVERAGE FORECAST OF NUMBER OF
SURGERIES IN A HOSPITAL
Forecast at t
Time Period Actual Observation (Ot)
Month (t) Surgeries done in hospital
N Period Moving Average
3 Month
4 Month
Jan
15
-
-
Feb
17
-
-
Mar
18
Apr
21
15+17+18/3
May
28
17+18+21/3
15+17+18+21/4
Jun
31
18+21+28/3
17+18+21+28/4
Jul
35
21+28+31/3
18+21+28+31/4
Aug
33
28+31+35/3
21+28+31+35/4
Sep
23
31+35+33/3
28+31+35+33/4
Oct
28
35+33+23/3
31+35+33+23/4
Nov
19
33+23+28/3
35+33+23+28/4
23+28+19/3
33+23+28+19/4
Dec
-
Forecast at t = Ot-1+Ot-2+…..+Ot-n
N
Ot: Actual observation at time period t
The number of surgeries forecasted for the month of December with 3
month moving average is
𝑭𝑫𝒆𝒄𝒆𝒎𝒃𝒆𝒓 =
𝟐𝟑 + 𝟐𝟖 + 𝟏𝟗
= 𝟐𝟑
𝟑
The number of surgeries forecasted for the month of December with 4
month moving average is
𝑭𝑫𝒆𝒄𝒆𝒎𝒃𝒆𝒓 =
𝟑𝟑 + 𝟐𝟑 + 𝟐𝟖 + 𝟏𝟗
= 𝟐𝟔
𝟑
7.4 Time Series Forecasting: Weighted Moving Average
 The demand data or observations when follow some trend or pattern
 Give different weights to different observations
 Respond to changes where recent observations are more emphasized or given more
importance
Time Period
Month (t)
Actual Observation
(Ot) Surgeries done in
hospital
Forecast at 3 Period
(month) Weighted
Moving Average
Jan
15
-
Feb
17
-
Mar
18
-
Apr
21
[3x18 + 2x17 + 1x15]/6
May
28
[3x21 + 2x18 + 1x17]/6
Jun
31
Jul
35
Aug
33
Sep
23
Oct
28
Nov
19
Dec
Forecast
at t=
wt-1Ot-1+wt-2Ot-2+wt-nOtWt-1+wt-2+wt-
In the above example, the weights given to the most recent observation, w t1=3,
next most recent, wt-2= 2 and next to next most recent, wt-3= 1.
The forecast for the month of December is
𝑭𝑫𝒆𝒄𝒆𝒎𝒃𝒆𝒓 =
𝟑×𝟏𝟗+𝟐×𝟐𝟖+𝟏×𝟐𝟑
𝟔
=23
In this example more weight is given to the most recent occurring observation that is of
November month.
7.5 Time Series Forecasting: Exponential Smoothing
 Smooth’s out blips in the data
 Required most recent observation
 At the same time old data or observations are never dropped or lost, in fact, older
observations are given progressively less weight
Ft  Ft 1   (Ot 1  Ft 1 )
 Where Ft 𝑖𝑠 𝑡ℎ𝑒 smoothed forecast value for period t, Ot is actual observed
value for period t and  is smoothing constant assigned value mostly between
0.1 and 0.5.
 The term (Ot-1 – Ft-1) represents the forecast error (Difference between the actual
observation and forecast value that was calculated in the prior period)
 Hence, also called feeding back system where forecast error is considered to corrected
the previous smoothed or forecast value.
Example
In January, the number of surgeries to be performed were predicted for February to be 100.
Actual number of surgeries performed were 120. Using  = 0.3, the forecast for the month
of March using exponential smoothing tool is
Forecast(March) = 100 + 0.3 (120-100)
= 100 + 0.3 (20)
= 106
7.6 Association or Causal Forecasting Method
 Association or causal forecasting method helps in capturing trend in data.
 Consider several independent variables that are related to the dependent variable
being predicted.
 Independent variables can be many factors, which relates with the dependent
variable.
 Linear regression analysis is the most commonly used quantitative casual forecasting
model.
 Example: The sales of spare parts of auto vehicles depend on the age of vehicle,
seasonal changes, distance covered etc..
The forecast expression for exponential smoothing can also be written as
F = αOt +(1-α)Ft
t+1
If we substitute Ft in the above expression we get
F = αOt +(1-α)[αO +(1-α)F ]
t+1
t-1
t-1
and similarly we can substitute for Ft-1 . That means in exponential smoothing forecast
method the last period forecast captures the entire information about the past demand.
It can also be seen that maximum weightage is given to the last period demand and
lower weightages are given to the individual demand points as one goes down (past
data) in time.
Example
Demand
Forecast with  =0.1
(Ot)
(Ft)
1
10
10
2
18
10
3
29
11
4
15
13
5
30
13
6
12
15
7
16
14
8
8
14
9
22
14
10
14
15
11
15
15
12
27
15
13
30
16
14
23
17
15
15
18
Time period
16
20
18
Regression Model
 In linear regression model, there can be n independent variables Xi related to the
dependent variable Y, as expressed below
Y = a0 + a1X1 + a2 X2 + …..anXn
Where a0, a1, a2…..an , are the coefficients by using regression equations

Y
 Least squares method can be utilized to forecast the dependent variable,
related to independent variable X,

=a0 + a1X
Y
a0 represents y-axis intercept
a1 represents slope of the regression line
 The values of a0 and a1 are so determined which can represent Y using best fit line

=a0 + a1X within the range of observations of Yi and Xi
Y
Least Square Method
We have the data on Yi and Xi
Define error Ei as
Ei = (a0 + a1 Xi - Yi)
 Determine a0 and a1 in such a way that sum of the squared errors over all the
observations is minimized i.e.,
SS (a0 , a1 ) 
n
 [a X
i 1
0
i
 a1  Yi ]2
 To minimize we need to determine partial derivative of SS with respect to a 0 and a1
which gives following equation
æn
ö
÷
na1 + ççå Xi ÷
a =
÷
çè i= 1 ø
÷ 0
n
å
Yi
¾ (1)
i= 1
æn
ö
æ n 2ö
çç X ÷
ç
÷
÷
a +
X ÷
a=
÷
÷
÷ 1 ççèåi= 1 i ø
÷
çèåi= 1 i ø
n
å
Xi Yi
¾ (2)
i= 1
 Equations (1) and (2) gives two linear equations in ao and a1, which can be solved to get
n
nå Xi Yi i= 1
a1 =
n
n
Xi å Yi
å
i= 1
n
nå X i
i= 1
i= 1
2
2
æ
ö
÷
- ççå Xi ÷
÷
çè i= 1 ø
÷
n
n
å
a1 =
Xi Yi - n X .Y
i= 1
n
å
2
( )
Xi 2 - n X
i= 1
where X and Y are the average of observations Xi and Yi respectively
a0 = Y - a1 X
In the regression model
ao is the level component of forecast
ao is the trend component of forecast
Example
A software developer company wants to forecast the revenues for the next year. The
manager of the company wants to conduct casual analysis to analyze if the number of hours
spend by employee per day has impact on revenues. Manger collects data for past six years
and applied linear regression analysis in the following manner. Every year he/she kept on
increasing the number of working hours by one hour.
Number of hours Revenues
spend per day
earned
(Rs. 00000)
6
70
7
71.5
8
75
9
76.5
10
77.9
11
80.2
In this example, the dependent variable is revenues and the independent variable is number
of hours. We will apply least square method to following regression equation.
a 0  Y  a1 X
where,
Y is the average of revenues for last six years
X is the average of number of hours per day to get the forecast for next year with 12 hours
per day, represented by
Y  a 0  a1 (12)
We need to determine a0 and a1
a 0  Y  a1 X
n
a1 
 X Y  nX Y
i
i 1
n
X
i 1
i
2
i
 n(X) 2
Yi
Xi
XiYi
Xi2
70
6
420
36
71.5
7
500.5
49
75
8
600
64
76.5
9
688.5
81
77.9
10
779
100
80.2
11
882.5
121
Total
451.1
51
3870.2
451
Average
75.2
8.5
645
75.2
Here n=6
Y =75.2, X =8.5
n
X Y
i 1
i
i
= 3870.2
n
X
i 1
2
i
= 451
Substituting these values for a0 and a1we will get
a0 

3870.2  (6) (8.5) (75.2)
451  (6) (8.5) 2
3870.2  3834.4
451  433.5
a1  2.05
a 0  Y  a1 X
 75.2  a1 X
 57.8
The forecasted revenues for next year if number of working hours increased from 11 to 12
hours are
Y  57.8  (2.05) (12)
 82.4