Publishedpaper Bayesian 4

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/343054057

Bayesian Structural Time Series

Article in Nanoscience and Nanotechnology Letters · January 2020


DOI: 10.1166/nnl.2020.3083

CITATIONS READS

6 3,580

2 authors:

Abdullah M Almarashi Khushnoor Khan


King Abdulaziz University King Abdulaziz University
56 PUBLICATIONS 434 CITATIONS 57 PUBLICATIONS 198 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Khushnoor Khan on 19 July 2020.

The user has requested enhancement of the downloaded file.


Copyright © 2020 American Scientific Publishers Nanoscience and
All rights reserved Nanotechnology Letters
Printed in the United States of America Vol. 12, 54–61, 2020

Bayesian Structural Time Series


Abdullah M. Almarashi∗ and Khushnoor Khan∗
Department of Statistics, Faculty of Science, King Abdulaziz University, Jeddah, 21589, Saudi Arabia

The current study focused on modeling times series using Bayesian Structural Time Series tech-
nique (BSTS) on a univariate data-set. Real-life secondary data from stock prices for flying cement
covering a period of one year was used for analysis. Statistical results were based on simulation
procedures using Kalman filter and Monte Carlo Markov Chain (MCMC). Though the current study
involved stock prices data, the same approach can be applied to complex engineering process
involving lead times. Results from the current study were compared with classical Autoregressive
Integrated Moving Average (ARIMA) technique. For working out the Bayesian posterior sampling
distributions BSTS package run with R software was used. Four BSTS models were used on a
real data set to demonstrate the working of BSTS technique. The predictive accuracy for competing
models was assessed using Forecasts plots and Mean Absolute Percent Error (MAPE). An easy-
to-follow approach was adopted so that both academicians and practitioners can easily replicate
the mechanism. Findings from the study revealed that, for short-term forecasting, both ARIMA and
BSTS are equally good but for long term forecasting, BSTS with local level is the most plausible
option.
Keywords: Bayesian, Time Series, Structural, ARIMA, State-Space, Prior, Posterior.

1. INTRODUCTION are cumbersome to deal with as ARIMA is a theoreti-


Time Series is a stochastic process indexed by time and cal in nature Seasonal effects and random fluctuations are
most commonly used technique for forecasting a time not as nicely addressed as in the structural models [3].
series that has been auto-regressive integrated with moving It has been proposed that in analyzing times series with
average (ARIMA) methodology. The traditional ARIMA missing observations, one should employ the state-space
technique just relies on data without any knowledge of representation and recursive equations using Kalman Fil-
the mechanism from which the data is retrieved. On the ter. The working of the state space models is Marko-
other hand, structural models examine whether the pre- vian in nature, as the future stage depends on the present
dicted patterns as envisaged by each component (linear, state consequently, and computations are iterative, thus
seasonal, random) are what is exactly needed. This feature high dimension models can be easily worked out with-
of structure models makes them more flexible in the pre- out any tedious mathematical procedures. Researchers pro-
diction process. Furthermore, the requirement of station- vide good introductions to state-space models [3–5], and
arity in approach implies differentiating the series under they produced many results used subsequently as a clas-
study [1], but the exact integration order is hard to work sic reference [6], and others have provided excellent, more
out. It is because of such artefacts that precise prediction advanced, treatments on the subject [7, 8]. These models
models cannot be suggested. According to the fact [2], are commonly suitable class of time series models, com-
the main advantage of structural models stems from the monly named as “structural time series,” “state space mod-
fact that they are run efficiently with specific software els,” “Kalman filter models,” and “dynamic linear models,”
without much mathematical hassle and taught in limited among others. BSTS was proposed by Steven et al. [9, 10],
educational institutions of higher learning. But the same who further explored the BSTS model, which is a tech-
is not true for ARIMA approach, which is widely taught nique that can be used for selection of the features, fore-
and practiced for forecasting time series data. Usual prob- casting of time series, deducing any causal relationship.
lems in using ARIMA technique is how to handle missing Mokilane et al. [11] applied BSTS in modeling long-term
observations if they occur, and also independent variables electricity demand forecasting in South Africa with quan-
tifying uncertainties. Analysis of sustainable technology

Authors to whom correspondence should be addressed. using BSTS was proposed by Jun [12] with application

54 Nanosci. Nanotechnol. Lett. 2020, Vol. 12, No. 1 1941-4900/2020/12/054/008 doi:10.1166/nnl.2020.3083


Almarashi and Khan Bayesian Structural Time Series

to artificial intelligence technologies. Though BSTS is where


a new area in modeling and predicting time generated
variables but has gained enough currency in diversified ”4B p 5 = 1 − ”1B − ”2 B 2 − · · · − ”p B p 3
fields of study, ranging from industrial engineering to pre- B p Yt = Yt−p
dicting electricity demand and studying the behavior of
stock returns. For studying causality and in-depth study, ˆ4B q 5 = 1 + ˆ1B + ˆ2 B 2 + · · · + ˆq B q 3
see [13–16], etc. After going through the literature, the we d  
d d j
Bj
X
did not come across any study which has compared four 41 − B5 = 4−15
j
BSTS models-local, local-linear, seasonal and Semi-local j=0

linear, on a real data set at the same time on a single data The ARIMA (p1 d1 q) models are based upon choice of
set. Hence, current study will be the first study to study optimum values of p and q. ARIMA (p1 d1 q) models are
the effect of four BSTS models, apart from ARIMA tech-
useful in modeling mean of a process given that the vari-
nique, in forecasting stock returns of a real data set.
ance is constant.
Present study will focus on the application of four mod-
Abbasi et al. [17] dealt with ARIMA analysis of the
els of BSTS on stock prices of flying cement and it also
data under study and Almarashi et al. [18] studied in detail
compares the results will classical ARIMA model. Easy-
the GARCH modelling of the data. Reference [17] used
to-follow approach involving less mathematical hassle is
secondary data stock prices data from Ref. [19] with time
demonstrated so that the practitioners can easily apply the
starting from 1st January 2016 to 30th January 2017. For
model in diversified fields of study.
model estimation and for model validation, the data set
was split into two parts for performing the proposed mod-
1.1. Format of the Paper els (from 1st January 2016–31st December 2016 known
The format of the current study is as follows: Section 2 as the in-sample) and for cross validation (from 1st Jan-
introduces the competing techniques for forecasting— uary 2017–20th January 2017 known as out-of-sample).
ARIMA and Structural time series and Bayesian frame- Predicted values and the value of MAPE for the ARIMA
work on structural time series coupled with relevant (1,2,1) model on the secondary data from Ref. [18] exhib-
equations to be used to derive the results in the present ited in Table I.
context. Also, the criteria for comparison of competing ARIMA is a theoretical approach to forecasting time
models is explained in the same section: Section 3 pro- series i.e., there are no a priori expectations (lack of the-
duces and explains the results obtained through the use oretical underpinnings) in the case of ARIMA modelling.
of appropriate statistical tools: Section 4 deals with two The value of MAPE points to the fact that ARIMA (1,2,1)
different ways by which to compare the outcome of the predicts the well but the model is unable to theorize the
competing models and Section 5 concludes the present time series itself. All one can know is that the ARIMA
study. model fits well using a combination of AR, lagged and
MA terms.
2. METHODS Figure 1 displays the Forecasted values along with orig-
2.1. ARIMA inal data the light grey area depicts the 95% confidence
The procedure, for fitting ARIMA approach, developed by interval and slightly dark grey area is the 80% confidence
Mokilane et al. [1], on a stationed data, is summarized as
follows: Table I. Cross validation of ARIMA model for one-month.
• Constructing correlograms and working out the p and
MM/DD/YY Original Ln (Original) Predicted Ln (Predicted)
q terms-PACF will indicate AR (p) terms and ACF will
show MA (q) terms. 01/02/17 15.00 2.71 15.05 2.71
• Fitting the model with terms worked out in step 1. 01/03/17 15.26 2.72 15.10 2.71
• Finding the residual and performing diagnostic tests. If 01/04/17 14.95 2.70 15.32 2.73
01/05/17 14.50 2.67 15.10 2.72
the residuals are IID (independently and identically dis- 01/06/17 14.35 2.66 14.66 2.68
tributed), then the fitted model will be considered suitable. 01/09/17 14.44 2.67 14.46 2.67
Otherwise the same process with other AR and MA terms 01/10/17 14.40 2.67 14.50 2.67
should be repeated. 01/11/17 14.40 2.67 14.49 2.67
• Application of the model for forecasting purposes and 01/12/17 14.33 2.66 14.48 2.67
01/13/17 14.14 2.65 14.42 2.67
comparing different forecasting models using some infor-
01/16/17 14.45 2.67 14.24 2.66
mation criteria—AIC, SBC, BICC. 01/17/17 14.20 2.65 14.47 2.67
01/18/17 14.25 2.66 14.31 2.66
The ARIMA (p, d, q) models are useful in modelling a
01/19/17 14.54 2.68 14.31 2.66
short-memory process and is specified as follows: 01/20/17 14.30 2.66 14.56 2.68
MAPE 1.214%
”4B p 541 − B5d 4Yt − xt/ ‚5 = ˆ4B q 5˜t (1)

Nanosci. Nanotechnol. Lett. 12, 54–61, 2020 55


Bayesian Structural Time Series Almarashi and Khan

software. The current study will utilize the following two


techniques:
• Kalman filter and Kalman smoothing
• Monte Carlo Markov Chain algorithm.
Deriving results mathematically using BSTS was quite
cumbersome but thanks to modern day computational
facilities like R software, which can run easy to use pack-
ages and now BSTS can be run without much effort
through a BSTS package run on R program and devel-
oped by BSTS R package written by Steven L. Scott at
Google. In BSTS, the forecasts are worked out from the
Fig. 1. Original time series of stock returns for 245 days along with the posterior predictive distribution but for fitting the model
forecasts of 20 days from ARIMA (1, 2, 1).
Kalman filtering and MCMC procedures are used. Filter-
ing simple means updating our knowledge of the system
whenever a new observation is added. Apart from having
interval. Dark blue line indicates the predicted values for
multiple applications in diversified areas of studies BSTS’s
the next 15 days stock returns.
optimum use is in the analysis of time series data. Cur-
rent study mainly focused on the Kalman Filter given by
2.2. Structural Time Series Harvey [5] and Kalman [21], which is mainly concerned
The structural time series models are in fact the build- with time series decomposition. In this component, the
ing blocks of BSTS. In Structural time series model the researcher adds different state variables to the model, usu-
data comes from some unobserved process known as state- ally starting with the local state, local linear trend state,
space and the data which is observed is generated from seasonal state and if need be the regression state. Readers
the state-space with added noise. It is this unobserved concerned with computational details of the Kalman filter
state-space which is modelled in Structural time series can see [2] or numerous other sources. For analyzing the
instead of the observed data. Underlying unobserved com- Flying cement stock prices, we start with local level then
ponents responsible for generation of the data like trend, add the local linear trend and the seasonal monthly annual
seasonal, cycle, and the effects of explanatory and inter- and lastly Semi-local linear trend model state component.
vention variables are identified separately prior to be used All the states in the four models only depend on the pre-
in a state space model. Reference [20] while enumerating vious state, thus they exhibit a Markov chain process.
the advantages of Structural models over ARIMA writes
“the ARIMA methodology constitutes itself as a kind of 2.4. State-Space Models
black-box in which the adopted model depends entirely 2.4.1. State-Space Model
from the data, without a prior analysis of the structure A structural time series model can be described by a
underlying the generating system. Structural models are, pair of equations relating yt to a vector of latent state
from this point of view, more transparent as they allow variables t .
checking if the predicted behavior by the model for each
component corresponds to what is expected from the data.” yt = ZtT t + t t ∼N 401 Ht 5 (2)
What follows is a brief introduction of space-state model " #
and its components with some basic notations. Tt
t = 1 Zt = 61 17
St
2.3. Bayesian Framework on Structural Time Series
t is the state of the system at time t.
Bayesian structural time series is an amalgamation of
Equation (1) is called the observation equation, because
time series models using the state-space representation
it links the observed data yt with unobserved latent
and Bayesian statistics. BSTS was originally proposed
state t .
by Steven and Varian [9]. The first step in the applica-
tion of the Bayesian framework on the Structural time t+1 = Tt t + Rt ‡t ‡t ∼N 401 Qt 5 (3)
series model is specifying the prior distribution for each
parameter in the model (i.e., for error variances). The Equation (2) is called the state/transition equation
error variances are treated as parameters of the models because it defines how the latent state evolves over time.
in all the equations and have conditionally independent The model matrices Zt 1 Tt and Rt typically contain a mix
inverse Gamma full conditional distributions assuming of known values (often 0 and 1), and unknown parame-
independent inverse Gamma priors. The second step is ters. Equations (2) and (3) together are called state-space
to obtain posterior distributions. These are worked out model [9]. One can directly interpret the structural time
numerically using some simulation techniques run on series model constructed in terms of components they are

56 Nanosci. Nanotechnol. Lett. 12, 54–61, 2020


Almarashi and Khan Bayesian Structural Time Series

made of. For instance, one may study the classical decom- 2.4.5. Semi-Local Linear Trend Model
position where a time series is the addition or multiplica- Though the local linear trend and semi-local linear trend is
tion of four components namely-trend, season, cycle and somewhat similar, the latter is more useful for long-term
regression [22] depending on the pattern as exhibited in forecasting. It assumes that the level component moves
the time series plot. according to a random walk, but the slope component
moves according to an AR(1) process centered on a poten-
2.4.2. Local Level Model tially nonzero value D. The equation for the level is:
Local level model is the simplest Structural times series
Œt+1 = Œt + „t + ‡t ‡t ∼N 401 ‘‡2 5 (11)
model. Local level assumes the trend is a random walk:
2
The equation for the slope is:
yt = t + t t ∼N 401 ‘ 5 (4)
t+1 = t + ‡t ‡t ∼N 401 ‘‡2 5 (5) „t+1 = D + 4„t — D5 + †t †t ∼N 401 ‘†2 5 (12)

This model assumes the slope ät follows a random walk,


In the local level the matrices Zt Tt and Rt in equation are and this is where it differs from the local linear trend. As
collapsed to the scalar value ‘1’. Parameters of the model far as controlling the variability is concerned, stationary
are variances of the error term (‘ 2 1 ‘‡2 ). AR(1) is preferred over a random walk model. Moreover,
AR(1) model is a better choice when long term forecasts
2.4.3. Local Linear Trend Component are expected and often produces realistic estimates. The
The local linear trend assumes that both the mean and long run slope for the trend component is the D parameter,
slope follow random walks. The equation for the mean is and the ät will eventually revert to D. However, ät can
as follows: have short term autoregressive deviations from the long-
term trend, with memory determined by ñ. Values of ñ
2
yt = t + t t ∼N 401 ‘ 5 (6) close to 1 will lead to long deviations from D.
Œt+1 = Œt + „t + ‡t ‡t ∼N 401 ‘‡2 5 (7) There are four independent components for the prior
distribution of semi-local liner trend. These are:
• an inverse gamma prior on, ó2ç
and the equation of the slope is:
• an inverse gamma prior on, ó2î
„t+1 = „t + †t †t ∼N 401 ‘†2 5 (8) • a Gaussian prior on the long run slope parameter D, and
a potentially truncated Gaussian prior on the AR(1) ‘ñ.’ If
Here again there are three parameters of the model (error the prior on ‘ñ’ is truncated to (−1, 1), then the slope will
variances) (‘↓ ↑ 21 ‘↓ ‡↑ 21 ‘↓ † ↑ 2). exhibit short term stationary variation around the long run
slope D.
2.4.4. Seasonal State for Monthly Data
A seasonal state component for daily data, representing the 2.5. Pros and Cons of BSTS
contribution of each month to the annual seasonal cycle, • Bayesian time series models are considerably more
i.e., this is the “January, February, March, 0 0 0” effect, with translucent than ARIMA model. They also enable
12 months equal to 12 seasons. There is change of one improved handling of uncertainty, which is a vital feature
step at the start of each month, and after that the input for future planning. Bayesian time series model fits per-
of that month is constant over the course of the month. fectly with sequential learning, and decision making and
The state of this model is an 11-vector ƒt , where the first it directly leads to exact small sample results.
element is contribution to the mean for the current month, • The precision of the forecasting results might be a little
and the remaining elements are the values for the 10 most inferior than ARIMA model for some cases.
recent months. When t is the first day in the month then

yt = t + ƒt + 2 2.6. Data/Analysis
t t ∼N 401 ‘ 5 (9)
For the proposed BSTS technique, secondary data consist-
1
ing of 260 days of stock prices for Flying Cement col-
—t ∼N 401 ‘—2 5
X
yt+1 = I 1ƒt1 i + —t (10)
i=2
lected by Ref. [17] is used. For extracting empirical results
R package (BSTS) written and developed by Ref. [9] is
and the remaining elements are shifted down one. When employed. As a first step model fitting for Flying cement
t is any other day then ƒt+1 =. If the local, trend stock returns data by using local, local linear, seasonal
and seasonal components are used simultaneously, then (monthly) and Semi-local linear trend will be carried out.
there are four parameters i.e., four error variance terms In the second step, posterior distributions of our model
(‘↓ ↑ 21 ‘↓ ‡↑ 21 ‘↓ —↑ 2). time series will be generated and will be displayed through

Nanosci. Nanotechnol. Lett. 12, 54–61, 2020 57


Bayesian Structural Time Series Almarashi and Khan

Table II. 95% Confidence/prediction intervals of competing models.

Models 95% CI Ranks w.r.t width of C.I

ARIMA 2.52–2.89 2
Local level 2.51–2.86 1
Local linear level 2.32–3.20 4
Monthly seasonal level 2.25–3.25 5
Semi-local linear level 2.51–2.90 3

3. RESULTS
The data for the study behaved as displayed in Figure 2.
The upper panel shows the original times series for the
stock returns [log (stock prices)] varying from 1.0 to 2.8.
The lower panel of Figure 2 displays the monthly season-
ality this aspect was in fact overlooked by the previous
authors while using ARIMA model. We begin with sim-
ple BSTS model like local level and gradually increased
the complexity by incrementing local linear trend monthly
seasonal trend and Semi-local linear trend.

3.1. Expectation of Posterior


Distributions and Forecasts
Plots displayed in Figure 3 are expectation of posterior of
state components (left panel) and forecasts of original data
(right panel) for the four BSTS models. The blue bubbles
Fig. 2. Upper panel showing original data, lower panel showing
monthly seasonal time series.
represent our original data. The posterior distributions in
the four BSTS models have been extracted by using 1000
iterations with MCMC. It was quite difficult to differenti-
several plots. Lastly, comparison of predictive ability of ate, by merely looking at the Expectation of Posterior Dis-
competing models will be carried out using graphical dis- tribution plots, which BSTS model was the most plausible,
plays, assessing the predictive ability of the competing as the original and posterior distributions overlapped. The
models using graphical displays and comparing MAPE. plots shown in the right panels is the original data series
The form of MAPE for the current study will be as along with the forecasts of 20 days (i.e., one month) blue
follows: lines indicating the median of the predictive distribution
and green dashed lines are the 95% prediction interval.
P
—4Observed − Forecast5/Observed—
MAPE == × 100 The grey shading is the posterior density distribution, the
n
(13) air bags are indicative of the fact that the values of the
MAPE is used under the assumption that the forecast with stock returns may lie between the corresponding points on
smallest past errors is more likely to have smallest future the y-axis. Table II shows the 95% confidence interval for
errors. 15 days forecast for ARIMA and the four BSTS models.

Fig. 3. Showing expectation of posteriors and forecasts along with original data for local level model.

58 Nanosci. Nanotechnol. Lett. 12, 54–61, 2020


Almarashi and Khan Bayesian Structural Time Series

Fig. 4. Expectation of posteriors and forecasts along with original data for local linear model.

By looking at the plots shown in Figures 3–6, it seems 3.2. Comparison Among Competing
like all four models fitted the data well and seemed little Models (Diagnostics)
difficult to differentiate among the competing models. In Original stock returns are shown in the lower panel
order to overcome this difficulty, the predictive accuracy and cumulative absolute errors are in the upper panel
for the competing models is evaluated using (a) a com- of Figure 7. The last points of the lines in the top
bined graph of four models using cumulative predictive panel indicate the proportion of the mean absolute pre-
errors (b) computing and comparing the MAPE of com- diction error for the competing models. Here again it
peting models. was witnessed that Local Level and Semi-local Linear

Fig. 5. Expectation of posteriors and forecasts along with original data for monthly seasonal model.

Fig. 6. Expectation of posteriors and forecasts along with original data for semi-local linear model.

Nanosci. Nanotechnol. Lett. 12, 54–61, 2020 59


Bayesian Structural Time Series Almarashi and Khan

However, following limitations should be noted:


(1) Present study focused on comparing different models
using univariate time series, which might mark the gener-
alizability of the study. For more in-depth analysis, multi-
variate series should also be studied.
(2) For enhanced comprehension of predictive models in
time series in future studies, apart from ARIMA and Struc-
tural Models, Dynamic Linear Models may also be applied
and compared.
(3) On the same data set, comparative analysis con-
sisting of Exponential Smoothing and Bayesian analysis
may be conducted in future studies using some robust r
packages.

Acknowledgments: The authors are highly indebted to


Professor Muhammad Qaiser Shahbaz for providing the
requisite impetus to carry out current research and to all
Fig. 7. Cumulative absolute error in the upper panel and original stock the reviewers without their feedback this work would have
returns in the lower panel. stalled.

Table III. Competing models in terms of MAPE. References and Notes


1. Box, G. and Jenkins, G.M., 1976. Time Series Analysis: Forecasting
MAPE (%) and Control. San Francisco, Holden-Day.
Forecasts (number of iterations = 1000) 2. Durbin, J. and Koopman, S.J., 2001. Time Series Analysis by State
Space Methods. Oxford, UK, Oxford University Press.
Models 15 days 20 days 30 days 40 days 3. Brockwell, P.J. and Davis, R.A., 1991. Time Series: Theory and
ARIMA 1.62 1.12 6.14 23017 Methods. New York, NY, USA, Springer.
Local level 1.29 1.69 2.75 2075 4. Hamilton, D., 1994. Time Series Analysis. Princeton, New-Jersey,
Local linear level 2.87 2.86 6.00 32042 Princeton University Press.
Local linear level + seasonal 4.14 2.15 2.44 40050 5. Harvey, A.C., 1989. Forecasting, Structural Time Series Models and
Semi-local linear trend level 2.88 1.30 6.01 13029 The Kalman Filter. Cambridge, United Kingdom, Cambridge Uni-
versity Press.
Note: Italicized values are the lowest MAPE. 6. Anderson, B.D.O. and Moore, J.B., 1979. Optimal Filtering.
New-Jersey, Prentice-Hall.
7. Caines, P.E., Meyn, S.P. and Mosca, E., 1987. On the Steady State
Behavior of Stochastic Processes with Applications to ARMAX Sys-
Level trends have the minimum cumulative predictive
tems. 26th IEEE Conference on Decision and Control, Los Angeles,
errors. California, USA. pp.1185–1188.
Table III gives MAPE for the five competing models, 8. Benveniste, A., Hannan, E.J. and Deistler, M., 1989. The statistical
where the values in the tables were not calculated by seed theory of linear systems. Statistical Papers, 30, pp.239–242.
9. Steven, L.S. and Varian, H., 2014. Predicting the present with
setting procedure but are the mean values for 1000 sim-
bayesian structural time series. International Journal of Mathemati-
ulations repeated five times. For Short term forecasting cal Modelling and Numerical Optimization, 5(1–2). pp.4–23.
ARIMA and Local Level of BSTS has the best predictive 10. Steven, L.S. and Varian, H., 2015. Bayesian Variable Selection for
ability followed by Local linear trend and Semi-local Lin- Nowcasting Economic Time Series. Economics of Digitization, Cam-
bridge, MA, NBER Press. pp.119–136.
ear Trend level but for long term forecasting, Local level
11. Mokilane, P., Debba, P., Yadavalli, V. and Sigauke, C., 2019.
BSTS stands out from the rest with the minimum MAPE Bayesian structural time-series approach to a long-term electricity
of 2.75%. demand forecasting. Applied Mathematics and Information Sciences,
13, pp.189–199.
12. Jun, S., 2019. Bayesian structural time series and regression model-
4. CONCLUSION ing for sustainable technology management. Sustainability, 11(18),
Zero error mechanism for forecasting a time series data is pp.1–12.
13. Brodersen, K.H., Gallusser, F., Koehler, J., Remy, N. and Scott, L.S.,
still a far cry. Still no statistical tool can remove inherent 2015. Inferring causal impact using Bayesian structural time-series
uncertainty or bring out clear signals from limited data. models. The Annals of Applied Statistics, 9(1), pp.247–274.
BSTS to some extent optimizes the chances of success in 14. George, E.I. and Mcculloch, R.E., 1997. Approaches for bayesian
forecasting. Overall, it is fair to conclude that the BSTS variable selection. Statistica Sinica, 7, pp.339–373.
15. Ghosh, J. and Clyde, M.A., 2011. Rao-blackwellization for bayesian
model offers practitioners a very good alternative strategy variable selection and model averaging in linear and binary regres-
to model or forecast time series without mathematical has- sion: A novel data augmentation approach. Journal of the American
sle in a step wise manner. Statistical Association, 106(495), pp.1041–1052.

60 Nanosci. Nanotechnol. Lett. 12, 54–61, 2020


Almarashi and Khan Bayesian Structural Time Series

16. Peters, G.P., Weber, C.L., Guan, D. and Hubacek, K., 2007. China’s 19. Flying Cement Limited Stock Prices Data,
growing CO2 emissions a race between increasing consumption and (https://pkfinance.info/kse/stock/FLYNG).
efficiency gains. Environmental Science & Technology, 41, pp.5939– 20. Joao, T.J., 2009. Structural time series models and the Kalman filter:
5944. A concise review. FEUNL Working Paper Series. Universidade Nova
17. Abbasi, U., Almarashi, A.M., Khan, K. and Hanif, S., 2017. Fore- de Lisboa, Faculdade de Economia.
casting cement stock prices using ARIMA model: A case study of 21. Kalman, R., 1960. A new approach to linear filtering and prediction
flying cement. Science International, 29(5), pp.1039–1043. problems. Journal of Basic Engineering, 82, pp.94–135.
18. Almarashi, A.M., Abbasi, U., Saman, H., Alzahrani, M.R. and 22. Steven, L.S., 2017. Fitting Bayesian Structural Time Series
Khushnoor, K., 2018. Modeling volatility in stock prices using with the BSTS R package. (http://www.unofficialgoogledatascience.
ARCH/GARCH. Science International, 30(1), pp.89–94. com/2017/07/fitting-bayesian-structural-time-series.html).

Received: 6 January 2020. Accepted: 4 May 2020.

Nanosci. Nanotechnol. Lett. 12, 54–61, 2020 61

View publication stats

You might also like