Santiagorivera 2015

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

This article was downloaded by: [Gazi University]

On: 05 February 2015, At: 16:38


Publisher: Taylor & Francis
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered
office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Journal of Decision Systems


Publication details, including instructions for authors and
subscription information:
Click for updates http://www.tandfonline.com/loi/tjds20

A Dashboard to Support Management of


Business Analytics Capabilities
a a
David Santiago Rivera & Graeme Shanks
a
Department of Computing and Information Systems, The
University of Melbourne, Melbourne, Australia
Published online: 30 Jan 2015.

To cite this article: David Santiago Rivera & Graeme Shanks (2015): A Dashboard to
Support Management of Business Analytics Capabilities, Journal of Decision Systems, DOI:
10.1080/12460125.2015.994335

To link to this article: http://dx.doi.org/10.1080/12460125.2015.994335

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the
“Content”) contained in the publications on our platform. However, Taylor & Francis,
our agents, and our licensors make no representations or warranties whatsoever as to
the accuracy, completeness, or suitability for any purpose of the Content. Any opinions
and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content
should not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or
howsoever caused arising directly or indirectly in connection with, in relation to or arising
out of the use of the Content.

This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http://www.tandfonline.com/page/terms-
and-conditions
Journal of Decision Systems, 2015
http://dx.doi.org/10.1080/12460125.2015.994335

A Dashboard to Support Management of Business Analytics


Capabilities
David Santiago Rivera* and Graeme Shanks

Department of Computing and Information Systems, The University of Melbourne, Melbourne,


Australia
(Received 13 October 2014; accepted 3 November 2014)

Business analytics (BA) systems create value and provide competitive advantage for
Downloaded by [Gazi University] at 16:38 05 February 2015

organisations. They involve technology and data infrastructure, BA capabilities and


business processes that provide useful insights and support decision-making. To
provide value and competitive advantage, BA capabilities should be valuable, rare
and inimitable, and have organisational support (VRIO). In this paper, we develop
and evaluate a prototype dashboard for the VRIO assessment of BA capabilities.
The dashboard is intended to support the strategic management of BA capabilities.
We discuss implications of the prototype dashboard for researchers and practitioners
and suggest directions for future research.
Keywords: business analytics; resource-based view; information dashboard design;
design science

1. Introduction
Business analytics (BA) systems involve the use of BA capabilities and technologies to
collect, transform, analyse and interpret data to support decision-making (Cosic,
Shanks, & Maynard, 2012; Davenport & Harris, 2007). BA capabilities and technolo-
gies comprise data warehousing, reporting, online analytical processing (OLAP), dash-
boarding, data visualisation, predictive modelling and forecasting systems. BA systems
provide value to organisations by improving business processes, supporting decision-
making (Carte, Schwarzkopf, Shaft, & Zmud, 2005; Kohavi, Rothleder, & Simoudis,
2002; Piccoli & Watson, 2008) and providing competitive advantage (Davenport &
Harris, 2007).
While a number of research studies have explained how and why BA systems can
provide value to organisations using the resource-based view (RBV; e.g. Shanks &
Bekmamedova, 2012), few have addressed the strategic management of BA capabili-
ties. To provide strategic value and competitive advantage, BA capabilities should be
valuable, rare and inimitable, and have the support of the organisation (VRIO; Barney,
1997). In order to strategically manage BA capabilities, managers need to understand
and develop the BA capabilities within their organisations along these four dimensions.
In this paper, we discuss how each of these dimensions might be measured. We design,
develop and evaluate a prototype tool to enable the measures to be visualised on a
dashboard to support their strategic management.

*Corresponding author. Email: [email protected]

© 2015 Taylor & Francis


2 D. Santiago Rivera and G. Shanks

There are three motivations for our research. First, BA systems can provide compet-
itive advantage and are an important strategic investment for many organisations
(Davenport, Harris, & Morison, 2010). Second, BA is consistently ranked highly
amongst the concerns of chief information officers who need to understand how to
manage the development of BA capabilities (Hagerty, Sallam, & Richardson, 2012).
Third, although there are a number of BA capability frameworks (e.g. Cosic et al.,
2012), there are currently no tools available to support the measurement and strategic
management of BA capabilities. Providing managers with information about the BA
capabilities in their organisations should potentially be of great value.
The research question we explore in this paper is: How can the management of BA
capabilities be effectively supported?
To answer this question, we use a design science research approach and develop
and evaluate a prototype dashboard for the VRIO assessment of BA capabilities.
The paper is organised as follows. First we discuss the background to the study,
Downloaded by [Gazi University] at 16:38 05 February 2015

including BA systems and capabilities, the resource-based view and capability assess-
ment, and dashboard design. We then describe the design science research approach
used in the study. Following that, we describe the design and development of the proto-
type dashboard to support assessment of BA capabilities. We then present the evalua-
tion of the prototype dashboard. Finally, we discuss limitations of the study, and
implications of the prototype dashboard for researchers and practitioners, and suggest
directions for future research.

2. Background
Four important areas of the literature are analysed in this section. First, we discuss BA
systems and explain how they provide business value. Second, we discuss the RBV
and define the VRIO dimensions of capabilities. Third, we define BA capabilities based
on the framework of Cosic et al. (2012). Fourth, we discuss the dashboard design
principles we used in designing the BA capability dashboard.

2.1. Business analytics systems and business value


BA systems enable managers and other decision-makers to interpret organisational data
to gain insights that are used to improve business processes and support decision-making
(Davenport & Harris, 2007). BA systems provide value and competitive advantage to
organisations in many areas including marketing, customer relationship management,
manufacturing, production planning, and supply chain operations (Davenport & Harris,
2007; Kohavi et al., 2002). BA systems include data warehousing, reporting, OLAP,
dashboarding and visualisation, and predictive modelling and forecasting systems
(Davenport & Harris, 2007).
Business value has been achieved from BA systems in areas including: marketing
applications that reduce customer attrition, and increase customer profitability and
response rates of marketing campaigns (Kohavi et al., 2002); manufacturing and pro-
duction planning applications that optimise order delivery (Kohavi et al., 2002); and
remote diagnostics and replenishment applications (Allmendinger & Lombreglia, 2005).
Business value is achieved when BA systems work synergistically with other organisa-
tional systems (Asadi Someh & Shanks, 2013). This requires well-developed BA capa-
bilities and strong integration effort from management to initiate, support and maintain
the interaction among BA capabilities and other organisational resources (Asadi Someh
Journal of Decision Systems 3

& Shanks, 2013). High-quality management of BA capabilities is crucial in gaining


benefits from BA systems.

2.2. Resource-based view


From the RBV perspective, organisations may be conceptualised as collections of
resources that enable them to succeed and compete. Resources comprise assets, includ-
ing hardware, software, data and people, and capabilities, including competencies
(skills) and practices (routines). Information systems researchers have used the RBV to
understand and explain how investments in IT lead to organisational benefits and com-
petitive advantage (Barney, 1991; Kraaijenbrink, Spender, & Groen, 2010; Wade &
Hulland, 2004). As IT assets may be considered to be commodities, we focus on capa-
bilities. To be strategically important and achieve competitive advantage, capabilities
should be valuable, rare and inimitable, and have the support of the organisation
Downloaded by [Gazi University] at 16:38 05 February 2015

(Barney, 1995; Barney, 1997).


Valuable capabilities are recognised as strengths of an organisation, enabling it to
exploit opportunities and neutralise threats. A capability is valuable when it allows an
organisation to devise and implement strategies that will improve its efficiency and
effectiveness (Barney, 1997). Rare capabilities are scarce and not possessed by an
organisation’s competitors. Capabilities that are both valuable and rare will provide an
organisation with competitive advantage. Capabilities that are valuable but not rare are
still important, providing benefits and enabling organisations to survive (Barney, 1997).
Inimitable capabilities are expensive to imitate or substitute. Capabilities that are valu-
able, rare and inimitable can provide sustained competitive advantage to an organisation
(Barney, 1997). Organisations that have capabilities that are valuable, rare and inimita-
ble must still support the exploitation of these capabilities to achieve competitive
advantage. Organisational support is visible through the provision of funding and
strong management support for capabilities (Barney, 1995; Barney, 1997).
In turbulent business environments, organisational capabilities are considered to be
relatively static and need to evolve and develop (Pavlou & El Sawy, 2006). Dynamic
capabilities were conceptualised to focus on ‘resource renewal’, reconfiguring and
renewing capabilities into new organisational capabilities (Teece, Pisano, & Shuen,
1997). Examples of dynamic capabilities include new product development routines
and knowledge management practices (Eisenhardt & Martin, 2000). Dynamic capabili-
ties continually build, integrate and reconfigure internal and external capabilities to
address rapid changes in turbulent business environments.
Clearly, organisations need to understand and manage their capabilities in order to
achieve value and competitive advantage. Capabilities may be measured in several
ways, including by the levels of value, rarity, inimitability and organisational support.
Another commonly used measure is maturity (Lahrmann, Marx, Winter, & Wortmann,
2011). In this paper, we focus on the levels of value, rarity, inimitability and
organisational support for various BA capabilities, and develop a prototype dashboard
comprising information about these levels.

2.3. Business analytics capabilities


A number of BA capability frameworks exist in both the academic and practitioner
literature. We adopt the BA capability framework of Cosic et al. (2012), as it is
4 D. Santiago Rivera and G. Shanks

comprehensive and comprises BA capabilities that develop continuously rather than


within a simple staged model (Becker, Knackstedt, & Pöppelbuß, 2009).
The Business Analytics Capability Maturity Model (BACMM) of Cosic et al.
(2012) comprises 16 BA capabilities grouped into four capability areas: governance,
culture, technology and people (see Tables 1–5 below). The governance capability area
is defined as the mechanism for managing the use of BA resources within an organisa-
tion, and the assignment of decision rights and accountabilities to align BA initiatives
with organisational objectives. The culture capability area is defined as the tacit and
explicit organisational norms, values and behavioural patterns that form over time and
lead to systematic ways of gathering, analysing and disseminating data. The technology
capability area is defined as the development and use of hardware, software and data
within BA activities. The people capability area is defined as the skills and knowledge
of the individuals within an organisation who use BA as part of their job function
Cosic et al. (2012).
Downloaded by [Gazi University] at 16:38 05 February 2015

Table 1. Governance capability area (Cosic et al. 2012).

BA capability Description
Decision rights The assignment of decision rights and accountabilities, by determining
those who are responsible for making each kind of decision, those who will
provide input for the decision and how these people will be held
accountable
Strategic alignment The alignment of an organisation’s BA initiatives with its business strategy,
largely determined by the level of understanding that exists between the
strategic and BA managers
Dynamic BA The continuous renewal of an organisation’s BA resource base and
capabilities organisational capabilities in order to respond to changes in dynamic
business environments;it involves searching for, selecting and funding and
implementing new opportunities
Change To manage people who are impacted by BA initiatives to accept and
management embrace technological and process changes
BA: business analytics.

Table 2. Culture capability area (Cosic et al. 2012).

BA capability Description
Evidence-based A culture where formal authority, reputation, intuition and ad hoc
management decision-making are superseded by decisions based on data and
quantitative analysis
Embeddedness The extent to which BA has permeated the social fabric of the
organisation and has become ingrained into people’s values and daily
work habits.
Executive leadership The ability of the senior managers within an organisation to infuse a
and support passion for BA and data-driven decision-making throughout the
organisation
Flexibility and agility The level of change readiness within an organisation, relating to how
receptive an organisation’s non-managerial BA personnel are to
respond to changes in the business environment
BA: business analytics.
Journal of Decision Systems 5

Table 3. Technology capability area (Cosic et al. 2012).

BA capability Description
Data management Management of an integrated and high-quality data resource is
crucial to the success of BA
Systems integration The seamless integration of BA systems with operational systems
in order to exploit the capabilities of both systems
Reporting and visualisation The development and utilisation of reports, dashboards,
BA technology scorecards, online analytical processing (OLAP) and data
visualisation technologies to display the output information in a
format that is readily understood by decision-makers
Discovery BA technology The development and utilisation of sophisticated statistical and
data-mining software applications to explore data and identify
useful correlations, patterns and trends and extrapolate them to
forecast what is likely to occur in the future
BA: business analytics.
Downloaded by [Gazi University] at 16:38 05 February 2015

Table 4. People capability area (Cosic et al. 2012).

BA capability Description
Technology skills and The skills and knowledge of BA technology specialists, including
knowledge statistics, data management, reporting and visualisation, discovery BA
technologies and information technology in general
Business skills and The skills and knowledge of BA business specialists, including sales,
knowledge finance, marketing, supply chain and production business systems
Management skills and The skills and knowledge of management specialists, who are
knowledge responsible for BA initiatives and projects, both enterprise-wide and
in local business units
Entrepreneurship and The skills and knowledge of technology, business and management
innovation personnel to use BA technologies to develop innovative and more
effective processes and products that result in better organisational
performance and create competitive advantage
BA: business analytics.

2.4. Dashboard design principles


A dashboard is a means of presenting information to decision-makers, with a focus on
visual communication, so that important information is consolidated and arranged on a
screen in order to be easily monitored (Few, 2006). Summaries and exceptions are
included, together with the ability to readily access further relevant information if
required. A number of important dashboard design principles may be used for effective
dashboard design. These include the visual perception, display media and things to
avoid (Few, 2006).
Visual perception principles concern four aspects of dashboard design: colour, form,
spatial position and motion (Ware, 2000). Variations in hue and intensity may be used
to highlight information features and attract the user’s attention. The use of colour to
attract the attention of decision-makers should be done with care, as it is effective only
when few things are highlighted (Few, 2006). Objects that are near each other, con-
nected in some way or spatially aligned with each other are perceived to belong to the
same group. Information located near the centre or top left of the display is considered
to be more important (Few, 2006).
6 D. Santiago Rivera and G. Shanks

Several display media may be used in dashboard design, including graphs, icons,
text and organisers such as tables and spatial maps. Different types of graphs and icons
may be used to communicate different aspects of information; for example, line graphs
are particularly useful for trends and cycles in volatile information. Icons may be used
to communicate the simple meaning of facts in a clear manner. For example, up/downs
may be used to communicate if a measure has increased or decreased. Some informa-
tion is better communicated using simple text, while spatial maps are ideally suited to
display geographic information (Few, 2006).
Things to avoid in dashboard design include using more than one screen to display
the dashboard, separating data that needs to be compared, providing data in the wrong
context, inappropriate displays such as complex three-dimensional graphs that are diffi-
cult for humans to understand, and the use of meaningless variety and complexity in
the design (Few, 2006).
Downloaded by [Gazi University] at 16:38 05 February 2015

3. Research approach
A design science research approach was used in the study. Design science involves the
creation and evaluation of innovative artefacts to enable organisations to accomplish
information-related tasks (Hevner, March, Park, & Ram, 2004; Kuechler & Vaishnavi,
2012). Development of the artefact may be considered ‘proof by demonstration’
(Moody & Shanks, 2003). The artefact in this study is a prototype dashboard to support
the effective management of BA capabilities. It is based in RBV theory, including BA
capabilities and their VRIO measurement, together with dashboard design principles.
Evaluation of the artefact in design science research involves demonstrating the util-
ity, quality and efficacy of the artefact (Hevner et al., 2004). In this study, the artefact
was evaluated using a ‘one-shot case study’ involving a group of surrogate decision-
makers (Moody & Shanks, 2003). The population for the empirical evaluation was
middle-level decision-makers. A sample was formed by opportunistically recruiting a
group of 20 people, each with a minimum of 3 years’ experience in information sys-
tems and BA. Each participant was provided with a brief demonstration of the dash-
board, and then asked to follow a script that required them to use the dashboard for
several decision-making tasks involving BA capabilities. Finally, each participant was
asked to complete a small survey about use of the dashboard.
The survey contained 12 questions about the use of the dashboard, each involving a
5-point Likert scale (ranging from 1 – ‘strongly disagree’ to 5 – ‘strongly agree’). The
first four questions were about the perceived ease of use of the dashboard. These ques-
tions were adapted from the original perceived ease-of-use questions in the technology
acceptance model (Davis, 1989; Moody & Shanks, 2003). The next five questions were
about the perceived usefulness of the dashboard. These questions were adapted from
the original perceived usefulness questions in the technology acceptance model (Davis,
1989; Moody & Shanks, 2003). The final three questions were about the perceived rel-
evance of the dashboard to practice in managing BA capabilities. These were adapted
from a data-quality dashboard-evaluation study reported in Moody and Shanks (2003).
The survey questions can be seen in Table 5 below.
Perceptual measures are widely used in social science research (Neuman, 2010).
They enable subjective measures to be quantified and used in the measurement of con-
structs. Data from the survey were analysed using t-tests to compare the scores
obtained from each question, with the mid-point of the scale (3 – ‘neither disagree or
agree’) using a normal distributed set of values.
Journal of Decision Systems 7

Table 5. Results of the evaluation of the dashboard.

Mean Standard t- Significance


Question response deviation value level
1. I found the BA capability tool easy to learn 4 0.649 6.89 α < 0.01
2. I found the BA capability measures clear and 3.9 0.968 4.15 α < 0.01
easy to understand
3. I found the dashboard flexible to interact with 4.2 0.41 13.08 α < 0.01
4. Overall, I found the BA capability tool easy to 4.4 0.503 12.46 α < 0.01
use
5. Using the BA capability tool helps me to 3.7 0.865 3.62 α < 0.01
assess and compare BA capability effectiveness
6. Using the BA capability tool helps to 4.6 0.503 14.24 α < 0.01
understand how BA capabilities change over
time
7. Using the BA capability tool helps to 4.05 0.686 6.84 α < 0.01
understand which BA capabilities need to be
Downloaded by [Gazi University] at 16:38 05 February 2015

developed in the future


8. The BA capability tool helps to compare BA 4.15 0.745 6.9 α < 0.01
capabilities in different parts of the
organisation
9. Overall, I found the BA capability tool useful 4.2 0.41 13.08 α < 0.01
10. The BA capability tool helps to understand 3.6 0.821 3.27 α < 0.01
the different BA capabilities needed for
successful use of BA
11. The BA capability tool would help 4.3 0.657 8.85 α < 0.01
practitioners in managing BA initiatives
12. Understanding the importance and value of 4.45 0.759 8.54 α < 0.01
BA capabilities is highly relevant to
practitioners

The research approach used was evaluated according to the design science research
guidelines of Hevner et al. (2004). The evaluation is presented below in the discussion
section of the paper.

4. Design and development of prototype dashboard


The prototype dashboard was developed using a MySQL database to store the underly-
ing data about BA capabilities, together with a dashboard developed using Excel, and
comprising basic online analytical processing capabilities. An agile, incremental
development approach was used in building the prototype dashboard.

4.1. Architecture of the prototype dashboard


The architecture comprises three parts: data sources, data mart and dashboard (see
Figure 1 below). Data about BA capabilities is sourced from within the organisation
and loaded periodically into the data mart. The data mart is designed using a
dimensional model, comprising a fact table and four dimensions, and is discussed in
section 4.2 below (Moody & Kortink, 2003). The dashboard sources data from the data
mart and displays it according to dashboard design principles (Few, 2006).
8 D. Santiago Rivera and G. Shanks

4.2. The dimensional model


Dimensional modelling was introduced by Ralph Kimball (Kimball & Ross, 2013) and
is widely used for data mart and data warehouse design. It is a constrained and simpli-
fied version of entity relationship modelling, intended for the design of retrieval-only
databases. A dimensional model comprises a fact table (containing the measures used
by decision-makers) and several dimension tables (the concepts by which the measures
will be analysed). The dimensional model to support management of BA capabilities
comprises one fact table and four dimension tables (see Figure 2 below).
The fact table contains VRIO measures for BA capabilities. These include a mea-
sure for each of value, rarity, inimitability and organisational support, together with a
VRIO index comprising the average of these four measures. Each measure used a linear
scale from 0 to 5, with 0 representing no score for the measure and 5 representing the
maximum score for the measure.
The four dimension tables were capability, stakeholder, department and time. The
Downloaded by [Gazi University] at 16:38 05 February 2015

capability dimension includes a base granularity of individual BA capabilities, and a

Figure 1. Data mart and dashboard architecture.

Figure 2. Dimensional model.


Downloaded by [Gazi University] at 16:38 05 February 2015

Journal of Decision Systems

Figure 3. Screenshot of dashboard.


9
10 D. Santiago Rivera and G. Shanks

two-level hierarchy of BA capability area and overall BA capability. Data in the proto-
type dashboard system is based on Cosic et al. (2012), and includes 16 BA capabilities
and four capability areas. For example, the four BA capabilities evidence-based
management (Pfeffer & Sutton, 2006), BA embeddedness (Shanks & Bekmamedova,
2012), executive leadership and support and flexibility and agility are grouped into the
BA culture capability area (Cosic et al., 2012). This enabled the measures in the fact
table to be reported for either individual capabilities or at the aggregated level of
capability areas.
The stakeholder dimension includes a base granularity of individuals within an
organisation, and a two-level hierarchy of stakeholder role (job type) and stakeholder
type (IT or business focus). This enabled the measures in the fact table to be reported
for either individual stakeholders or at the aggregated level of different stakeholder
roles or stakeholder type.
The department dimension includes a base granularity of individual departments
Downloaded by [Gazi University] at 16:38 05 February 2015

within an organisation, and a two-level hierarchy of division and overall company. This
enabled the measures in the fact table to be reported for either individual departments
or at the aggregated levels of division and overall company.
The time dimension comprises a base granularity of month, and a two-level hierar-
chy of quarter and year. This enabled the measures in the fact table to be reported for
either individual months or at the aggregated levels of quarter and year.
The design of the dimensional model supports the management of BA capabilities
using VRIO measures that may be reported at multiples levels of detail by four dimen-
sions. The measures in the fact table may be ‘sliced and diced’ and ‘rolled up and
drilled down’ in multiple ways. This provides data that is suitable for flexible
presentation and manipulation on the dashboard (Kimball & Ross, 2013).

4.3. The dashboard


The dashboard was designed using dashboard design principles (Few, 2006). Specifi-
cally, the dashboard was split into four main areas. The display of BA capability mea-
sures over time using a line graph was considered to be very important and located in
the top left corner of the dashboard. The area immediately below this was used to
display selected capabilities and their variation over several time periods, using bar
graphs. The third area was on the right side of the dashboard, and this was used to dis-
play the most recent measure together with several recent values, using a table. Finally
the fourth area was at the bottom of the screen and was used for controls. These were
able to select particular BA capabilities, levels of granularity and various other means
of displaying the data. A screenshot of the dashboard is shown below in Figure 3.

5. Evaluation of prototype tool


The dashboard was evaluated by a group of participants who were provided with a
brief demonstration of the dashboard, and then asked to follow a script that required
them to use the dashboard for several decision-making tasks involving BA capabili-
ties. They were then asked to complete a small survey about use of the dashboard.
The survey questions related to the ease of use, usability and relevance to practice of
the dashboard. Responses from 20 participants were analysed by comparing their
scores for each question to the mid-point of the scale with a normally distributed set
of values.
Journal of Decision Systems 11

A summary of the results obtained from the study is shown below in Table 5. It
can be seen that the responses to each of the questions were significantly higher than
the mid-point of the scale. Overall, the evaluation showed that the dashboard was
considered easy to use, useful and relevant to practice.

6. Discussion
The BA capability dashboard is aimed at supporting the management of BA capabilities
within organisations. It provides managers with a flexible means of understanding the
levels of value, rarity, inimitability and organisational support of BA capabilities. In
particular, managers can analyse BA capabilities in relation to time (when), stakeholder
(who) and department (where).
Measures for the value, rarity, inimitability and organisational support of each BA
capability must first be collected from relevant stakeholders. The measures could use a
Downloaded by [Gazi University] at 16:38 05 February 2015

linear scale from 0 to 5, reflecting the strength of each capability. Data should be col-
lected periodically, probably every 6 months, to enable meaningful time series analysis.
Data should also be collected from a variety of stakeholders throughout the organisa-
tion, to enable meaningful analysis across different stakeholder types and organisational
departments.
Analysis of BA capability levels using the time dimension provides insight into
which BA capabilities are improving and which are stagnating or becoming worse. This
insight should help managers to understand the strengths and weaknesses in their orga-
nisation’s BA capabilities, and to decide which BA capabilities to invest in. Further-
more, it will provide input to the search and select routines in an organisation’s
dynamic capabilities (Teece, 2009; Teece et al., 1997), and help organisations to gain
insight into which innovations and changes are required.
Analysis of BA capability levels using the stakeholder dimension provides insight
into the different perceptions of BA capabilities by stakeholder roles and types. For
example, discovering that IT-type stakeholders perceive BA capabilities in the BA tech-
nology area to be poor might carry more weight in decision-making than the percep-
tions of business management-type stakeholders. The insight that various types of
stakeholders have differing perceptions might lead to further training rather than an
investment in improving BA capabilities.
Analysis of BA capability levels using the department dimension provides insight
into which parts of an organisation have differing perceptions of BA capability levels.
Investments in BA capabilities can then be better targeted.

6.1. Guidelines for design science research


Hevner et al. (2004) provided seven guidelines for the evaluation of design science
research. The first guideline requires that design science research must produce a viable
artefact. We have achieved this with the development of the prototype dashboard to
support the management of BA capabilities. The second guideline requires that design
science research should address relevant problems. We have justified the importance of
the dashboard based on the importance of BA systems in providing competitive advan-
tage for organisations (Davenport et al., 2010), the consistently high ranking of BA
among the concerns of chief information officers (Hagerty et al., 2012) and the lack of
tools available to support the management of BA capabilities. The third guideline con-
cerns the rigour of evaluation of the artefact. We have designed the evaluation study to
12 D. Santiago Rivera and G. Shanks

examine the utility and potential utility of the artefact, and applied rigorous quantitative
data-analysis techniques. The fourth guideline concerns the research contributions from
the study. We argue that there are important contributions to knowledge in the design
of the artefact and the approach to its evaluation. The artefact combines innovative
insights into the definition and measurement of BA capabilities within organisations.
The evaluation approach provides a useful means of initial evaluation of dashboard
artefacts in a ‘laboratory setting’. The evaluation instrument is based on previously
used and validated measurement scales. The fifth guideline concerns the rigour of the
research design. We have used rigorous methods in both the design and evaluation of
the artefact. The dashboard is based on the RBV, and uses a set of BA capabilities that
have been developed using a combination of existing work and empirical study (Cosic
et al., 2012). The one-shot case study approach used in the empirical evaluation has
been previously used by Moody and Shanks (2003) and incorporates a measurement
instrument based on previously validated scales (Davis, 1989; Moody & Shanks,
Downloaded by [Gazi University] at 16:38 05 February 2015

2003). The sixth guideline concerns design and a search process. We argue that the
application of dashboard design principles, in addition to the underlying theoretical
base of the BA capabilities and measures, utilises available means to ensure that arte-
fact is effective. The final guideline concerns the communication of the design research.
We are reporting our research outcomes to both technology-oriented as well as manage-
ment-oriented audiences, and in both practitioner and academic outlets.

6.2. Implications for researchers


There are a number of implications for researchers from the design, development and
evaluation of the BA capability dashboard. First, it is an example of a BA-based tool
to support the management of BA capabilities. Researchers should explore the design
of the dimensional model and consider alternative dimensions and hierarchies within
the dimensions. For example, while we have used the BA capability model of Cosic
et al. (2012) in this study, researchers should consider using other BA capability and
maturity models (see e.g. Lahrmann et al., 2011; Rajteric, 2010).
Second, the design and layout of the dashboard needs further work. There are many
design options and visualisation considerations when displaying information to support
decision-makers (Tufte & Graves-Morris, 1983). Different display types, better use of
icons and colour, and improved recognition and highlighting of exceptions need to be
explored (Ware, 2000). Empirical evaluation of the artefact that focuses on the human-
machine interface should be conducted to refine and enhance the presentation of
information on the dashboard.
Third, the set of BA capabilities used in the prototype from Cosic et al. (2012) pro-
vides a broad and complete range of BA capabilities and BA capability areas. This
might be too extensive for the needs of some organisations, and so researchers should
explore simpler sets of BA capabilities and how they may be grouped when organisa-
tional requirements are more focused. Understanding which capabilities are important
to organisations in various industry sectors or under particular circumstances requires
further empirical work involving action research and in-depth case studies.
Fourth, the use of perceptual VRIO measures of each BA capability level is a use-
ful starting point for the prototype dashboard system. However, alternative measures
should be considered, including more refined scales for each measure. Another aspect
of measurement is maturity. There are many BA maturity models that might provide
insight into alternative measures (Cosic et al., 2012; Lahrmann et al., 2011).
Journal of Decision Systems 13

Researchers should also explore the possibility of establishing benchmarks for


capabilities for various types of organisation that might be used when planning for
investment in BA systems.

6.3. Implications for practitioners


The dashboard should provide practitioners with a useful decision-support tool when
managing BA capabilities. Practitioners will benefit from a knowledge of which BA
capabilities are most developed within different parts of their organisations, and the dif-
ferences in perceptions between stakeholder roles and types. Practitioners may benefit
from aggregate BA capability measures for organisations within different industry
sectors. These may provide useful benchmarks for use in managing BA capabilities.

7. Conclusion
Downloaded by [Gazi University] at 16:38 05 February 2015

The design, development and evaluation of the BA capability dashboard has demon-
strated its potential to support management of BA capabilities. There are a number of
limitations of the study. First, the prototype tool needs further refinement. It is currently
a useful starting point to demonstrate the utility of a dashboard for managing BA capa-
bilities, but needs further development to be of use in practice. Second, the evaluation
was limited, although appropriate at this initial stage of development of the dashboard.
More extensive evaluation involving further refined one-shot case studies and,
eventually, action research and in-depth case studies is required.
While BA assets may be readily acquired and are commodities that do not necessarily
provide competitive advantage, organisations need to carefully manage their BA capabili-
ties to successfully compete in turbulent environments (Shanks & Bekmamedova, 2012).
The successful management of BA capabilities, supported by the prototype dashboard for
the VRIO assessment of BA capabilities, has the potential to significantly enhance the
decision support infrastructure within organisations.

Acknowledgements
An earlier version of this paper was presented at the DSS2014 International Federation for Infor-
mation Processing (IFIP) Working Group 8.3 International Conference on DSS. The authors
acknowledge with thanks the feedback provided by the anonymous reviewers of that paper. This
research was supported under the Australian Research Council’s Discovery Projects funding
scheme.

References
Allmendinger, G., & Lombreglia, R. (2005). Four strategies for the age of smart services.
Harvard Business Review, 83, 131–145.
Asadi Someh, I., & Shanks, G. (2013). The role of synergy in achieving value from business
analytics systems. Proceedings of the thirty fourth international conference on information
systems, AISeL.
Barney, J. (1991). Firm resources and sustained competitive advantage. Journal of Management,
17, 99–120.
Barney, J. (1995). Looking inside for competitive advantage. The Academy of Management
Executive, 9, 49–61.
Barney, J. (1997). Gaining and sustaining competitive advantage. Reading: Addison-Wesley.
Becker, J., Knackstedt, R., & Pöppelbuß, J. (2009). Developing maturity models for IT
management – a procedure model and its application. Business & Information Systems
Engineering, 3, 213–222.
14 D. Santiago Rivera and G. Shanks

Carte, T., Schwarzkopf, A., Shaft, T., & Zmud, R. (2005). Advanced business intelligence at
cardinal health. MIS Quarterly Executive, 4, 413–424.
Cosic, R., Shanks, G., & Maynard, S. (2012). Towards a business analytics capability maturity
model. Proceedings of the 23rd Australasian conference on information systems, 1–11.
Geelong, Australia.
Davenport, T. H., & Harris, J. G. (2007). Competing on analytics. Boston, MA: Harvard
Business School Press.
Davenport, T. H., Harris, J. G., & Morison, R. (2010). Analytics at work. Boston, MA: Harvard
Business School Press.
Davis, F. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information
technology. MIS Quarterly, 13, 319–340.
Eisenhardt, K. M., & Martin, J. A. (2000). Dynamic capabilities: What are they? Strategic
Management Journal, 21, 1105–1121.
Few, S. (2006). Information dashboard design. O’Reilly North Sebastopol, CA.
Hagerty, J., Sallam, R. L., & Richardson, J. (2012, Feb 6). Magic quadrant for business
intelligence platforms. Gartner Research Note G00225500.
Hevner, A., March, S., Park, J., & Ram, S. (2004). Design science in information systems
Downloaded by [Gazi University] at 16:38 05 February 2015

research. MIS Quarterly, 28, 75–105.


Kimball, R., & Ross, M. (2013). The data warehouse toolkit: The complete guide to dimensional
modelling (3rd ed.). Wiley New York.
Kohavi, R., Rothleder, N., & Simoudis, E. (2002). Emerging trends in business analytics.
Communications of the ACM, 45, 45–48.
Kraaijenbrink, J., Spender, J., & Groen, A. (2010). The resource-based view: A review and
assessment of its critiques. Journal of Management, 36, 349–372.
Kuechler, W., & Vaishnavi, V. (2012). A framework for theory development in design science
research: Multiple perspectives. Journal of the Association for Information Systems, 13,
395–423.
Lahrmann, G., Marx, F., Winter, R., & Wortmann, F. (2011). Business intelligence maturity:
Development and evaluation of a theoretical model. Proceedings Hawaii International
Conference on System Sciences, 1–10.
Moody, D., & Kortink, M. (2003). From ER models to dimensional models: Bridging the gap
between OLTP and OLAP design, Business Intelligence Journal, Summer, 7–24.
Moody, D., & Shanks, G. (2003). Improving the quality of data models: Empirical validation of
a quality management framework. Information Systems, 28, 619–650.
Neuman, W. (2010). Social research methods: Qualitative and quantitative approaches. 7th
Pearson Boston, MA.
Pavlou, P. A., & El Sawy, O. (2006). From IT competence to competitive advantage in turbulent
environments: The case of new product development. Information Systems Research, 17,
198–227.
Pfeffer, J., & Sutton, R. (2006). Evidence-based management. Harvard Business Review, 84,
62–68.
Piccoli, G., & Watson, R. (2008). Profit from customer data by identifying strategic opportunities
and adopting the ‘born digital’ approach. MIS Quarterly Executive, 7, 113–122.
Rajteric, I. H. (2010). Overview of business intelligence maturity models. International Journal
of Human Science, 15, 47–67.
Shanks, G., & Bekmamedova, N. (2012). Achieving benefits with business analytics systems: An
evolutionary process perspective. Journal of Decision Systems, 21, 231–244.
Teece, D. J. (2009). Dynamic capabilities and strategic management. Oxford University Press
Oxford.
Teece, D. J., Pisano, G., & Shuen, A. (1997). Dynamic capabilities and strategic management.
Strategic Management Journal, 18, 509–533.
Tufte, E., & Graves-Morris, P. (1983). The visual display of quantitative information. Cheshire,
CT: Graphics Press.
Wade, M., & Hulland, J. (2004). The resource-based view and information systems research:
Review, extension, and suggestions for future research. MIS Quarterly, 28, 107–142.
Ware, C. (2000). Information visualization. Morgan Kaufmann San Francisco, CA.

You might also like