Ai in Services
Ai in Services
Ai in Services
https://www.emerald.com/insight/1757-5818.htm
Alfredo Perez-Rueda
Management, Facultad de Ciencias Sociales y del Trabajo, Universidad de Zaragoza,
Zaragoza, Spain
Daniel Belanche
Marketing and Market Research, Facultad de Economia y Empresa,
Universidad de Zaragoza, Zaragoza, Spain, and
Luis V. Casalo
Marketing and Market Research, Facultad de Empresa y Gestion Pu
blica de Huesca,
Universidad de Zaragoza, Huesca, Spain
Abstract
Purpose – The automation of services is rapidly growing, led by sectors such as banking and
financial investment. The growing number of investments managed by artificial intelligence (AI)
suggests that this technology-based service will become increasingly popular. This study examines
how customers’ technology readiness and service awareness affect their intention to use analytical AI
investment services.
Design/methodology/approach – Hypotheses were tested with a data set of 404 North American-based
potential customers of robo-advisors. In addition to technology readiness dimensions, the potential customers’
characteristics were included in the framework as moderating factors (age, gender and previous experience
with financial investment services). A post-hoc analysis examined the roles of service awareness and the
financial advisor’s name (i.e., robo-advisor vs. AI-advisor).
Findings – The results indicated that customers’ technological optimism increases, and insecurity decreases,
their intention to use robo-advisors. Surprisingly, feelings of technological discomfort positively influenced
robo-advisor adoption. This interesting finding challenges previous insights into technology adoption and
value co-creation as analytical AI puts customers into a very passive role and reduces barriers to technology
adoption. The research also analyzes how consumers become aware of robo-advisors, and how this influences
their acceptance.
Originality/value – This is the first study to analyze the role of customers’ technology readiness in the
adoption of analytical AI. The authors link the findings to previous technology adoption and automated
services’ literature and provide specific managerial implications and avenues for further research.
Keywords Artificial intelligence, Technology readiness, Optimism, Innovativeness, Discomfort, Insecurity,
Awareness, Intention to use, Robot, Robo-advisor, Financial services, FinTech
Paper type Research paper
The authors wish to thank Prof. Russell Belk (York University) for his valuable suggestions about this
research project. Journal of Service Management
Vol. 33 No. 2, 2022
Funding: The authors are grateful for the financial support received from Gobierno de Aragon pp. 293-320
(Research Group “METODO” S20_20R) and Ministerio de Ciencia, Innovacion y Universidades © Emerald Publishing Limited
1757-5818
(PID2019-105468RB-I00). DOI 10.1108/JOSM-10-2020-0378
JOSM 1. Introduction
33,2 Technological advances in robotics and artificial intelligence (AI) are radically changing
service provision (Belanche et al., 2020a; Lu et al., 2020; Robinson et al., 2020). Huang and Rust
(2018) predicted that automated technology will gradually replace workers in task requiring
mechanical, analytical, intuitive and even empathetic intelligence. Wirtz et al. (2018) proposed
that AI software that works autonomously and learns over time can be distinguished from
service robots depending on manifestation (virtual or physical), level of anthropomorphism
294 (from none to high) and task orientation. For companies, AI data and knowledge are likely to
become important sources of competitive advantage, based on economies of scale and scope,
leading to “winners-take-all” markets (Wirtz et al., 2018).
The banking and finance industries has become prototypical examples of the AI
technological revolution worldwide as a sector leading internal and customer-oriented
automation processes (Caron, 2019). In this regard, financial technology (FinTech) has
revolutionized the finance industry by increasing user value and firms’ revenues in the last
decade (Huang and Rust, 2021; Kumar et al., 2019; Goldstein et al., 2019).
Within the scope of AI-based financial services, this study focuses on robo-advisor agents,
that is, agents which automate or assist in managing investments by replacing human
advisory services and/or the customer’s own management, a recent innovation in the finance
industry (Goldstein et al., 2019). The assets under robo-advisor management are expected to
grow annually by 27.0%, reaching US$ 2,552bn in 2023, while the number of robo-advisor
users is expected to grow by 75.4% year-on-year until 2023 (Statista, 2019). As distinct from
most mechanical-based automation (e.g. robots), these innovative services are based on
analytical AI, which has been defined as the ability to process, and learn from, information for
problem-solving purposes (Huang and Rust, 2018). Nonetheless, the penetration rate of robo-
advisors among customers is still relatively low (Jung et al., 2018b; Belanche et al., 2020a). To
address this challenge, there is a need to better understand how to integrate AI into service
offerings. To this end, several experts in the AI adoption domain (e.g. Mende et al., 2019; van
Doorn et al., 2017; Belanche et al., 2020a) have suggested that the technology readiness index
(TRI) is a suitable framework, hitherto unexplored, in this novel context.
To address this research gap, we apply the TRI to examine the role of technology readiness in
explaining intention to use robo-advisors, prototypical examples of analytical AI services already
accessible to a wide spectrum of customers. Unlike technology acceptance models based on
customer motivations, the TRI (Parasuraman, 2000) captures consumers’ positive (i.e. optimism
and innovativeness) and negative (i.e. discomfort and insecurity) mental readiness regarding
technologies. The TRI dimensions assess crucial consumer perceptions in the financial sector,
such as their enthusiasm (optimism), perceived control (or discomfort) and service reliability (or
insecurity); these often determine the initial investment decisions that can lead to successful long-
term relational exchanges (Clark-Murphy and Soutar, 2004). The framework also categorizes
users based on their propensity to embrace technologies as it links personality and technology
use (Walczuch et al., 2007) and facilitates the design of segmentation variables (Victorino et al.,
2009). Therefore, as robo-advisors represent a disruptive technological advance, the TRI seems
particularly suited to understanding customer willingness to use AI-based services.
As a complement to this framework, we propose that customers with higher awareness of
robo-advisors may be more willing to use these innovative services. Service awareness, an
important variable in other service domains (e.g. Andersen et al., 2000; Crist et al., 2007), has
been neglected in previous research based on the assumption that customers are fully aware of
the available technologies (Venkatesh et al., 2003); however, these assumptions may be
misplaced as these innovations are just starting to penetrate the market. In addition, to increase
the practical implications of our research, we propose that renaming “robo-advisors” as
“AI-advisors”, a more accurate and sophisticated description, unrelated to robots, may increase
their acceptance by potential adopters. Finally, previous studies have found that individual
characteristics are key in explaining technology usage and proposed them as moderating Intention to use
variables (Sun and Zhang, 2006; Blut and Wang, 2019). Therefore, we include age, gender and analytical AI
previous investment experience in the model as moderating variables (Venkatesh et al., 2003).
The present study’s contribution is threefold. First, we empirically investigate the
adoption of a prototypical analytical AI service, financial robo-advisors. Most of the previous
research into automation has been conceptual in nature, and the growing number of empirical
studies in the field focus on mechanical-AI (e.g. service robots). However, analytical AI
represents a more advanced stage in the development of intelligent automation skills (Huang 295
and Rust, 2018), with distinctive features that make it ideal for service personalization and
optimal productivity (e.g. it learns from, and adapts to, data, Belanche et al., 2020b; Huang and
Rust, 2021). Due to the disruptive nature of analytical AI and its multiple social and economic
implications, there is an urgent need for more research and analysis in this fast-growing area.
Second, by drawing on the TRI framework, we assess to what extent regular customers are
ready to embrace an autonomous technology that performs analytical tasks traditionally carried
out by humans (i.e. investing customers’ money). This is the first study to apply the TRI
framework to identify if analytical AI adoption differs from the adoption of previous technological
innovations, research which has been repeatedly called for by scholars in the field (e.g. Mende
et al., 2019; van Doorn et al., 2017; Belanche et al., 2020a). In contrast to the previous new
technology adoption literature, our study revealed that technological discomfort does not harm,
instead it promotes, the adoption of robo-advisors. That is, customers who feel overwhelmed by
technology are more likely to use analytical AI as it is a simple system that requires minimal user
participation. This important finding suggests, in the case of analytical AI, there is a need to
reconsider previous theoretical technology adoption and value co-creation axioms as users may
not, in the future, play such active roles in value creation and decision-making.
Third, our research identified consumer awareness as a critical, but frequently ignored,
factor in adoption; thus, to identify how consumers become aware of robo-advisors, a post-
hoc analysis was conducted. In summary, this research advances the understanding of
customers’ decisions about the use of analytical AI services and can help managers design
better strategies for the successful introduction of these innovations.
The remainder of this work is structured as follows. First, we review the previous robo-
advisor literature. Second, we develop the research model’s hypotheses to explain customers’
intention to use analytical AI services. Third, we describe the data collection procedure and
the measurement validation. Next, we present the results of the empirical and the post-hoc
analyses. Finally, we discuss the main conclusions, the theoretical and practical implications,
the study’s limitations and further research lines.
2. Literature review
AI has been defined as “machines exhibiting facets of human intelligence” (Huang and Rust,
2018, p. 155). Previous research in the service domain has posited that customers approach
AI-related services differently to how they approach traditional services (Grewal et al., 2017).
Unlike other technologies (e.g. self-service technologies), AI-based systems operate
autonomously or with few instructions, often replacing humans (Belanche et al., 2020b; De
Keyser et al., 2019). Thus, companies must understand how to introduce AI technologies to
reduce barriers to their use by customers (Mazurek and Małagocka, 2019) to improve
management practices and product offerings (Kumar et al., 2019).
3. Hypotheses formulation
3.1 Technology readiness
Personality differences are regarded in management and marketing theories as important
human behavior determinants. Prior new technology acceptance literature has argued that
individual’s reactions to technology are diverse (Mick and Fournier, 1998; Ratchford, 2020).
This can be explained by the positive and negative feelings technology triggers in customers
(Parasuraman and Colby, 2015). In this regard, Parasuraman (2000) developed the TRI,
defining technology readiness as “people’s propensity to embrace and use new technologies
for accomplishing goals in home life and at work” (p. 308). The TRI captures consumers’
positive and negative mental readiness regarding technology; it has previously been
employed to explain the adoption of innovations such self-service technology in airports
(Liljander et al., 2006), C2C platforms (Lu et al., 2012) and mobile payment systems (Martens
et al., 2017). Technology readiness (later improved and renamed TRI 2.0 (Parasuraman and Intention to use
Colby, 2015)) is measured through four dimensions: two motivators, optimism and analytical AI
innovativeness, and two inhibitors, discomfort and insecurity. Prior research has
underlined the independence of the four dimensions as each measures the extent of a
person’s openness to technology differently (Lu et al., 2012).
Technological optimism represents “a positive view of technology and a belief that it
offers people increased control, flexibility, and efficiency in their lives” (Parasuraman and
Colby, 2015, p. 60). This definition can be extended to AI as people may perceive it as a “hell” 297
or a “heaven” (Kaplan and Haenlein, 2020). Optimists accept situations and are more willing
to use new technologies (Lu et al., 2012), perceiving them as functional and trustworthy,
overlooking possible negatives outcomes, than are pessimistic technology users (Walczuch
et al., 2007). Thus, optimistic customers are more positively predisposed toward new
technologies (Godoe and Johansen, 2012). In the financial sector, more enthusiastic consumers
tend to look for new investment opportunities (Clark-Murphy and Soutar, 2004), for example,
robo-advisors. Thus, we propose that:
H1. Customers’ technological optimism has a positive effect on their intention to use
financial robo-advisors.
Technological innovativeness has been defined as “a tendency to be a technology pioneer and
thought leader” (Parasuraman and Colby, 2015, p. 60). Innovators are willing to try new
technologies (Martens et al., 2017) and related services (Rodriguez-Ricardo et al., 2018). Highly
innovative people tend to be open-minded and exhibit greater willingness to use technologies,
including innovative financial services, for example, mobile payment (Oliveira et al., 2016).
Furthermore, innovativeness is an antecedent of adoption intentions; innovative customers
generally have a positive impression of technology functionality even when its potential
value is uncertain (Prodanova et al., 2021). Thus, we propose:
H2. Customers’ technological innovativeness has a positive effect on their intention to
use financial robo-advisors.
Technology discomfort has been defined as “a perceived lack of control over technology and a
feeling of being overwhelmed by it” (Parasuraman and Colby, 2015, p. 60). People who
experience discomfort with technologies perceive them as complicated and unable to satisfy
their needs (Lu et al., 2012). Customers experiencing high levels of discomfort in an unknown
technology environment can feel averse toward using new technology-based products and
services (Tsang et al., 2004). The feeling of lacking control or the capability to deal with
technologies can result in rejection of innovative systems. Customers who feel discomfort in
surrendering control to an automated system may not want to use robo-advisor services.
Thus, the following hypothesis is proposed:
H3. Customers’ technological discomfort has a negative effect on their intention to use
financial robo-advisors.
Finally, technology insecurity has been defined as “distrust of technology, stemming from
skepticism about its ability to work properly and concerns about its potential harmful
consequences” (Parasuraman and Colby, 2015, p. 60). Users need at least a rudimentary
understanding of how AI systems function to have confidence in them (Kaplan and Haenlein,
2019). Customers with high levels of technology insecurity may avoid using them (Lu et al.,
2012). Prior studies have concluded that, in the finance industry, insecure customers tend to
refuse to adopt new technology-based services (Oliveira et al., 2016). Thus, we posit:
H4. Customers’ technological insecurity has a negative effect on their intention to use
financial robo-advisors.
JOSM 3.2 Awareness
33,2 Service awareness has been defined as “being conscious of, having knowledge of, or being
informed about a given service” (Crist et al., 2007, p. 212). Awareness has not hitherto been
closely examined in the technology acceptance literature, but it may be particularly important
in the study of recently launched technological services such as robo-advisors. In advertising,
awareness refers to product/brand recognition (Hellofs and Jacobson, 1999) and indicates that
customers have paid attention to, and are conscious of, information provided about a new
298 product or service.
The scarce literature on service awareness focuses on specific domains. In the elderly care
sector, most studies understand awareness as the customers’ knowledge about the existence
of the service (Andersen et al., 2000). In this sense, individuals with more knowledge about
elderly care services are more aware of their options and use them to a greater extent than
those hearing about the services for the first time (Crist et al., 2007). Thus, individuals’
awareness of services is related to the information they have received, researched and their
experiences. Previous research in an organizational context has shown that when employees
are aware of their company’s corporate social responsibility activities, this encourages them
to contribute to the company’s efforts (Raub and Blunschi, 2014). Applying this concept to
this research domain, we propose that customers who are aware of robo-advisors will be more
willing to use them. Accordingly, we propose:
H5. Customers’ awareness has a positive effect on their intention to use financial robo-
advisors.
4. Method
4.1 Data collection
The study data were collected through an online survey designed and hosted by
SurveyMonkey. An invitation and link to the questionnaire was sent to US-based English-
speaking consumers between 20 and 85 years. This sampling process was conducted using a
reputable consumer panel comprising over 70,000 consumers unrelated to any specific
banking provider. The participants were paid US$ 1.20. Only those respondents who
completed the whole questionnaire in a reasonable timeframe were rewarded/considered for
analysis. After removing 14 records due to incomplete responses, the final sample was 404.
The sample was very similar in sociodemographic terms to North American consumers aged
between 20 and 85 (US Census Bureau Office, 2020). Table 1 shows the participants’ and US
population’s demographics.
The invitation link asked the panelists to participate in a study about financial services.
Replicating other experimental design manipulation procedures, for around half of the sample
the advisor was referred to as a “robo-advisor”; whereas for the other half, it was referred to as
an “AI-advisor”. Some 207 participants were randomly assigned to the AI-advisor scenario, and
Technology Readiness
Customer characteristics Intention to use
Previous
analytical AI
Age
experience
Optimism
H1 + Gender
H6 H8
Innovativeness
H2 + 301
H7
H3 –
Discomfort Intention to use AI
(robo-advisors)
H4 –
Insecurity
H5 +
Control variables
Awareness FinTech
Ease of use Usefulness
name
Figure 1.
Research model
Note(s): Solid lines represent direct effects; broken lines represent moderating effects
197 to the robo-advisor scenario. The name given to the service was the only difference between
the scenarios. Both subgroups were similar in terms of gender and age distribution. Specifically,
χ 2 tests confirms that there is no significant difference in gender (χ 2 5 0.329, 1 d.f., p > 0.1) and
age (χ 2 5 10.105, 12 d.f., p > 0.1) distributions between both groups. The questionnaires were
adapted to the different financial service names. The study’s website gave the participants the
kind of information normally presented to consumers considering using robo-advisors. First,
they were provided with a basic general description of financial robo-advisors (or AI-advisor);
this explained that they were new autonomous financial advisors that evaluated the market
through analytical AI and adapted their advice to the customers’ profiles. As robo-advisors
have no anthropomorphic appearance, the description included four illustrative screenshots of
a real financial advisor interface, showing graphs and rates adapted to avoid brand familiarity
bias (i.e. colors, fonts and figures were altered, and company names omitted). The questionnaire
included TRI-related scales of optimism, innovativeness, discomfort and insecurity; it also
asked the respondents about their level of awareness of these financial services. They were also
asked to indicate the perceived ease of use and usefulness of, and their intention to use, robo-
advisors/AI-advisors. The final questions covered sociodemographics (age and gender) and
previous investment experience.
4.2 Measurement instrument
All scales used in the study were adapted from the previous literature. Appendix shows the
scale items employed for the robo-advisors (as aforementioned, the scales were also adapted
to the “AI-advisor” name). Specifically, we incorporated Parasuraman and Colby’s (2015) TRI;
this uses 16 items (four per construct) to measure customers’ levels of optimism,
innovativeness, discomfort and insecurity (Parasuraman and Colby, 2015). The consumers’
awareness of robo-advisor services was measured using a three-item scale adapted from
Raub and Blunschi (2014) and Collins (2007). Following previous research, awareness was
measured using the following question: “How did you know about robo-advisors?” (Kangis
and Passa, 1997). Perceptions of ease of use and usefulness were measured using scales
JOSM Survey respondents US Census Bureau
33,2 percentage percentage*
developed by Davis et al. (1989) and Bhattacherjee (2000), four items per variable. The
participants’ intention to use robo-advisors was measured using three items adapted from
Bhattacherjee (2000). All scales used self-reported measures based on seven-point Likert-type
response formats, from 1 (“completely disagree”) to 7 (“completely agree”). The demographic
questions covered age (1 5 20–24 years, 2 5 25–29, 3 5 30–34, 4 5 35–39, 5 5 40–44, 6 5 45–49,
7 5 50–54; 8 5 55–59, 9 5 60–64, 10 5 65–69, 11 5 70–74, 12 5 75–79, 13 5 80–84) and gender
(1 5 woman, 0 5 man), and the participants were asked about previous investment experience
(1 5 yes, 0 5 no).
5. Results
5.1 Structural model
To test the hypotheses and the structural model, the SmartPLS algorithm, followed by
bootstrapping with 5,000 subsamples, was used (Hair et al., 2011). The results are presented in
Table 4. As to the technology readiness-related hypotheses, the results indicated that
customers’ optimism significantly influenced their intention to use robo-advisors (β 5 0.187,
p < 0.01), supporting Hypothesis 1. In turn, customers’ level of innovativeness did not have a
significant effect on use intentions (β 5 0.015, p > 0.10); thus, Hypothesis 2 is not supported.
304
JOSM
Table 3.
Discriminant validity
Fornell–Larcker criterion
1 2 3 4 5 6 7 8 9 10 11 12
1. Optimism 0.887
2. Innovativeness 0.431 0.879
3. Discomfort 0.162 0.033 0.774
4. Insecurity 0.307 0.264 0.329 0.772
5. Awareness 0.193 0.334 0.170 0.011 0.923
6. Intention to use 0.418 0.308 0.087 0.217 0.456 0.952
7. Ease of use 0.345 0.315 0.092 0.203 0.300 0.501 0.940
8. Usefulness 0.333 0.161 0.048 0.151 0.259 0.644 0.557 0.952
9. Age 0.187 0.327 0.002 0.149 0.109 0.213 0.110 0.144 1.000
10. Gender 0.053 0.013 0.010 0.099 0.121 0.069 0.101 0.052 0.096 1.000
11. Previous investment experience 0.182 0.272 0.060 0.082 0.442 0.262 0.277 0.151 0.019 0.070 1.000
12. FinTech name 0.018 0.010 0.058 0.069 0.060 0.045 0.014 0.022 0.020 0.029 0.018 1.000
1. Optimism
2. Innovativeness 0.475
3. Discomfort 0.214 0.171
4. Insecurity 0.379 0.252 0.544
5. Awareness 0.213 0.358 0.180 0.078
6. Intention to use 0.449 0.326 0.065 0.172 0.489
7. Ease of use 0.368 0.344 0.127 0.142 0.321 0.523
8. Usefulness 0.354 0.168 0.042 0.116 0.275 0.672 0.576
9. Age 0.195 0.346 0.114 0.114 0.113 0.217 0.112 0.147
10. Gender 0.055 0.045 0.019 0.144 0.127 0.070 0.103 0.053 0.096
11. Previous investment experience 0.192 0.278 0.060 0.075 0.463 0.268 0.283 0.154 0.019 0.070
12. FinTech name 0.027 0.011 0.076 0.068 0.064 0.046 0.014 0.023 0.020 0.029 0.018
Hypotheses β p-value Supported
Intention to use
analytical AI
TRI variables
Optimism (H1) 0.187*** 0.000 Supported
Innovativeness (H2) 0.015 0.775 Not Supported
Discomfort (H3) 0.079* 0.094 Not Supported1
Insecurity (H4) 0.094** 0.036 Supported
Awareness (H5) 0.221*** 0.000 Supported 305
Moderating effects
Age 3 Optimism (H6a) 0.042 0.334 Not Supported
Age 3 Innovativeness (H6b) 0.025 0.619 Not Supported
Age 3 Discomfort (H6c) 0.027 0.586 Not Supported
Age 3 Insecurity (H6d) 0.043 0.367 Not Supported
Age 3 Awareness (H6e) 0.083* 0.074 Not Supported1
Gender 3 Optimism (H7a) 0.058 0.135 Not Supported
Gender 3 Innovativeness (H7b) 0.090** 0.043 Supported
Gender 3 Discomfort (H7c) 0.008 0.877 Not Supported
Gender 3 Insecurity (H7d) 0.033 0.468 Not Supported
Gender 3 Awareness (H7e) 0.090** 0.024 Supported
Experience 3 Optimism (H8a) 0.008 0.845 Not Supported
Experience 3 Innovativeness (H8b) 0.048 0.288 Not Supported
Experience 3 Discomfort (H8c) 0.054 0.258 Not Supported
Experience 3 Insecurity (H8d) 0.100** 0.040 Supported
Experience 3 Awareness (H8e) 0.082 0.151 Not Supported
Control variables
TAM variables
Usefulness 0.413*** 0.000
Ease of use 0.125** 0.011
Sociodemographics
Age 0.076** 0.050
Gender 0.001 0.976
Previous investment experience 0.025 0.556
FinTech name (Direct effect)
Fintech name (AI vs. robo-advisor) 0.033 0.334
FinTech name (Moderating effects)
Fintech name 3 Optimism 0.026 0.492
Fintech name 3 Innovativeness 0.025 0.597
Fintech name 3 Discomfort 0.027 0.571
Fintech name 3 Insecurity 0.064 0.174
Fintech name 3 Awareness 0.066 0.127
Fintech name 3 Age 0.001 0.973
Fintech name 3 Gender 0.000 0.993 Table 4.
Fintech name 3 Experience 0.043 0.262 Results, estimated
Note(s): 1 Significant effect, contrary to expected coefficients and
***p < 0.01; **p < 0.05; *p < 0.10 significance
5.2 Post-hoc analysis: the roles of awareness and financial advisor name
To better understand consumer awareness, we carried out a post-hoc analysis. As normal in
research into service awareness in other fields (e.g. Kangis and Passa, 1997), the questionnaire
asked the respondents if they had previously been aware of robo-advisor services. Some
54.95% of the sample reported that they had not been very aware of robo-advisors, and
45.05% that they had learned about robo-advisors through the following means: 32.11% from
press and news reports, 26.42% from financial services ads, 12.20% because their bank had
directly offered them the service, 11.38% through their own means (i.e. through online
searches and their own experiences), 8.94% from other customers and 8.94% through other
means (e.g. specialized investment magazines and blogs). We analyzed to what extent robo-
advisor adoption was affected by how consumers became aware of them. As the categories
were not exclusive, independent t-tests were conducted; taking one example, we compared
intention to use robo-advisors by the customer group who learned about them through
advertising with those who did not learn about them through advertising. The results
showed that awareness of robo-advisors by almost any means increased intention to use,
supporting Hypothesis 5. The analyses showed that customers with higher intention to use
robo-advisors had learned about the services through their own means (M 5 4.64, t 5 4.35,
p < 0.01), followed by customers who received information directly from their banks
(M 5 4.40, t 5 3.72, p < 0.01) and from other customers (M 5 4.30, t 5 2.74, p < 0.01); Intention to use
participants who had been exposed to robo-advisor advertisements (M 5 3.81, t 5 2.51, analytical AI
p < 0.05) and related news items (M 5 3.78, t 5 2.70, p < 0.01) also had increased intentions to
use the services, but to a lesser extent. Being aware of robo-advisors by other means did not
significantly influence use intention (M 5 3.55, t 5 0.66, p > 0.10). Figure 2 illustrates use
intentions based on how customers became aware of the services.
In addition, we used the participants’ responses to test whether the financial advisor’s
name had a differential effect on behavioral intentions among those aware and unaware of the 307
service. We performed a 2 (aware vs unaware) 3 2 (AI-advisor vs. robo-advisor) analysis of
variance (ANOVA) to assess the interaction effect between both factors on intention to use.
The results of the ANOVA confirmed that consumer awareness significantly increased
intention to use (F 5 25.82; p < 0.01), whereas the financial advisor’s name did not (F 5 1.17;
p > 0.01). The interaction effect between both variables on intention to use was also
significant (F 5 25.82; p < 0.01), as Figure 3 shows. Specifically, the FinTech name did not
significantly affect customers unaware of the service (MAI-advisor 5 2.78, MRobo-advisor 5 3.02,
t 5 1.07, p > 0.10). However, the financial service’s name was important for customers aware
of the service; they presented higher use intentions for AI-advisors than for robo-advisors
(MAI-advisor 5 4.08, MRobo-advisor 5 3.47, t 5 2.37, p < 0.05).
6. Discussion
6.1 Theoretical implications
Robo-advisor services represent a prototypical example of analytical AI, which is already
shaking the financial services industry. However, some customers may be reluctant to start
using such a disruptive technology. To understand more about this underexplored field, we
investigated whether customers are ready to embrace robo-advisors. To that end, we
analyzed the extent to which the direct effect of technology readiness factors and awareness,
and the moderation effects of age, gender, previous experience and the FinTech advisor’s
name influenced customers’ decisions to use analytical AI to manage their investments. This
is the first study to test the effects of technology readiness on consumers’ preferences for
4.5
Intention to use robo-advisors
3.5
2.5
2
Figure 2.
1.5 Mean values of
intention to use robo-
advisors based on how
1 potential customers
Own means Bank Other Advertising News Other means Non-aware became aware of them
customers
JOSM 4.5
Name AI-advisor
308
3.5
3.0
Figure 3.
Service awareness and
FinTech name
interaction effect on
intention to use 2.5
Unaware customers Aware customers
analytical AI, which had been identified as a research gap (Belanche et al., 2020a; Mende et al.,
2019; van Doorn et al., 2017).
As a novel, important finding, technological discomfort, that is, the lack of control and
overwhelming complexity of technologies experienced by some customers, had a positive and
significant effect on intention to use robo-advisors. This result is particularly interesting
because technological discomfort was originally proposed as a TRI factor that inhibited
adoption (Parasuraman, 2000; Parasuraman and Colby, 2015). Indeed, discomfort has been
shown to reduce the adoption of investment software among employees (Walczuch et al.,
2007). AI systems are based on automation, thus the user does not need to deal with the
demanding, and often problematic, tasks of understanding and operating the technology,
tasks that must be undertaken when using other systems; analytical AI carries out the tasks
for the user (Belanche et al., 2020b). Therefore, paradoxically, AI systems may be particularly
embraced by customers who suffer higher levels of technological discomfort because
automation removes the need for learning and dealing with awkward technological
processes. This finding provides a thought-provoking insight into the AI adoption literature
as customers may no longer need to play active roles in the service process. A theoretical
contribution of the present study is the finding that technological discomfort, previously
regarded as a key barrier to technology adoption, might act as a driver of acceptance. In turn,
several important drivers of adoption, such as the user’s self-efficacy, or perceived control,
may act in the future as barriers to AI adoption (i.e. more skillful users might avoid using AI).
Furthermore, this finding questions some service-dominant logic value co-creation axioms
such as “The customer is always a co-creator of value”, and that “Value is always uniquely
and phenomenologically determined by the beneficiary” (Vargo and Lusch, 2016, p. 8). In the
context of analytical AI robo-advisors, the customer no longer plays an active role, that is, the
technology creates the value. When frontline employees are replaced by automated agents
interacting socially with customers (e.g. assistive robots), the participation of the different
c et al., 2018). However, when automated agents
actors often leads to value co-creation (Cai
replace the customers’ role because of their greater skills and performance, value is created
because of the customer’s reduced participation. In other words, lower customer participation
and co-creation results in higher service value.
Turning to the other TRI variables (Parasuraman and Colby, 2015), as hypothesized,
customers’ optimism had a positive impact, and insecurity had a negative impact, on robo-
advisor adoption. In general, higher technology optimistic customers will be more likely to Intention to use
use robo-advisors as they believe they will help them better perform their tasks and increase analytical AI
their quality of life. Thus, our results suggested that having an overall positive opinion about
the benefits of technology encourages customers to embrace innovations, as was the case
with previous technologies when first launched onto the market (Liljander et al., 2006). In
addition, optimism may be increasingly important in the post-COVID-19 era, during which
individuals perceived that robot and AI-based technologies increased their quality of life
(Gonzalez-Jimenez, 2020). 309
In turn, insecurity concerns reduced consumers’ behavioral intentions to use robo-
advisors. Thus, customers who worry about the harmful consequences of technologies, such
as increased technology dependence or reduced personal interaction quality, may avoid using
AI-based financial services. This finding is in line with recent research that has indicated
there is a need to explore people’s fear of service robots and AI, especially when these
innovations have human skills or appearances (Mende et al., 2019; Ransbotham et al., 2018).
Indeed, it seems that anxiety felt toward AI and robots will become a field of increasing
interest among scholars and should be added to the already wide body of knowledge about
customers’ awareness of the harmful consequences of technology (e.g. smartphone or social
media addiction (Jiang et al., 2018; Sanz-Blas et al., 2019)). An interesting moderating effect
observed was that insecurity inhibited behavioral intentions to use robo-advisors more
among customers without investment experience than among those with investment
experience. This finding suggests that investment experience lowers the barriers to
acceptance of this new technology; thus, experienced customers, who will be less troubled by
technology threats, may be a good target sector for robo-advisors. This finding is in line with
previous research that found that consumers must have confidence in both the vendor (firm)
and the technology to adopt new technology-based services (Belanche et al., 2014) such as
robo-advisors (Cheng et al., 2019).
In contrast, innovativeness did not have a significant direct effect on behavioral
intentions. That is, in general, customers’ technology innovativeness is not a critical driver of
robo-advisor adoption. Perhaps, in such an evolving market, the more innovative consumers
are looking for even more creative investment alternatives such as cryptocurrencies and/or
crowdfunding. To the extent that robo-advisors reduce the active role of customers through
technology, they may also attract less innovative customers. Nevertheless, the influence of
innovativeness is significant among men and exerts a positive effect on their intentions to use
robo-advisors. This finding is consistent with previous studies that have suggested that
innovative men are more likely to use new service technologies (Kalinic et al., 2019).
Therefore, innovative men may be a particularly fertile target segment for robo-advisor
service adoption.
In an additional theoretical contribution, we identified that customer awareness positively
influenced intention to use robo-advisors, shedding light on a factor ignored in the previous
technology adoption literature. Our study found that service awareness (i.e. being conscious
of and having knowledge of) significantly increased the acceptance of new and disruptive
services among many customer groups. The moderating effects observed revealed that
awareness is particularly important for younger customers and women. These findings align
with previous research; as women report lower levels of self-efficacy when dealing with new
technologies (Sun and Zhang, 2006), awareness may be especially important for them as it
increases confidence in technologies. On the other hand, awareness is more important for
younger than for older customers. This may be because younger users are more autonomous
in their decision-making (Sun and Zhang, 2006) and, thus, may rely to a greater extent on their
acquired knowledge, that is, awareness, to decide whether to adopt this new technology. This
finding is consistent with previous research into social communications which has suggested
that younger people and women tend to incorporate available social information into their
JOSM decision-making processes (Sorce et al., 2005; Burke, 2001). The study’s post-hoc analysis
33,2 increased the understanding of how customers become aware of the existence of robo-
advisors, and to what extent this information affects use intentions. These findings
contribute to the existing knowledge in the field as previous studies have found that
subjective norms (i.e. other’s opinions) affect robo-advisor adoption (Belanche et al., 2019);
however, we identified that when customers become aware of robo-advisors through their
own means, or directly from their banks, they are more likely to adopt the service. In addition,
310 the results of the post-hoc analysis revealed that for customers’ who were previously aware of
these services, the name “AI-advisor” created higher use intention than did the name “robo-
advisor”. Thus, although the standard industry name, robo-advisor, may be effective with
first-time users, those with previous knowledge of these FinTech initiatives were more
attracted by the term “AI-advisor”.
The control variables included in the model also affected customers’ intentions to use
robo-advisors. As proposed in technology adoption models (e.g. the TAM; Davis, 1989),
customers’ perceptions of robo-advisors as easy to use and useful increased their behavioral
intentions. Customers tend to use technologies that perform well and do not require too much
effort to operate. These effects are consistent with those observed in previous research
(Bhattacherjee, 2000; Belanche et al., 2019), and can be understood in economic terms as
customers’ decisions in financial markets are often based on the cost-benefit paradigm,
including nonmonetary and psychological costs (Lee and Cunningham, 2001). As to the direct
effects of demographic factors, older customers had lower use intention. This finding is
consistent with previous research that indicated that older customers have more negative
attitudes toward online-based services and service robots (Hudson et al., 2017; Onorato, 2018).
Thus, our study confirms that robo-advisors are more likely to be popular among younger
users, and that older consumers may prefer traditional financial advice channels (Woodyard
and Grable, 2018). Our results also suggested that men and women are equally inclined to use
robo-advisors. Although the previous literature has found that women have more negative
attitudes than men toward technological services (Chen and Huang, 2016), this was not the
case in this analysis of a representative sample of North American men and women. Finally,
we found that prior investment experience did not affect customers’ intention to use
automated financial advisors. Perhaps customers with previous experience feel they do not
need to switch to FinTech alternatives. In any case, these findings reinforced the proposals
that robo-advisors democratize access to financial investment services (Belanche et al.,
2020a), and that FinTech services should be oriented toward both men and women, with or
without previous investment experience.
References
Andersen, R., Bozzette, S., Shapiro, M. and St. Clair, P. (2000), “Access of vulnerable groups to
antiretroviral therapy among persons in care for HIV disease in the United States”, Health
Services Research, Vol. 35, pp. 389-416.
Anderson, J. and Gerbing, D. (1988), “Structural equation modeling in practice: a review and
recommended two-step approach”, Psychological Bulletin, Vol. 103 No. 3, pp. 411-423.
Bacile, T.J. (2020), “Digital customer service and customer-to-customer interactions: investigating the
effect of online incivility on customer perceived service climate”, Journal of Service
Management, Vol. 31 No. 3, pp. 441-464.
Bagozzi, R.P. (1994), “Structural equation models in marketing research: basic principles”, in Bagozzi,
R.P. (Ed.), Principles of Marketing Research, Blackwell Publishers, Oxford, pp. 125-140.
Baumgartner, H. and Homburg, C. (1996), “Applications of structural equation modeling in marketing Intention to use
and consumer research: a review”, International Journal of Research in Marketing, Vol. 13 No. 2,
pp. 139-161. analytical AI
Belanche, D., Casalo, L.V. and Flavian, C. (2012), “Understanding the influence of social information
sources on e-government adoption”, Information Research, Vol. 17 No. 3, available at: http://
informationr.net/ir/17-3/paper531.html#.YC-yZ-hKg2w (accessed 18 February 2021).
Belanche, D., Casalo, L.V., Flavian, C. and Schepers, J. (2014), “Trust transfer in the continued usage of
public e-services”, Information and Management, Vol. 51 No. 6, pp. 627-640. 313
Belanche, D., Casalo, L. and Flavian, C. (2019), “Artificial intelligence in FinTech: understanding robo-
advisors adoption among customers”, Industrial Management and Data Systems, Vol. 119 No. 7,
pp. 1411-1430.
Belanche, D., Casalo, L.V., Flavian, C. and Schepers, J. (2020a), “Service robot implementation: a
theoretical framework and research agenda”, The Service Industries Journal, Vol. 40 Nos 3-4,
pp. 203-225.
Belanche, D., Casalo, L.V., Flavian, C. and Schepers, J. (2020b), “Robots or frontline employees?:
exploring customers’ attributions of responsibility and stability after service failure or success”,
Journal of Service Management, Vol. 31 No. 2, pp. 267-289.
Belanche, D., Casalo, L.V., Schepers, J. and Flavian, C. (2021), “Examining the effects of robots’
physical appearance, warmth, and competence in frontline services: the Humanness-Value-
Loyalty model”, Psychology and Marketing, forthcoming.
Ben-David, D. and Sade, O. (2018), “Robo-Advisor adoption, willingness to pay, and trust - an
experimental investigation”, available at: https://ssrn.com/abstract53361710 (accessed 25
June 2020).
Bhattacherjee, A. (2000), “Acceptance of e-commerce services: the case of electronic brokerages”, IEEE
Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, Vol. 30 No. 4,
pp. 411-420.
Blut, M. and Wang, C. (2019), “Technology readiness: a meta-analysis of conceptualizations of the
construct and its impact on technology usage”, Journal of the Academy of Marketing Science,
Vol. 48, pp. 649-669.
Brenner, M. (2020), 17 Digital Marketing Trends You Need to Know for 2021, Marketing Insider
Groups, available at: https://marketinginsidergroup.com/marketing-strategy/marketing-trends/
(accessed 18 February 2021).
Breuer, R. and Brettel, M. (2012), “Short-and long-term effects of online advertising: differences between
new and existing customers”, Journal of Interactive Marketing, Vol. 26 No. 3, pp. 155-166.
Burke, R.J. (2001), “Information sources: is there a gender issue?”, Corporate Communications: An
International Journal, Vol. 6 No. 1, pp. 7-12.
c, M., Odekerken-Schr€oder, G. and Mahr, D. (2018), “Service robots: value co-creation and
Cai
co-destruction in elderly care networks”, Journal of Service Management, Vol. 29 No. 2, pp. 178-205.
Caron, M.S. (2019), “The transformative effect of AI on the banking industry”, Banking and Finance
Law Review, Vol. 34 No. 2, pp. 169-214.
Chawla, D. and Joshi, H. (2020), “The moderating role of gender and age in the adoption of mobile
wallet”, Foresight, Vol. 22 No. 4, pp. 483-504.
Chen, N.H. and Huang, S.C.T. (2016), “Domestic technology adoption: comparison of innovation
adoption models and moderators”, Human Factors and Ergonomics in Manufacturing and
Service Industries, Vol. 26 No. 2, pp. 177-190.
Cheng, X., Guo, F., Chen, J., Li, K., Zhang, Y. and Gao, P. (2019), “Exploring the trust influencing
mechanism of robo-advisor service: a mixed method approach”, Sustainability, Vol. 11
No. 18, pp. 1-20.
Clark-Murphy, M. and Soutar, G.N. (2004), “What individual investors value: some Australian
evidence”, Journal of Economic Psychology, Vol. 25 No. 4, pp. 539-555.
JOSM Cole, C. and Balasubramanian, S. (1993), “Age differences in consumers’ search for information: public
policy implications”, Journal of Consumer Research, Vol. 20 No. 1, pp. 157-169.
33,2
Collins, C.J. (2007), “The interactive effects of recruitment practices and product awareness on job
seekers’ employer knowledge and application behaviors”, Journal of Applied Psychology, Vol. 92
No. 1, pp. 180-190.
Constantinides, E. (2004), “Influencing the online consumer’s behavior: the Web experience”, Internet
Research, Vol. 14 No. 2, pp. 111-126.
314
Crist, J.D., Michaels, C., Gelfand, D.E. and Phillips, L.R. (2007), “Defining and measuring service
awareness among elders and caregivers of Mexican descent”, Research and Theory for Nursing
Practice, Vol. 21 No. 2, pp. 119-134.
Davcik, N.S. (2014), “The use and misuse of structural equation modeling in management research”,
Journal of Advances in Management Research, Vol. 11 No. 1, pp. 47-81.
Davis, F.D. (1989), “Perceived usefulness, perceived ease of use, and user acceptance of information
technology”, MIS Quarterly, Vol. 13 No. 3, pp. 319-340.
Davis, F., Bagozzi, R. and Warshaw, P. (1989), “User acceptance of computer technology: a comparison
of two theoretical models”, Management Science, Vol. 35, pp. 982-1003.
Dayan, Y. (2019), “How robo-advisors are single-handedly democratizing the investing world”,
available at: https://www.moneyunder30.com/how-robo-advisors-are-single-handedly-
democratizing-the-investing-world (accesed 20 June 2020).
De Keyser, A., K€ocher, S., Alkire (nee Nasr), L., Verbeeck, C. and Kandampully, J. (2019), “Frontline
service technology infusion: conceptual archetypes and future research directions”, Journal of
Service Management, Vol. 30 No. 1, pp. 156-183.
Dietrich, O. (2016), Understanding the Ageing Consumer: Exploring Strategies for Overcoming
Innovation Resistance, Doctoral dissertation, University of Gloucestershire.
Eagly, A.H. and Karau, S.J. (2002), “Role congruity theory of prejudice toward female leaders”,
Psychological Review, Vol. 109, pp. 573-598.
Faloon, M. and Scherer, B. (2017), “Individualization of robo-advice”, The Journal of Wealth
Management, Vol. 20 No. 1, pp. 30-36.
Faqih, K.M. (2016), “An empirical analysis of factors predicting the behavioral intention to adopt
Internet shopping technology among non-shoppers in a developing country context: does
gender matter?”, Journal of Retailing and Consumer Services, Vol. 30, pp. 140-164.
Fishbein, M. and Ajzen, I. (1975), Belief, Attitude, Intention, and Behavior: an Introduction to Theory
and Research, Addison-Wesley, Reading, MA.
Fornell, C. and Bookstein, F. (1982), “Two structural equation models: LISREL and PLS applied to
consumer exit-voice theory”, Journal of Marketing Research, Vol. 19, pp. 440-452.
Fornell, C. and Larcker, D.F. (1981), “Evaluating structural equation models with unobservable
variables and measurement error”, Journal of Marketing Research, Vol. 18 No. 3, pp. 39-50.
Glaser, F., Iliewa, Z., Jung, D. and Weber, M. (2019), “Towards designing robo-advisors for
unexperienced investors with experience sampling of time-series data”, in Davis, F.D., Riedl, R.,
vom Brocke, J., Leger, P.M. and Randolph, A.B. (Eds), Information Systems and Neuroscience,
Springer, Cham, pp. 133-138.
Godoe, P. and Johansen, T. (2012), “Understanding adoption of new technologies: technology readiness
and technology acceptance as an integrated concept”, Journal of European Psychology Students,
Vol. 3 No. 1, pp. 38-52.
Goldstein, I., Jiang, W. and Karolyi, G.A. (2019), “To FinTech and beyond”, The Review of Financial
Studies, Vol. 32 No. 5, pp. 1647-1661.
Gonzalez-Jimenez, H. (2020), “Robots in daily life: a post COVID-19 perspective”, in Bunkanwanicha,
P., Coeurderoy, R. and Ben Slimane, S. (Eds), Managing a Post-Covid-19 Era, ESCP Impact
Papers, pp. 277-282.
Grewal, D., Roggeveen, A.L. and Nordf€alt, J. (2017), “The future of retailing”, Journal of Retailing, Intention to use
Vol. 93 No. 1, pp. 1-6.
analytical AI
Hair, J.F., Ringle, C.M. and Sarstedt, M. (2011), “PLS-SEM: indeed a silver bullet”, Journal of Marketing
Theory and Practice, Vol. 19 No. 2, pp. 139-152.
Hair, J.F., Ringle, C.M. and Sarstedt, M. (2013), “Partial least squares structural equation modeling:
rigorous applications, better results and higher acceptance”, Long Range Planning, Vol. 46 Nos
1-2, pp. 1-12.
315
Hair, J.F. Jr, Hult, G.T.M., Ringle, C. and Sarstedt, M. (2016), A Primer on Partial Least Squares
Structural Equation Modeling (PLS-SEM), Sage, Thousand Oaks, CA.
Hauk, N., H€uffmeier, J. and Krumm, S. (2018), “Ready to be a silver surfer? A meta-analysis on the
relationship between chronological age and technology acceptance”, Computers in Human
Behavior, Vol. 84, pp. 304-319.
He, J. and Freeman, L.A. (2019), “Are men more technology-oriented than women? The role of gender
on the development of general computer self-efficacy of college students”, Journal of
Information Systems Education, Vol. 21 No. 2, pp. 203-212.
Hellofs, L.L. and Jacobson, R. (1999), “Market share and customers’ perceptions of quality: when can
firms grow their way to higher versus lower quality?”, Journal of Marketing, Vol. 63 No. 1,
pp. 16-25.
Henseler, J., Ringle, C.M. and Sinkovics, R.R. (2009), “The use of partial least squares path modeling in
international marketing”, Advances in International Marketing, Vol. 20 No. 1, pp. 277-319.
Henseler, J., Ringle, C.M. and Sarstedt, M. (2015), “A new criterion for assessing discriminant validity
in variance-based structural equation modeling”, Journal of the Academy of Marketing Science,
Vol. 43 No. 1, pp. 115-135.
Hoffman, L.W. (1972), “Early childhood experiences and women’s achievement motives”, Journal of
Social Issues, Vol. 28 No. 2, pp. 129-155.
Hofstede, G. (2011), “Dimensionalizing cultures: the Hofstede model in context”, Online Readings in
Psychology and Culture, Vol. 2 No. 1, pp. 1-26.
Hogreve, J., Bilstein, N. and Hoerner, K. (2019), “Service recovery on stage: effects of social media
recovery on virtually present others”, Journal of Service Research, Vol. 22 No. 4, pp. 421-439.
Hu, L.T. and Bentler, P.M. (1998), “Fit indices in covariance structure modeling: sensitivity to
underparameterized model misspecification”, Psychological Methods, Vol. 3 No. 4, pp. 424-453.
Huang, M.H. and Rust, R.T. (2018), “Artificial intelligence in service”, Journal of Service Research,
Vol. 21, No. 2, pp. 155-172.
Huang, M.H. and Rust, R.T. (2021), “Engaged to a robot? The role of AI in service”, Journal of Service
Research, Vol. 24 No. 1, pp. 30-41.
Huang, M.H., Rust, R.T. and Maksimovic, V. (2019), “The feeling economy: managing in the next
generation of artificial intelligence (AI)”, California Management Review, Vol. 61 No. 4,
pp. 43-65.
Hudson, J., Orviska, M. and Hunady, J. (2017), “People’s attitudes to robots in caring for the elderly”,
International Journal of Social Robotics, Vol. 9 No. 2, pp. 199-210.
Ji, M. (2017), “Are robots good fiduciaries? Regulating robo-advisors under the investment advisers
act of 1940”, Columbia Law Review, Vol. 117, pp. 1543-1583.
Jiang, Q., Li, Y. and Shypenka, V. (2018), “Loneliness, individualism, and smartphone addiction among
international students in China”, Cyberpsychology, Behavior, and Social Networking, Vol. 21
No. 11, pp. 711-718.
Jung, D., Dorner, V., Glaser, F. and Morana, S. (2018a), “Robo-advisory”, Business and Information
Systems Engineering, Vol. 60 No. 1, pp. 81-86.
Jung, D., Dorner, V., Weinhardt, C. and Pusmaz, H. (2018b), “Designing a robo-advisor for risk-averse,
low-budget consumers”, Electronic Markets, Vol. 28 No. 3, pp. 367-380.
JOSM Jung, D., Glaser, F. and K€opplin, W. (2019), “Robo-advisory: opportunities and risks for the future of financial
advisory”, in Nissen, V. (Ed.), Advances in Consulting Research, Springer, Cham, pp. 405-427.
33,2
noz-Leiva, F. and Marinkovic, V. (2019), “The moderating
Kalinic, Z., Liebana-Cabanillas, F.J., Mu~
impact of gender on the acceptance of peer-to-peer mobile payment systems”, International
Journal of Bank Marketing, Vol. 38 No. 1, pp. 138-158.
Kangis, P. and Passa, V. (1997), “Awareness of service charges and its influence on customer
expectations and perceptions of quality in banking”, Journal of Services Marketing, Vol. 11,
316 No. 2, pp. 105-117.
Kaplan, A. and Haenlein, M. (2019), “Siri, Siri, in my hand: who’s the fairest in the land? On the
interpretations, illustrations, and implications of artificial intelligence”, Business Horizons,
Vol. 62 No. 1, pp. 15-25.
Kaplan, A. and Haenlein, M. (2020), “Rulers of the world, unite! the challenges and opportunities of
artificial intelligence”, Business Horizons, Vol. 63 No. 1, pp. 37-50.
Klink, R.R. (2009), “Gender differences in new brand name response”, Marketing Letters, Vol. 20 No. 3,
pp. 313-326.
Kumar, V., Rajan, B., Venkatesan, R. and Lecinski, J. (2019), “Understanding the role of artificial
intelligence in personalized engagement marketing”, California Management Review, Vol. 61
No. 4, pp. 135-155.
Lee, M. and Cunningham, L.F. (2001), “A cost/benefit approach to understanding service loyalty”,
Journal of Services Marketing, Vol. 15 No. 2, pp. 113-130.
Liljander, V., Gillberg, F., Gummerus, J. and Van Riel, A. (2006), “Technology readiness and the
evaluation and adoption of self-service technologies”, Journal of Retailing and Consumer
Services, Vol. 13 No. 3, pp. 177-191.
Lu, J., Wang, L. and Hayes, L.A. (2012), “How do technology readiness, platform functionality and
trust influence C2C user satisfaction?”, Journal of Electronic Commerce Research, Vol. 13 No. 1,
pp. 50-69.
Lu, V.N., Wirtz, J., Kunz, W.H., Paluch, S., Gruber, T., Martins, A. and Patterson, P.G. (2020), “Service
robots, customers and service employees: what can we learn from the academic literature and
where are the gaps?”, Journal of Service Theory and Practice, Vol. 30 No. 3, pp. 361-391.
Luksyte, A., Unsworth, K.L. and Avery, D.R. (2018), “Innovative work behavior and sex-based
stereotypes: examining sex differences in perceptions and evaluations of innovative work
behavior”, Journal of Organizational Behavior, Vol. 39 No. 3, pp. 292-305.
Martens, M., Roll, O. and Elliott, R. (2017), “Testing the technology readiness and acceptance model for
mobile payments across Germany and South Africa”, International Journal of Innovation and
Technology Management, Vol. 14 No. 06, pp. 1-19.
Mazurek, G. and Małagocka, K. (2019), “Perception of privacy and data protection in the context of the
development of artificial intelligence”, Journal of Management Analytics, Vol. 6 No. 4, pp. 344-364.
McCann, B. (2020), “Robo advisers keep adding on services”, available at: https://www.wsj.com/
articles/robo-advisers-keep-adding-on-arms-11583331556 (accessed 14 June 2020).
Mende, M., Scott, M.L., van Doorn, J., Grewal, D. and Shanks, I. (2019), “Service robots rising: how
humanoid robots influence service experiences and elicit compensatory consumer responses”,
Journal of Marketing Research, Vol. 56 No. 4, pp. 535-556.
Meuter, M.L., Bitner, M.J., Ostrom, A.L. and Brown, S.W. (2005), “Choosing among alternative service
delivery modes: an investigation of customer trial of self-service technologies”, Journal of
Marketing, Vol. 69 No. 2, pp. 61-83.
Mick, D.G. and Fournier, S. (1998), “Paradoxes of technology: consumer cognizance, emotions, and
coping strategies”, Journal of Consumer Research, Vol. 25 No. 2, pp. 123-143.
Moore, G.C. and Benbasat, I. (1991), “Development of an instrument to measure the perceptions of
adopting an information technology innovation”, Information Systems Research, Vol. 2 No. 3,
pp. 192-222.
Nunnally, J.C. and Bernstein, I.H. (1994), Psychometrictheory, McGraw-Hill, New York, NY. Intention to use
Oliveira, T., Thomas, M., Baptista, G. and Campos, F. (2016), “Mobile payment: understanding the analytical AI
determinants of customer adoption and intention to recommend the technology”, Computers in
Human Behavior, Vol. 61, pp. 404-414.
Onorato, D.A. (2018), “Robots, unions, and aging: determinants of robot adoption evidence from OECD
countries”, Atlantic Economic Journal, Vol. 46 No. 4, pp. 473-474.
Ostrom, A.L., Fotheringham, D. and Bitner, M.J. (2019), “Customer acceptance of AI in service 317
encounters: understanding antecedents and consequences”, in Maglio, P.P., Kieliszewski, C.A.,
Spohrer, J.C., Lyons, K., Patrıcio, L. and Sawatani, Y. (Eds), Handbook of Service Science,
Springer, Cham, Vol. II, pp. 77-103.
Parasuraman, A. (2000), “Technology Readiness Index (TRI): a multiple-item scale to measure
readiness to embrace new technologies”, Journal of Service Research, Vol. 2 No. 4, pp. 307-320.
Parasuraman, A. and Colby, C.L. (2015), “An updated and streamlined technology readiness index:
TRI 2.0”, Journal of Service Research, Vol. 18 No. 1, pp. 59-74.
Prodanova, J., San-Martın, S. and Jimenez, N. (2021), “Are you technologically prepared for mobile
shopping?”, The Service Industries Journal, Vol. 41 Nos 9-10, pp. 648-670.
Ransbotham, S., Gerbert, P., Reeves, M., Kiron, D. and Spira, M. (2018), “Artificial intelligence in
business gets real”, available at: https://sloanreview.mit.edu/projects/artificial-intelligence-in-
business-gets-real/.
Ratchford, B.T. (2020), “The history of academic research in marketing and its implications for the
future”, Spanish Journal of Marketing - ESIC, Vol. 24 No. 1, pp. 3-36.
Raub, S. and Blunschi, S. (2014), “The power of meaningful work: how awareness of CSR initiatives
fosters task significance and positive work outcomes in service employees”, Cornell Hospitality
Quarterly, Vol. 55 No. 1, pp. 10-18.
Robinson, S., Orsingher, C., Alkire, L., De Keyser, A., Giebelhausen, M., Papamichail, K.N., Shams, P.
and Temerak, M.S. (2020), “Frontline encounters of the AI kind: an evolved service encounter
framework”, Journal of Business Research, Vol. 116, pp. 366-376.
Rodriguez-Ricardo, Y., Sicilia, M. and Lopez, M. (2018), “What drives crowdfunding participation? The
influence of personal and social traits”, Spanish Journal of Marketing-ESIC, Vol. 22 No. 2, pp. 163-182.
Roldan, J.L. and Sanchez-Franco, M.J. (2012), “Variance-based structural equation modeling: guidelines
for using partial least squares in information systems research”, in Mora, M., Gel-Man, O.,
Steenkamp, A. and Raisinghani, M.S. (Eds), Research Methodologies, Innovations, and
Philosophies in Software Systems Engineering and Information Systems, Information Science
Reference, Hershey, PA, pp. 193-221.
Salthouse, T.A. (2004), “What and when of cognitive aging”, Current Directions in Psychological
Science, Vol. 13 No. 4, pp. 140-144.
Sanz-Blas, S., Buzova, D. and Miquel-Romero, M.J. (2019), “From Instagram overuse to instastress and
emotional fatigue: the mediation of addiction”, Spanish Journal of Marketing-ESIC, Vol. 23
No. 2, pp. 143-161.
Sorce, P., Perotti, V. and Widrick, S. (2005), “Attitude and age differences in online buying”,
International Journal of Retail and Distribution Management, Vol. 33 No. 2, pp. 122-132.
Statista (2019), “Robo-advisors”, available at: https://www.statista.com/outlook/337/100/robo-advisors/
worldwide (accessed 26 October 2019).
Sun, H. and Zhang, P. (2006), “The role of moderating factors in user technology acceptance”,
International Journal of Human-Computer Studies, Vol. 64 No. 2, pp. 53-78.
Tenenhaus, M., Vinzi, V.E., Chatelin, Y.M. and Lauro, C. (2005), “PLS path modeling”, Computational
Statistics and Data Analysis, Vol. 48 No. 1, pp. 159-205.
Tertilt, M. and Scholz, P. (2018), “To advise, or not to advise—how robo-advisors evaluate the risk
preferences of private investors”, The Journal of Wealth Management, Vol. 21 No. 2, pp. 70-84.
JOSM Trecet, J. (2019), “Un robot a cargo de tus inversiones: ası funcionan los robo advisors”, available at:
https://www.businessinsider.es/como-funciona-robo-advisor-como-saber-ti-388261 (accessed 04
33,2 April 2019).
Tsang, M.M., Ho, S.C. and Liang, T.-P. (2004), “Consumer attitudes toward mobile advertising: an
empirical study”, International Journal of Electronic Commerce, Vol. 8 No. 3, pp. 65-78.
US Census Bureau Office (2020), “Age and sex composition in the United States: 2019”, available at:
https://www.census.gov/data/tables/2019/demo/age-and-sex/2019-age-sex-composition.html
318 (accessed 18 June 2020).
van Doorn, J., Mende, M., Noble, S.M., Hulland, J., Ostrom, A.L., Grewal, D. and Petersen, J.A. (2017),
“Domo arigato Mr. Roboto: emergence of automated social presence in organizational frontlines
and customers’ service experiences”, Journal of Service Research, Vol. 20 No. 1, pp. 43-58.
Vargo, S.L. and Lusch, R.F. (2016), “Institutions and axioms: an extension and update of service
dominant logic”, Journal of the Academy of Marketing Science, Vol. 44 No. 1, pp. 5-23.
Venkatesh, V. (2000), “Determinants of perceived ease of use: integrating control, intrinsic motivation,
and emotion into the technology acceptance model”, Information Systems Research, Vol. 11
No. 4, pp. 342-365.
Venkatesh, V. and Davis, F.D. (2000), “A theoretical extension of the technology acceptance model:
four longitudinal field studies”, Management Science, Vol. 46 No. 2, pp. 186-204.
Venkatesh, V. and Morris, M.G. (2000), “Why don’t men ever stop to ask for directions? Gender, social
influence, and their role in technology acceptance and usage behavior”, MIS Quarterly, Vol. 24
No. 1, pp. 115-139.
Venkatesh, V., Morris, M.G., Davis, G.B. and Davis, F.D. (2003), “User acceptance of information
technology: toward a unified view”, MIS Quarterly, Vol. 27 No. 3, pp. 425-478.
Victorino, L., Karniouchina, E. and Verma, R. (2009), “Exploring the use of the abbreviated technology
readiness index for hotel customer segmentation”, Cornell Hospitality Quarterly, Vol. 50 No. 3,
pp. 342-359.
Walczuch, R., Lemmink, J. and Streukens, S. (2007), “The effect of service employees’ technology
readiness on technology acceptance”, Information and Management, Vol. 44 No. 2, pp. 206-215.
Wang, Y.S., Wu, M.C. and Wang, H.Y. (2009), “Investigating the determinants and age and gender
differences in the acceptance of mobile learning”, British Journal of Educational Technology,
Vol. 40 No. 1, pp. 92-118.
Wirtz, J., Patterson, P.G., Kunz, W.H., Gruber, T., Lu, V.N., Paluch, S. and Martins, A. (2018), “Brave
new world: service robots in the frontline”, Journal of Service Management, Vol. 29 No. 5,
pp. 907-931.
Woodyard, A.S. and Grable, J.E. (2018), “Insights into the users of robo-advisory firms”, Journal of
Financial Service Professionals, Vol. 72 No. 5, pp. 56-66.
Xiao, L. and Kumar, V. (2019), “Robotics for customer service: a useful complement or an ultimate
substitute?”, Journal of Service Research, September, pp. 1-21.
Yoganathan, V., Osburg, V.S., Kunz, W.H. and Toporowski, W. (2021), “Check-in at the Robo-desk:
effects of automated social presence on social cognition and service implications”, Tourism
Management, Vol. 85 No. August 2021, p. 104309.
Appendix Intention to use
Scale items
analytical AI
Optimism
319
(1) New technologies contribute to a better quality of life
(2) Technology gives me more freedom of mobility
(3) Technology gives people more control over their daily lives
(4) Technology makes me more productive in my personal life
Innovativeness
(1) Other people come to me for advice on new technologies
(2) In general, I am among the first in my circle of friends to acquire new technology when it
appears
(3) I can usually figure out new high-tech products and services without help from others
(4) I keep up with the latest technological developments in my areas of interest
Discomfort
(1) When I get technical support from a provider of a high-tech product or service, I sometimes feel
as if I am being taken advantage of by someone who knows more than I do
(2) Technical support lines are not helpful because they do not explain things in terms that I
understand
(3) Sometimes, I think that technology systems are not designed for use by ordinary people
(4) There is no such thing as a manual for a high-tech product or service that’s written in plain
language
Insecurity
(1) People are too dependent on technology to do things for them
(2) Too much technology distracts people to a point that is harmful
(3) Technology lowers the quality of relationships by reducing personal interaction
(4) I do not feel confident doing business with a service that can only be reached online
Service Awareness
For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: [email protected]