Time To Act Mature - Gearing Ehealth Evaluations Towards Technology Readiness Levels
Time To Act Mature - Gearing Ehealth Evaluations Towards Technology Readiness Levels
Time To Act Mature - Gearing Ehealth Evaluations Towards Technology Readiness Levels
Digital Health
Volume 8: 1–5
Time to act mature—Gearing eHealth evaluations © The Author(s) 2022
Article reuse guidelines:
towards technology readiness levels sagepub.com/journals-permissions
DOI: 10.1177/20552076221113396
journals.sagepub.com/home/dhj
Abstract
It is challenging to design a proper eHealth evaluation. In our opinion, the evaluation of eHealth should be a continuous
process, wherein increasingly mature versions of the technology are put to the test. In this article, we present a model
for continuous eHealth evaluation, geared towards technology maturity. Technology maturity can be determined best via
Technology Readiness Levels, of which there are nine, divided into three phases: the research, development, and deployment
phases. For each phase, we list and discuss applicable activities and outcomes on the end-user, clinical, and societal front.
Instead of focusing on a single perspective, we recommend to blend the end-user, health and societal perspective. With this
article we aim to contribute to the methodological debate on how to create the optimal eHealth evaluation design.
Keywords
Creative Commons NonCommercial-NoDerivs CC BY-NC-ND: This article is distributed under the terms of the Creative Commons Attribution-
NonCommercial-NoDerivs 4.0 License (https://creativecommons.org/licenses/by-nc-nd/4.0/) which permits non-commercial use, reproduction
and distribution of the work as published without adaptation or alteration, without further permission provided the original work is attributed as specified on
the SAGE and Open Access page (https://us.sagepub.com/en-us/nam/open-access-at-sage).
2 DIGITAL HEALTH
the effects and costs of eHealth from a multidimensional frameworks only present the results of a single perspec-
perspective. The strong points of MAST are the involve- tive.15 In previous eHealth evaluation studies the clinical
ment of all the actors and the assessment of outcomes in perspective is over-represented and findings related to
seven domains: (1) health problem and description of the usability, the user-experience, technology acceptance, and
application; (2) safety; (3) clinical effectiveness; (4) costs are rarely addressed.5 To overcome these limitations,
patient perspectives; (5) economic aspects; (6) organiza- and based on our experience within the field of eHealth
tional aspects; and (7) socio-cultural ethical and legal evaluation, we present our position toward eHealth evalua-
aspects. Another commonly used framework is the five- tions and present a model for the continuous evaluation of
stage model for comprehensive research on telehealth by eHealth, aligned to technology maturity levels, and incorp-
Fatehi et al.14 This framework outlines five important orating different evaluation perspectives.
stages for eHealth intervention: concept development,
service design, pre-implementation, implementation, and
post-implementation. By outlining these stages, this frame- Our position towards eHealth evaluation
work addresses the difference between the assessment of Evaluation, defined here as the collection, interpretation,
prototypes and the evaluation of mature technology. The and presentation of information in order to determine the
assessment of prototypes helps to identity the required value of a result or process,16 becomes a possibility and
improvements, while the evaluation of a mature technology necessity as soon as technology development starts.
aims to measuring the overall success factors and perform- Evaluation should be a continuous process, whereby the
ance after implementation. evaluation setup is geared towards the maturity of the tech-
The endorsement of applying an iterative approach and nology. There is no need to wait with the evaluation until
to focus on multiple perspectives, are strong points of the technology is mature, the evaluation of the technology
these current frameworks to streamline the set-up of can start from the first concept. In other disciplines, such
eHealth evaluations. While these frameworks are useful, as software development in which agile SCRUM is a
we foresee three major limitations. The first limitation is common approach, continuous evaluation is very common.
the current frameworks are only applicable for fully Next, we think that, instead of focusing on a single per-
mature technologies and are no solution for technology spective, evaluations should incorporate a multitude of
still in development. Therefore, the applicability of these complementary evaluation perspectives. In our opinion
frameworks is limited. The second limitation is that these these perspectives are (1) the end-user, (2) the health, and
frameworks do not provide a clear method for determining (3) the societal perspective. The end-user perspective
technology maturity. When using these frameworks, in focuses on the task-technology fit (which differs per type
combination with immature technologies to evaluate the of end-user), usability, the user-experience (UX) and tech-
value of an eHealth service, results are likely to be overly nology acceptance, to ensure that a technology is suitable
negative or, at the very least, biased. Finally, the third limi- for the intended end-users and their context; the health (or
tation is the over-representation of the clinical perspective. clinical) perspective should safeguard the health benefits
Most of the articles that report on the use of these that one derives from using the technology; the societal per-
spective should ensure that the technology can be imple-
mented with the support of relevant stakeholders, and is
durable.
A model for continuous eHealth evaluation spective) (e.g. as in van Velsen et al.20 and
Jansen-Kosterink et al.).21 These discussions aim to
Our model for continuous eHealth evaluation addresses
gauge the end-users’ reactions towards the basic concepts
both the maturity of the technology as the starting point
and main functionality of the prototype. The main aim of
for eHealth evaluations, as well as the inclusion of the dif-
the continuous evaluations in the research phase is to opti-
ferent evaluation perspectives. An overview of the sug-
mize the new concept and technology. As the technology
gested activities for the three perspectives in each phase is
mainly consists of ideas and simple prototypes at this
provided in Table 1.
stage, applying an iterative approach in this phase is
crucial. Quick rounds of testing-redesing-testing should
Research phase ensure the proper focus of the innovation. The work of
During the research phase, the technology is immature and Schnall and colleagues22–24 on their Health Information
the new concept, often in the form of a low-fidelity proto- Technology Usability Evaluation Scale (Health-ITUES)
type, is discussed with potential end-users (end-user per- fit very well with this phase.
Table 1. An overview of the activities on the end-user, health, and societal perspective for the research, development, and deployment
phase.
Deployment TRL 7 Large scale clinical study to the use, Discussions with relevant stakeholders
phase acceptance, health benefits, and to assess the forecast of financial
safety of the technology within the and extra-financial value.
daily clinical context.
TRL 8 Long term monitoring of the health Strengthen the model of financial and
benefits and safety of the extra-financial-value with the
technology within the broad outcomes of clinical studies
clinical context.
7. Laplante C and Peng W. A systematic review of e-health 19. Lyng KM, Jensen S and Bruun-Rasmussen M. A paradigm
interventions for physical activity: an analysis of study shift: sharing patient reported outcome via a national infra-
design, intervention characteristics, and outcomes. Telemed structure. In: MEDINFO 2019: health and wellbeing e-net-
J e-Health: Off J Am Telemed Assoc 2011; 17: 509–523. works for all. IOS Press, 2019, pp.694–698.
2011/07/02. 20. van Velsen L, Evers M, Bara C-D, et al. Understanding the
8. Granja C, Janssen W and Johansen MA. Factors determining acceptance of an eHealth technology in the early stages of
the success and failure of eHealth interventions: systematic development: an end-user walkthrough approach and two
review of the literature. J Med Internet Res 2018; 20: case studies. JMIR Formativ Res 2018; 2: e10474.
e10235–e10235. 21. Jansen-Kosterink S, van Velsen L and Cabrita M. Clinician
9. Broens TH, Huis in,’t Veld RM, Vollenbroek-Hutten MM, acceptance of complex clinical decision support systems for
et al. Determinants of successful telemedicine implementa- treatment allocation of patients with chronic low back pain.
tions: a literature study. J Telemed Telecare 2007; 13: 303– BMC Med Inform Decis Mak 2021; 21: 37.
309. 2007/09/06. 22. Brown W 3rd, Yen PY, Rojas M, et al. Assessment of the
10. Eysenbach G. What is e-health? J Med Internet Res 2001; 3: health IT usability evaluation model (health-ITUEM) for
E20–E20. evaluating mobile health (mHealth) technology. J Biomed
11. van Dyk L. A review of telehealth service implementation fra- Inform 2013; 46: 1080–1087. 2013/08/27.
meworks. Int J Environ Res Public Health 2014; 11: 1279– 23. Cho H, Yen PY, Dowding D, et al. A multi-level usability
1298. evaluation of mobile health applications: a case study. J
12. Kidholm K, Ekeland AG, Jensen LK, et al. A model for Biomed Inform 2018; 86: 79–89. 2018/08/27.
assessment of telemedicine applications: mast. Int J Technol 24. Schnall R, Rojas M, Bakken S, et al. A user-centered model
Assess Health Care 2012; 28: 44–51. 2012/05/24. for designing consumer mobile health (mHealth) applications
13. Lampe K, Mäkelä M, Garrido MV, et al. The HTA core (apps). J Biomed Inform 2016; 60: 243–251. 2016/02/24.
model: a novel method for producing and reporting health 25. Keizer RACM O, van Velsen L, Moncharmont M, et al. Using
technology assessments. Int J Technol Assess Health Care socially assistive robots for monitoring and preventing frailty
2009; 25: 9–20. 2009/12/25. among older adults: a study on usability and user experience
14. Fatehi F, Smith AC, Maeder A, et al. How to formulate challenges. Health Technol (Berl) 2019; 9: 595–605.
research questions and design studies for telehealth assess- 26. Kosterink SM, Huis in ’t Veld RM, Cagnie B, et al. The clin-
ment and evaluation. J Telemed Telecare 2016; 23: 759– ical effectiveness of a myofeedback-based teletreatment
763. service in patients with non-specific neck and shoulder pain:
15. Kidholm K, Clemensen J, Caffery LJ, et al. The model for a randomized controlled trial. J Telemed Telecare 2010; 16:
assessment of telemedicine (MAST): a scoping review of 316–321. 2010/08/28.
empirical studies. J Telemed Telecare 2017; 23: 803–813. 27. Talboom-Kamp E, Ketelaar P and Versluis A. A national
2017/08/02. program to support self-management for patients with a
16. Shaw I, Shaw IGR, Greene JC, et al. The sage handbook of chronic condition in primary care: a social return on invest-
evaluation. Thousand Oaks, CA: Sage, 2006. ment analysis. Clin eHealth 2021; 4: 45–49.
17. Liu L, Stroulia E, Nikolaidis I, et al. Smart homes and home 28. Sadegh SS, Khakshour Saadat P, Sepehri MM, et al. A frame-
health monitoring technologies for older adults: a systematic work for m-health service development and success evalu-
review. Int J Med Inf 2016; 91: 44–59. 2016/05/18. ation. Int J Med Inf 2018; 112: 123–130. 2018/03/04.
18. Roach DP and Neidigk S. Does the Maturity of Structural 29. Scherr TF, Moore CP, Thuma P, et al. Evaluating network
Health Monitoring Technology Match User Readiness? readiness for mHealth interventions using the beacon
2011. Sandia National Lab.(SNL-NM), Albuquerque, NM mobile phone app: application development and validation
(United States). study. JMIR Mhealth Uhealth 2020; 8: e18413. 2020/07/29.