WISA2010-P035
WISA2010-P035
WISA2010-P035
ABSTRACT
The Water Services Authority (WSA) orientated electronic Water Quality Management
System (eWQMS) has been shown to assist WSAs to meet their responsibilities, improve
water quality awareness and build capacity, and meet Department of Water Affairs (DWA)
need to monitor and regulate the operation of WSAs in a proactive cooperative
governance fashion. The implementation of the eWQMS has been supported by the
stewardship of the Institute of Municipal Engineering of Southern Africa (IMESA) and
endorsement by the South African Local Government Association (SALGA). Since the
establishment and ongoing maintenance of the eWQMS at all WSAs in South Africa, the
eWQMS has undergone continuous sector directed development based on a list of
prioritised needs. Consequently, the eWQMS has addressed numerous water services
sector data and information requirements. In order to ensure that (i) the eWQMS remains
functional and relevant to WSAs, and (ii) DWA receives credible WSA water quality related
data and information, ongoing refinement and development of the eWQMS is necessary.
This has included ensuring alignment and associated data transfer between the eWQMS
and DWA’s Blue Drop System (BDS). In particular, to enable effective regulation,
appropriate data Quality Assurance/Quality Control (QA/QC) processes are essential.
Using a systems engineering approach, the QA/QC process for the eWQMS-BDS has
included review of value integrity, referential integrity and statistical integrity, resulting in
the implementation of further data validation measures to enhance data credibility.
Considering the above, this paper will present the successful systems engineering process
followed and highlight new eWQMS features implemented to further aid users with
ensuring provision of credible water quality data/information.
1. INTRODUCTION
The electronic Water Quality Management System (eWQMS) has been established and
maintained at all Water Services Authorities (WSAs) in South Africa for a period of
approximately 3 years. The eWQMS is used by WSAs to improve management of water
quality (and associated water services). In addition, the eWQMS transfers data to the
Department of Water Affairs’ (DWA) Drinking Water Quality Regulation System (DWQRS),
now known as the Blue Drop System (BDS), on a regular basis. This data is used by DWA
to both monitor and regulate performance of WSAs.
The eWQMS has undergone continuous sector directed development and consequently
addressed numerous water services sector data and information requirements. Indeed, as
the eWQMS initiative develops more momentum and success with WSAs, the greater the
numbers of sector requested/suggested developments that arise. A project was therefore
initiated to ensure that the existing key activities of the DWA-IMESA-SALGA eWQMS
initiative continue to be supported, and that the platform be further expanded to extend the
benefits to other related aspects of Water Services. By DWA proactively and formally
supporting and overseeing eWQMS system modifications, (i) DWA receive credible WSA
water quality related data and information, and (ii) the eWQMS remains functional and
relevant to WSAs. The focus of recent project activities has been on improving eWQMS-
BDS data alignment via review of systems engineering, systems integration and data
quality assurance/quality control. Importantly, this process has included an independent
review of both the eWQMS and BDS, with recommendations where processes,
procedures, etc can be improved.
Considering the above, this paper will show the process followed, provide feedback
regarding accomplishments to-date and highlights future needs.
The eWQMS is a well proven comprehensive Water Quality Management tool, which has
been successfully used by WSAs, Regional and National DWA offices, and the public to
manage water quality. The eWQMS is a novel Open Source Software based system which
is accessible via the internet (www.wqms.co.za), and is a very useful means for allowing a
range of participating parties (including WSAs, Provincial and National Government, etc) to
guide the tracking, reviewing and improving of water quality. Importantly, the eWQMS has
been developed in a “bottom up” approach with WSAs, IMESA, DWA and the Water
Research Commission (WRC). Features of the eWQMS include (a) Data input (via
internet, spreadsheet or import from LIMS), (b) Management Dashboard (highlights
sample sites satisfying and/or failing water quality requirements), (c) Compliance Overview
(summary of legislative compliance), (d) Data Analysis (generate tables and graphs), (e)
Reports (archive of WQM reports), (f) Summary Reports (automatically generated reports),
(g) Information (water related information and references), (h) Infrastructure (capture
details of water system), (i) Administration (configure and manage system set-up) and (j)
Risk Toolbox (perform self-assessments of WQM, water supply system infrastructure, etc).
The success of the eWQMS initiative can largely be attributed to the approach utilized and
considered during eWQMS development and implementation, including:
In particular, understanding the needs of WSAs and sector partners have led to significant
system modification/development to ensure that users needs are continuously met. The
following diagram summarises the context of the eWQMS and its environment.
Figure 1: eWQMS environment and associated stakeholders
The eWQMS has been deployed to all WSAs in South Africa, and is provided routinely
with water quality data, either directly through manual (web-based) capture of data, or
through automated methods (e.g. Laboratory Information Systems (LIMS)). The WSAs are
supported in use of the system by the eWQMS Team through a network of provincial
infrastructure. The main objective of this domain, historically, has been the improved
management of water resources in a municipal environment and the resulting success of
this initiative has been recognized locally and internationally.
The focus of recent activities has been on improving eWQMS-BDS data alignment via
review of systems engineering, systems integration and data quality assurance.
Importantly, this process has included an independent review of both the eWQMS and
BDS, with recommendations where processes, procedures, etc can be improved. Several
frameworks exist for the management and governance of information systems. It was
proposed and accepted that a governance model based on the Information Technology
Infrastructure Library (ITIL) (which is widely adopted in the United Kingdom and is also
fairly commonly implemented in South Africa) be used. The proposed governance
framework is scaled down considerably to align with the specific requirements of eWQMS.
The above aspects will be discussed in greater detail in the following sections.
As complex, distributed systems (such as the eWQMS) require more coordination and
formality, and as the eWQMS Team needs to maintain, use, and control the knowledge
base provided by such an approach, it was recommended that the eWQMS systems
engineering function be more formalized (i.e. not in ‘agile programming’ terms that are less
formal). Furthermore, it was stressed that this does not mean that the systems engineering
function must be over-elaborate or cumbersome, with the main objective being to achieve
an acceptable level of maturity using a minimum or adequate level of formality. The
systems engineering function should operate on four levels of detail and reach, as follows:
Considering the above, the main findings from the independent review process, were:
• Systems engineering documentation in the normal sense of the word, and measured
against what can be seen as good practice, did not exist, with documentation for user
requirements, specifications, etc largely in e-mail and issue tracker attachments.
• It was not possible for a third party to test/verify that the eWQMS complies with DWA
requirements, since no documentation was under configuration control and releases of
the system did not reflect which version of specification/requirements was being
implemented (both DWA and eWQMS Team documents).
• Test schedules and formal acceptance of test results were lacking.
• Issue resolution mechanisms had not been agreed by all stakeholders and did not
cover non-system issues adequately.
• There was substantial communication outside of agreed channels.
The above aspects have received significant recent attention, and have consequently lead
to a much improved and aligned systems engineering process. In particular, the following
systems engineering framework has been adopted:
• Three levels of detail are required for specification and testing. These reflect, broadly,
the user’s, the designer’s and the developer’s involvement.
• Acceptance should be tested against the appropriate level of specification as follows:
o User acceptance is tested formally against user requirements
o Factory acceptance, site acceptance and system tests are performed against
the systems requirements specification
o Internal tests (unit tests) are performed against detailed design documentation
• Development should ideally also proceed with client-endorsed acceptance tests
already available.
• The concept of measured releases with traceability to an applicable set of
specifications, requirements, and acceptance tests is of great importance.
• Configuration control and management need not include the unit designs or unit
acceptance tests (unless requested by the client).
The systems development success is, to a large extent, dependent on efficient issue
resolution. To achieve efficiency, the following guidelines were considered:
In basic terms, quality assurance should measure the conformance of actual events to the
processes, benchmarks and specifications applicable to the events. In the context of
eWQMS, quality assurance covers a number of such standardized processes and
specifications (see below):
Data quality was a focus area, and was assessed against three main frameworks:
• Data validity – measuring, from a systemic point of view, the integrity of data in a
database, the completeness of the data, whether it meets range criteria, and similar,
mostly automated, methods of evaluation (i.e. best practice vs. current outcome).
• Content (meaning) – assessing whether the content of the data tables in eWQMS
meet with the requirements expressed by DWA and a systemic assessment of potential
content-related errors (i.e. internal requirement vs. current outcome).
• Accuracy – assessing whether data have been correctly entered or measured. This
was not assessed in detail, given that without local knowledge it is not possible to
determine this accuracy automatically or by inspection.
When considering the above, a key question is: “What constitutes acceptable data
quality?” This is not a simple question, because the answer depends at least on the
application of the data and on the volume of data. Very few references for data quality
assessment could be found, and the benchmark below was developed using literature
reference and a summary of experience by the independent reviewer.
The following key outcomes were noted at the time of the independent review:
1
A transaction is a water quality measurement for a specific determinand, for a specific sample point, for a specific date,
as measured in a specific laboratory.
Aspect Assessment
Referential Integrity (cont) % Correct
• Wrong Water Type Assigned 99.966%
• Transaction Entries - Empty Keys >99.999%
• Transaction Entries - Null without reason 100.000%
• Transaction Entries - No Matching
Determinand 100.000%
• Transaction Entries - No Matching Sample 99.958%
• Transaction Entries - No Matching Analysis 100.000%
• Sample Entries - No Sample Point 99.992%
• Sample points - No Area 100.000%
Content Content is not satisfactory measured against the requirements provided
(meaning) by DWA, or against systemic benchmarks.
Aspect Assessment
Sample Points – robustly implemented (ease of Poor 2
making errors)?
Sample Point Classification – treated water/untreated Poor 3
water and treatment works, boreholes, etc?
Laboratory References – well populated for analysis Poor 4
results?
Laboratory Methods – well populated for analysis Poor 5
results?
Area References/Assignment – reference structure is Adequate/Poor6
recursive but a simpler area reference is required
Determinand References – robustly implemented Good
(ease of making errors)
Accuracy This is not easily measured automatically, but confidence can be
improved through attention to the above two aspects7
Currently, key stakeholders are informed on a weekly basis of these data checks and
progress in addressing data queries. Examples of outputs communicated to these
stakeholders are shown below.
eWQMS Data - Queries Ra ised with WSAs
120
105
100
91
80
Number of Queries
60
49
40
25 25 25
20 21
20 14 13
8 6
5 3
0 0 1 0 2 0 0 0 0
0
01
03
05
07
09
11
13
15
17
19
21
23
25
27
29
01
1/
1/
1/
1/
1/
1/
1/
1/
1/
1/
1/
1/
1/
1/
1/
2/
/1
/1
/1
/1
/1
/1
/1
/1
/1
/1
/1
/1
/1
/1
/1
/1
09
09
09
09
09
09
09
09
09
09
09
09
09
09
09
09
20
20
20
20
20
20
20
20
20
20
20
20
20
20
20
20
Figure 7: Tracking data queries raised with WSAs (as at 23 November 2009)
Percentage Data Queries Per Province
August - November 2009
WC EC
17% 19%
FS
2%
NW
5% GP
1%
NC
12%
KZN
20%
MP
10%
LP
14%
1000
900
800
700
600
500
400
300
200
100
0
Total Queries Queries Queries Queries Queries Queries Queries
queries followed-up confirmed co nfirmed remaining remaining (1 remaining (2 remaining
(August to with WSAs as co rrect as incorrect (<1month - 2 month - 3 months (>3 months
November) values values and old) old) o ld) old)
corrected
Figure 9: Tracking the status of data queries with WSAs (as at 23 November 2009)
Besides introduction of improved error trapping and resolution (by both systems and
people) described above, a number of usability improvements to data capture will be
shortly introduced. In particular, consideration is given to:
• On-screen validation including (i) future dates not allowed, (ii) not applicable
negative values not allowed and (iii) informal bounds (warning messages)
• Automated validation when loading spreadsheet/csv files
• Inclusion of a data status (to provide greater confidence in data quality)
The above sections have shown that significant progress and improvements have been
made to the eWQMS and its alignment with the BDS, thus assisting with improved water
quality management throughout South Africa. The independent review process has shown,
however, that challenges remain. Many of the recommendations for improvement have
already been implemented and several mitigation measures are underway. The following
status is therefore noted:
Table 4: Key outcomes from independent review process and issues requiring attention
Aspect Description Status
Data Investigation of and Process to continuously monitor status of data
Quality corrections to data and to follow up on suspect values is in place.
determined to be Historical data marked as suspect corrected.
erroneous or suspect Improved validation being developed.
Improvements to the Systems specification to implement Laboratory
content of data References complete, development required.
Follow up on misallocated sample points
underway. Water type alignment between
eWQMS and BDS has not started.
Improved confidence Processes such as validation, confirmation of
level and accuracy suspect values, and others assist with defining
the confidence level of the data.
Aspect Description Status
Governance Improvements to Process has not started.
strategic level aspects/
institutional alignment
Improvements to Several policies, guidelines, and processes have
policies and guidelines been defined, but monitoring has not explicitly
started.
Improvements to the Financial and high-level task monitoring in place,
planning and monitoring but need to extend to systems engineering and
of execution issue resolution monitoring.
Systems Proposal of and Proposals have been made and formally adopted.
Engineering agreement on a
process
Definition of roles and This must still be formally agreed.
responsibilities
Creation of a mutually This must still be done.
accessible repository
Implementation New systems development tasks follow the
proposed process.
Issue Agreement on a Proposals have been made and have been
Resolution mechanism adopted.
Implementation Portal has been provided and is in regular use.
Agreement to Proposals have been made but must still formally
classification/processes be adopted by all stakeholders.
In addition to the above, and of equal importance, is the need for on-going collation and
prioritisation of system development requirements for the benefit of WSAs and the water
sector. It is envisaged that new features/functions will continue to be introduced on at least
a bi-annual (6-monthly) basis. Noted examples include the need for:
• Enhanced security
• Improved data management (e.g. data input scheduling, data submission tracking)
• Enhanced laboratory referencing of data
• Enhanced tracking of data source (e.g. laboratories, WSA vs. DWA vs. etc)
• Usability improvements (i.e. continue to make eWQMS easy and simple to use)
• Enhanced data/information displays and reports
• Enhance infrastructure components (e.g. classification of water/wastewater
treatment works, diagrammatically linking infrastructure components)
• Web-enablement of additional water system assessments
• Improved WSA based public website
• Enhance in-field measurements data capture (e.g. operational monitoring and
analysis methods and data entry via mobile phone application)
• Continuous alignment with current and new BDS requirements
• Improved data exchange between eWQMS, DWA systems and other systems
• Development or integration of other water services related modules (asset
management, human resources, operations and management, etc)
• Integration of new SANS 241 requirements
In order to ensure that the eWQMS remains relevant, the above aspects should be
considered in subsequent projects.
13. ACKNOWLEDGEMENTS
The entire South African Water Sector, including WSAs, DWA, IMESA, WRC, SALGA, etc
who provide valuable inputs to improve the eWQMS are thanked for their important
contributions to the continued success of the eWQMS initiative.
14. REFERENCES
1. Kahn, B.K., Strong, D.M. and Wang, R.Y. (2002) Information Quality Benchmarks:
Product and Service Performance, Communications of the ACM, Vol. 45 pp 184 – 192,
April 2002.
2. MBV Equsys (2009) Review of eWQMS and Its Interaction with DWQRS. Prepared for
Department Water Affairs and Emanti Management, October 2009.
3. PF de Souza, A Wensley, L Manus and E Delport (2009) Electronic Water Quality
Management System: New Developments and Direction. Paper presented at the 2nd
Drinking-Water Quality Management Conference, Port Elizabeth, 10 – 13 May 2009.