WISA2010-P035

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Search for keyword

ELECTRONIC WATER QUALITY MANAGEMENT SYSTEM


(eWQMS) NEW DEVELOPMENTS: USE OF A SYSTEMS
ENGINEERING PROCESS TO AID IMPROVED DATA QUALITY
ASSURANCE/QUALITY CONTROL
PF de Souza, W Hugo1, A Wensley2 and K Kuhn2
Emanti Management, Postal Address: PO Box 1264, Stellenbosch, South Africa, 7599,
[email protected] Tel: +27218802932 Fax: +27218802931
1
MBV Equsys
2
Water Services Directorate, Department of Water Affairs

ABSTRACT
The Water Services Authority (WSA) orientated electronic Water Quality Management
System (eWQMS) has been shown to assist WSAs to meet their responsibilities, improve
water quality awareness and build capacity, and meet Department of Water Affairs (DWA)
need to monitor and regulate the operation of WSAs in a proactive cooperative
governance fashion. The implementation of the eWQMS has been supported by the
stewardship of the Institute of Municipal Engineering of Southern Africa (IMESA) and
endorsement by the South African Local Government Association (SALGA). Since the
establishment and ongoing maintenance of the eWQMS at all WSAs in South Africa, the
eWQMS has undergone continuous sector directed development based on a list of
prioritised needs. Consequently, the eWQMS has addressed numerous water services
sector data and information requirements. In order to ensure that (i) the eWQMS remains
functional and relevant to WSAs, and (ii) DWA receives credible WSA water quality related
data and information, ongoing refinement and development of the eWQMS is necessary.
This has included ensuring alignment and associated data transfer between the eWQMS
and DWA’s Blue Drop System (BDS). In particular, to enable effective regulation,
appropriate data Quality Assurance/Quality Control (QA/QC) processes are essential.
Using a systems engineering approach, the QA/QC process for the eWQMS-BDS has
included review of value integrity, referential integrity and statistical integrity, resulting in
the implementation of further data validation measures to enhance data credibility.
Considering the above, this paper will present the successful systems engineering process
followed and highlight new eWQMS features implemented to further aid users with
ensuring provision of credible water quality data/information.

1. INTRODUCTION

The electronic Water Quality Management System (eWQMS) has been established and
maintained at all Water Services Authorities (WSAs) in South Africa for a period of
approximately 3 years. The eWQMS is used by WSAs to improve management of water
quality (and associated water services). In addition, the eWQMS transfers data to the
Department of Water Affairs’ (DWA) Drinking Water Quality Regulation System (DWQRS),
now known as the Blue Drop System (BDS), on a regular basis. This data is used by DWA
to both monitor and regulate performance of WSAs.

The eWQMS has undergone continuous sector directed development and consequently
addressed numerous water services sector data and information requirements. Indeed, as
the eWQMS initiative develops more momentum and success with WSAs, the greater the
numbers of sector requested/suggested developments that arise. A project was therefore
initiated to ensure that the existing key activities of the DWA-IMESA-SALGA eWQMS
initiative continue to be supported, and that the platform be further expanded to extend the
benefits to other related aspects of Water Services. By DWA proactively and formally
supporting and overseeing eWQMS system modifications, (i) DWA receive credible WSA
water quality related data and information, and (ii) the eWQMS remains functional and
relevant to WSAs. The focus of recent project activities has been on improving eWQMS-
BDS data alignment via review of systems engineering, systems integration and data
quality assurance/quality control. Importantly, this process has included an independent
review of both the eWQMS and BDS, with recommendations where processes,
procedures, etc can be improved.

Considering the above, this paper will show the process followed, provide feedback
regarding accomplishments to-date and highlights future needs.

2. THE ELECTRONIC WATER QUALITY MANAGEMENT SYSTEM (EWQMS)

The eWQMS is a well proven comprehensive Water Quality Management tool, which has
been successfully used by WSAs, Regional and National DWA offices, and the public to
manage water quality. The eWQMS is a novel Open Source Software based system which
is accessible via the internet (www.wqms.co.za), and is a very useful means for allowing a
range of participating parties (including WSAs, Provincial and National Government, etc) to
guide the tracking, reviewing and improving of water quality. Importantly, the eWQMS has
been developed in a “bottom up” approach with WSAs, IMESA, DWA and the Water
Research Commission (WRC). Features of the eWQMS include (a) Data input (via
internet, spreadsheet or import from LIMS), (b) Management Dashboard (highlights
sample sites satisfying and/or failing water quality requirements), (c) Compliance Overview
(summary of legislative compliance), (d) Data Analysis (generate tables and graphs), (e)
Reports (archive of WQM reports), (f) Summary Reports (automatically generated reports),
(g) Information (water related information and references), (h) Infrastructure (capture
details of water system), (i) Administration (configure and manage system set-up) and (j)
Risk Toolbox (perform self-assessments of WQM, water supply system infrastructure, etc).

The success of the eWQMS initiative can largely be attributed to the approach utilized and
considered during eWQMS development and implementation, including:

• Raising awareness with regards to Water Quality Management


• Building on existing Good Practice (i.e. not counter-productive)
• Bottom-up approach (i.e. the system must be useful to users)
• A proven system (easy to use, robust, reliable, secure)
• Driving progressive improvement in water quality
• Enabling intervention in areas facing public health threats
• Providing strategic data related to the quality of water services to WSAs, DWA, etc
• Satisfying WSA Governance Requirements
• Supporting DWAs regulatory function and satisfy other role player requirements
• Undergoing iterative enhancements via WSA and sector feedback

In particular, understanding the needs of WSAs and sector partners have led to significant
system modification/development to ensure that users needs are continuously met. The
following diagram summarises the context of the eWQMS and its environment.
Figure 1: eWQMS environment and associated stakeholders

The eWQMS has been deployed to all WSAs in South Africa, and is provided routinely
with water quality data, either directly through manual (web-based) capture of data, or
through automated methods (e.g. Laboratory Information Systems (LIMS)). The WSAs are
supported in use of the system by the eWQMS Team through a network of provincial
infrastructure. The main objective of this domain, historically, has been the improved
management of water resources in a municipal environment and the resulting success of
this initiative has been recognized locally and internationally.

3. INDEPENDENT REVIEW PROCESS: GOVERNANCE FRAMEWORK

The focus of recent activities has been on improving eWQMS-BDS data alignment via
review of systems engineering, systems integration and data quality assurance.
Importantly, this process has included an independent review of both the eWQMS and
BDS, with recommendations where processes, procedures, etc can be improved. Several
frameworks exist for the management and governance of information systems. It was
proposed and accepted that a governance model based on the Information Technology
Infrastructure Library (ITIL) (which is widely adopted in the United Kingdom and is also
fairly commonly implemented in South Africa) be used. The proposed governance
framework is scaled down considerably to align with the specific requirements of eWQMS.

Figure 2: eWQMS and BDS governance framework


The above comprehensive governance framework contains typical aspects whereby
information systems are managed and operated. Considering the above, the main focus
to-date has been on both strategic alignment and the systems engineering function (see
orange sections in the previous figure). In particular, the following critical aspects (and
associated tasks) of the governance framework were initially considered:

1. Systems Engineering Process


• Agree a systems engineering process and align with stakeholder comments
• Assess the compliance of the current situation, and raise issues if gaps exist
• Implement mitigation for serious deficiencies
2. Issue Resolution
• Establish an agreed issue resolution mechanism with all important stakeholders
• Survey and formalize all known issues
• Categorize and plan issue mitigation once root causes/issue categories are known
• Monitor progress
3. Quality Assurance
• Define a set of critical user requirements and performance specifications
• Assess the system in respect of critical user requirements, and raise issues if gaps
exist
• Implement mitigation and monitoring measures as required

The above aspects will be discussed in greater detail in the following sections.

4. IMPROVED SYSTEMS ENGINEERING PROCESS

As complex, distributed systems (such as the eWQMS) require more coordination and
formality, and as the eWQMS Team needs to maintain, use, and control the knowledge
base provided by such an approach, it was recommended that the eWQMS systems
engineering function be more formalized (i.e. not in ‘agile programming’ terms that are less
formal). Furthermore, it was stressed that this does not mean that the systems engineering
function must be over-elaborate or cumbersome, with the main objective being to achieve
an acceptable level of maturity using a minimum or adequate level of formality. The
systems engineering function should operate on four levels of detail and reach, as follows:

1. Strategic level systems engineering function (e.g. translating organizational strategy,


business requirements, etc into systems architecture (e.g. Business Requirements
Specifications (BRS)), evaluating the impact of environmental drivers, assisting with
alignment of systems with organizational objectives, periodic audit of technical aspects
of delivery, assistance with risk assessment and mitigation).
2. Policy/abstract level systems engineering (e.g. establishment of a systems engineering
process/methodology with objectives of making sure that processes are based on
international best practice including standardized methodology/approach, templates,
etc and that processes are implemented properly with associated documentation in
place to guide on-going development and interoperability with external systems).
3. Planning-level tasks, related to the life cycle elements of the methodology or systems
engineering process (e.g. documenting impact assessments, developing User
Requirements Specifications (URS) documentation, developing and/or researching
systems specifications, developing test procedures to prove compliance with user
requirements and system specifications (e.g. User Acceptance Tests (UAT), etc).
4. Execution and implementation-level tasks (e.g. establishment of dictionaries and
system meta-data, configuration management, process definitions, issue resolution
mechanisms, delivery against agreed system specifications, etc).
The funding level of a project has a direct bearing on the amount of time and money that
can be spent on systems engineering. It must be recognised, though, that the systems
development effort results in an asset, and systems engineering has, as its main objective,
to make sure that the asset meets requirements, can be safeguarded in future, and that
development risks are adequately addressed. Considering this, a balance is required in
terms of two conflicting objectives:

1. There is an obvious requirement of minimum diligence, so that:


a. A record of systems construction is available in sufficient detail to be transferred
to a third party
b. Systems developers have sufficient detail to construct a system that can meet
with user requirements in an objective assessment
c. And users have sufficient detail to confirm that the documented system
requirements and specifications meet with their needs
2. On the other hand, the systems engineering function and its level of detail is strongly
influenced by the size of a project and available budget.

Considering the above, the main findings from the independent review process, were:

• Systems engineering documentation in the normal sense of the word, and measured
against what can be seen as good practice, did not exist, with documentation for user
requirements, specifications, etc largely in e-mail and issue tracker attachments.
• It was not possible for a third party to test/verify that the eWQMS complies with DWA
requirements, since no documentation was under configuration control and releases of
the system did not reflect which version of specification/requirements was being
implemented (both DWA and eWQMS Team documents).
• Test schedules and formal acceptance of test results were lacking.
• Issue resolution mechanisms had not been agreed by all stakeholders and did not
cover non-system issues adequately.
• There was substantial communication outside of agreed channels.

The above aspects have received significant recent attention, and have consequently lead
to a much improved and aligned systems engineering process. In particular, the following
systems engineering framework has been adopted:

Figure 3: eWQMS (and associated BDS) systems engineering framework


Considering the above, the improved process requires the following:

• Three levels of detail are required for specification and testing. These reflect, broadly,
the user’s, the designer’s and the developer’s involvement.
• Acceptance should be tested against the appropriate level of specification as follows:
o User acceptance is tested formally against user requirements
o Factory acceptance, site acceptance and system tests are performed against
the systems requirements specification
o Internal tests (unit tests) are performed against detailed design documentation
• Development should ideally also proceed with client-endorsed acceptance tests
already available.
• The concept of measured releases with traceability to an applicable set of
specifications, requirements, and acceptance tests is of great importance.
• Configuration control and management need not include the unit designs or unit
acceptance tests (unless requested by the client).

5. IMPROVED ISSUE RESOLUTION PROCESS

The systems development success is, to a large extent, dependent on efficient issue
resolution. To achieve efficiency, the following guidelines were considered:

• Issues need to be contextualized in terms of its formality. There is a large difference


between an issue raised by a test team under test conditions, and an informal issue
raised by a user under non-test conditions. Specifically, the contractual implications are
very different in terms of the responsibilities of the service provider/systems developer.
• It also needs to be contextualized in terms of severity. Again, there is a big difference
between an issue (raised in a live environment) that prevents users from executing
system functions and an issue raised by testers in a non-live environment.
• Furthermore, the root cause of the issue is of vital importance. Although the details of
the root cause terminology may differ, the central point remains: issues are often not
due to programming error, but to faulty data, mis-specification of requirements,
inappropriate use of the system, etc.
• Finally, issues must be assigned to releases on the basis of severity. This is a critical
component of prioritized mitigation and improvement of systems. The following protocol
has been accepted and implemented by both DWA and the eWQMS Teams.

Table 1: Categorizing Issues


Priority Name Description Release
1 Blocker Show-stopping issue – no one can use the Ad-Hoc
service/application
2 Critical Use of the service/application is severely Ad-Hoc
Issue impacted but some functions are available
3 Major A specific, important function fails/is Next Minor
Issue unavailable Release
4 Minor Impacts on user experience but the Next Major
Issue functionality is available, workarounds required Release
but the service/application is functional
5 Optional Will improve the service/application but is not Some Major
essential for all users Release
6 Trivial Limited improvements for some users or Never
disagreement on benefits
Consequently, an appropriate issue management system that is visible to all participants
has been established to assist with issue resolution (see below).

Figure 4: DWA Water Services Issue Tracker for eWQMS-BDS issues/queries

6. IMPROVED QUALITY ASSURANCE (QA)

In basic terms, quality assurance should measure the conformance of actual events to the
processes, benchmarks and specifications applicable to the events. In the context of
eWQMS, quality assurance covers a number of such standardized processes and
specifications (see below):

Table 2: Quality assurance aspects


Aspect Process/ Procedure
Data Quality Value Integrity, Referential Integrity and Statistical Integrity
Integration Quality Verification of Delivery, Verification of Process and
Assessment of Compliance with Specification
Development Quality Assessment of Software Engineering Quality
Delivery Quality Assessment of Systems Engineering Quality
Governance Quality Assessment of maturity in respect of major perspectives

Considering the above, the following is noted:

• Data quality has been a focus area for the project.


• Integration quality has been improved via improved monitoring of data imports, flagging
and resolution of issues, etc.
• Both development quality and delivery quality have been addressed via assessment
and improvement to systems engineering quality.
• Deficiencies in governance quality have been highlighted to all concerned parties.
These deficiencies are not from the perspective of diligence in terms of project
oversight and financial management, but from the perspective of alignment of
stakeholder requirements and translation of this alignment into a positive force for the
benefit of all. It is anticipated that an appropriate forum will be established where all
major stakeholder perspectives and requirements can be raised.

Data quality was a focus area, and was assessed against three main frameworks:
• Data validity – measuring, from a systemic point of view, the integrity of data in a
database, the completeness of the data, whether it meets range criteria, and similar,
mostly automated, methods of evaluation (i.e. best practice vs. current outcome).
• Content (meaning) – assessing whether the content of the data tables in eWQMS
meet with the requirements expressed by DWA and a systemic assessment of potential
content-related errors (i.e. internal requirement vs. current outcome).
• Accuracy – assessing whether data have been correctly entered or measured. This
was not assessed in detail, given that without local knowledge it is not possible to
determine this accuracy automatically or by inspection.

When considering the above, a key question is: “What constitutes acceptable data
quality?” This is not a simple question, because the answer depends at least on the
application of the data and on the volume of data. Very few references for data quality
assessment could be found, and the benchmark below was developed using literature
reference and a summary of experience by the independent reviewer.

Table 3: Data quality acceptability


Assessment Bounds Transaction Decision Reference/ Comments
System Support
System
Excellent 99% + Kahn, B et al (2002)
Unusual level of quality
Good 95%-99% Intervention required for
transaction or monitoring
systems
Adequate 90%-95% Suitable for Decision
Support, Planning, and
Policy Making
Intermediate 75%-90% Requires more sensitivity
analysis for DSS, not usable
in transaction systems
Poor Below 75% Unusable

The following key outcomes were noted at the time of the independent review:

Table 4: Outcome of data quality checks


Aspect Assessment
Data Validity Data quality from a validity perspective is high, and can be characterised
as excellent measured against Table 3 benchmarks
Value Integrity % Correct
• Informal Bounds: Drinking Water 99.955%
• Informal Bounds: Waste Water 99.956%
• Date Range Values 99.818%
• Determinand Counts (duplicate determinands) 99.997%
• Negative Values 99.994%
Referential Integrity % Correct
• Multiple Determinands 99.926%
• Empty Transactions 1
98.039%

1
A transaction is a water quality measurement for a specific determinand, for a specific sample point, for a specific date,
as measured in a specific laboratory.
Aspect Assessment
Referential Integrity (cont) % Correct
• Wrong Water Type Assigned 99.966%
• Transaction Entries - Empty Keys >99.999%
• Transaction Entries - Null without reason 100.000%
• Transaction Entries - No Matching
Determinand 100.000%
• Transaction Entries - No Matching Sample 99.958%
• Transaction Entries - No Matching Analysis 100.000%
• Sample Entries - No Sample Point 99.992%
• Sample points - No Area 100.000%
Content Content is not satisfactory measured against the requirements provided
(meaning) by DWA, or against systemic benchmarks.
Aspect Assessment
Sample Points – robustly implemented (ease of Poor 2
making errors)?
Sample Point Classification – treated water/untreated Poor 3
water and treatment works, boreholes, etc?
Laboratory References – well populated for analysis Poor 4
results?
Laboratory Methods – well populated for analysis Poor 5
results?
Area References/Assignment – reference structure is Adequate/Poor6
recursive but a simpler area reference is required
Determinand References – robustly implemented Good
(ease of making errors)
Accuracy This is not easily measured automatically, but confidence can be
improved through attention to the above two aspects7

With reference to the above independent review, further improvements in data


quality/integrity were introduced, and will be discussed in greater detail in the next section.

7. IMPROVED DATA QUALITY AND INTEGRITY

A number of improvements have been noted, including:

• Implementation of a Unique ID which uniquely identifies and traces each individual


analysis performed by a WSA and loaded onto eWQMS and transferred to BDS
• Improved data checks
o Daily access of eWQMS File Transfer Portal (FTP) and data download
o Daily check that the files contain expected data and check content of the files
o Regular checks of BDS including sample points, determinands, data displayed
2
Poor rating results mainly from the possibility that assigning the wrong water type to a sample point invalidates all data
associated with the sample point.
3
eWQMS structure for classification of sample points does not completely align with current BDS requirement.
4
As the eWQMS adoption rate at WSAs has increased, the overall quality of laboratory references has deteriorated, with
many of the smaller WSAs not providing this information.
5
eWQMS makes provision for storing this information, but it is poorly provided at present, and the BDS requires some
extensions.
6
Area references for sample points are properly implemented in eWQMS, but do not completely align with current BDS
requirements.
7
Specifically, all transactions that are followed up manually because of suspected inaccuracy are now marked as having
a higher confidence level (see Sections 7 and 8 for more).
o Checks for possible wastewater sample points reflected as drinking water
o Checks for/corrections of future dates (current/historical)
o Checks for/corrections of negative values (current/historical)
o Checks for/corrections of failing values (current/historical) with WSA follow-ups
o Logging issues for resolution if it is suspected that an issue exists
• Data upload to BDS
o Development of BDS upload facility and regular upload of data by eWQMS
Team to BDS, with feedback of successful data transfer to BDS
• Development and implementation of a QA database to assist with the above

Currently, key stakeholders are informed on a weekly basis of these data checks and
progress in addressing data queries. Examples of outputs communicated to these
stakeholders are shown below.
eWQMS Data - Queries Ra ised with WSAs

120

105

100
91

80
Number of Queries

60
49

40

25 25 25
20 21
20 14 13
8 6
5 3
0 0 1 0 2 0 0 0 0
0
01

03

05

07

09

11

13

15

17

19

21

23

25

27

29

01
1/

1/

1/

1/

1/

1/

1/

1/

1/

1/

1/

1/

1/

1/

1/

2/
/1

/1

/1

/1

/1

/1

/1

/1

/1

/1

/1

/1

/1

/1

/1

/1
09

09

09

09

09

09

09

09

09

09

09

09

09

09

09

09
20

20

20

20

20

20

20

20

20

20

20

20

20

20

20

20

Figure 7: Tracking data queries raised with WSAs (as at 23 November 2009)
Percentage Data Queries Per Province
August - November 2009

WC EC
17% 19%

FS
2%
NW
5% GP
1%

NC
12%
KZN
20%

MP
10%
LP
14%

Figure 8: Tracking data queries per province (as at 23 November 2009)


Status of eWQMS "High Value" Data Queries
August - November 2009
1600
1500
1400
1300
1200
1100
Number of Queries

1000
900
800
700
600
500
400
300
200
100
0
Total Queries Queries Queries Queries Queries Queries Queries
queries followed-up confirmed co nfirmed remaining remaining (1 remaining (2 remaining
(August to with WSAs as co rrect as incorrect (<1month - 2 month - 3 months (>3 months
November) values values and old) old) o ld) old)
corrected

Figure 9: Tracking the status of data queries with WSAs (as at 23 November 2009)

8. IMPROVED DATA CAPTURING MECHANISM

Besides introduction of improved error trapping and resolution (by both systems and
people) described above, a number of usability improvements to data capture will be
shortly introduced. In particular, consideration is given to:

• On-screen validation including (i) future dates not allowed, (ii) not applicable
negative values not allowed and (iii) informal bounds (warning messages)
• Automated validation when loading spreadsheet/csv files
• Inclusion of a data status (to provide greater confidence in data quality)

It is anticipated that the aforementioned functionality will be made available to eWQMS


users via a eWQMS release before the end of 2009.

9. FEATURES/FUNCTIONS TO IMPROVE WATER QUALITY MANAGEMENT

Besides the improvements to systems, processes and procedures described in the


preceding sections, over the last 12 months, a number of additional features/functions
have been developed with assistance through this project. By inclusion of these, the
eWQMS has supported WSAs in obtaining Blue Drop-Green Drop Certification. This
includes assisting WSAs with the following aspects:

• Data submission to BDS – transferring data to the BDS on behalf of WSAs


• Data accuracy – running daily checks of data submitted to the eWQMS and flagging
potential issues of concern with WSAs
• Notification of water quality failures – by e-mail, viewing website or eWQMS Team
representative
• Water quality failure issue resolution – capturing corrective actions
• Monitoring programme details, laboratory details – easy for WSAs to capture
• Correct monitoring frequency of all towns/communities – display data available over
a 12-month period
• Sample points map – colour coded to indicate water quality failures and activated
by loading GPS co-ordinates for sample points
• Collecting sufficient samples – check that at least 1 sample per 10 000 persons
• DWQM Performance Assessment – display of BASIC WQM criteria
• Process Assessment/Audit – web enablement of WRC Infrastructure
Assessment/Audit Tools
• Asset Register – capture basic inventory/asset register
• Operational Measurements – WRC Operation Information Tool (OIT) and use of
mobile phone technology for capturing water quality data in the field.
• SANS 241 – Analysis and compliance tools to see status
• Templates Download – guidelines (e.g. WHO Water Safety Plan Manual) and
references (e.g. registration of works, registration of process controllers).
• eWQMS Helpdesk for eWQMS, water quality management, etc queries
• Green Drop System – most of the features/functions noted are already available for
the wastewater component of the eWQMS

Figure 10: Examples of eWQMS supporting Blue Drop-Green Drop Certification

10. SUMMARY OF MAIN FINDINGS FROM INDEPENDENT REVIEW PROCESS

The above sections have shown that significant progress and improvements have been
made to the eWQMS and its alignment with the BDS, thus assisting with improved water
quality management throughout South Africa. The independent review process has shown,
however, that challenges remain. Many of the recommendations for improvement have
already been implemented and several mitigation measures are underway. The following
status is therefore noted:
Table 4: Key outcomes from independent review process and issues requiring attention
Aspect Description Status
Data Investigation of and Process to continuously monitor status of data
Quality corrections to data and to follow up on suspect values is in place.
determined to be Historical data marked as suspect corrected.
erroneous or suspect Improved validation being developed.
Improvements to the Systems specification to implement Laboratory
content of data References complete, development required.
Follow up on misallocated sample points
underway. Water type alignment between
eWQMS and BDS has not started.
Improved confidence Processes such as validation, confirmation of
level and accuracy suspect values, and others assist with defining
the confidence level of the data.
Aspect Description Status
Governance Improvements to Process has not started.
strategic level aspects/
institutional alignment
Improvements to Several policies, guidelines, and processes have
policies and guidelines been defined, but monitoring has not explicitly
started.
Improvements to the Financial and high-level task monitoring in place,
planning and monitoring but need to extend to systems engineering and
of execution issue resolution monitoring.
Systems Proposal of and Proposals have been made and formally adopted.
Engineering agreement on a
process
Definition of roles and This must still be formally agreed.
responsibilities
Creation of a mutually This must still be done.
accessible repository
Implementation New systems development tasks follow the
proposed process.
Issue Agreement on a Proposals have been made and have been
Resolution mechanism adopted.
Implementation Portal has been provided and is in regular use.
Agreement to Proposals have been made but must still formally
classification/processes be adopted by all stakeholders.

In order to ensure continual improvement, the above aspects should be considered in


subsequent projects.

11. eWQMS DIRECTION: NEW DEVELOPMENT CONSIDERATIONS

In addition to the above, and of equal importance, is the need for on-going collation and
prioritisation of system development requirements for the benefit of WSAs and the water
sector. It is envisaged that new features/functions will continue to be introduced on at least
a bi-annual (6-monthly) basis. Noted examples include the need for:

• Enhanced security
• Improved data management (e.g. data input scheduling, data submission tracking)
• Enhanced laboratory referencing of data
• Enhanced tracking of data source (e.g. laboratories, WSA vs. DWA vs. etc)
• Usability improvements (i.e. continue to make eWQMS easy and simple to use)
• Enhanced data/information displays and reports
• Enhance infrastructure components (e.g. classification of water/wastewater
treatment works, diagrammatically linking infrastructure components)
• Web-enablement of additional water system assessments
• Improved WSA based public website
• Enhance in-field measurements data capture (e.g. operational monitoring and
analysis methods and data entry via mobile phone application)
• Continuous alignment with current and new BDS requirements
• Improved data exchange between eWQMS, DWA systems and other systems
• Development or integration of other water services related modules (asset
management, human resources, operations and management, etc)
• Integration of new SANS 241 requirements

In order to ensure that the eWQMS remains relevant, the above aspects should be
considered in subsequent projects.

12. CONCLUSIONS AND WAY FORWARD

The independent review and assessment of the eWQMS-BDS highlighted a number of


deficiencies in current processes and practices including gaps in (i) data quality, (ii)
integration quality, (iii) development quality, (iv) delivery quality and (v) governance quality
Subsequently, both the eWQMS and BDS Teams have proactively addressed these gaps,
with the successful systems engineering process resulting in a number of new eWQMS
features implemented to further aid users with ensuring provision of credible water quality
data/information. Although significant positive progress has already been made, a number
of steps still need to be implemented to ensure that accurate data continues to be
available for accurately determining and regulating the water quality management
performance of WSAs. These will be addressed in the near future.

13. ACKNOWLEDGEMENTS

The entire South African Water Sector, including WSAs, DWA, IMESA, WRC, SALGA, etc
who provide valuable inputs to improve the eWQMS are thanked for their important
contributions to the continued success of the eWQMS initiative.

14. REFERENCES

1. Kahn, B.K., Strong, D.M. and Wang, R.Y. (2002) Information Quality Benchmarks:
Product and Service Performance, Communications of the ACM, Vol. 45 pp 184 – 192,
April 2002.
2. MBV Equsys (2009) Review of eWQMS and Its Interaction with DWQRS. Prepared for
Department Water Affairs and Emanti Management, October 2009.
3. PF de Souza, A Wensley, L Manus and E Delport (2009) Electronic Water Quality
Management System: New Developments and Direction. Paper presented at the 2nd
Drinking-Water Quality Management Conference, Port Elizabeth, 10 – 13 May 2009.

You might also like