Data Integrity
Data Integrity
Data Integrity
Arjan Timmerman
Complete data
Default Strings are key to using same comments and proper reasons WHY
Reasons WHY for ~23 categories to be filled (2 categories are for Sign Off)
It helps the user identify the reason WHY a change was proposed
Reasons WHY are hard to write down!
– Remember the who, what and when is already capture by the system
System
Owner
Department
Analysts
Manager
Regulated
Company
Vendors /
Reviewers
Consultants
Quality IT
Use an overview
sheet to identify
the differences
Use an overview
sheet to identify
the differences
Use an overview
sheet to identify
the differences
Use an overview
sheet to identify
the differences
Over what time period / which projects would you need to compare data –
stability projects ?
The privilege to lock and unlock channels are separate so control of when
results are reprocessed can be controlled.
Different locks are possible
Contain
User Types Privileges
Are assigned to
Projects
Access to
Users as owner Systems
Nodes
Are part of
Access to
User Groups as users own type
Make sure the privileges for creating, adjusting and copying methods is set
properly
Data is
Result(3)
Result Reported
Sets
Sample Result(2)
Project A Set
Result Set Result(1)
A well defined SOP for processing will help, but does that project show all
the data?
Data is not
Initial reported or
Project B Sample Set Result Set Result(1)
addressed
Repeat
Result Sets Result(1)
Sample Set
Sample Set
Result(2)
Result Set
Result(3)
Data review is more effective with an outlined process and determine if there
is a problem
©2017 Waters Corporation COMPANY CONFIDENTIAL 32
Acquiring Samples SOP
FDA are being trained that multiple results indicate that users are trying to
reintegrate into acceptance.
However, this conclusion can only be confirmed by looking at the actual
integration for each iteration
– Good documentation of “why” you reprocessed is essential
– Getting it right first time, all the time, is unrealistic
o If it data looks too good, it probably is
1.249
1.249
0.05 0.05
2.132
2.132
0.04 0.04
0.03 0.03
AU
AU
0.02 0.02
0.01 0.01
0.00 0.00
0.5 1.0 1.5 2.0 2.5 0.5 1.0 1.5 2.0 2.5
Minutes Minutes
1.249
1.249
0.05 0.05
2.132
2.132
0.04 0.04
0.03 0.03
Good Not Good
AU
AU
0.01 0.01
0.00 0.00
0.5 1.0 1.5 2.0 2.5 0.5 1.0 1.5 2.0 2.5
Minutes Minutes
1.249
1.249
0.05 0.05
2.132
2.132
0.04 0.04
0.03 0.03
Good Not Good
AU
AU
0.01 0.01
0.00 0.00
0.5 1.0 1.5 2.0 2.5 0.5 1.0 1.5 2.0 2.5
Minutes Minutes
1.249
1.249
0.05 0.05
2.132
2.132
0.04 0.04
0.03 0.03
Good Not Good
AU
AU
0.01 0.01
0.00 0.00
0.5 1.0 1.5 2.0 2.5 0.5 1.0 1.5 2.0 2.5
Minutes Minutes
1.249
1.249
0.05 0.05
2.132
2.132
0.04 0.04
0.03 0.03
Good Not Good
AU
AU
0.01 0.01
0.00 0.00
0.5 1.0 1.5 2.0 2.5 0.5 1.0 1.5 2.0 2.5
Minutes Minutes
1.249
1.249
0.05 0.05
2.132
2.132
0.04 0.04
0.03 0.03
Good Not Good
AU
AU
0.01 0.01
0.00 0.00
0.5 1.0 1.5 2.0 2.5 0.5 1.0 1.5 2.0 2.5
Minutes Minutes
1.249
1.249
0.05 0.05
2.132
2.132
0.04 0.04
0.03 0.03
Good Not Good
AU
AU
0.01 0.01
0.00 0.00
0.5 1.0 1.5 2.0 2.5 0.5 1.0 1.5 2.0 2.5
Minutes Minutes
Why?
– To uncover possible cases of fraudulent behaviour
o Multiple processing data
o Altering metadata to make results pass
o Hiding or altering meta data on reports sent to QA
o Uncovering persistent suspicious behaviour around security of data
o Ensuring only authorised users have access to certain functionality
o Deletion of data
o Altering system policies /configuration / settings without change control
procedures
Biggest issue: Audit trails generally may often be more a log of all activity
(to comply) and are not always designed for easy review
– But it is expected that inspectors will look at the audit trails
– Adding Audit trails to reports is not practical or sufficient
Archived
Calibration Created by
System Audit Instrument Curves Empower
Trail
•Audit trail
•Compare
•Acquisition Log
Individual Verify against
results e-data
Processing
•Audit trail
•Compare
What was
Manual results
approved
Display links in
All calculated
Review and peak values
Result Audit Viewer
Archived
Calibration Created by
System Audit Instrument Curves Empower
Trail
•Audit trail
•Compare
•Acquisition Log
Individual Verify against
results e-data
Processing
•Audit trail
•Compare
What was
Manual results
approved
Display links in
Using View
As…. All calculated
peak values
Audit trails tell us WHO did WHAT, WHEN and WHY (when defined by
the user comments)
They have two primary purposes:
– Give a history to the data, to help decide if it can be trusted
– They should deter wrongdoing
o Without review, they are not a deterrent
Make sure the privileges for creating and adjusting Reports is set properly
Having everybody creating reports...
– What is shown on the report?
– Do you rely on these results?
– Are you using electronic signatures?
– Do you need to insert audit trails?
Guidance on Backup
– EU and PIC/S Annex 11 describe in: Section 7 Data Storage
– MHRA guidance describes: in Data Retention section, Backup
– FDA guidance describes: in Question 1 (a)
– WHO guidance describes: in Annex 5 Retention of Original Records
©2017 Waters Corporation COMPANY CONFIDENTIAL 63
Do you Archive your Empower data?
Guidance on Archive
– EU and PIC/S Annex 11 describe in: Section 17 Archiving
– MHRA guidance describes: in Data Retention section, Archive
– FDA guidance describes: in Question 1 (a), a bit cryptic
– WHO guidance describes: in Annex 5 Retention of Original Records
Inspectors want to see that you have implemented the controls that
Empower provides for you
– Unique Usernames for audit trails
– Default strings for reasons WHY you change objects
– Password expiry and history
– Limited access to delete objects in the database
– Audit Trail review
Outside Empower procedures are as important
– Training
– Daily Backup of data & Long Term Archiving
– SOPs for Acquisition, Review, Reporting etc.
Validation of the entire system, including software to demonstrate
“ fit for intended use” based on a clear URS is a key aspect
– Including a clear Change Control procedure
Component Names
RRT x.yz
Custom Fields
Instrument Methods
Method Sets
Plot Scaling
Processing Methods
Reporting Methods
Results
Sample Information
System Policies
Systems
User Groups
Users