SCOPE Joint Action Stakeholder Event

SCOPE Joint Action
Stakeholder Event
WP4 Tools for Measuring and Improving
Quality of Reports in National ADR
Databases
Adriana Andrić, Petar Mas, HALMED
20 - 21 March 2017
London
We are going to talk about…
• The importance of measuring and improving
quality of reports
• Steps for establishing quality assurance/internal
audit procedure
• Supplementary tools
What is quality?
Quality: degree to which a set of inherent
characteristics of an object fulfils requirements
• can be used with adjectives such as poor, good or
excellent
• “inherent”, as opposed to “assigned”, means existing
in the object
ISO 9000:2015
Why Quality Assurance
Matters?
Only reports of good
quality can produce
reliable signals!
Aim of audit activities
MEDICINE RISKS
PHARMACOVIGILANCE
AUDIT ACTIVITIES
should verify (…) the
appropriateness and
effectiveness of the
implementation and
operation of a
pharmacovigilance system*
* Guideline on good pharmacovigilance practices
Module IV – Pharmacovigilance audits (Rev 1)
Where does quality
assurance fit?
Quality
Assurance
Reporter
Receiver
Database
• HCP (phyisician,
pharmacist, nurse, etc.)
• Non-HCP (patient, lawyer,
family member, etc.)
• Regional PhV centre
• National PhV centre
• MAH
• Local database
• National ADR database
• EudraVigilance
• VigiBase
Awareness
Signal
detection
•
Aims to provide an overview of tools
and to encourage NCAs to use these
tools in their databases
•
Target audience: primarily PHV staff
working at EU NCAs
•
The document covers:
1. Introduction
2. SCOPE survey results
3. Procedure for monitoring and
improving the quality of reports in
National ADR databases
- MHRA case study
- Checklist
4. Supplementary tools
- EudraVigilance feedback report
- VigiGrade completeness score
- Clinical documentation tool
5. Conclusions
Quality auditing:
MHRA case study
• Quality auditing of ADR reports was introduced in
2007 coinciding with the launch of the Sentinel
database
• Data processing by ~18 staff
• 3 work flow steps:
– 1) validity
– 2) data entry
– 3) quality assurance
Courtesy of Charlotte Goldsmith, MHRA
Quality auditing:
MHRA case study
• 100 direct cases monthly
– All fatal reports
– Made up of patient and HCP reports
•
•
•
•
Audited by senior staff
Errors discussed at meeting
Results and statistics calculated
Feedback to teams
Courtesy of Charlotte Goldsmith, MHRA
Increasing error seriousness
MHRA case study:
Error categories
A
Influences:
• Signal detection
• Published data
• Confidentiality
Examples:
• Incorrect/missing drug
• Incorrect/missing reaction
• Patient name in narrative
B
Influences:
• Signal assessment
• Information provided
to MAHs in ICSRs
Examples:
• Incorrect/missing drug details –
form, route, dose etc.
• Incorrect reaction outcomes
• Incorrect reporter type
C
Influences:
• Administrative aspects
• Neatness of ICSR
record
Examples:
• Spelling and grammar mistakes
• Database flags missing
• Reporter title missing
Courtesy of Charlotte Goldsmith, MHRA
MHRA case study:
Statistics
• Errors per person
Number of errors
Number of errors
• Error type
• Error reason
– Interpretation
– Omission
– Procedural
– Electronic
– Paper
B error
C error
January
February
March
April
Month
Distribution of errors between reports
Number of reports
• Report type
A error
Total cases audited
Type C
Type B
Type A
Report type
Courtesy of Charlotte Goldsmith, MHRA
MHRA case study:
Feedback
Courtesy of Charlotte Goldsmith, MHRA
•
•
•
•
Report circulated
Recommendations provided
A and B errors reclassified
Guidance manual updated
MHRA case study:
Improvements over time
Error reason
Types of reports
• Gaining a better
understanding of
why errors were
occurring
• Indication of the
route cause of the
error
Courtesy of Charlotte Goldsmith, MHRA
• Official role out of
patient reporting in
2008
• Integration of Yellow
Card reporting into
clinical systems
Vigilance Competency
Framework
• VCF was introduced
in 2012
• high quality reports
is one of the
criteria needed to
progress through
VCF
MHRA case study:
Summary
• Quality auditing enables us to have confidence in our data
– Internal: signal assessment
– External: academic research and data provision
• Allows us to have a quantitive measure of quality to aid
individual development
• Measures the error reason along side report type
• Monthly report provides transparency across the whole team
• Resource saving- aids efficiency in ADR processing also
resulting in a reduced number of MAH queries
Courtesy of Charlotte Goldsmith, MHRA
Checklist: How to establish
the procedure?
•
•
•
•
•
•
•
•
•
•
•
Aim/purpose, scope, responsibilities for and frequency of performing the review
Criteria for selecting the sample of reports for review
Guidance for reviewers and additional references to be used for reviewing the
reports
Error classification
Allocation of reports and the timelines for review
How the decision on classification of error and/or overall quality review should be
made
Types of recommendations for improvement
Templates for recording the errors and the overall review
How the report results should be fed back to assessors
How corrections to the ICSRs and other recommended improvements are
implemented and reviewed
Describe the entire process in an SOP
Topics for discussion
• What should be the criteria for selecting the sample of
reports and who should be reviewing the reports?
• How should errors be classified?
• How the audit results could be used to improve the
processing of reports?
• How to manage resource effectively ensuring the impact
is minimised?
Supplementary tools
• EV Feedback Report
• vigiGrade Completeness Score
• Clinical Documentation Tool (ClinDoc)
EV Feedback Report
• 10 reports produced each month for all organisations which send
reports to EV
• No specific timeframes for producing the reports
• The case narratives and free text fields checked for any information that
should be provided in the structured fields – internal consistency and
correct coding
• Due attention given to correct coding of the medicinal product and
active substance names and adherence to expedited reporting
timelines
• A report provided summarising the review, its outcome and potential
findings including a list of suggested actions for improvement
• The organisation under review requested to review the report and
invited to send their comments back to EMA
• A corrected follow-up version of an ICSR should be submitted ASAP
vigiGrade Completeness
Score
• Accounts ten dimensions inside ICSR with different levels of importance:
– Essential
– Important
– Supportive
• Can range from 0.07 to 1 and is calculated from several field scores
• Calculated for each ICSR but usually given as an average number for all
ICSRs submitted from one country over time
• Can be compared with other time intervals or with other countries
• Not a direct indicator of quality of data processing
• Sudden unexpected drop in CS indicative of possible systematic errors
• Can be provided for different subgroups of ICSRs (e.g. ICSRs with
company IDs and ICSRs with authority numbers)
• Expected to improve with development of natural language processing
techniques
vigiGrade Completeness
Score
Dimensions of vigiGrade Completeness Score
Dimension
Time-to-onset
Description
Time from treatment start to the suspected ADR
Indication
Indication for treatment with the drug
Outcome
Sex
Age
Dose
Country
Outcome of suspected ADR in the patient
Patient sex
Patient's age at onset of the suspected ADR
Dose of the drug(s)
Country of origin
Primary reporter
Occupation of the person who reported the case
(e.g. Physician, Pharmacist)
Report type
Type of report (e.g. spontaneous report, report from
study, other)
Free text information
Comments
ClinDoc
• Aims to assess the clinical documentation of ICSRs
• The unit of analysis is an ICSR and assessment is performed case-bycase
• Four domains for assessment: ADR, chronology, suspected drug,
patient characteristics
• The assessor indicates which subdomains are relevant for assessment
of the specific ICSR and afterwards indicates if this information is
present or not
• The score given to each domain is the proportion of information present
in relation to the information deemed relevant
• Final score based on the average of the percentages scored per domain
and is categorised as: poor, moderate, well
• Can be used to compare the clinical quality of reports between
different reporting groups or different means/sources of reporting
Take home messages
• Measuring and improving quality of reports is
important
• It should be done through regular quality
assurance/internal audit procedure
• Supplementary tools can also be used
Only reports of good quality
can produce reliable signals!
Questions?
Contact:
[email protected]
[email protected]