A Study Among Key Stakeholders on the Use of Real World

A Study Among Key Stakeholders and the Use of Real
World Data and Evidence in Clinical Research
February 15, 2017
Clinical Innovation Seminar
Mary Jo Lamberti, PhD
Senior Research Fellow
Tufts CSDD
About Tufts CSDD
•
An independent, academic, non-profit research group at Tufts University
•
Our mission is to develop strategic and actionable information to help stakeholders in the
research-based drug development industry improve quality, efficiency and performance
Primary Objectives
•
•
•
•
To monitor and report on the development, regulation, and utilization of new drugs and
biopharmaceuticals.
To explore the economic, legal, scientific, and public policy issues affecting pharmaceutical
and biopharmaceutical innovation worldwide
To raise the level of national and international debate on issues related to new drug and
biopharmaceutical product development and regulation.
To hold forums and educational programs that bring together the perspectives
of government, industry, academia, and the public health community.
Real World Evidence Study
•
Real world evidence is becoming increasingly important to drug development
and patient safety and has recently been cited as a significant priority of the
FDA.
•
Currently, there is little to no data available characterizing sponsor and
contract service provider current and planned uses of real world evidence to
support development and post-marketing safety applications.
•
Tufts CSDD is gathering insights on how the industry is responding to this
challenge including:
– Priority areas of investment
– Impressions of the effectiveness of current and emerging technology
solutions
– The potential impact of incorporating new technologies involving data
from wearables and other sensor devices, mobile health, social media and
big data
3
Real World Evidence Study Aims
• The aims of this study include an examination of the current
and planned uses of:
– Real world evidence and data
– Operational approaches
– Return on investment measurements
• Characterize current real world data uses, sources of data, and
how data is being integrated.
• Characterize planned and anticipated future sources and uses
of data
• Capture value measures and impact of real world data on
development practices and performance
4
Interview Methodology
• Tufts CSDD conducted interviews with 19 respondents from
five stakeholder groups across 16 organizations
• Respondents held various roles and responsibilities
– The majority occupied director level positions and above
• Respondents represented the following stakeholders:
– Regulators
– EHR Vendors
– Payers
– Health Systems
– Patient Advocacy Groups
5
Definitions Provided
• Real World Evidence is derived from real-world data sources which
include:
– Electronic health records (EHRs) used within provider settings
– Laboratory information systems
– Pharmacy and radiology systems
– Administrative claims systems and registries
– Patient-generated data captured on home-based and wearable
monitoring devices
– Patient information-sharing networks and social media
• Real World Data is gathered from sources outside of randomized
controlled trials reflecting the actual experiences of patients during
routine patient care.
6
Interview Results Summary
• No common definition of real world evidence/data.
• Largest challenges reported were lack of standardization of data,
data privacy, and issues with using unstructured data.
• 3 out of 5 stakeholders participated in federated data efforts to
support real world evidence such as FDA’s Sentinel database.
– Regulators, Payers, and Health Systems
• 3 out of 5 stakeholders enabled deterministic linkage to other data
sources (e.g. administrative claims data and EMR data)
– EHR Vendors, Payers, and Health Systems
• Most valuable insights were gathered from stakeholder use cases.
7
Use Cases: Regulators
•
FDA Use Cases
– Safety label changes
– Manage persistence of care (treatment)
• EMA Use Cases
– Effectiveness studies
– Safety cases
•
TGA Use Cases
– Pre/post market
• Linking prescribing and hospital data for more than 200,000 patients (anti-epileptic
drugs in pregnancy)
― To support product registration in alpha 1- antitrypsin deficiency (rare disease)
– To develop joint registry on hip replacement -medical devices
– Change in scheduling (classification of drug)
• Xanax for panic disorder
– Data from police reports to examine effect of specific benzodiazepines - narcotic levels
– Data from coronial databases (reported deaths) to look at overdoses
– Funding
8
Use Cases: EHR Vendors
• Integration with other types of data for predictive analytics:
– Managing populations
– Predicting potential outcomes
– Giving patients tools to help their well-being/treatment
decisions/options
• “Use of EHR data to support supplemental pragmatic studies
for drugs and devices”
• “For example, a specialty medicine, a pulmonary hypertension
agent, was approved two years ago and was getting covered.
Then three years later they spent money to find out the RWE
and costs as there is more competition. So it takes competition
for engagement.”
9
Use Cases: Payers
• Comparative effectiveness is the biggest use case.
• Cost data most critical for coverage decisions
– “Would be used in payer standpoint.”
– Focus on therapeutic areas where costs are highest
(biologics, oncology, rheumatology, cardiovascular,
diabetes)
10
Use Cases: Health Systems
• Use data to improve patterns of care at hospitals
• Opiates Example
– Assessment of two competing drug therapies for opiate
addiction for operational improvement purposes
11
Use Cases: Patient Advocacy Groups
• RWE was used to support drug approval
– FDA recently granted accelerated approval for first drug for
Duchenne muscular dystrophy based on draft guidance
and benefit/risk data from Parent Project Muscular
Dystrophy organization (contingent on conducting a clinical
trial)
– “Will hope it will lead to other approvals by regulatory
agencies.”
12
Survey Methodology
• Tufts CSDD is currently conducting a web survey across 75-100
global pharmaceutical, biotechnology companies and CROs
• Survey on use of real world evidence :
– Examines current data uses, sources of data, and how data
is being integrated.
– Benchmarks operational support of real world data use
(e.g., functions, personnel, roles and responsibilities, skill
sets)
– Measures of return on investment/performance areas
impacted by real world data use
13
Topic Areas - Survey
•
•
•
•
•
•
•
Functions/roles/skills perceived as most critical
Budgets – current and future to support function
Use of technologies, standards/models
Measures of return on investment/impact of real world data
Use of real world evidence to support label changes
Data sources to support NDA
Types of partnerships important to current and future uses of
real world evidence
• Significant challenges perceived with use of real world
evidence
14
Sample Questions - Survey
• Have you created a RWE function within your organization?
• Approximately how much have you budgeted per year to support this
function?
• Where do you foresee the highest potential return on investment of real
world evidence occurring in the next year? In 3 years?
• Do you have direct experience with regulatory agencies accepting
observational data for label changes? In what cases have they accepted
observational data?
• Have you ever used one of the following sources of data in support of a
New Drug Application to a regulatory authority? (Current vs.
Planned/anticipated use of data in three years)
– (claims data, EHR clinical data, social media data, etc.)
• Which of these is your organization routinely buying now? Planning in
three years?
• What are the most significant challenges perceived with the use of real
world evidence?
15
Tufts CSDD Data on mHealth
• Tufts CSDD – DIA conducted a study quantifying the
adoption and impact of patient centric initiatives
• Study examined key primary areas of impact:
 Return on engagement
 Mapping the landscape of patient centric initiatives
 Management strategy and practices
 Guidance and frameworks
• Additional slides available on DIA website
(http://www.diaglobal.org)
16
mHealth Definitions
mHealth technologies include wearable devices; apps for
clinical data collection; smartphone application; text messaging
– i.e. mobile phones, patient monitoring devices, tablets, smart phone
apps and other wireless devices
– Case studies do not include telemedicine
17
Metrics Collected Focus on…
Metric Theme
Patient Adherence to Device…
25
5
1 1 32
Patient Reach
19
5
24
Patient Sentiment / Engagement
14
7
1 22
Other
4
9
6
19
Cost
7
3
5
1 16
Device Feedback
10
1 11
Other:
•
Study protocol concerns (patient
Differences in Clinical Outcomes
10
1 11
blinding; others using device;
Data Quality
3
6
9
objectivity of collected data)
•
Legal barriers (e.g. data ownership)
Case Study Not Generalizeable
7
7
•
Regulatory barriers
Real Time Disease Management
7
7
•
Site sentiment
•
Screen failure rates
Disease Understanding
5
1 6
•
Using popular apps and devices
Patient Privacy
5
5
Study Timing
4
4
IRB Concerns 1 1 2
Number of Metrics / Qualitative Feedback Collected
Internal Resistance
2 2
Quantitative
Qualitative: Benefit
Qualitative: Challenge
Note: number of metrics add up to 177 and not 163 because many qualitative feedback statements spanned multiple themes.
18
Reported Costs
• Cost varies by sophistication of app and wearable device (n=16)
– $70- $250 per wearable device
– ~$30K for bare-bones app development
– ~$45-50K total cost (pilot studies)
• Four studies report cost savings; two report 50% cost savings
– Case study one: no wearable used; using Apple Research Kit
– Case study two: reduction attributed to remote monitoring and
mHealth
• Five studies report high costs due to:
– Programming
– IRB fees
– Cost of handing data
19
Summary
• Two Tufts CSDD initiatives assessing industry use ‘big data’ and
integration of data sources
• No consistent definition for RWE; challenges reported include
patient privacy and data standardization
• mHealth case studies positive overall (stronger patient adherence,
patient engagement and patient reach); challenges reported
include patient privacy, cost, and ability to generalize findings
across all drug development
20
Thank You
Mary Jo Lamberti, PhD
[email protected]