Skip to content
AdminApr 19, 20224 min read

RWE Guidance: NICE’s RWE Framework

This month, the UK’s National Institute for Health and Care Excellence (NICE) released its draft Real-World Evidence (RWE) Framework for public consultation. This RWE guidance is part of NICE’s larger five-year strategic plan, which focuses on the use of RWE to close evidence gaps, improve NICE’s decision making, and help get innovative health technologies to patients. The RWE Framework “describes best practices for planning, conducting, and reporting RWE studies” and is one of the most thorough guidelines released by a major stakeholder to date. 

Here, we will outline the key takeaways from the guidance and discuss how NICE’s RWE Framework fits into the larger push toward comprehensive RWE guidance

Key insights from NICE’s RWE Framework 

Separate best practices for descriptive studies vs. comparative effectiveness studies 

There are many use cases for RWE in health technology assessments (HTAs), and NICE specifically separates best practices for comparative effects RWE studies from other quantitative RWE studies (e.g., descriptive studies like utilization or natural history of disease, etc). While there is a lot of overlap in best practices, comparative effectiveness RWE studies have more nuanced considerations and likely require a higher level of rigor. NICE dedicates a substantial portion of the Framework to this topic. 

NICE created a Data Suitability Assessment Tool (DataSAT) 

An important consideration for creating any RWE study is data fitness for purpose (i.e., data quality and relevance to the research question). NICE expects RWE researchers to transparently justify the selection of the real-world data (RWD) source either through the use of previously published frameworks, like Structured Process to Identify Fit-for-Purpose Data (SPIFD), or through the use of NICE’s DataSAT. NICE recognizes that tools like SPFID can help researchers both identify and justify the selection of the dataset. However, the DataSAT is focused on justifying the selection of the dataset to NICE’s committees that will be evaluating the study. Similar to previous guidance from the FDA and EMA, the DataSAT focuses on provenance, data, quality, and data reliance. 

Best Practices for Comparative Effectiveness RWE Studies 

NICE recommends that RWE study developers follow the target trial approach when considering the design for an RWE study. Whenever possible, a new user design should be used to limit selection bias. A modified intent-to-treat (i.e., as-started) and as-treated causal effects should be estimated. Confounders should be identified through published literature and expert opinion, and should be controlled for by using a variety of methods with propensity score matching or multivariate regression preferred when many potential confounders exist. Sensitivity analysis should be used to examine the robustness of the results and multiple dimensions of the study should be varied (e.g., follow-up time, model specifications, use of covariates, algorithms for defining outcomes and covariates, etc). Transparency and reporting is a key component of NICE’s best practices and includes pre-registration of the study protocol, justification of why an RWE study is needed, justification of the dataset selection, reporting study design diagrams, and clearly reporting methods and results so end users can understand and potentially replicate the study. At Aetion, we agree that these best practices are critical to generating reliable RWE.

How NICE’s RWE Framework fits into existing stakeholder recommendations and what gaps continue to exist 

NICE’s guidance is released on the heels of other newly released guidance from MHRA, FDA, HAS, and EMA. Guidance from regulators has mostly focused on RWD quality while HAS’s guidance is a high-level overview on conducting RWE studies specifically for drug/device re-evaluation. 

NICE’s guidance is the most detailed RWE guidance to date and attempts to fill multiple gaps in prior recommendations for RWE generation. See the table below on how NICE’s RWE Framework matches up to our previously published pillars of comprehensive RWE guidance. 

Pillars for Comprehensive RWE guidance NICE’s RWE Framework
Minimally sufficient justification/explanation of the RWE element (e.g., data quality, fit-for-purpose study design) While the Framework does a good job of defining RWE study elements, minimal criteria for acceptance is not included. For example, NICE’s focus on data quality lists completeness as essential, but does not quantify what level of completeness is typically sufficient. 
A step-by-step process for researchers to follow to meet decision-maker expectations In most cases, NICE attempts to outline high level step-by-step considerations for study design, justifying fit-for-purpose data selection, analytical methods, and transparency and reporting. However, more detailed processes are necessary for comprehensive guidance. 
Checklists/tools to allow researchers to cross check their work  The RWE Framework does a good job of  referencing previously published recommendations and tools that researchers should follow (e.g. STaRT RWE, SPIFD, ROBINS-I). NICE also created DataSAT for researchers to document their justification of data source selection. However, not all RWE study elements have included checklists, so additional work is needed here. 

NICE’s RWE Framework is a “living guidance” and will be updated as RWE recommendations and methodologies evolve over time, and as demonstration projects help NICE further refine their expectations for “high quality” RWE. 

What NICE’s RWE Framework means for biopharma With the flurry of activity around development of RWE guidance, NICE’s RWE Framework is a great resource for researchers developing RWE studies, especially comparative effectiveness studies. Similar to FDA, NICE encourages open dialogue with manufacturers planning to submit RWE; we all learn by doing. Transparency and traceability are major themes throughout NICE’s guidance, so manufacturers should be prepared to provide as much documentation and justification of each step in the RWE study design and execution process as possible—for example, by using a validated and transparent RWE platform to run the analyses.

RELATED ARTICLES