Skip to content
AdminJul 29, 20218 min read

An update on efforts to standardize and democratize RWE: Q&A with Dr. Nirosha Mahendraratnam Lederer

As real-world evidence (RWE) adoption continues to ramp up across health care, researchers have increasingly recognized the importance of implementing standardized methods for evaluating and analyzing real-world data (RWD). In developing these standards and making them accessible to a wide range of researchers, regulators, payers, providers, and patients can feel confident that the RWE generated is high quality and credible enough to inform decision-making.

Nirosha Mahendraratnam Lederer, Ph.D., has recently been involved in several efforts to advance RWE standards, and to expand the knowledge base regarding when and how RWE can inform regulatory decisions. In an interview, she discussed three of these initiatives. 

Dr. Lederer is the Director of RWE Strategy at Aetion. She oversees Aetion’s engagements with the FDA and other government entities, advises clients on generating evidence for regulatory decision-making, and supports policy and standard-setting efforts. Prior to Aetion, Dr. Lederer led the RWE portfolio at the Duke-Margolis Center for Health Policy, and served in the U.S. FDA’s Oncology Center of Excellence.

Responses have been edited for clarity and length. 

Q: In a recent paper you authored with collaborators from the Duke-Margolis Center for Health Policy, you explored the use of RWE to support regulatory decision-making on the effectiveness of medical products. What were your findings? 
A: While RWE has great potential to complement clinical trial findings with additional evidence on medical product effectiveness, we’re still figuring out the optimal circumstances where it can effectively inform regulatory decision-making. In the paper we published in Clinical Pharmacology & Therapeutics, we aimed to identify and characterize instances in which RWE was included in submission packages to the FDA to support effectiveness. We identified 34 instances of effectiveness RWE submitted to the agency between 1954 and 2020; over half of which were for rare disease or pediatric populations, and one-fourth were for oncology indications. In addition, RWE was included in the product label 59 percent of the time. In instances where the submitted RWE studies didn’t contribute to the FDA’s decision, the agency cited data relevancy concerns and lack of pre-specification of study design and analysis plans as common issues. 

Overall, this work has shown us that in order to continue to realize the potential for RWE to support effectiveness decision-making—including in therapeutic areas beyond oncology and rare disease—we need standardized methods for evaluating and analyzing data. This will help ensure that the evidence submitted is regulatory-grade, especially as we continue to capture patient-relevant outcomes, and include them in research. 

Q: You also recently discussed democratizing patient-centered outcomes research (PCOR) in a presentation to the National Academies for Science, Engineering, and Medicine (NASEM). What is PCOR? And what role does RWE play in that space?
A: While there are many different types of RWD—from claims and electronic health records to data collected through mobile health applications—not all of it is necessarily patient-centered, and therefore neither is all of the RWE generated from it. PCOR, according to NASEM, “provides decision makers with objective, scientific evidence on the effectiveness of treatments, services, and other interventions used in health care.” It is about identifying and analyzing the research questions that are most relevant to patients. For example, we all know, from a clinical perspective, that keeping your blood pressure under control is important to lowering the risk of cardiovascular events. But for a patient who may not see a day-to-day impact of their blood pressure on their lifestyle, they would be more interested in learning how looking after their blood pressure affects their daily life: how difficult is it to take this medication, and how expensive is it? PCOR tries to address those questions by generating RWE to deliver insights that matter most to patients. 

Q: What role does data availability and access play in enabling or hindering progress in PCOR? 
A: There are two issues at play here. One is: are the data being collected? And if so, are they of sufficient quality? Data collection is a complex topic—it’s important to understand not only the data element of interest, but also how the data are collected, formatted, and stored. None of these factors are standardized, which can make it difficult to combine data and use them for research. 

The second issue is that sometimes we collect usable data, but it’s not accessible for research. This can be due to a number of factors, including individuals’ and organizations’ hesitancy to sharing data, as well as HIPAA laws. Patient privacy is of utmost importance, but HIPAA regulations have not modernized with innovations in research and technology. Therefore, when working with health care data, you have to be very careful about the patient information you provide, because you don’t want to be able to trace deidentified data back to an individual. In doing so, you often have to make tradeoffs between patient characteristics like geographic granularity, setting of care, and race—three of the most important variables in addressing disparities in quality of care. We must continue to address this challenge in order to truly unlock the power of RWE in PCOR. 

Q: What challenges exist to making high quality research methods accessible to a wide range of researchers? How do you recommend we overcome them to democratize PCOR? 
A: Deciding whether RWE is fit for the purpose of answering a given research question depends on the stakeholder—the levels of rigor required for a regulator might be different than those for another decision maker. To generate credible evidence, we need to have access to the right data and employ the right analytic techniques. And while there’s so much opportunity for RWE, we’ve also seen that with increased access to RWE research methods—without a similarly democratized approach granting access to the high quality tools, platforms, and resources needed to conduct these studies—comes an increase in the number of poorly conducted studies. 

Science shouldn’t be proprietary. If we could create and employ a “toolbox” of centralized resources for generating high quality RWE studies, it could unlock the potential for more researchers to not only conduct this type of research, but also to ensure that they do it correctly. These include tools for both conducting and evaluating research, and would guide researchers as they identify their research question, collect data, design the study, analyze the data, and publish the results. 

Q: Why is transparency important to advancing real-world PCOR? How can researchers ensure their studies are transparent?
A: Transparency is the best way to build credibility and trust in RWE studies. Because there are so many potential pitfalls to avoid, researchers and reviewers need to have a full view into all choices made during an analysis: what data sources were used, how were the data transformed, how was the study designed, what methods were used to analyze the data? 

It’s also important to preregister all studies publicly—for example, on ClinicalTrials.gov or the RWE Evidence Registry—to remove any concerns of cherry picking or manipulating results. This is included in the “publishing” piece of the PCOR toolbox, and is where RWE platforms are especially helpful to enabling transparency through auto-documentation of study design choices.

Q: What does a world in which RWE research is democratized look like?
A: Science is a repetitive field; you independently generate and evaluate ideas, reproduce, and replicate both independently and collaboratively to ensure that the result is real and accurate. By democratizing good RWE practices—and, in turn, making sure the RWE output is credible—we are helping drive towards higher value, more efficient, and better quality health care. This will move us toward a learning health care system in which all institutions have access to the proper methods and tools to generate useful evidence. 

One example of democratizing RWE has been the COVID-19 Evidence Accelerator, organized by the Reagan-Udall Foundation for the FDA and Friends of Cancer Research. Recognizing the rapid need for answers about COVID-19 and its interventions, this group of regulators, academics, data providers, biopharma organizations, technology companies, and others came together to share a common protocol, analyze the same research questions in parallel, and create a common space for scientific communication. The shared learning opportunity for different stakeholders to hear about data and methods they normally wouldn’t be exposed to helps to enhance the field for everyone, and similar collaborations will be key to democratizing research post-COVID. 

Q: What other collaborative efforts exist that aim to create shared learning environments for RWE research? 
A: One initiative that recently launched in this space is the Digital Health Measurement Collaborative Community (DATAcc), hosted by the Digital Medicine Society. This large, multi-stakeholder group aims to standardize approaches to measuring health using digital technologies, and, in turn, realize the potential of digital health measurement tools in driving improvements in outcomes for patients. Members include the FDA as well as other government, health system, medical technology, biopharma, and policy organizations.

Where there are so many different digital tools for health measurement—mobile apps, fitness trackers, and others—it can be difficult to standardize the data output for use in research. DATAcc will develop tools such as best practices, frameworks, and pilot projects to address questions about how to best leverage digital health: How will they complement current methods of data collection and measurement, or could they help us measure different outcomes than we’ve been able to before? Powered by the group’s collective expertise, DATAcc aims to enable stakeholders to analyze these data and use them to inform decision-making. Aetion is proud to be a founding member of this group, and I look forward to the learnings we reach together.

RELATED ARTICLES