Pink Sheet is part of Pharma Intelligence UK Limited

This site is operated by Pharma Intelligence UK Limited, a company registered in England and Wales with company number 13787459 whose registered office is 5 Howick Place, London SW1P 1WG. The Pharma Intelligence group is owned by Caerus Topco S.à r.l. and all copyright resides with the group.

This copy is for your personal, non-commercial use. For high-quality copies or electronic reprints for distribution to colleagues or customers, please call +44 (0) 20 3377 3183

Printed By

UsernamePublicRestriction

PCORI Should Consider Treatment Variations, Financial Consequences In Research Prioritization – Wilensky

Executive Summary

Project HOPE’s Gail Wilensky proposed two key parameters for prioritizing research during a recent workshop hosted by the Patient-Centered Outcomes Research Institute: variations in treatment across the nation and financial consequences of treatment. Other speakers also offered advice on prioritization, while PCORI staff revealed challenges faced in the prioritization process.

As the Patient-Centered Outcomes Research Institute begins its process for prioritizing targeted research projects, Project HOPE Senior Fellow Gail Wilensky proposed two aspects that should be considered in prioritization: variations in treatment across the nation and the financial consequences of treatment choices.

Speaking Dec. 5 at the PCORI-sponsored “Methodology Workshop for Prioritizing Specific Research,” Wilensky suggested looking at treatment variation as one of the prioritization criteria, such as how treatments for a particular condition change based on things such as treatment setting or geography. Wilensky is a former administrator of the Health Care Financing Administration, now known as CMS.

“Those variations have consequences,” Wilensky said. “They can have consequences in terms of patients’ outcomes … and presumably they will have consequences also in terms of the economic implications of these alternatives.”

When talking economic implications, Wilensky was not necessarily referring to cost-effectiveness research, but understanding how economic conditions can factor into variations in treatment of conditions in different settings, and understanding that would help bring about smarter spending on health care.

“I will leave it for you to decide how to weigh those [criteria], but at the end of the day, if what you’re looking at can’t pass those general tests … then you need to think hard, even if it’s a popular area, why you are giving limited resources” to fund research, Wilensky said.

PCORI Finds Process Challenges

Wilensky’s comments followed a presentation by PCORI Senior Scientist Rachael Fleurence, who discussed the results of a pilot aimed to test a prioritization process that will be used as the institute more actively engages patients and other stakeholders to generate targeted research subjects (Also see "PCORI To Seek Patient/Stakeholder Input In Generating Future Research Topics" - Pink Sheet, 1 Oct, 2012.). The institute launched its own targeted research agenda at its November meeting, choosing three specific topics with two more yet to come, but those topics were chosen using an internal process separate from the pilot being discussed (Also see "PCORI Initiates A Targeted Approach To Research At November Meeting" - Pink Sheet, 3 Dec, 2012.).

The pilot process itself took 35 volunteers spanning a range of stakeholder categories: patients, caregivers, industry and many self-identified as wearing multiple stakeholder hats. They ultimately were asked to rank 10 proposed research topics. Participants were given a “topic guide” that offered more details about the 10 topics: indoor air pollution, obesity, back pain in the elderly, prostate cancer, falls in the elderly, antipsychotics in young adults, Clostridium difficile, breast cancer, coronary disease and effectiveness of treating multiple chronic conditions, a process that proved to be a challenge all on its own.

Fleurence described the process of assembling the topic briefs as a “humbling exercise. … A lot of our pilot members struggled with the level of consistency between the different topic briefs and how we operationalized the criteria to actual examples. This is no easy task. I think that’s definitely where a lot of our work and our efforts will have to focus on, is how to provide good information to a diverse group that’s going to be working on thinking about what research questions really matter to PCORI.”

Effectively engaging stakeholders, particularly patients, has been a consistent challenge throughout PCORI’s existence, including in the application review process (Also see "Patients Could Play A Role In PCORI Research Application Reviews – Exec Director Selby" - Pink Sheet, 7 May, 2012.).

The volunteers were split into two groups. The first used two separate software programs, Survey Gizmo and Expert Choice, to prioritize the 10 topics, while the second group used only the Survey Gizmo program. Survey Gizmo simply allowed users to rank the topics, while Expert Choice allowed for more in-depth rankings, taking into account weighting based on the criteria established in other PCORI funding review processes.

The group tasked with prioritizing using two software programs came up with identical top and bottom topics through both programs: treatment of coronary artery disease and indoor air pollution interventions, respectively. The use of both programs also came up with an identical rank for the second prioritized item, studying biomarkers for the prevention of breast cancer, as well as the sixth topic, the efficacy of antipsychotics in adolescents and children.

For the group using only Survey Gizmo, other than indoor air pollution interventions coming in last on the prioritization list, none of the other rankings lined up with either list generated from the group using both software programs.

The differences between the three generated lists echo concerns raised at the November PCORI board of governors meeting where three targeted research projects were approved from a list that used an internal prioritization process. Board member Harlan Krumholz, Yale University, suggested in that if the process used to generate the three approved topics were run through a second time, it was likely that a different set of final topics would result.

Fleurence noted that PCORI’s next steps will be to take feedback from the pilot to tweak the process and prepare it for implementation by the institute’s advisory panels. Those panels will include patients and other stakeholders and will make recommendations to the board regarding which topics should be researched.

Other Prioritization Recommendations

The workshop also served as a forum for interested stakeholders to offer up additional recommendations to PCORI staff related to prioritization.

Bobby Dubois, chief scientific officer at the National Pharmaceutical Council, offered a series of recommendations. First, he noted that prioritization results will depend on how criteria used to make the selections are weighted, whether an emphasis should be placed on patient-centeredness, disease burden or other criteria that are used in the prioritization and selection process. He suggested the board play a role in setting those weights.

Second, Dubois suggested being very transparent throughout the prioritization process so when someone submits a research proposal, they know how it gets ranked as it moves through the process. If an application was good but didn’t make the cut, they would have the opportunity to tweak and re-submit a proposal to make it more relevant.

Third, he called for testing the reliability of the process, especially as membership of the advisory panels evolves and changes over time. Fourth, Dubois noted that the process of putting together 10 briefs for the pilot process took a few months, and said that the scale needed for prioritization of actual research topics will require an “industrial strength” process.

Dubois’ fifth point was a suggestion that in the prioritization process, the cost of conducting research needs to be a consideration in the prioritization process, so those engaged know how to handle a high-profile topic that might require a lot of resources and potentially take away from other projects versus one that might not be as exciting but would cost less and potentially have a similar impact. Also on the topic of cost, he noted that time costs, i.e. the time it will take to complete a study, should be a prioritization consideration.

Finally, he suggested that PCORI engage in what he called “meta-prioritization,” such as making it clear what percentage of the institute’s resources are going to be dedicated to rare disease research versus conditions that affect a wider population and what percentage of research will be investigator driven versus stakeholder driven.

Veronica Goff, vice president at the National Business Group on Health, also made a time recommendation, but hers was focused on prioritizing research that can get studies produced quickly with “actionable results.”

She also encouraged PCORI not to undervalue the communication and dissemination of research, but to be “fearless” in how it considers potential barriers to implementation, suggesting that PCORI not worry about whether consideration of payment models will have an impact on research findings.

Wilensky made a similar point, noting that some studies, no matter how much buy-in and prioritization input is received from patients and stakeholders, will be controversial, especially if the results go against conventional wisdom.

“You need to be sure that you are able to say, ‘We have input from affected parties,’” she said, emphasizing that all affected parties, not just patients, need to lend their voice. “How you disseminate your findings will be very important.”

Sally Morton, chair of the Department of Biostatistics at the University of Pittsburgh Graduate School of Public Health, said that transparency is key to credibility and specifically highlighted the release of rating data so it is clear how the topics were rated. She also said that PCORI needs to understand and figure out how to address variability within the prioritized rankings set by reviewers.

To illustrate that point, she offered a hypothetical group of nine raters looking at two projects. For project A, all nine raters scored the project a 15, giving an average ranking score a 15. For project B, three raters scored the project zero, three scored it 13 and three scored it 30, for an average ranking score of 15. The final scores may be identical, but how they were reached is very different, so that is something PCORI needs to account for as it refines the prioritization process.

Despite some challenges identified throughout the workshop, PCORI Executive Director Joe Selby expressed optimism that, based on the feedback received, “we are, in fact, going to be able to take those key concepts and put them into measures … that preserve value of those ideas, but in a clearly, more tenable package.” He predicted that the prioritization process would be a continually evolving one, but ultimately it would be the “most ambitious and most successful prioritization process on the planet because it really does go through the very hard work of considering topics from such a broad range.”

Latest Headlines
See All
UsernamePublicRestriction

Register

PS054960

Ask The Analyst

Ask the Analyst is free for subscribers.  Submit your question and one of our analysts will be in touch.

Your question has been successfully sent to the email address below and we will get back as soon as possible. my@email.address.

All fields are required.

Please make sure all fields are completed.

Please make sure you have filled out all fields

Please make sure you have filled out all fields

Please enter a valid e-mail address

Please enter a valid Phone Number

Ask your question to our analysts

Cancel