Research Design Service: East Midlands

Latest News

Core outcome set - response to COVID-19

  Created on Friday, 01 May 2020. Posted in Priorities | Methods

Many clinical trials and systematic reviews are already underway, or will begin shortly, to strengthen the evidence base for the COVID-19 response. Core outcome sets (COS), showing the outcomes that should be measured and reported in these studies, will ensure that this evidence base will contain, as a minimum, the key information needed by decision makers about the effects of interventions.

COMET core outcome sets for trials and systematic reviews

COVID-19 Research and Platform (Complex Innovative Design) Trials

  Created on Thursday, 30 Apr 2020. Posted in Methods

A Platform study allows for several experimental treatments to be studied simultaneously. In addition, new experimental treatments can be integrated into an existing study rather than setting up a new study. This allows for more efficient research

Insights into adaptive studies and other complex and novel methods

  Created on Thursday, 30 Apr 2020. Posted in Methods

Research and Development Forum's slides from their Research but not as we know it: Managing novel methods in research symposium (March 2020)

Questionnaire design review service (QuestReviewTM)

  Created on Tuesday, 19 Mar 2019. Posted in Methods

Health Outcomes Insights offers a questionnaire design review service (QuestReviewTM) where they provide a detailed appraisal and report - based on best practice - on any weaknesses in the questionnaire’s design. This is an excellent service for researchers who want to design surveys but don’t have the experience.

The IDEAL Collaboration

  Created on Monday, 30 Jan 2017. Posted in Toolkit/Database | Methods

The IDEAL Collaboration is an initiative to improve the quality of research in surgery.

  • The model describes the stages of innovation in surgery: Idea, Development, Exploration, Assessment, Long-term study
  • There are a set of recommendations at each stage of the model that have been developed by experts in evidence-based surgery, for example on improving methodology and how to address the challenges of randomized controlled trials in surgery.
  • The IDEAL Framework describes the stages through which interventional therapy innovation normally passes, the characteristics of each stage and the study design types the collaboration recommends for each.

Important tip: recent HTA commissioned calls have mentioned this framework!

HS&DR 15/140/01 Challenges, solutions and future directions in the evaluation of service innovations in health care and public health (04:16)

  Created on Tuesday, 14 Jun 2016. Posted in Literature | Methods

This report is the output from a meeting in London in June 2015 that was funded by The Health Foundation, The Medical Research Council, the National Institute for Health Research and Universities UK. It brings together current approaches and methodologies for the evaluation of complex interventions in healthcare, and considers future challenges. The report covers 8 themes, including patient-reported outcome measures, healthcare equity, both qualitative and quantitative methods, and evidence synthesis.

Major funders collaborate to produce first in-depth guide on evaluating healthcare system innovations

  Created on Tuesday, 24 May 2016. Posted in Methods

Entitled Challenges, solutions and future directions in the evaluation of service innovations in health care and public health’, the book is the result of a partnership between the MRC, the National Institute for Health Research (NIHR) and the Health Foundation, together with Universities UK and Academy Health.

“This book will enable researchers, healthcare providers and policy makers to discover the most up-to-date methodologies for evaluation research, to prioritise these methods and implement them in the most appropriate manner to transform the provision of healthcare and public health.”

New free online guide: A quick guide why not to use A+B designs

  Created on Tuesday, 10 May 2016. Posted in Methods

The Adaptive Designs Working Group of the MRC Network of Hubs for Trials Methodology Research has produced an online guide on why not to use A+B designs.

The document summarises the main arguments why 3+3 and similar rule based A+B designs are inappropriate for phase I dose escalation studies. These designs are still widely used although superior model-based designs such as the continual reassessment method (CRM) are available.

Stepped Wedge Randomized Controlled Trials – read the series now!

  Created on Monday, 21 Sep 2015. Posted in Literature | Methods

Guest edited by Professor David Torgerson this new series reviews previous stepped wedge studies, explores key issues of design and forms a comprehensive overview of the current state of knowledge of the design.

There is often a tension between policy makers wishing to implement a novel intervention and researchers wanting time to do a controlled evaluation.  One trial design that can address this tension is the stepped wedged randomised trial.  In a stepped wedge trial, all groups or clusters of participants eventually receive the intervention, but the implementation is staggered in a random fashion allowing a robust evaluation to take place.  Recently there has been an upsurge in interest in the design, which is not yet well understood.  This series of papers reviews previous stepped wedge studies and explores key issues of design, forming a comprehensive overview of the current state of knowledge of the design.

Commonly asked questions about how to approach evaluation of quality improvement in health care

  Created on Wednesday, 01 Apr 2015. Posted in Methods

Evaluation is an essential part of quality improvement and when done well it can help to solve problems, inform decision making and build knowledge. While evaluation comes in many shapes and sizes, its key purpose is to help us to develop a deeper understanding of how best to improve health care.

People involved in quality improvement often ask us about how to approach evaluation. Inspired by the most commonly asked questions, this guide is intended to assist those new to evaluation by suggesting methodological and practical considerations and providing resources to support further learning.

Ten questions covered in the guide

  1. Why do an evaluation?
  2. What are the different types of evaluation?
  3. What are the design considerations for an evaluation?
  4. What are we comparing our intervention with?
  5. How does evaluation differ from other forms of measurement?
  6. What practical issues should we consider?
  7. When should we start and finish an evaluation?
  8. How do we cope with changes in the intervention when the evaluation is underway?
  9. Should we do the evaluation ourselves or commission an external team?
  10. How do we communicate evaluation findings?

The Health Foundation guide is not prescriptive or step-by-step as people and organisations will have very diverse evaluation needs. Instead, it aims to stimulate your thinking and support your plans for an evaluation that delivers what you need.