We have over 15 systematic reviewers working on a wide range of projects contributing to excellence in health care research worldwide.
Our systematic reviewers work closely with the information specialists, statisticians and health economic modellers throughout the various Research Centres across the whole school.
We also work on reviews for Cochrane, industry, and charitable organisations.
If you require a quote for our services in systematic review please email: firstname.lastname@example.org or telephone: +44 114 222 6125
Our systematic reviewers work closely with the information specialists, statisticians and health economic modellers throughout the whole school.
Our reviewers come from diverse educational backgrounds including information resources, psychology, nursing, medicine and pure science. Experience in systematic review encompasses a broad range of clinical areas and includes diagnostic and prognostic reviews. Please click on the staff members to view their research interests for PhD topics, reviewing experience and publications
- Andrew Booth
- Fiona Campbell
- Anna Cantrell
- Chris Carroll
- Katy Cooper
- Munira Essat
- Emma Hock
- Sue Harnan
- Jo Leaviss
- Marrissa Martyn-St James
- Abdullah Pandor
- Edith Poku
- Louise Preston
- Alison Scope
- Emma Simpson
- Katie Sworn
- Lesley Uttley
ScHARR Information Specialists have a wealth of experience in systematic literature searching and evidence retrieval for all types of knowledge synthesis, including systematic reviews.
Please click on the staff members to view their research interests, information retrieval experience and publications.
Research in systematic review methods
In addition to commissioned systematic review work to inform health care, our systematic reviewers are working on innovations in methodology for evidence syntheses. Papers discussing methodology of systematic reviews authored by staff can be found on their individual web pages.
Drawing on our extensive experience of working with partners, we offer a range of systematic review products for consultancy and the NHS that can be tailored to the needs of customers. We can focus on a range of study types, including RCTs, cohort studies, case-control studies, cross-sectional data, diagnostic studies, prognostic studies and qualitative studies. We can also provide mixed-methods syntheses of qualitative and quantitative evidence.
- Broad systematic review: Produced to standards required for submission to reimbursement bodies such as NICE or submission to a peer reviewed journal. Can provide information for strategic planning in industry. Employing highest standard of review methodologies. Can include multiple interventions, broadly or narrowly defined populations and multiple outcomes. Analysis can include narrative synthesis, meta analysis (quantitative) or meta synthesis (qualitative). Estimated time: 6 -12 months
- Narrow systematic review: Produced to the standards outlined above, but are narrower in focus, e.g. only RCTs, one intervention and fewer outcomes. Estimated time: 1 - 6 months, depending on available evidence
- Systematic reviews to support network meta analysis/mixed treatment comparisons: Produced to standards required for a submission to reimbursement bodies such as NICE. Where multiple interventions are to be compared, a wider and more comprehensive search and inclusion strategy is required. We recommend: use of a clearly defined population, use of RCTs only, clearly stated interventions and a network limited to an agreed level and agreed number of outcomes. Estimated time: 6 - 12 months
- Review of systematic reviews: Where multiple systematic reviews are already published, a quick approach to inform strategy or plan a submission to NICE might be to review existing reviews. Estimated time: 1 - 3 months
- Update of systematic review: Produced to standards required for submission to reimbursement bodies such as NICE. Can only be used to update existing high quality reviews. Search strategy is replicated from the date of the last search. Any new studies are data extracted and, if data allows, a new meta-analysis or meta-synthesis performed. Estimated time: 2 - 6 weeks
- Updating a review (quick): Employs limited search strategy, or addition of known new data, such as data provided by funder. Estimated time: 2 -4 weeks
- Rapid review: Where timescales are tight, a rapid review provides a methodologically transparent review, which includes limited search techniques, little or no quality assessment and limited data extraction. These reviews may miss some evidence. Estimated time: 1 - 3 months
- Review of model parameters (systematic): Where a model parameter is key, or data is difficult to identify, a systematic search provides a high quality and accurate source of data. Includes quality assessment and data extraction. Estimated time: 1 week - 1 month
- Review of model parameters (quick): As above, but limited searches, no quality assessment and focussed on one outcome. Transparent methods, but may miss some evidence. Estimated time: 1 week
- Research reports: Where no synthesis of evidence is required. Research reports employ good quality searches, quality assessment and a summary of each study. Estimated time: 1 - 3 months
- Mapping reviews: to outline the available evidence in a broad healthcare topic or to identify research gaps. Include limited systematic searches, limited data extraction, and a variety of outputs which can include numerical or narrative summaries of available studies, their outcomes, study design, etc. Estimated time: 1 - 3 months
- Systematic searches: a systematic search strategy will be designed, translated across a number of databases, and search results will be provided. Grey literature searching can also be provided if required. Estimated time: 2-4 weeks.
- Search string validation: peer-review of existing search strings can be provided. Estimated time: 2 weeks.
If you require a quote for our services in systematic review please email email@example.com or telephone +44 114 222 6125. Together with ScHARR's Information Knowledge Exchange, we will scope the review protocol and provide an estimated price. Turn around from request to quote is usually 2 weeks. It will be helpful if you can provide us with as much of the following information as you have available.
Scoping information for the review question
- Title of review
- Population of interest
- Intervention(s) of interest
- Comparators(s) of interest
- Outcome(s) to be included
- Study types to be included
- Indication of type of review
- TIMESCALE (start date and end date)
- Why do we need a decision tool for rapid review approaches?
Rapid reviews are increasingly used within health technology assessment (HTA) due to the need for timely evidence, combined with resource constraints. There are many published articles on rapid review methods, but little pragmatic guidance on which methods to select.
- What does the STARR Decision Tool cover?
The aim of the STARR Decision Tool is to outline high-level approaches to the rapid review process, rather than defining detailed review methods. The Decision Tool covers the following areas:
- Interaction with review commissioners - it is important that there is a common understanding between those undertaking and commissioning the rapid review, as to the purpose of the review, the questions to be answered, and the trade-off between time and scope.
- Understanding the evidence base - assessment of the volume and type of evidence available will help inform discussions with commissioners about the review scope, which rapid review methods are most appropriate and the feasibility of the review in the given timescales.
- Data extraction and synthesis methods - decisions around how much data should be extracted and presented, and in what format, will involve an assessment of the nature and complexity of the evidence base and requirements of review commissioners.
- Reporting of rapid review methods - clear reporting of the rapid review methods used, and the impact this may have on the findings, are essential elements of a rapid review.
- Development and validation of STARR
The STARR Decision Tool was initially developed based on experience in undertaking rapid reviews, together with existing research on rapid review methodology. STARR has been validated using the Delphi method to gain consensus among researchers who have either i) published rapid reviews or ii) undertaken methodological research in rapid reviewing. This was to ensure that the most important components are included in the tool.
This study has ethics approval from the ScHARR Ethics Committee (application number 017096) and is funded by the ScHARR Research Stimulation Fund.
- Contact information
For more information on the STARR Decision Tool or to give us feedback, please contact: firstname.lastname@example.org.
- Publications and outputs relating to STARR
- Best Oral Presentation Award at Health Technology Assessment International (HTAi) 2019: Martyn-St James M, Pandor A, Kaltenthaler E, Wong R, Cooper K, Dimairo M, O’Cathain A, Campbell F, Booth A. Developing A Decision Tool For Selecting Approaches For Rapid Reviews. HTAi Annual Meeting 2019, Cologne.
- Pandor A, Kaltenthaler E, Martyn-St James M, Wong R, Cooper K, Dimairo M, O’Cathain A, Campbell F, Booth A. Delphi consensus reached to produce a decision tool for SelecTing Approaches for Rapid Reviews (STARR), Journal of Clinical Epidemiology 2019; 114:22-29. doi: https://doi.org/10.1016/j.jclinepi.2019.06.005.
- Workshop at Health Technology Assessment International (HTAi) 2018: Kaltenthaler E, Cooper K, Martyn-St James M, Pandor A, Clowes M, Wong R. From Evidence to Action - Rapid Review Methods for HTA. HTAi Annual Meeting 2018, Vancouver.
- Case study from another research group who used draft STARR Decision Tool: Negro A, Camerlingo Maria Domenica, Maltoni S, Trimaglio F. Challenges Of Rapid Reviews In Health Technology Assessment: Case Study From An Italian Region. Abstract PP097. HTAi Annual Meeting 2017, Rome.
- Workshop at Health Technology Assessment International (HTAi) 2017: Kaltenthaler E, Cooper K, Martyn-St James M, Pandor A, Wong R. Selecting Rapid Review Methods for HTA. HTAi Annual Meeting 2017, Rome.
- Presentation at the Liverpool Evidence Synthesis Network (LivEN): Kaltenthaler E. Selecting rapid review methods for HTA. Liverpool Evidence Synthesis Network (LivEN) 2017.
- Kaltenthaler E, Cooper K, Pandor A, Martyn-St James M, Chatters R, Wong R. The use of rapid review methods in health technology assessments: 3 case studies. BMC Medical Research Methodology 2016; 16(1):26.
The University’s four flagship institutes bring together our key strengths to tackle global issues, turning interdisciplinary and translational research into real-world solutions.