Skip to Main Content
Skip to main content

Trinity College Dublin, The University of Dublin

Menu Search

Doing a Systematic Review

A concise guide to the steps involved in systematic, scoping and related reviews

Screening your results

A sieve and icing sugarScreening is evaluating the studies against your research question and inclusion/exclusion criteria – initially, this is done just looking at the titles and abstracts (“TI/AB screening”).

We recommend using Covidence to screen your results. In Trinity, we have a site licence which means any reviews with a Trinity member can use it.

Getting started with Covidence

If you haven't already, register for our institutional account in Covidence and create a blank review. Invite you co-reviewers to be a part of your study.

Next, in your new review, you will need to import the results from your searches. You may be exporting directly from the databases, or you may have decided to save the results into one or more EndNote Libraries. In either case, importing your results is very similar.

If you have saved your results into EndNote Libraries:

  • Open your first EndNote Library.
  • Open the Export function (File > Export on Windows). Select XML as the file format, and don't tick Export Selected References (we want them all).

  • Save the XML file somewhere obvious, like your Desktop.
  • Go to Covidence, and click Import on the main page.
  • On the Import page, tell it you want to import them into Screen. Tell Covidence which database you used to find these results. Browse to select your XML file, and then click Import.

  • Allow your computer to upload the file, and then it will automatically return to the main screen.
  • You can now repeat the process with your next EndNote Library.

Covidence will automatically compare and deduplicate results based on title, author, DOI etc. In our tests, it seemed to get over 98% of duplicates compared to a version of the same review where we manually deduplicated; this is a very high level of accuracy for an automated system and probably far higher than that achieved with other methods.

Title/abstract screening

Ideally, each study is screened by two people, and if they make the same decision (Yes or Maybe – include; No – reject) then the study moves into an Included or Excluded group. If different decisions are made by the screeners - “conflicts” - then a final consensus decision has to be made to move the study into the correct group. Having two screeners at each stage minimises risk of bias, but is not always feasible (especially for students).

Before you start screening as a team, you must have a shared understanding of what counts as relevant. Otherwise you will get constant conflicts. There will always be some conflicts, but normally only a few per cent.

Screening full text

Once you have screened at TI/AB, you will have almost certainly have excluded the vast majority of articles as being irrelevant. The next step for the ones you have deemed relevant is to find the full text and read the whole thing – or at least, read until you read an exclusion reason.

Hence, you need to find the full texts of these, upload then into Covidence, and then screen at the full text level. You come up with reasons why they should be excluded (wrong age group, wrong methodology etc. etc). These may be a bit more detailed than the TI/AB reasons – “cannot disaggregate ages” for example, when you are looking for adolescent studies. These are input into Covidence's settings, so they can be chosen from a drop-down list. Once you hit an exclusion reason, you can stop reading and exclude it.

The ones you include move forward to the extraction stage. Again, this is best done with two screeners to minimise bias.

Finding PDFs using Google Scholar

If you go directly to Google Scholar off campus, it won't know you are affiliated to Trinity and give you access to Trinity subscriptions. The easiest way to get the Trinity version of Google Scholar is to install our PDF helper app, Lean Library:


The ones that have passed the full text level and are still deemed relevant are now “extracted” by putting information from the studies into forms or a spreadsheet. There are normally two parts to this in a systematic review:

  1. Risk of bias (quality of article) section – evaluate the article against certain criteria.
  2. Data extraction section – the findings of the study. You extract the data from each study, using standardised headings so they can be compared.

Make sure you have chosen "Extraction 2.0" in Covidence's settings. This is an improved version of their extraction tool that allows you to add and subtract sections to their forms, even after extraction has started.

Risk-of-bias and quality appraisal

In a systematic review, once you have selected the relevant studies you must evaluate their quality as not all will have a sufficiently rigorous methodology to avoid biased results. This is done using forms for particular types of studies such as RCTs or other systematic reviews, where you pick answers that when totted up gives you an indication of the quality.

You then have a record of whether the study is of low, middling or high quality (or conversely, has a high, middling or low risk of bias).

You can use the checklists available via the links below to help construct a form in Covidence's extraction section, or use Google Forms or Excel if you prefer.

Additional resources on critical appraisal

See the following links for further help on doing a critical appraisal of the literature:

Data extraction

Data extraction is pulling together the information in the study - the population, methodology, outcomes etc. - so that it can be compared with the same fields of information in the other studies.

Covidence's extraction section has an inbuilt Data Extraction form with many useful fields already present, to which extra fields can be added and irrelevant ones removed; or, you can construct a form entirely from scratch. Alternatively, you can use Google Forms or Excel if you prefer, but this won't be so easy for multiple screeners.

If using Covidence, you can continue to make changes to the template even after your team has started working with studies. All changes made to the template will be applied to all studies, regardless of what stage of work they are at. It may therefore be necessary to return to studies to do further extraction or assessment, or to re-do consensus. The system will advise of what further work is required when publishing an updated template.

Other tools for managing evidence synthesis

In most cases we recommend using Covidence for screening your results. However, at a pinch you can do this within EndNote, or consider the following tools. Please note that we are unable to offer support for using these products.


Lists of references can be exported from EndNote into Excel for screening. This is quite straightforward if you have the Tab Delimited style installed.


Rayyan is a free online tool that can be used for screening and coding of studies. It will pre-populate inclusion and exclusion criteria, although these can be customised. You can also tag and filter to code and organize references.


Although primarily a tool for qualitative evidence synthesis from interviews, focus groups and open-ended survey questions, NVivo can be used for drawing out themes from literature gathered as part of systematic review. It is possible to download a free trial version for two weeks, and while there are a few areas in College that have it installed, there is no site licence unfortunately. See the relevant IT Services page.


A web-based software program for managing and analysing data in literature reviews. It is suitable for all types of systematic review (meta-analysis, framework synthesis, thematic synthesis etc) but also has features that would be useful in any literature review. It manages references, stores PDF files and supports qualitative and quantitative analyses such as meta-analysis and thematic synthesis. It also contains some ‘text mining’ technology for making systematic reviewing more efficient. Sign up for a month-long free trial, subscription fees apply after that.