Dr Gillian Leng (left), Professor David Haslam, and Mr Tahir Mahmood explain how the NHS Evidence Accreditation Scheme helps the user identify high-quality guidance

In the February issue of Guidelines in Practice, Dr Gillian Leng, Chief Operating Officer for NHS Evidence, explained how this new service provides high-quality clinical information.1 NHS Evidence was launched in April 2009 to provide staff with access to the full range of evidence relating to healthcare and social care.2 Right from the starting concept and throughout the development of NHS Evidence, we considered how best to provide access to information covering a range of sources, and the key question in our minds was, ‘How do users know what is best quality?’

Early research made it clear that users did not want content filtered on their behalf, but they did want a way to differentiate the quality of search results. Our solution was to develop the NHS Evidence Accreditation Scheme: users would be able to see everything available through NHS Evidence and a mark (see Box 1) would highlight which organisations meet the highest standards of guidance production.

We knew that there was a demand for this kind of service—NHS Evidence is provided by the National Institute for Health and Care Excellence (NICE) and over the years there have been frequent requests from guidance producers for an accreditation, or kite-marking scheme. Organisations want to drive up the standard by which they produce guidance; so it is good news that the UK has a body that sets explicit benchmarks, allowing organisations to make an assessment about the processes they are using. Our conclusion is that these standards are being widely applied, even if organisations are not going as far as to apply for accreditation. For instance, we know from our discussions with some guidance producers that they have reviewed their processes in line with the accreditation criteria.

Box 1: NHS Evidence Accreditation Mark
NHS Evidence Accreditation Mark

Setting up the accreditation process

When we were developing our accreditation scheme in 2008/09, we started by looking at whether any other countries had developed a similar framework. We found a few mechanisms for assessing guideline quality and some significant international collaborative work describing best practice for developing guidance (see Box 2). However, there was nothing that fitted the bill for precisely what we required: namely, a formal, robust process of applying for accreditation and a process for assessing an application against transparent criteria leading to a mark for those organisations that meet them.

We recognised that the accreditation process would require many additional elements—an independent advisory committee to take the decision, supported by peer input, and an appeals mechanism. We were able to draw on the experience and expertise of NICE to develop an initial process, which was subject to public consultation. In November 2008, we ran a workshop with key guidance developers, including the Medical Royal Colleges, to test the demand for such a process. This was a critical step for NHS Evidence, as the fundamental success of the scheme would hinge on external interest and support for the process. Much to our relief, the workshop participants all expressed interest in the proposed process.

Box 2: Appraisal of Guidelines, Research, and Evaluation Collaboration [AGREE]

AGREE is an international collaboration of researchers and policy makers who seek to improve the quality and effectiveness of clinical practice guidelines by establishing a shared framework for their development, reporting, and assessment.

The collaboration has the participation of a core of European countries—Denmark, Finland, France, Germany, Italy, the Netherlands, Spain, Switzerland, and the United Kingdom—as well as Canada, New Zealand, and the USA. It is coordinated by the Health Care Evaluation Unit at St George’s Hospital Medical School in London (www.agreecollaboration.org). Contact: Françoise Cluzeau (email: f.cluzeau@sghms.ac.uk).

How the accreditation process works

At the heart of the accreditation process is an independent advisory committee, chaired by Professor David Haslam. This committee was set up to have a wide membership including appropriate expertise in a range of relevant areas—today the committee has about 30 members including commissioners, people from lay groups, clinicians, experts in evidence synthesis and health economics, and social care staff, all of whom give their time freely. They meet every 6 weeks or so to make decisions about accreditation applications and to have an overview of NHS Evidence.

Broadly, the accreditation process involves a guidance producer making an initial submission, with supporting information covering six domains:

  • Scope and purpose
  • Stakeholder involvement
  • Rigour of development
  • Clarity and presentation
  • Applicability
  • Editorial independence.

The submission undergoes scrutiny from an internal accreditation team employed by NHS Evidence and at least two external advisers, usually clinicians regarded as international experts. Our assessment of the submission, together with the input from the external advisers then goes back to the organisation, which has 20 working days to respond.

After receiving comments from the guidance producer, the independent advisory committee becomes involved. It considers all the evidence—the overview, the external advisers’ reports, and the guidance producer’s feedback—before making a draft accreditation decision at an advisory committee meeting, which is held in public. This decision is based on a discussion in which wide participation by the advisory committee is encouraged, followed by a secret ballot. The decision is then included in a draft accreditation report that is open for public consultation for 20 working days. In the last stage, the advisory committee uses feedback from the consultation to help with its final decision, which is notified to the guidance producer and then to the public via the NHS Evidence website (www.evidence.nhs.uk).

Making decisions

On balance, the applications received so far have been of high quality and for the most part, guidance producers are performing to a high standard. However in some cases, there is a lack of documentation, which is critical because the committee needs clear evidence that demonstrates the quality of the processes used by the guidance producer.

The advisory committee’s experience to date is that the accreditation decisions are not necessarily straightforward. There are various criteria within each domain, 25 in total, but meeting the criteria is not simply a yes/no decision—there can be areas of uncertainty where it is either not clear from the submission that the organisation meets the criteria or where the criteria may not apply in full to the organisation. As the accreditation process matures, the committee will develop further experience in determining which elements are critical, and where more flexibility can be applied. For example, the advisory committee has been very firm on making sure guidance producers deal adequately with issues around conflicts of interest and it has clear views on patient and service user involvement. However, there are cases in which there may be good reasons for not involving patients and users at every stage and we have been open to hearing why (see the case study, Box 2).

NHS Evidence has now established an accreditation decision database, which helps to build a picture of draft and final decisions, allowing patterns and trends to be identified, and assists in ensuring that the decision-making is consistent. It therefore allows us to build up a library of good practice. Early findings suggest that:

  • our decision-making is consistent across and within applications
  • the weakest domains in the submissions received so far have been around rigour of development, applicability, and how organisations take into account the implementation of their guidance
  • based on a very small dataset, we have shown that uncertainty in some criteria has a big influence over accreditation decisions. The three most important areas are:
    • describing the clinical question
    • detailing the patient group and target audience
    • describing the strengths, limitations, and uncertainties of the evidence used to develop the guidance.

The future for NHS Evidence Accreditation

Currently, many of the very largest guidance producers (e.g. NICE, Royal Colleges) have applied for and been awarded accreditation. It would be good to see smaller organisations coming forward too. We are also considering the development of accreditation schemes for other products used extensively in healthcare and social care decision making, for example, the clinical content of decision support systems and summaries of the effectiveness of new drugs and systematic reviews.

NHS Evidence has received international interest in its accreditation scheme, with several international guideline developers enquiring to see if they can be accredited through the scheme. Theoretically we could, and in many ways it would be desirable to do so because we know that some international guidelines, such as those developed by the European Society of Cardiology, are used by clinicians in this country. There is no charge for UK guideline producers who go through the accreditation process because we see the service as an important part of quality assurance for the NHS. But where there is international interest, it may be necessary to levy a charge for the accreditation service.

We would like to reach the point where any clinician or commissioner who needs guidance about what constitutes best practice in almost any area, will find NHS Evidence the best place to go to find accredited information.

Guidance producers who are interested in applying for NHS Evidence accreditation can attend a workshop in September 2010. For more information, contact margaret.derry@nice.org.uk

Box 3: Case study—Royal College of Obstetricians and Gynaecologists

The Royal College of Obstetricians and Gynaecologists (RCOG) was among the first organisations to apply for NHS Evidence Accreditation, says Mr Tahir Mahmood, Vice President, Standards, Royal College of Obstetricians and Gynaecologists

The RCOG has been producing guidance for 20 years and is committed to working to the very highest standards. Our guidelines are based on review of published literature, and evidence is drawn by using similar criteria to that used for developing SIGN guidelines.a The guidelines developers follow our policy statementb and receive multi-professional input; for example, patient representatives, the Department of Health, and NICE. We currently have over 50 active clinical guidelines that deal with areas of uncertainty in everyday practice and they have proven to be very popular. In just one month in 2010, there were 115,000 hits on the guidelines section of the RCOG website and more than 20,000 people viewed our guideline on deep vein thrombosis prophylaxis.c

Having worked closely with Lord Darzi on his NHS next stage review report,d we were very keen to be among the first to apply for and win accreditation from NHS Evidence. We produce guidelines that patients read and that our members use. If our guidelines are to influence purchasing and commissioning to drive up quality, they must be used by commissioners. That is the only way we will really start to tackle variations in care provided (e.g. Caesarian section rates).

The accreditation process was very challenging, perhaps because we made our submission so early in the process and there were no other organisations we could talk to about getting it right. When we started to measure our own performance, we found that we were lacking in some areas. For example, we have no health economist and lack multi-professional authorship. We have been able to discuss this with NHS Evidence, saying that as a charity we cannot afford to employ a health economist for each guideline and also arguing that our guidelines are about best evidence; it is up to commissioners to decide how to implement them.

In the past, the guidelines prioritisation process lacked input from patient groups. The topics were selected by the clinicians based on the perceived need for clinical care. However, our process does involve input for our consumer group (lay members, healthcare professionals) as well. Additionally, we do have a capacity issue as we cannot produce an evidence-based guideline in every clinical area that may be deemed to be important to patients. The approach adopted by the RCOG is that we should have guidelines in those areas where uncertainty affects a large critical mass of patients. Once a topic has been agreed, a representative of our consumer forum is involved in the guideline development process, along with open public consultation via the internet. We are now looking at how we can include patients as co-authors for our guidelines.

Now that we have been awarded the NHS Evidence Accreditation Mark, I can say it was all worthwhile. We have proved our process to ourselves and to the wider world, and shown that it is robust. We know what we are doing meets the highest standards. It is good to do something well and to be recognised for that.

bRoyal College of Obstetricians and Gynaecologists. Development of RCOG green-top guidelines policies and processes. Clinical governance advice No. 1a. London: RCOG, 2006. Available at: www.rcog.org.uk/green-top-development
cRoyal College of Obstetricians and Gynaecologists. Reducing the risk of thrombosis and embolism during pregnancy and the puerperium. London: RCOG, 2009. Available at: www.rcog.org.uk/womens-health/clinical-guidance/reducing-risk-of-thrombosis-greentop37
dDepartment of Health. High quality care for all: NHS next stage review. London: DH, 2008. Available at: www.dh.gov.uk



  1. Leng G. NHS Evidence provides easy access to quality clinical information. Guidelines in Practice 2010; 13 (2): 43–44.
  2. NHS Evidence website. www.evidence.nhs.ukG