When guidance starts arriving from NICE, PCGs will need to plan carefully how best to implement it locally, explain Dr Berry Beaumont (left) and Nick Pahl


   

If the NHS goal of improving clinical effectiveness is to be achieved, it is essential that initiatives such as NICE succeed in providing guidance that can be put into practice.

The problem is that although there are guidelines for just about anything, they are unlikely to have the automatic trust of local clinicians jaded by the plethora of information with which they are bombarded daily.

The GP Commissioning Group in Camden and Islington, London, successfully applied for funds to establish a commissioning pilot which went live in April 1998.

 

One of the projects set up with this additional funding aimed to improve clinical effectiveness. A further aim was for primary and secondary care clinicians to work together to influence both management in, and referralsrby, primary care, and to achieve uniformity of effective clinical practice in the three main hospital providers.

The plan was to develop and implement guidance on patient management in seven clinical areas, based on the available evidence for good practice. The areas chosen were the management of:

Menorrhagia
Serous otitis media (glue ear)
Recurrent tonsillitis
Varicose veins
Benign prostatic hypertrophy.

The place of magnetic resonance imaging (MRI) before knee arthroscopy, and prophylactic removal of wisdom teeth, were also considered.

Most of these clinical areas are the 'bread and butter' of everyday GP practice. They were also highlighted in the NHS Executive Clinical Effectiveness Indicators document,1 and reflected local concerns from GPs in the Commissioning Group.

 

Careful consideration was given to the process to be followed (Figure 1, below).2 It was considered important that the project should distance itself from controversial issues around rationing and waiting lists.

Instead, the focus was on improving clinical practice, emphasising that any guidance produced was only a 'guide' to appropriate management, and that advice should not be seen as proscriptive. Clinicians would remain free to refer to, and to treat in, secondary care according to their judgment of each patient's needs.

Figure 1: Process for improving clinical practice*
flow chart

 

A meeting of GPs was convened locally, and a GP with a special interest was found to steer the work in each clinical area. Funds from the pilot allowed locum reimbursement for their time, which was crucial for GP involvement.

The project officer (NP) then carried out a search for best practice in each clinical area, together with local hospital activity analysis. Key issues were identified.

These were presented to each steering group, comprising the lead GP, consultants from local trusts, a prescribing adviser and a public health consultant from the health authority.

 

Each group discussed the evidence base for appropriate management of the condition under review, and decided which issues in primary and secondary care needed to be addressed.

Although this often revealed entrenched views from the clinicians (which could be contrary to the strong evidence from research) and differences in the clinical culture of general practice and hospital care, it was generally viewed as a very useful opportunity to attempt consensus on best practice.

Discussion usually focused on the care pathway in a GP algorithm flowchart, and produced some constructive agreement on how conditions should be managed in general practice, when referral to secondary care was appropriate, and what should then happen at hospital.

Consensus was easier to reach in some clinical areas than others, but an optimum care pathway was devised for most conditions. There was less disagreement when considering a clinical area that had recently been examined in detail by one of the medical Royal Colleges.

 

A variety of educational approaches was used to disseminate guidance on good practice, including presentation at local primary care forums, and workshops open to all primary care professionals in the Camden and Islington area, where participants were encouraged to comment on the guidance.

The project was closely monitored throughout by the lead GP (BB) of the Commissioning Group Pilot.

A key achievement of the project was the high attendance at workshops – on average 40 people attended. Attendees were GPs ( principals and non-principals) and practice nurses. Attendance was certainly boosted by PGEA accreditation and locum reimbursement for GPs, which were major 'carrots'.

Workshops focused on:

Primary care management
The finalisation of GP algorithms
Case histories

New evidence presented by local GPs and hospital specialists.

Every workshop was evaluated, with generally very positive feedback. They were felt to be relevant to GP needs, with an appropriate level of involvement, and clear and concise presentations and format.

 

The main problem for the project was that the Commissioning Pilot was originally intended to run for 2 years, but halfway through the first year it became clear that funding would cease in April 1999 with the advent of PCGs. This left no time to monitor and evaluate the degree to which the guidance was being implemented and was bringing about change in practice.

Discussions will be taking place with PCGs and the local MAAG to see whether this can be organised through the new clinical governance arrangements.

The truncated timescale of the project also caused other problems. GPs were overwhelmed by too many meetings taking place, and guidance being produced over a very short space of time (4 months). Ideally, each initiative should be phased over a longer period, say a 3-monthly interval, to allow time for considered participation and assimilation.

Of those GPs who did not attend any meetings, the majority only learned of the guidance through passive dissemination which, it is well recognised, rarely results in behaviour change.

Practice visits would have been a useful way to increase a sense of ownership of the guidance, as would an electronic format for guidance, for loading onto practice computers. However, this was not feasible in the shortened project.

 

Lessons learned from this project should be useful to PCGs planning to take on a commissioning role and implement packed clinical governance agendas. The project's experience is in line with the conclusions of a recent publication by the NHS Centre for Reviews and Dissemination at the University of York:3

There should be careful preparation of any clinical effectiveness project, with ownership of the project by clinicians.

There should be multifaceted interventions targeting different barriers to change – barriers which may not be the same for particular localities or individuals.
There must be adequate resources and people with skills to implement change.

There has to be a systematic approach to monitoring and evaluation of change and a sensitive management of the process.

Although many clinicians fear that guidance undermines their professional autonomy, others see it as an inevitable step in the new NHS.

Experience with our project suggests that PCGs may find it less difficult to agree on what is best clinical practice than to bring about change in practice in both primary and secondary care.

Commissioning needs to become a more powerful tool for influencing clinical practice. However these issues are approached, successful implementation of an effective clinical practice programme depends on adequate time and resources, for which competition is fierce in the current climate of reorganisation and change.

key points
If clinical guidance is going to be implemented effectively, the process must be carefully managed with appropriate consultation with stakeholders
Dissemination of guidance must take into account the local situation and be multifaceted and adequately resourced
PCGs need to be aware of local obstacles to change and the need to monitor and evaluate change

 

  1. NHS Executive. Clinical Effectiveness Indicators: a consultation document. May 1998.
  2. CA Sawka, Goel V, Mahut CA et al. Development of a patient decision aid for choice of surgical treatment for breast cancer. Health Expectations 1998; 1: 23-36.
  3. NHS Centre for Reviews and Dissemination, University of York. Getting evidence into practice. Effective Health Care 1999; 5(1).

Guidelines in Practice, December 1999, Volume 2
© 1999 MGP Ltd
further information | subscribe