No-one can work on more than a very limited number of projects. This article discusses when you can safely consider the job of introducing a guideline done, so that you can free up time to move on to a new one.
Other articles in this series have made it clear that you cannot assume that simply distributing information will change behaviour, yet many attempts to introduce new guidelines stop at just this.
Even if your plan includes the follow-up work recommended, you cannot assume that new ways of working will be adopted. Only by monitor- ing progress through measurement and seeing that new patterns of behaviour have been adopted and sustained can you strt to think of moving on.
Turning your attention elsewhere too early may allow the situation to drift back quite quickly to the way it was before you started: the natural tendency of people is to revert to previous, familiar behaviour patterns.
You may occasionally have to admit defeat, despite your best efforts. People may be reluctant to change in line with a guideline for reasons that you cannot influence. Good planning minimises the chance of this happening, but does not eliminate it entirely.
Assuming that you have had the success you deserve with your project, there are four questions to ask:
1. Is the guideline widely accepted?
In your contacts with colleagues during the course of implementation, you will develop an accurate sense of whether the guideline has been understood and accepted by them.
If it has not, any change the indicators may reveal is likely to be due to the Hawthorne effect people changing because they are being watched, and going back to earlier ways of working when they are not watched any more.
2. Is the change maintained?
Through monitoring, you will be able to track what is happening to key components of the guideline. You want to see not only a change in these indicators, but also track them for long enough to be satisfied that having changed they have stayed for some time at the new level.
Although you cannot be sure whether the Hawthorne effect is operating until you disband the implementation team, people usually get used to being monitored and this has less effect over time; this probably accounts for the 'post-guideline blip'
3. Are new systems and ways of working fully bedded in?
|Has the organisation itself accepted the new ways of working?|
|Have all the old forms been disposed of?|
|Has everyone been trained on the new computer system?|
|Are new arrangements for collecting path specimens or returning results always used?|
Have the old notices been taken down?
Look for symptoms of failing to adopt new working practices: if you see telltale signs, beware!
4. Have other areas of care become high priority?
This final question may involve balancing the need to finish off the job properly with the need to start another one which is now high on the agenda.
This is a matter for judgment, but take care not to succumb to the temptation to take on a new project just because it is new.
If members of the implementation team are enthusiasts, be aware that they will tend to get bored with old things and curious about new ones, and make sure that this outlook does not colour their judgment too much.
You should also put in place a system for dealing with predictable changes in circumstances. For example, you may need to make sure that new trainees are familiarised with the guideline when they arrive in the practice; similar steps may be needed when a new receptionist or practice nurse arrives.
When you decide that the time has come to focus your attention on something new, you should consider withdrawing from the current project gradually.
You might stop monitoring data for a trial period say 3 months and thenNcollect further data to see whether the position is still satisfactory. This will help to distinguish the Hawthorne9effect from real commitment to the new way of working.
If you are in any doubt about whether the change is permanent, it may be useful to plan further 'spot checks' to alert you to the need to engage more intensively once again.
Once you have disbanded the implementation team and signed off the project, it may be difficult to get things going again.
Every guideline you adopt should have a review date. This may be based on a general standard (e.g. annual review) or on considerations specific to the guideline in question (e.g. when the results of an important study are expected to be published).
As well as these predetermined dates, you should consider reviewing a guideline when:
|New research is published|
|New facilities become available (e.g. a clinical computer system, new near-patient-testing equipment etc.)|
|New national policies are published|
|Expectations of the public or clinicians change markedly.|
Success in implementing a guideline requires a plan that includes:
|Induction of new staff|
A review that is either predetermined or reflects unpredictable changes in research, facilities, policies or explanations.
Even when the implementation team has been disbanded or moved on to other things, someone should have the responsibility of keeping an eye on matters that may be relevant to the guideline and advise the local lead of the need to consider taking an interest in the guideline once again.