What we learned: ingredients for successful data-enabled, collaborative improvement in general pract

The case studies in this briefing represent very different approaches to data-driven improvement. In this section, we draw out some of the learning from these. Some important differences are underpinned by organisational structure (see Table 1), and these have implications for the practicalities of sharing data and the motivations for participating practices.

The case studies in this briefing represent a spectrum. At one end are the provider organisations (Modality and ATM), which face lower barriers in relation to data governance and the cooperation of constituent practices, yet also have to fund the data extraction and analysis themselves. At the other end are the CEG and EQUIP, which are working voluntarily with independent practices, with additional quality improvement support.

Table 1: Commonalities and differences across the case study organisations

CEG

Modality

ATM

EQUIP

Organisation type

Academic unit

GP provider

GP provider

Quality improvement group

Information governance: data-sharing agreements

Required

Not required

Not required

Required

Participation from practices

Voluntary

Mandatory

Mandatory

Voluntary

Bespoke support

Data analysts, practice facilitators

Data analyst, information governance manager

Business intelligence manager

Data analyst, quality improvement coaches

Mechanisms to enable improvement

Individual practice support from facilitators

Clinical meetings

Clinical meetings

Individual practice support from quality improvement coaches, shared Life QI platform

Funding

Research grants, CCG

Self-funded

Self-funded

Research grants, CCG

1. Getting the basics right: accessing and analysing data

Collating, analysing and sharing data are fundamental to the models studied in this briefing.

All of the case study groups can run searches on all participating practices from a central location, as opposed to each practice running a search and then sending in the data. This serves two positive functions: it improves data consistency, as the analyst can run the same search across all practices, and it saves time for practices, as they don’t have additional work to submit data.

If all practices within an improvement group are using the same electronic health records, then running searches is more straightforward. For example, the CEG encouraged all Tower Hamlets practices to use EMIS Web, which means a single search works across them all. Data-sharing approaches to improvement are not impossible if practices are using different electronic health records, but equivalent searches need to be created in each system.

Adequate information governance to protect patient data is essential to data-sharing approaches to quality improvement. Some organisational forms may hold advantages in this area. As non-provider organisations, the CEG and EQUIP require data-sharing agreements with all participating practices, and both noted relationship building had been a vital, if time-consuming, step in getting these agreements. Provider organisations, however, can access data from all practices in their organisation within the remit of direct patient care. Modality and ATM therefore do not require data-sharing agreements to be able to centrally analyse data from any of their constituent practices.

All of the case study organisations employ staff beyond the usual scope of the general practice workforce to support their improvement work. Data analysts devise, execute and interpret searches, while practice facilitators and quality improvement coaches support practices to make improvements. Several groups described challenges with recruiting to these roles. There is a known lack of analytical capacity in health care services in the UK, and some organisations trained people especially for roles. For example, the CEG developed its own analysts and practice facilitators, and EQUIP designed an on-the-job programme to train its quality improvement coaches. All of our case study groups met the cost of funding these posts themselves, though the source of their revenue varies. Modality and ATM generate their own income as GP providers, whereas the CEG and EQUIP receive income from competitive grants and from their local CCG.

All four organisations use dashboards to present data to practices, though they take different approaches to making data identifiable. At the start of its work, the CEG presented data back to practices so they could compare their performance against their local peers, but could only identify their own practice. This was important to begin with, although practices subsequently asked to become identifiable to each other so they could better share learning. ATM, EQUIP and the CEG’s dashboards all provide role-specific data, in order to present the most relevant data to individuals; for example, GPs see different information to practice managers. The aim is to avoid data oversaturation and to make the dashboards easier to engage with.

At ATM and Modality, monthly clinical meetings are used as a forum to discuss the quality dashboards, identify unwarranted variation and agree actions. Where possible these meetings include the same clinicians each month, so that continuity is maintained. Understanding when variation is warranted or unwarranted, even within relatively local areas, can be challenging, and these meetings act to build consensus among peers, and to share ideas to enable improvement.

The CEG employs practice facilitators, each of whom has responsibility for a small group of practices in the same geographical locality. Facilitators act as links between data analysts and practices, helping teams to interpret and act on their data. Facilitators are then able to take learning from teams back to the CEG to continue a cycle of learning and improvement. At EQUIP, quality improvement coaches work alongside practices to provide support. Practices share reflections on progress and lessons learned with other participating practices via an online platform, so that the whole network learns together.

2: Choosing areas for data-driven improvement

The breadth of targets for improvement in these case studies shows that data sharing can inform improvement across a range of clinical outcomes and processes. The sites have made careful choices about where to focus their efforts.

Both the CEG and EQUIP make sure that chosen improvement goals have the support of the clinical communities in which they will be applied. For the CEG, this involves engaging clinicians at local meetings to identify areas for improvement that will have the support of the local GP community. The EQUIP team facilitates a ‘data wall’ exercise: all staff at a practice come together with quality improvement coaches around a wall, on which their practice performance data is presented. They then collaborate to identify opportunities for improvement and agree which to prioritise.

All sites use incentive alignment as a tool to increase GP engagement. In England this often means choosing areas for improvement that will result in additional payment for practices, either from QOF or locally enhanced service specifications (services not deemed essential under most GP contracts and for which extra payment is made if provided). For GP providers, aligning improvement efforts with financial incentives may be particularly appealing, since they will directly benefit from improved outcomes. For non-provider organisations, incentive alignment may act as an effective tool to engage practices, as financial benefits may flow directly to individual practices.

Financial incentives complement professional motivations. Competition to do better can be seen as a motivator because it appeals to professional pride and the desire to do better for patients. At ATM, for example, creating a sense of competitiveness between practices is viewed as a desirable element of the organisational culture, whereas the CEG and EQUIP place more emphasis on collaborative approaches to learning.

The CEG and EQUIP both highlighted the importance of picking goals that are ambitious but achievable. For example, reducing teenage pregnancies is likely to be more complicated than improving adherence to existing blood pressure medication guidelines. The chosen areas of focus must also be evidence based and measurable – it can otherwise be challenging to directly attribute improvements in patient outcomes to improved care processes or adherence to clinical guidelines.

Previous Next