What are learning health systems and why do they matter?

 

1.1. What are learning health systems?

A learning health system (LHS) is a way of describing a team, provider or group of providers in the health and care system that, working with a community of stakeholders, has developed the ability to learn from its own delivery of routine care and improve as a result. At its most fundamental, an LHS comprises a set of activities and assets that enable continuous learning and improvement of services.

LHSs are an important method for improving the quality, efficiency and effectiveness of health and care services. But there are many different types of LHS, ranging from the clinical microsystem level to the national level and everything in between. As a result, the term tends to be used in many different ways. We think that this can, on occasion, be a stumbling block to making progress in this field – with people sometimes talking at cross purposes or using terminology in overly restrictive ways.

For that reason, we begin this chapter by developing an analytical framework for understanding the key components of LHSs and for characterising the variety of types that exist.

Common aspects of learning health systems

While there are many different types of LHS, they all have some key factors in common, although important variation can emerge in relation to these factors.

  • The provision of services. At the core of an LHS sits a service provider or providers, and the desire to improve service provision and outcomes drives the LHS’s activity. An associated factor is that a key source of data from which an LHS learns is data generated from routine service provision (whether clinical data, operational data, patient-reported data and so on). This is one important thing that makes an LHS different from many types of research or trials relying purely on bespoke data collection. (Another important difference is the continuous, iterative nature of the learning that takes place within LHSs – discussed further below.) The presence of a provider means the type of improvement that LHSs do can be endogenous (driven from within) rather than simply exogenous (externally driven by factors such as policy or regulation).

One possible source of variation among LHSs is therefore the type and sector of the provider in question and the nature of services being delivered (for example, health care, public health, social care or community services). What is being improved will also affect who the ‘service users’ might be on any particular occasion (for example, patients, staff, carers or citizens).

  • The learning community and improvement ambition. An LHS is driven by a learning community that has been formed around a common ambition of improving services and outcomes. Not everyone in the learning community will necessarily be involved in every stage of the LHS (for example, patients might be involved in formulating ideas and trialling service changes but not in data analysis; data analysts might be involved in generating learning from data but not implementing service changes; and so on). However, all share and contribute to the common ambition in some way.

Another critical source of LHS variation is therefore the nature of the learning community, and its corresponding improvement ambition:

  • They could be place based, and if so they could exist over a range of geographies (for example, based around an individual provider organisation, a health economy or the whole NHS).
  • They could be condition based (for example, improving care for people with cystic fibrosis).
  • They could be thematic (for example, improving procurement, adopting a particular technology or reducing a particular type of medical error).
  • They could combine these properties (for example, improving asthma care for young people in London).
  • The learning and improvement cycle. LHSs effect change through iterative learning cycles based on generating and learning from data and formulating and testing service changes. Despite the huge diversity of LHSs, their learning and improvement cycles tend to be based on a common set of stages, illustrated in Figure 1. And at each stage of the cycle, the same types of activities tend to be going on: measuring outcomes, formulating hypotheses, analysing performance, designing improvements, implementing service changes and so on (see Figure 2).,, The cycle is then repeated, allowing each subsequent iteration to test and evaluate the service changes implemented in the previous iteration. These activities are the ‘bread and butter’ of LHSs. And it is by focusing on how to do these activities well, we can support the development of LHSs, whatever their form.

Source: The Health Foundation’s Insight & Analysis Unit

Importantly, if it is the presence of certain activities that constitutes an LHS, then it does not really matter whether the individuals involved think of it as an LHS or not. As we will see with some of the case studies explored in this report, something can be an LHS even if the practitioners involved do not use that terminology or did not set out explicitly to develop an LHS.

Figure 1 is not intended to be a systematic analysis of the learning and improvement cycle, but simply a useful way to think about the constituent activities that go on in an LHS. It broadly corresponds to Charles Friedman’s influential three-stage characterisation of the learning cycle: practice to data; data to knowledge; and knowledge to practice (indicated in the figure).

Source: The Health Foundation’s Insight & Analysis Unit

However, something the classic tripartite LHS schema does not always make explicit is how values and intentions get into the cycle – how questions get asked and priorities for improvement get determined. And this is a more fundamental issue than just the initial selection of an improvement ambition – the need to ask questions, gather and aggregate views, reconcile differences, and make judgements and course corrections is an intrinsic part of the learning process. For that reason, we think that it is important to consider problem definition and solution design as explicit parts of the learning and improvement cycle – signified in Figures 1 and 2 by the actions ‘ask questions’ and ‘identify and agree potential improvements’. Indeed, as we will discuss further later, it is these fundamentally human processes of convening, interacting, deliberating and making decisions that make the social infrastructure of LHSs just as important as their technical infrastructure.

Differences in the way each stage of the learning and improvement cycle happens can be another important source of variation between LHSs. Each stage can differ in scale and intensity (appropriate to the LHS’s goals) – that is, in the depth and granularity of the work going on, the number of people involved, the timescale, the cost and so on. For example, the data analysis involved in a learning and improvement cycle could range from reading a patient feedback form to a lengthy research study involving novel and complex analytics, while the service changes could range from putting up a sign in a waiting room to redesigning a whole health care pathway.

The scale and intensity of different stages of the learning and improvement cycle will therefore greatly affect what the LHS looks like in practice. In particular, the greater the scale or intensity required, the more likely it is that different individuals will lead different aspects of the cycle, or that these stages will happen in different environments. While at the smallest scale, the stages of an LHS could be executed by a single individual in a single environment (for example, a clinician using real-time feedback in a mobile app to optimise their practice), at the other end of the spectrum might be a learning and improvement cycle that involves lengthy research projects, specialist engagement exercises, the lab-based development of new technology or data tools, or the cross-organisational implementation of new clinical pathways.

In summary, variation in the three aspects of LHSs outlined here – the nature of services provided, the nature of the learning community and improvement ambition, and the scale and intensity of each stage of the learning and improvement cycle – makes many different types of LHS possible. Figure 3 illustrates some of this diversity, using a selection of the case studies presented in this report.

Source: The Health Foundation’s Insight & Analysis Unit

These six case studies are presented at the end of Chapter 1.

1.2. The assets underpinning learning health systems

Another approach to thinking about LHSs is to consider the capabilities and infrastructure that typically underpin them – in other words, going beyond thinking about the activities of the learning and improvement cycle to consider the assets on which these activities rely.

These assets, illustrated in Figure 4, include:

  • data, data analytics capability and research capability, including skilled researchers and analysts
  • technology, including data platforms, tools and systems, as well as an organisation’s wider digital maturity
  • learning communities and networks, along with mechanisms, spaces and support for convening, deliberating and sharing knowledge
  • improvement capability and culture, along with resources to enable the planning, design and implementation of improvements to care.

 

Source: The Health Foundation’s Insight & Analysis Unit

As the scale of an LHS increases, these assets will tend to become more visible and significant. And, of course, they are not static: they will develop and mature over time with successive iterations of the learning and improvement cycle and successive projects. And there is a need to continually nurture them.

Importantly, a provider or learning community might exhibit some of the components of an LHS but not all of them. And it is worth emphasising that the activities, capabilities and infrastructure required for successful LHSs are valuable in their own right, even when they are not being used as part of a full LHS.

Focusing on these kinds of assets and how they can be successfully developed should therefore be an important aim in itself. In practice, when thinking about how to support the development and evolution of LHSs, rather than trying to create entire LHSs in a single step, it can be more effective to focus on developing one or two components first – for example data analytics or improvement capability. In many cases, it will be about identifying what components are already in place and building on these. LHSs are not ‘all or nothing’ in this respect. Nevertheless, ultimately, it will be by bringing all these different components together and ensuring they are working in partnership that the LHS will become more than the sum of its parts.

Furthermore, there are other important elements of learning infrastructure in health and care that complement the concept of LHSs described here, and on which LHSs rely to connect and share learning (see Box 3 for further discussion).

Box 3: The relationship between learning health systems and other types of systematic learning in health and care

LHSs as described here are only one way in which systematic learning and improvement can happen in health and care. Other approaches exist that are not necessarily provider centred, nor involve learning and improvement cycles, nor even focus on a particular improvement ambition. Examples include the role of networks in spreading innovation – for example, Q, a community of thousands of people across the UK and Ireland collaborating to improve health and care – or peer learning through clinical communities. These kinds of wider approaches may use similar infrastructure and capabilities as LHSs (networks, data, improvement capability and so on), but they often go beyond the reach and focus of individual LHSs.

An exploration of these broader approaches to learning is beyond the scope of this report. But it is worth noting that they may play a very important role in helping to create an environment in which LHSs can flourish. While the endogenous aspect of LHSs (driving change from within) is one of their strengths, it does mean there is a need for linking mechanisms between them. Without this, there is a risk of siloed improvement efforts.

These broader approaches can be particularly important in bridging between different LHSs – helping them learn from each other and tackle unwarranted variation between different providers. They may also provide critical pieces of infrastructure on which LHSs rely, for example, platforms for research and consultation like Thiscovery (see case study 13 in the next chapter). So, LHSs as described here should not be considered in isolation from the health and care system’s wider learning infrastructure.

1.3. How learning health systems can help

Why should providers and learning communities be supported to adopt an LHS approach?

First, LHSs can support the delivery of externally led change through national programmes by providing the means to implement changes, test them and iteratively adapt and refine them, working with the very patients and staff the changes apply to. For example, in England, there are significant opportunities to develop LHS capabilities within integrated care systems as a way of supporting the successful adaptation and embedding of new pathways and models of care. In this guise, LHSs can be thought of as sophisticated ‘implementation mechanisms’ that can help deliver national priorities for service transformation.

But LHSs also matter because they are critical for enabling locally led service change. Many of the challenges in health and care cannot be solved by top-down change programmes alone. And while local systems face common problems and many solutions are generalisable, there are also problems and solutions that are specific to individual contexts, which those closest to them will need to diagnose and solve.

By creating the capability to learn and improve from within, LHSs can turn providers into ‘engines of innovation and improvement’, driving improvement in a way that is not reliant on national initiatives or investment. And over the long term, endogenous, continuous improvement has the potential to achieve more than a series of centrally led improvement initiatives and may in many cases be more effective in achieving sustainable quality and efficiency gains.,

Another reason for the growing currency of LHSs is that they are an important way to capitalise on the increasing availability of data and analytical tools. In short, our ability to learn has never been greater. Crucially, developments in data and data analytics are giving providers themselves the power to gain insights about pressing challenges and how to solve them, reducing the need for external analytic capability. The increasing sophistication of technologies such as artificial intelligence also presents further significant opportunities for data-driven service improvement.

National policy has been slower to focus on LHSs than other drivers of health care improvement (such as targets, incentives and competition). However, interest in LHS approaches has grown over the past decade as the limitations of these more traditional, top-down policy levers have become apparent. In England, for example, the 2013 Berwick review set out a vision for the NHS to become ‘a system devoted to continual learning and improvement of patient care’ and this led to the government proposing that the NHS should become ‘the world’s largest learning organisation’. More recently, the 2021 Integration and Innovation White Paper contained the ambition of ‘accelerating [the system’s] ability to learn, adapt and improve’, while NHS England have argued that integrated care systems should become ‘consciously learning systems’. Meanwhile, Healthcare Improvement Scotland sees the development of ‘human learning systems’ as a key part of its approach to quality management. So there now appears to be acceptance at the national level that building a culture of continuous learning and improvement is essential for improving quality, efficiency and effectiveness.,,

For all these reasons, LHSs are an idea whose time has come. Not only are there increasing opportunities to deploy them and increasing interest from national policymakers, but we will not be able to solve the huge challenges that services are facing adequately unless we fully exploit the potential of providers to learn and improve. Box 4 explores where the greatest potential for further development might lie.

The chapter concludes with six case studies from the UK of LHSs of varying scale and focus. And to gain insights from other countries where LHS approaches are used, Box 5 gives three international examples.

Box 4: Development opportunities for learning health systems – what our respondents said

Given the diversity of possible types of LHS, we asked our survey respondents how developed they thought LHS approaches currently were across different levels of the health and care system – with an eye to understanding where the greatest potential for further development might lie.

The results, shown in Figure 5, suggest that LHSs have scope for development at all levels. But respondents felt that LHSs across multiple local providers, such as integrated care systems and provider collaboratives, were the least developed – perhaps unsurprisingly given the nascent state of these structures.


Case study 1: Flow Coaching Academy

The effective movement of patients between departments and organisations, along pathways of care, and around the wider health and care system, is an essential part of delivering safe, timely and high quality care. Poor flow is a major contributing factor to adverse outcomes, readmissions and higher mortality rates, whereas good flow can improve outcomes and waiting times, reduce duplication and improve efficiency.

Set up by Sheffield Teaching Hospitals NHS Foundation Trust in 2016, the Flow Coaching Academy empowers teams to improve flow through a common purpose, language and quality improvement method. Through open, inclusive and non-hierarchical safe spaces called ‘Big Rooms’, teams collaboratively identify, develop and test local solutions informed by qualitative and quantitative data. Critically, each Big Room starts with a patient story to make sure their voice is a central part of the process – whether through a clinician telling a patient story or inviting patients to Big Room meetings.

The Flow Coaching Academy’s Roadmap for Improvement and ‘5Vs Framework’ underpin each Big Room, which provides a way for teams to assess a pathway and develop a shared understanding. Flow coaches, who have undertaken a one-year action-learning programme to develop relational and technical skills, including data analysis and coaching, work with teams to identify and achieve sustainable improvements to care within and across pathways.

The Flow Coaching Academy has delivered training to nearly 400 coaches from NHS trusts, clinical networks, charitable organisations and health boards across the UK. It has developed a network of local academies and training is currently taking place in Northumbria, Lancashire & South Cumbria and Sheffield.

The Big Room approach shows the importance of creating a learning culture where teams have the tools, opportunity and time to collectively define and implement improvements to service delivery. Part of the success of the Big Room, which emphasises that improvement is ‘20% technical and 80% relational’, is the focus on building multidisciplinary teams and a shared understanding, empowering all members to contribute.

Coaches encourage teams to take ownership of both the learning and improvement process and the data that inform it, which helps to develop better relationships across professional disciplines, including between clinical staff and data analysts. This is essential in building understanding of service performance.

Case study 2: PINCER – a pharmacist-led intervention to reduce medication errors

Medication errors, such as mistakes with prescriptions, preparation or dispensing, occur more than 237 million times a year in England. While most are minor, in an estimated 1.8 million cases these medication errors could lead to serious patient harm.

Researchers at the Universities of Nottingham, Manchester and Edinburgh developed PINCER, a pharmacist-led intervention that combines clinical audit tools with quality improvement methodology and educational outreach. Through the PINCER online resource centre, pharmacists can download searches to run on GP clinical systems that identify patients at risk of medication error. Pharmacists can compare their data to other practices across the country and then work with practice teams to improve prescribing processes and reduce potential harms.

PINCER goes beyond simple feedback tools by providing training through action learning sets that give participants the resources and skills needed to drive improvement and embed changes into everyday practice. Pharmacists develop skills in using quality improvement tools and strategies, root cause analysis, action plan development and delivering feedback. The action learning sets model has also provided participants with informal peer networks to support continuing development. More than 2,350 health care professionals have now been trained to deliver PINCER, including 1,785 pharmacists.

Supported by the Health Foundation and all 15 Academic Health Science Networks, the initiative, led by PRIMIS at the University of Nottingham, has now been adopted by more than 40% of GP practices in England through a social franchising model. This model has given individual localities the flexibility to tailor the intervention to their needs, which has been critical to its successful scaling. As a result, more than 220,000 at-risk patients have been identified, and analysis of follow-up data from 1,677 practices has shown a reduction of 32% in the number of patients at risk of hazardous prescribing associated with gastrointestinal bleeding – a common cause of medication-related hospital admissions.

Case study 3: CFHealthHub – a digital learning health system

Around 15 million people in England are living with at least one long-term health condition, accounting for 70% of health and care expenditure. However, it is estimated that up to half of all medicines prescribed in the UK for long-term conditions are not taken as recommended, with poor adherence to medical treatment having both a personal and an economic impact.

For the 10,600 people living with cystic fibrosis in the UK, daily inhaled medicines are vital for staying healthy, but only around 36% of people with cystic fibrosis are fully adherent to their complex treatment plans. To address this challenge, CFDigiCare, a collaboration of clinicians and people with cystic fibrosis, developed CFHealthHub – a digital LHS that seeks to optimise cystic fibrosis outcomes by creating a national community of practice that uses data to improve care.

Through a digital platform co-designed with users, people with cystic fibrosis can track their progress by accessing real-time medication data captured by their Bluetooth-enabled nebuliser. The CFHealthHub mobile app shows these data through accessible, colour-coded graphs that give feedback on treatment-taking.

Users can also choose to share the data with their clinicians, who then work with them to support behaviour change, identify barriers to effective treatment and talk through evidence-based strategies for overcoming them. A 19-centre randomised control trial showed that CFHealthHub increased adherence to treatment while reducing the burden and effort of self-care.

As of May 2022, CFHealthHub is used by 60% of adult cystic fibrosis units in England, creating a learning community of clinicians, managers, pharmacists and allied health professionals who are sharing their learning and best practice. Using the real-time automatic data capture of CFHealthHub, this community of practice is able to understand how well the system is supporting people with cystic fibrosis.

This has led to, and provided the infrastructure for, several linked, systems-optimisation workstreams. For example, the National Efficacy-Effectiveness Modulator Optimisation programme is carrying out a real-time health technology assessment of new medication which can significantly improve lung function, which is able to use data from 1,000 participants. The CFHealthHub has also shown how data gathered by technologies can be built into care, without burdening the patient or clinician, and how they can be used to both support system learning and improve personalised support for people with long-term health conditions.

Case study 4: Nightingale bedside learning coordinator

During the onset of the COVID-19 pandemic, NHS England set up NHS Nightingale Hospital London (the Nightingale) as a temporary facility in an east London convention centre to cope with the rising number of critical care patients in London. The novel setting, set up quickly with newly formed teams, meant the Nightingale had to manage significant risk and potential human error. In light of the knowledge gap surrounding COVID-19 and the need for rapid implementation of learning about the disease, the Nightingale was purposefully designed to be an LHS. The LHS approach enabled the Nightingale to rapidly make decisions backed by data and evidence to improve the delivery of care, quickly monitor the impact and make iterative adjustments where necessary.

A key component of the LHS involved gathering staff insights and ideas for improvement. The bedside learning coordinator role was developed as a mechanism to gather these insights rapidly and continuously without creating a burden for staff. The role involved:

  • capturing staff insights into what was and was not working
  • rapidly feeding these insights back to the leadership teams to review and agree how to respond
  • implementing agreed changes as appropriate
  • enabling robust feedback loops.

Staff from a diverse range of professional backgrounds (both clinical and non-clinical) undertook bedside learning coordinator shifts to give a broad set of perspectives and insights.

Insights captured were triaged into three areas: fix (requiring immediate action), improve (needing suggestions for better ways of doing things) and change (requiring substantial changes). Bedside learning coordinators worked with a central quality and learning team to triangulate insights from the bedside with other data sources, such as incident reports, team debriefs and performance dashboards as well as external evidence, to inform decision making and implement required actions as appropriate. In addition, as well as external evidence they carried out focused audits to confirm that implemented changes were successful, satisfactory to staff and sustainable. One example of this in action was the identification of mouth care as an area for improvement. Following concerns that staff had raised, a speech and language therapist completed a bedside learning coordinator shift to give specialist insight and recommendations. These were then adopted as standard operating procedure.

The Nightingale demonstrates that health care staff often have rich insights and ideas for improvement (including how to improve patient care, workplace efficiency and staff wellbeing), which, when analysed alongside other routine data sources, can support improvement work. The bedside learning coordinator role provides a mechanism to gather these insights, as well as giving staff a greater voice and empowering them to deliver tangible improvements as part of a wider LHS.

Since the initial pilot, several other large NHS organisations have adopted the bedside learning coordinator concept.

Case study 5: The Clinical Effectiveness Group

Data sharing between organisations within health and social care is often disjointed, leading to limited sharing of learning and the duplication of work between providers. As general practice moves to a model where bigger operational units – such as integrated care systems, primary care networks and GP federations – support service users with more integrated care, there is an opportunity to pool learning to support continuous improvement as part of an LHS.

The Clinical Effectiveness Group (CEG) at Queen Mary University of London is an academically supported unit that facilitates data-enabled improvement for 272 north-east London GP practices, serving 2.2 million patients. It brings together people from a range of disciplines, including clinicians, data analysts, informaticians, academic researchers and a team of facilitators who conduct around 300 GP practice visits a year.

The CEG builds standardised data entry templates that GP practices use to enter high quality data into their patient records at the point of care. Its software tools, searches and on-screen prompts then turn these data into actionable insights within the practice, for example to stratify patients by risk or to support self-reported measurements such as home blood pressure recording.

The CEG’s cardiovascular disease tools have contributed to improvements in blood pressure control, statin use and the management of other associated long-term conditions in the local population, with pre-pandemic performance among the highest in England. For example, pharmacists in the London Borough of Redbridge, in collaboration with St Bartholomew’s Hospital, are using one such tool – APL-CVD (Active Patient Link tool for Cardiovascular Disease) – to improve statin prescribing and identify suitable patients for a new drug that reduces cholesterol.

CEG analysts also create interactive dashboards showing performance across the region, allowing for the identification of areas requiring improvement. The CEG uses this evidence to design and deliver local guidelines and quality improvement programmes to reduce unwarranted variation in outcomes. The most recent is a programme to reduce inequalities in childhood immunisations. The CEG has championed GP recording of self-reported ethnicity to support the identification and reduction of health inequalities. The dashboards similarly reflect information on a range of equity indicators that local authority public health teams use to inform local initiatives.

Evaluation of the CEG identified key contributors to its success including:

  • access to high quality coded GP data from across north-east London
  • trust and credibility in its use of data
  • engagement with local clinicians and health care providers
  • the expertise of its clinical leads.

The CEG’s approach has put health data into practice to build an LHS in north-east London. The team is now working with other integrated care systems in London to support this approach in other areas as part of the London Health Data Strategy.

Case study 6: The Children & Young People’s Health Partnership

Research shows that some health systems are struggling to keep pace with the changing health needs of young people, and wide inequities in health remain among this group. With more than 180,000 children and young people living in the densely populated, diverse and fast-growing London boroughs of Lambeth and Southwark, an integrated approach to the delivery and coordination of care for this rapidly evolving population is essential.

The Children & Young People’s Health Partnership (CYPHP), hosted by Evelina London Children’s Hospital and part of King’s Health Partners, is a population-level LHS aiming to deliver better health for children and young people. Bringing together providers, commissioners, local authorities and universities, the CYPHP collaborates on taking care into the community, uncovering unmet need, and targeting care through technology and data-enabled early identification and intervention.

One of the CYPHP’s focuses is asthma. Data are gathered from several sources, including biopsychosocial data through a patient portal, routine clinical interaction data, data on wider determinants of health such as poverty and air quality, and data gathered through research that patients can opt into through the patient portal.

The team of clinicians, managers and researchers then translate these data into action by using them to make personalised decisions about patient care, support decisions on triage and inform what packages of care might be needed. The data are also used to inform population health management approaches by identifying which geographic areas have the greatest need, enabling earlier intervention.

The data are also being used for wider quality improvement and research activity. For example, through local test beds, the CYPHP is using a pragmatic but rigorous approach to evaluation by running randomised control trials alongside service evaluations that can quickly provide evidence to clinicians to support continuous improvement.

The CYPHP has demonstrated impact through a service evaluation, which showed improved health outcomes and quality of care as well as reductions in emergency department contacts and admissions. Our interviews with the team highlighted that by understanding population need through data, it is possible to deliver care that is proportionate to need and that can therefore help reduce inequalities in access to care among children, alongside reduced associated costs.

Box 5: International examples of learning health systems

While the case studies and examples featured in this report are from the UK, it is worth noting that there are many instances of LHS approaches being taken in other countries. Below we highlight three examples.

ImproveCareNow, United States

ImproveCareNow was set up to improve care for children and adolescents with inflammatory bowel disease in the US, which had seen significant variation in terms of both diagnostic testing and treatment. Through the setting up of a ‘collaborative learning network’, ImproveCareNow brought together a community of clinicians, researchers, patients and parents to use routine data for research and continuous improvement. All patients with inflammatory bowel disease within the network are now enrolled in a single patient registry, allowing ImproveCareNow to assess the impact of improvements on outcomes. Since its inception in 2007, ImproveCareNow has seen remission rates increase from 55% to 77%, and the network has grown to provide care for more than 17,000 patients across 30 states of the US.

Swedish Rheumatology Quality Registry, Sweden

By building on routine care data and their existing ‘outcomes dashboard’, clinicians in Sweden’s Gävle County were able to improve outcomes for patients with rheumatic diseases, going from having the worst outcomes in the country to the best. Patients were supported to use their data at home to understand when they might be out of remission. They were also able to use the information to control their care through an ‘open–tight’ model: when patients were doing well, they were ‘open’ to visiting a clinician if they felt they needed to and were supported to self-care, but if they were not doing as well, they would be ‘tightly’ cared for until that care was no longer needed. This approach both decreased unnecessary attendances and encouraged self-management approaches that saw outcomes improve substantially.

Johns Hopkins Medicine’s ‘learning and improving system’, United States

In recent years, Johns Hopkins Medicine has introduced an organisation-wide LHS approach that seeks to break down traditional silos between research and practice in order to improve patient outcomes and reduce waste. Bringing key leaders together around a clear and compelling, patient-centred purpose, its ‘learning and improving system for quality and safety’ is underpinned by a wide-ranging learning community, but with clear links to management for accountability. By aligning its goals and strengths across a broad range of stakeholders, the approach has seen significant improvements in a range of areas, including reductions in surgical-site infections of more than 50% and significant improvements in patient feedback.

Previous Next